1.2192962-3004430330
The Facebook Inc. logo is displayed for a photograph on an Apple Inc. iPhone in Washington, D.C., U.S., on Wednesday, March 21, 2018. Facebook is struggling to respond to growing demands from Washington to explain how the personal data of millions of its users could be exploited by a consulting firm that helped Donald Trump win the presidency. Photographer: Andrew Harrer/Bloomberg Image Credit: Bloomberg

The horrendous actions by Cambridge Analytica, a voter-profiling company, and Aleksander Kogan, a Russian-American researcher, raise serious questions about privacy, social media, democracy and fraud.

Amid the justified furore, one temptation should be firmly resisted: for public and private institutions to lock their data down, blocking researchers and developers from providing the many benefits that it promises — for health, safety, and democracy itself.

The precise facts remain disputed, but according to reports, here’s what happened. Kogan worked at Cambridge University, which has a Psychometrics Centre. The Centre purports to be able to use data from Facebook (including “likes”) to ascertain people’s personality traits. Cambridge Analytica and one of its founders, Christopher Wylie, attempted to work with the Centre for purposes of vote profiling. It refused, but Kogan accepted the offer.

Without disclosing his relationship to Cambridge Analytica, Kogan entered into an agreement with Facebook, which agreed to provide data to him — solely for his own research purposes. Kogan created an app, called “thisisyourdigitallife.” Offering a personality prediction, the app described itself on Facebook as “a research app used by psychologists.” About 270,000 Facebook users agreed to disclose their data (again, for research purposes).

By sharing data with Cambridge Analytica, Kogan violated his agreement with Facebook. According to one report, he ended up providing more than 50 million user profiles to Cambridge Analytica, not for academic research, but to build profiles for partisan political uses.

Armed with those profiles, Cambridge Analytica worked with members of the Ted Cruz and Donald Trump campaigns in 2016. Among other things, the firm helped to model voter turnout, identify audiences for fund-raising appeals and advertisements, and specify the best places for Trump to travel to increase support.

As early as 2015, Facebook learnt that Kogan was sharing his data and demanded that Kogan, Cambridge Analytica, and Wylie cease using, and destroy, all the information they had obtained. They certified that they had done so.

That was a lie — which recently led Facebook to suspend all three from its platform. Facebook was careful to add, “People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.”

All this raises numerous questions — some of which involve difficult trade-offs with respect to privacy and competing values. Aware of the risks, Facebook emphasises that all apps requesting detailed user information have to “go through our App Review process, which requires developers to justify the data they’re looking to collect and how they’re going to use it — before they’re allowed to even ask people for it.”

In view of Kogan’s misconduct, it’s reasonable to ask whether that process contains sufficient safeguards. An external review panel might well be a good addition; continuing monitoring of all uses of Facebook data, on the part of app developers, seems important.

But let’s not overreact. Authorised use of that data can do a great deal of good.

For example, Genes for Good, from the University of Michigan, is using a Facebook App to help combat diabetes, cancer, and heart disease. It seeks to learn how genes interact with the environment to produce — or not to produce — serious illness. There’s tremendous potential there.

A more immediate response to health problems is HealthTap, an app that permits users to type questions into Facebook’s Messenger and to obtain free responses from doctors — or to see answers from doctors to questions that are like their own.

Pew found that disagreement comes most often from party leaders — and that it is far more common from Republicans than from Democrats. Sure, those aren’t the most surprising findings, but there is far more to learn about polarisation and partisanship — and Facebook’s data will prove exceedingly valuable.

It is true, of course, that social media users should have a great deal of control over whether and how their information is used, and that app developers should be sharply constrained in their ability to share data.

The US government has faced, and solved, similar problems: Data.gov discloses a great deal of information, with more than 230,000 data sets involving health, safety, travel, energy, and the environment. Available apps, made possible by that information, are helping people to save money and to avoid health risks.

For social media providers, including Facebook, the fiasco underlines the need for more careful vetting of all developers who seek access to their data. But it would be a mistake to take the fiasco as a reason to keep treasure troves of information out of the hands of people who can provide immensely valuable services with it.

Cass R. Sunstein is a columnist. He is the editor of “Can It Happen Here? Authoritarianism in America” and a co-author of “Nudge: Improving Decisions About Health, Wealth and Happiness.”

The Washington Post.