Facebook is keeping a close eye on misinformation in the lead-up to 2018's elections. Which elections, exactly? All of them, according to the team working within the company to combat fake news. That means Turkey in June, Mexico in July, Rwanda in September, Brazil in October, and the US in November, to list just a few. It's a lot to keep track of, even—or perhaps especially—for a company as large, influential, and scrutinized as Facebook.

Which is why the company wants help. Last month, Facebook, together with the non-profit Social Science Research Council, announced an initiative that will connect independent researchers with Facebook's vast and, until now, largely inaccessible troves of data on human behavior. The goal: investigate social media's impact on elections and democracy.

The initiative is significant for many reasons, but here's the big one: It will, for the first time, enable researchers to not only access Facebook's data, but publish findings from that data without pre-approval from Facebook. That means if scientists uncover something in the social network's data that makes it look bad, Facebook won't be able to prevent them from making that information public.

At the time it was announced, Facebook and the SSRC provided few details about the then-unnamed initiative. Nearly two months on, the endeavor still has no official name, but some details are beginning to emerge—including how the initiative will protect Facebook users' data from the kind of misuse that landed Mark Zuckerberg in congressional hearings last month.

“Facebook is going to provide encrypted laptops," says political scientist Gary King, director of the Institute for Quantitative Social Science at Harvard University. He calls the laptops virtual clean rooms. They'll provide researchers remote access to Facebook's infrastructure while recording every click and keystroke. "It’s not a laptop you’d ever use to send personal messages. That’s not its purpose. Its purpose is to provide a level of security similar to what you'd find in a locked room in Menlo Park. Its purpose is to avoid another Cambridge Analytica.”

How will the auditing work? The details are TBD says King, who, together with Stanford legal scholar Nathaniel Persily, developed the industry-academic partnership model that Facebook will use to share its data. But some analyses will happen in real-time, via automated scripts. Others will be conducted on a post-hoc basis by experts trained to decipher log files—the record of activity on each laptop, including what information was requested, who requested it, and what they did with it.

That last bit is crucial: King says researchers will only be allowed to access data relevant to their proposed research; the hypotheses they are testing and the data they'll need to test them will be agreed upon ahead of time. Unlike the data Cambridge Analytica used, the data made available through the initiative with protect the privacy of individuals. No data will be stored on the laptop, and researchers will need to permission before removing any data (say, for publication purposes) from the device. If a team of scientists wants to investigate a different question or access other data, they'll need to submit a separate proposal for consideration.

While Facebook will provide the laptops, King says the company will not monitor how researchers use the hardware. Instead, auditing will be overseen by a commission of independent experts, who will be recruited not by Facebook, but King and Persily. "Facebook may provide some systems operators, but their purpose won't be to spy on us," King says. “It will be to help us ensure everything's running properly."

King and Persily's commission will also prioritize the research it thinks will shed the most light on Facebook's political impact. (Proposed areas include the broad subjects of misinformation, polarizing content, and foreign interference in domestic elections.) The commission will then solicit detailed research proposals from the academic community, which will be evaluated by a group of SSRC-appointed peer reviewers and vetted for ethical and privacy issues. Funding the endeavor is a diverse assortment of non-profits, from the Charles Koch Foundation, famous for its support of libertarian causes, to the James L. Knight Foundation, an organization best known for its support of journalism. Projects will be evaluated and green-lit independently of Facebook, to prevent the company from cherry picking the proposals it likes and nerfing the ones it doesn't.

Sound complicated? It is. The setup is elaborate even in theory, so there's not telling how it'll play out in practice. But on paper, the initiative is designed to shift the research model for academics accessing Facebook's data from one of individual accountability to collective responsibility.

When Cambridge University psychologist Aleksandr Kogan accessed Facebook user data under the pretense of academic research only to give it to Cambridge Analytica, it revealed how the privacy of tens of millions of Facebook users could be compromised by one rogue actor. King says his system of independent checks, balances, audits, and reviews is designed to minimize the possibility of that happening again. No system is perfectly secure, he admits, "but the consequences for anyone violating this kind of security would be huge. It would be a disaster for them."

That's assuming the initiative can get it up and running. King and Persily have yet to appoint the members of their expert commission, which means no official research agenda has been drafted, no research proposals solicited. Oh, and the whole operation still needs a name. "Right we're dealing mostly with complicated legal stuff. Partly because that's the way lawyers are. But also because an initiative like this has never existed before," King says. "But the legal structure will be in place this week or next. Once that happens then the money from the foundations can start flowing and we can really get started."


More Great WIRED Stories

Read more: