Image: Getty Images

Forget about media outlets and Facebook — worry about readers. 

Facebook’s 2 billion monthly users have come to rely on the social network as a way to keep up with the news. Now Facebook is limiting the reach of news publishers, leaving a vacuum to be filled with… well, it’s anyone’s best guess.

The change is simple. Facebook is going to show users more posts that their friends and family have either created, shared, or commented on. In turn, Facebook is reducing the reach of pages including news outlets. That may sound innocuous, but the shift turns up the dial on the signals that help amplify fake news. And there’s no way to tell how bad this is going to get — not even Facebook knows for sure.

We’re already seeing this in action.

CEO Mark Zuckerberg claimed the changes will put more “meaningful” content into people’s feeds. But previous tweaks and tests have shown there’s plenty of downside to showing people less news. The New York Times published a story over the weekend that highlighted how Facebook’s changes have had particularly negative consequences in countries where journalists are at risk and news media is censored. Publishers are having trouble reaching people with real news while fake news spreads. 

This follows a report that Facebook was used to spread misinformation and propaganda in Myanmar, where the United Nations says the government is participating in the ethnic cleansing of Rohingya Muslims.

Now, the social network is kneecapping publishers, leaving its News Feed open for whatever posts can get the most comments.

It’s unclear how users are supposed to deal with this change. The average person is probably not aware of the changes to Facebook, let alone in possession of a clear understanding that they’re going to see less news directly from publishers. A person who pulls up Facebook in the coming weeks isn’t going to think, “OK, there are fewer news stories directly from the platforms I trust. I should be careful about what I’m seeing, and maybe go to news outlets directly.”

Zuckerberg says that the changes are meant to improve “well-being.” Their research shows that “meaningful” posts from friends and family on social media can do this. How does Facebook tell what’s “meaningful?” Facebook executives have said that comments are the leading indicator, especially longer comments.

“The research shows that when we use social media to connect with people we care about, it can be good for our well-being. We can feel more connected and less lonely, and that correlates with long term measures of happiness and health. On the other hand, passively reading articles or watching videos — even if they’re entertaining or informative — may not be as good,” Zuckerberg wrote in a Facebook post

Facebook can’t undo what Facebook did

People may end up feeling better in the coming months after visiting Facebook, but it’s hard to understand how they won’t be subjected to more misinformation. People aren’t going to stop using Facebook as a news source just because of this tweak. They’re going to keep on visiting Facebook and expecting to see news. 

And they will see some news. There will be stories that your friends or family share and comment on. That sounds fine, except that this is exactly how fake news spreads on Facebook — and how politicians and interest groups have been trained by Facebook to maximize their reach. 

It’s no accident that Facebook became a destination for news and politics. The social network works closely with political campaigns in the U.S. and abroad, convincing them to spend big money to push their messages. That’s included working with Rodrigo Duterte, the president of the Philippines who has been accused of carrying out extrajudicial killings and shuttering news organizations. Facebook has a team dedicated to developing tools for politicians.

Meanwhile, Facebook’s embrace of publishers made it a destination for news. Almost half of U.S. adultsget news from Facebook. There’s a good chance you’re reading this after coming from Facebook. 

What we already know about how fake news spreads on Facebook makes this a scary proposition. 

Here’s how: A group wants to spread a particular piece of misinformation or propaganda for whatever reason. They do this by paying Facebook to show this content to people who are likely to share it. Those people are shown these paid-for posts, and they then start spreading it around their network. 

Examples of this tend to center on politics and elections, but there are other types of scams circulating on Facebook. Right now, bitcoin and cryptocurrency are particularly hot.

We are left with a scary timeline: Facebook makes itself into a news destination. Facebook makes itself into a way for propaganda to spread. Facebook removes news and boosts signals that help propaganda spread. 

It’s hard to imagine Facebook hasn’t taken this into consideration, but the past few years are littered with examples of the company not quite realizing what it has created or what’s happening on its network, especially with regard to fake news. One former Facebook employee familiar with the News Feed said that the company’s system is so complex that even its engineers don’t know what will happen when they tweak the system.

So there you have it. Readers, who were trained to get their news from Facebook, are now going to see a bunch of posts based on signals that are perfect for the spread of fake news — after Facebook explicitly pushed governments to use its tools to get their message out. 

That’s bad news for readers — and their well-being.

Read more: