Next week, Facebook CEO Mark Zuckerberg will testify before Congress about his company's failure to prevent the data firm Cambridge Analytica from siphoning off information belonging to up to 87 million people, the majority of whom are believed to be Americans. In the lead-up to the hearings, the social network has scrambled to respond to increased scrutiny from journalists and the public over its privacy practices.

Steps like overhauling its entire privacy settings menu are a clear benefit. But in other areas, Facebook's hurry to respond to criticism has resulted in features that could have potentially negative consequences for users.

On Thursday evening for example, TechCrunch reported that Mark Zuckerberg and other top Facebook executives can delete old messages on Messenger, a feature not available to anyone else. Facebook says the feature exists for corporate security reasons related to the 2014 Sony hack, but it still feels elitist, especially in light of broader critiques of the company's privacy practices.

'Cause-and-effect can get muddled or lost entirely.'

Sarah Jamie Lewis, Open Privacy

In order to avoid sustained backlash, Facebook quickly moved to announce that an "unsend" feature would roll out to all users in the next several months. Facebook says Zuckerberg and other executives won't use it until everyone else can, too. Messenger's end-to-end encrypted setting also already has the ability to set messages to delete after a specified amount of time. "We will now be making a broader delete message feature available. This may take some time. And until this feature is ready, we will no longer be deleting any executives' messages. We should have done this sooner—and we're sorry that we did not," a Facebook spokesperson said in a statement.

It seems very much that Facebook decided to make "unsend" part of Messenger because of the TechCrunch story—which makes it hard to believe that the company has had time to think through its potential implications. We don't know exactly how Facebook plans to design unsend—and given that their timeline for it is months long, they may not either—but there's a reason it doesn't already exist in every other messaging app.

"In the secure messaging space we have a concept called transcript consistency, the idea that all participants in a conversation see and react to the same messages," says Sarah Jamie Lewis, a privacy and anonymity researcher and the executive director of Open Privacy. "When you allow people to delete messages arbitrarily, you lose transcript consistency, and when that happens cause-and-effect can get muddled or lost entirely." Lewis says that it's not hard to conceive of ways such a feature could be abused, say during a sexual harassment investigation.

The ephemeral-messaging debacle is not the only time in recent weeks that Facebook appears to have not completely thought through how a well-intentioned, but quickly rolled-out, feature might impact its user base. Earlier this week for example, Facebook terminated the ability to search for people by their phone number on the platform, citing the fact that malicious actors had abused it to scrape information or attempt to access users' accounts. "Given the scale and sophistication of the activity we’ve seen, we believe most people on Facebook could have had their public profile scraped in this way. So we have now disabled this feature," Mike Schroepfer, Facebook's chief technology officer, said in a blog post published Wednesday.

But as others have pointed out, it's still fairly easy to obtain a Facebook users' phone number through other means, such as public groups, many of which were created when someone lost their phone or upgraded to a new one and needed to collect the numbers of their friends and family. Facebook was right to kill the phone number search feature, albeit belatedly, but the problem of bad actors collecting users' phone numbers or other information on the platform remains unsolved.

It's also hard to understand why Facebook didn't implement many of these changes earlier.

Facebook has instituted numerous other important fixes to its platform in the wake of the Cambridge Analytica scandal, beyond terminating phone number search. It has also limited the amount of information third-party apps can collect from its platform, stopped permitting information from third-party data to be used to target ads, instituted stricter restrictions on large Facebook pages, and tightened its regulations on ads concerning political campaigns and issues. But some of the fixes—including especially the unsend feature—feel particularly half-baked.

It's hard to understand why Facebook didn't implement many of these changes earlier, before the company found itself amid a sustained backlash about its approach to privacy. Facebook also has a long history of apologizing after the fact for violating users' trust. Of course, Facebook is also tasked with a monumental responsibility: It has amassed data about more people than have ever lived in a single country in the history of the world, and now has to figure out how to manage it.

But its business model depends on collecting that information and ensuring that advertisers can access much of it. It's clear that if the company had valued privacy all along, it wouldn't need to be in such a reactive mode. It would have already fixed these problems long ago.

Fortunately, Facebook feels the same way. In an interview with The Financial Times, Facebook's Schroepfer said the social network has changed its approach, and is "trying to be much more diligent about trying to understand all the potential bad uses and misuses. We will still be launching new products but prior to launching them we are sitting down and trying to think of all the possible bad uses of them and what bad actors might do with them." Good. Start with Messenger unsend.

The Facebook Fallout

Read more: