Facebook may have builtan influenceso largethat its cracking under the weight of the power and influence of its News Feed.

Mark Zuckerberg began an interviewon stage at Techonomy16discussing the evolution of the News Feedand Facebooks impact on the election.Post-election, journalists politicians, and pundits have questionedFacebooks role in shaping the campaign and its outcome, debating the merits of Facebooks position of primacy as a source of information.

Zuckerberg defended the News Feeds progress arguingthat the filter bubbleisnt an issue for Facebook. He suggested the real problem is thatpeople by nature engage with content they like and find agreeable, anddismiss things theydont agree with online as they would in real life.

Youd be surprised at how many things we dismiss, he said. The problem isnt that the diverse information isnt therebut that we havent gotten people to engage with it in higher proportions.

What are the tools that couldhelp us escape the echo chamber?

IfFacebook wont change its algorithms for fear of going against its wildly successful revenue modelor expandits Trending Topics product, it needsto implement better featuresto help diversifythe content we see.

  • First, Facebook should hirehuman journalists to curate stories during elections. They should pick the best stories from a variety of sources and perspectives and flag them on Facebook as good quality and worthy of reading. Also: Fact checking. Google did it. Now its Facebooks turn.
  • Since the personalized News Feed favors what we engage with, and we tend to engage with content we agree with, Facebook should provide an option to turn this off during electionsto allow peopleto see algorithm-free, real-time content.
  • Imagine being able to activate a filter that would show you what your Facebook-specified Republican or Democratic or Libertarian etc.friends were sharing.
  • Facebook could create a feature that allows people to declare endorsement fora candidate, and users could then build a feed to see what that pool of friends was posting about as well as the conversation aroundtheir posts.
  • Facebook could curate and flag certain content as partisan, and those stories could appear with alink to an Instant Article oftheopposing viewpoint or from an opposing news source (however,since not everyissue is purely partisan and not every news source either this could get tricky).
  • Trending Topics should be expanded and should display moretakeson political stories, not just what the highest number ofpeople are talking about.
  • Facebook could use theSuggested Videos window that pops up when you watch a video to its completion to surface opposing viewpoints.
  • Facebook could showa post from a candidate on the opposing side whenever a politician posts from their account.

Facebook is missing ahuge opportunity to use its tech to help us be more bipartisanduring this politically divided time in U.S. history.

Facebookishiding behind itswere a tech company, not a media company guise in an effort to excuse itself from the fact that it hasnt figured out thenews. For such an influential platform that preaches social responsibilityand prioritizes user experience, its irresponsible for Facebook to give people such a powerful megaphonefor personalexpression, only to lock them inside anecho chamber.

Despite what Zuckerberg claims, Facebookprofoundly affected the way the U.S. consumed the election, just asit has shaped our news experience whether it wants to our not.

I dont recommend using Facebook as a sole news source. But 44 percent of adults in the U.S. use Facebook as a source for news, aPew report detailed earlier this year. Another study found that Facebook saw an increase of almost 30 percent on election night compared to a typical evening.

Its safe to say that a solid number of people were banking on Facebook forelection updates, live video and as a stage for their own social commentary.

Is this all a pipe dream?

If Facebook routinely showed usersthings theyfound distasteful or viewed as incorrect, its audience wouldnt want to use it as much. Facebooks revenuemodel profits (you know, that cool $7B revenuein Q3) froma strategy of making its 1.79billion users feel validated (and more likely to engage) with its personalized algorithm.

It wants to keep us in a bubble of comfort where our views are repeated back to us in the News Feed. So yes,Facebook makes money by algorithmically favoringcontent that affirms our opinions. Why would it want to change? And are people evenready for a fair Feed? With its massive influence, Facebook may havethe ability to change this by offering both sides.

What isFacebook currently doing?

Facebook has offered lip service about breaking out of the echo chamber. Itsdata was used in the Wall Street JournalsBlue Feed, Red Feedexperimentto juxtapose a liberal Facebook and conservative Facebook feed sourced from users self-proclaimed political views andwhat they shared.

This year, Facebook published anodd video in a lackluster plea for us to play nice this election season, offering upits search bar as a resource todiscover newviewpoints (Facebooks search may bethe least usefulfunction on the platform).Itselection Hubwas a hands-on guide forinformation about the election aimed athelping people learn about candidates, policy and ballot propositions.

Italso supposedly helped over 2 million people become registered voters. But the way users are interacting with thelean back News Feed experience is important too.

Facebook is missing ahuge opportunity to use its tech to help us see content through a morebipartisan lens during this politically divided time in U.S. historywhen it could alsopotentially change our proclivityto ignore the other side. As a company that has always prioritized the user experience,Facebook could be doing much more.

Read more: