facebook like button upside down. Image: Meta

Facebook Says They Never Had Any Way to Shut Down Fake News Stories. Maybe They Should.

This article is over 8 years old and may contain outdated information

Recommended Videos

Post-election, a whole lot of people and organizations have been shouting “not it!” when it comes to taking blame for a Trump presidency. There were a lot of interlocking forms of institutional oppression that led our country to where it is today, which means that blaming any one institution doesn’t cover the whole story, but the media certainly deserves a share of the blame for this one–and, for better or worse, Facebook is the place where many people find said media. 62% of people get their news from social media these days, according to this Pew survey, with Facebook identified as the most popular choice to use, and from which to find the news.

Yet Facebook continues to claim there’s nothing further they could’ve done to facilitate more accurate news feed curation when it came to stories about the U.S. election. Earlier today, Engadget reported that insiders said Facebook had methods to “shut down fake news” on the platform, but that Facebook had chosen not to implement these methods. Facebook responded to this report with a denial:

The article’s allegation is not true. We did not build and withhold any News Feed changes based on their potential impact on any one political party. We always work to make News Feed more meaningful and informative, and that includes examining the quality and accuracy of items shared, such as clickbait, spam and hoaxes. Mark himself said “I want to do everything I can to make sure our teams uphold the integrity of our products.” This includes continuously review updates to make sure we are not exhibiting unconscious bias.

This lines up with recent statements made by Facebook CEO Mark Zuckerberg on how he did not believe that the propagation of wildly inaccurate news items on Facebook had any effect on the election: “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way—I think is a pretty crazy idea. Voters make decisions based on their lived experience.”

Some of the inaccurate news items that have trended on Facebook in recent months include a news item about Fox News firing Megyn Kelly (which never happened), as well as posts with titles like: “FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide,” and “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement.” Neither of those things happened, either. But you’d only know that if you bothered to do a little research, as opposed to just glancing at the headlines in your feed.

The larger conversation about Facebook’s responsibility regarding news has been going on for months now. It began last April, when an internal company poll at Facebook revealed one employee asking if the company had a responsibility to “prevent President Trump.” A month after that, a former Facebook employee spoke out against a perceived liberal bias at the company, thereby starting up a months-long debate about whether Facebook’s “Trending Topics” sidebar was purposefully silencing conservative voices. From the outside, it seemed as though political debates were raging internally among the employees of Facebook–not so different from the political debates among users on their platform.

In spite of Zuckerberg’s dismissals of the topic, Facebook has copped to influencing voter patterns in the past. Gawker reported on this back in 2014, explaining that Facebook ended up having an effect on the election in 2012 based on–you guessed it–the almighty news feed:

For one such experiment, conducted in the three months prior to Election Day in 2012, Facebook increased the amount of hard news stories at the top of the feeds of 1.9 million users. According to one Facebook data scientist, that change—which users were not alerted to—measurably increased civic engagement and voter turnout.

Zuckerberg is saying it’s “crazy” for users to be influenced by false headlines. But he must know, deep down, that isn’t “crazy” at all for users to pay attention to news headlines in their feeds, since his company has been leading experiments on user behavior for years now. Rather than acknowledge that responsibility, as media outlets do, Facebook has denied even being a media outlet at all, and has dodged the question of whether their platform had any affect on the election’s result this year.

This past year, Facebook was so worried about coming across as having a “liberal bias” that Mark Zuckerberg ended up meeting with conservative figureheads in order to assure them otherwise, and Facebook replaced all of the employees on Facebook’s news curation team with algorithms instead. Never mind that algorithms can also be biased–and, apparently, these algorithms were just fine with letting outright inaccurate news trend on Facebook.

Facebook says they want to “make sure we are not exhibiting unconscious bias,” but in order to do that, Facebook would have to acknowledge that their site has biases in the first place–but instead, they want to pretend that installing algorithms means they aren’t biased at all, and that misinformation getting spread online isn’t their fault, and there’s nothing they can do about it, and so on. The way to get around these problems isn’t algorithms; it’s editorial oversight and more diverse teams behind the oversight system.

Zuckerberg does acknowledge that he thinks it’s a problem that people don’t tend to click on articles if they don’t agree with the headline, saying, “I think we would be surprised by how many things that don’t conform to our worldview, we just tune out,” but he doesn’t seem to understand that people are still reading those headlines. So it might help if those headlines were, y’know, factually accurate. But I guess that’s just because I’m “crazy,” and all, and I think that having human oversight (not just algorithms) as to what appears on Facebook would be more ethical and fair, even if that means having tough conversations internally about editorial guidelines.

Here’s just one more thing that I think is “crazy,” Mr. Zuckerberg: the fact that Facebook gives the police access to a portal that allows them to directly remove posts, such as posts that speak out against the police. That’s a realm of “unconscious bias” on Facebook’s part that doesn’t, actually, seem very “unconscious” to me. It’s well past time to get rid of that company practice.

(via Engadget)

Want more stories like this? Become a subscriber and support the site!

—The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—

Follow The Mary Sue on Twitter, Facebook, Tumblr, Pinterest, & Google+.


The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Maddy Myers
Maddy Myers
Maddy Myers, journalist and arts critic, has written for the Boston Phoenix, Paste Magazine, MIT Technology Review, and tons more. She is a host on a videogame podcast called Isometric (relay.fm/isometric), and she plays the keytar in a band called the Robot Knights (robotknights.com).