Former Facebook Employee Claims “Trending Topics” Get Controlled By “News Curators,” Not Algorithms

This article is over 8 years old and may contain outdated information

Recommended Videos

Speculation about what lies behind Facebook’s “Trending Topics” sidebar has circulated for a few years now. It came to a head back in 2014 when Ferguson and the “Black Lives Matter” hashtag didn’t appear in Facebook’s trends, despite their dominance on other social networks, such as Twitter. “Black Lives Matter” did end up appearing in Facebook’s “Trending Topics” bar eventually, though. According to what an ex-Facebook employee told Gizmodo, that’s because Facebook staffers got instructed to inject that topic into the sidebar, even though it wasn’t getting shared on Facebook often enough to call the attention of the algorithm.

According to this ex-employee, the “Trending Topics” sidebar does make use of an algorithm that determines which stories get shared on the service most often, but there’s also some behind-the-scenes editorial massaging on the part of Facebook’s in-house “news curators,” who decide what’s worth putting in the sidebar and what should not appear.

Some of the news curators who spoke to Gizmodo claim that the “Trending Topics” sidebar has a liberal bias. For example, even if multiple Facebook users share an article from a conservative outlet (e.g. Breitbart, Washington Examiner, Newsmax), these curators claim they must wait until other news outlets (e.g. CNN, the New York Times) pick up the story as well, then use those outlets’ coverage to surface the story in the sidebar.

We’ve reached out to Facebook for comment, but haven’t heard back yet; nor has Gizmodo. However, TechCrunch has received a statement from Facebook on the matter, which reads as follows:

We take allegations of bias very seriously. Facebook is a platform for people and perspectives from across the political spectrum. Trending Topics shows you the popular topics and hashtags that are being talked about on Facebook. There are rigorous guidelines in place for the review team to ensure consistency and neutrality. These guidelines do not permit the suppression of political perspectives. Nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics.

Readers may recall that only a month ago, a Facebook employee had leaked an internal company poll about whether or not Facebook has a “responsibility” to “prevent President Trump.” This led to speculation that the service might be limiting the surfacing of news stories about the candidate.

Facebook’s control over users’ news feeds, as well as the “Trending Topics” sidebar, appears to happen by way of algorithm, but these anonymous accounts from employees suggest a much higher level of editorial control that goes beyond a purely numerical reflection of what’s most popular. The problem is that many users have no idea that this editorial control is happening, since it’s completely under the hood.

What’s also interesting is that Facebook’s own staff might have more of a “liberal bias” than their own user-base does. It makes sense that Twitter might have surfaced the “Black Lives Matter” tag more readily, given that Twitter’s demographics are younger and more diverse in comparison to Facebook. Twitter users might therefore be more willing to share stories from more progressive news outlets, in comparison with users on Facebook. However, the staff of Facebook might not be too happy about that. But what responsibility does Facebook have to reflect what its users “want”? Do they get to make the choice about what those users “should” be seeing?

This question doesn’t have an easy answer. I admit, I felt pretty uncomfortable about the idea of Facebook employees attempting to influence election results. I’m against Trump and his proposed policies, but that doesn’t mean that I trust tech industry leaders to know what’s best for the country, either. Mostly, I don’t like the idea of some unseen group of people deciding what I do and don’t get to see, all while giving me the illusion of control.

So, really, my main problem is that Facebook doesn’t make their curation methodology obvious enough. I’m not even opposed to the idea of editorial curation when it comes to the Facebook news feed or “Trending Topics.” I just think that it should be more clear to users that those sidebars, and the news feed, are not purely a reflection of algorithms, and that there’s also a “review team” (in Facebook’s own words). I imagine if the “Trending Topics” sidebar were 100% algorithmic, it would be stacked to the gills with cute animal videos … but I get the impression that Facebook wants to come across as a bit more intellectual than that.

It’s also a little sad that the “review team” that works on this task has remained anonymous and uncredited. According to Gizmodo’s coverage, many of them are “former journalists” who now have been relegated to the invisible task of curating Facebook topics and even writing headlines.

Seems like Facebook wants to have their cake and eat it too: they want all of the ideological purity that comes from the supposed “objectivity” of an algorithm, but they also want the benefits that only subjectivity can offer. Why would news ever need to be “subjective,” you ask? Because curation on the part of a human editorial team can ensure that news gets packaged in a compelling way. If a robot delivered your news, you probably wouldn’t like it, and it sounds like Facebook has already figured that out, which is why they would hire a “review team” in the first place! (Unfortunately, they’re still putting a ton of robots into the Messenger service, which is a separate problem. I mean, robots are great. But they can’t do everything. … Yet.)

One more thing: algorithms are biased, too. Even if Facebook fired all of its “news curators” tomorrow and left its news feed up to the workings of the algorithm, their service would still end up reflecting a variety of biases, either due to the programming of the algorithm itself, or the biases of Facebook’s users, or both. By having both an algorithm and human employees in place, Facebook seems to have covered a lot of bases. The only problem is that most people just don’t know how much control Facebook has over the news feed, and instead, users still believe that they have control over curating their own feeds. Turns out, nope. You might want to think a little harder next time you see what’s “trending” … because it might not actually be all that trendy.

(via Gizmodo, image via C_osett on Flickr)

—The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—

Follow The Mary Sue on Twitter, Facebook, Tumblr, Pinterest, & Google+.


The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Maddy Myers
Maddy Myers
Maddy Myers, journalist and arts critic, has written for the Boston Phoenix, Paste Magazine, MIT Technology Review, and tons more. She is a host on a videogame podcast called Isometric (relay.fm/isometric), and she plays the keytar in a band called the Robot Knights (robotknights.com).