Skip to main content

Facebook Will Train Their Employees to be Politically Unbiased. So, How Do You Teach That?

Recommended Videos

Facebook has had to consider its political stance multiple times this year, which seems like an unusual problem for a social networking company to have. Then again, if my own Facebook timeline is any indication of current trends, the service is primarily used for getting into election-related debates with one’s relatives and acquaintances. Whether Facebook wants to be political or not, it’s no wonder that people see it that way.

Facebook has also been having some political debates among its staff. The buzz on this began a few months ago when an internal company poll about potential topics for a staff meeting got leaked. This suggestion from an employee caused quite a stir: “What responsibility does Facebook have to help prevent President Trump in 2017?” As far as we know, Facebook didn’t have any internal meeting nor take any steps to influence the election after this. But that was the beginning of people questioning the company’s political stance, and the train just kept on rolling after that.

This past May, a former Facebook staffer who happened to be politically conservative told the media that Facebook’s “Trending Topics” algorithm has a liberal bias. Facebook denied this, and CEO Mark Zuckerberg did a lot of damage control on the topic after that, such as meeting with conservative leaders and explaining to them how the site’s algorithm works. Facebook also released tons of official employee training documents explaining exactly how they select the stories that end up in “Trending Topics.”

That damage control tour still hasn’t ended. The Hill reported today that Facebook will implement a new employee training program, designed to keep political biases in check. Facebook COO Sheryl Sandberg summed up the goals of the program as follows:

We think a lot about diversity at Facebook, and we have a managing-bias class that all of our leaders and a lot of our employees have taken that I was part of helping to create, and we’ve focused on racial bias, age bias, gender bias, national bias, and we’re going to add in a scenario now on political bias. So that, as part of [how] we think about helping people understand different points of view and being open to different points of view, we’re dealing with political bias as well going forward.

If you’re like me, then you read that and you thought to yourself, “How do you teach people to not have any political bias at all? Doesn’t everyone have a ‘bias’ based on their own unique life experiences and observations? How are you supposed to train people out of the ‘bias’ that makes up their interpretation of the world?” (Very weird that we both had such a specific, identical thought.)

I don’t think you can train people to be “unbiased,” but there are some things that this class could do. Studies have shown that people tend to rely on confirmation bias when it comes to evaluating moral and political decisions. They’ll seek out the evidence that supports what they already believe, and they’ll find reasons to dismiss any evidence that contradicts their conclusion. That isn’t a problem that’s unique to just one political party, either. It’s something that everyone could stand to work on.

I don’t think you can train anybody to be truly “unbiased,” though. I mean, even robots have a bias (after all, we humans create ’em). The algorithms that Facebook uses have a bias, too! Increased awareness of that bias, and a diverse staff, can help undo some of the institution’s structural inadequacies and help account for a plurality of life experiences and opinions. But achieving a state of perfect objectivity? That doesn’t exist.

Maybe these classes will be valuable for the employees of Facebook, but I’m not sure that they’ll actually address the larger concerns that people (read: me) have with Facebook right now. I just don’t like the idea of Facebook messing around with their users’ behavior; they’ve experimented on their users in the past without telling them, and that’s an immoral way to go about the social sciences. Frankly, I don’t trust them to stick with using their influential powers for good. Everybody needs to get their power checked from time to time, and that includes mega-rich social media companies.

For the moment, I think Facebook is pretty focused on harnessing their algorithmic technologies to sell us as many products as possible. But if they ever decided they wanted to influence an election, perhaps to ensure that a candidate would win who’d support their larger corporate goals, what’s to stop them from doing it? We probably won’t find out they’ve done that until it’s already too late.

Next up: my fashion coverage on tinfoil hats!

(via The Verge, image via C_osett/Flickr)

The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—

Follow The Mary Sue on Twitter, Facebook, Tumblr, Pinterest, & Google+.

Have a tip we should know? tips@themarysue.com

Author
Maddy Myers
Maddy Myers, journalist and arts critic, has written for the Boston Phoenix, Paste Magazine, MIT Technology Review, and tons more. She is a host on a videogame podcast called Isometric (relay.fm/isometric), and she plays the keytar in a band called the Robot Knights (robotknights.com).

Filed Under:

Follow The Mary Sue:

Exit mobile version