×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

FB spreading news with built-in bias

Last Updated 15 May 2016, 18:45 IST

Facebook is the world’s most influential source of news. That’s true according to every available measure of size — the billion-plus people who devour its News Feed every day, the cargo ships of profit it keeps raking in, and the tsunami of online traffic it sends to other news sites.

But Facebook has also acquired a more subtle power to shape the wider news business. Across the industry, reporters, editors and media executives now look to Facebook the same way nesting baby chicks look to their engorged mother — as the source of all knowledge and nourishment, the model for how to behave in this scary new-media world. Case in point: The New York Times, among others, recently began an initiative to broadcast live video. Why do you suppose that might be? Yup, the F word.

Yet few Americans think of Facebook as a powerful media organisation, one that can alter events in the real world. When blowhards rant about the mainstream media, they do not usually mean Facebook, the mainstreamiest of all social networks. That’s because Facebook operates under a veneer of empiricism. Many people believe that what you see on Facebook represents some kind of data-mined objective truth unmolested by the subjective attitudes of fair-and-balanced human beings.

None of that is true. Recently, Facebook rushed to deny a report in Gizmodo that said the team in charge of its “trending” news list routinely suppressed conservative points of view. Last month, Gizmodo also reported that Facebook employees asked Mark Zuckerberg, the social network’s chief executive, if the company had a responsibility to “help prevent President Trump in 2017.” Facebook denied it would ever try to manipulate elections.

Even if you believe that Facebook isn’t monkeying with the trending list or actively trying to swing the vote, the reports serve as timely reminders of the ever-increasing potential dangers of Facebook’s hold on the news. That drew the attention of Sen John Thune, who heads the Senate’s Commerce Committee. Thune sent a letter Tuesday asking Zuckerberg to explain how Facebook polices bias.

The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favour certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.

There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn’t seem to recognise its own power, and doesn’t think of itself as a news organisation with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.

That myth should die. It’s true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.

“Algorithms equal editors,” said Robyn Caplan, a research analyst at Data & Society, a research group that studies digital communications systems. “With Facebook, humans are never not involved. Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”

Everything you see on Facebook is therefore the product of these people’s expertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It’s often hard to know which, because Facebook’s editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.

Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable. Zuckerberg is for free trade, more open immigration and for a certain controversial brand of education reform. Instead of “building walls,” he supports a “connected world and a global community.”

You could argue that none of this is unusual. Many large media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren’t shy about their policy agendas — Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.

But there are some reasons to be even more wary of Facebook’s bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what’s acceptable and what’s not in the news business.

“The New York Times contains within it a long history of ethics and the role that media is supposed to be playing in democracies and the public,” Caplan said. “These technology companies have not been engaged in that conversation.”

According to a statement from Tom Stocky, who is in charge of the trending topics list, Facebook has policies “for the review team to ensure consistency and neutrality” of the items that appear in the trending list.

But Facebook declined to discuss whether any editorial guidelines governed its algorithms, including the system that determines what people see in News Feed. Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people’s previously held points of view.

If News Feed shows news that we’re each likely to Like, it could trap us into echo chambers and contribute to rising political polarisation. In a study last year, Facebook’s scientists asserted the echo chamber effect was muted.

But when Facebook changes its algorithm — which it does routinely — does it have guidelines to make sure the changes aren’t furthering an echo chamber? Or that the changes aren’t inadvertently favoring one candidate or ideology over another? In other words, are Facebook’s engineering decisions subject to ethical review? Nobody knows.

The other reason to be wary of Facebook’s bias has to do with sheer size. Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets.

To determine if The New York Times is ignoring a certain story unfairly, look at competitors like The Washington Post and The Wall Street Journal. If those outlets are covering a story and The Times isn’t, there could be something amiss about the Times’ news judgment.

Such comparative studies are nearly impossible for Facebook. Facebook is personalised, in that what you see on your News Feed is different from what I see on mine, so the only entity in a position to look for systemic bias across all of Facebook is Facebook itself. Even if you could determine the spread of stories across all of Facebook’s readers, what would you compare it to?

“Facebook has achieved saturation,” Caplan said. No other social network is as large, popular, or used in the same way, so there’s really no good rival for comparing Facebook’s algorithmic output in order to look for bias.

What we’re left with is a very powerful black box. In a 2010 study, Facebook’s data scientists proved that simply by showing some users that their friends had voted, Facebook could encourage people to go to the polls.

That study was randomised — Facebook wasn’t selectively showing messages to supporters of a particular candidate. But could it? Sure. And if it happens, you might never know.

ADVERTISEMENT
(Published 15 May 2016, 16:32 IST)

Follow us on

ADVERTISEMENT
ADVERTISEMENT