×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

Facebook denies sidelining research on site's 'divisiveness'

Last Updated 28 May 2020, 04:46 IST

Facebook on Wednesday defended itself against a report that it shelved internal research indicating that it was dividing people instead of bringing them together.

The social media network's algorithms are aimed at getting users to spend more time on the site.

But they "exploit the human brain's attraction to divisiveness," a slide from a 2018 presentation by a Facebook research team stated, according to the report in the Wall Street Journal.

It warned that if left unchecked Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."

Facebook chief Mark Zuckerberg and other executives sidelined the research, however, based on concerns it was too paternalistic or would result in product changes that would rankle politically conservative users, the Journal reported.

The company's integrity vice president, Guy Rosen, slammed the Journal story, saying the newspaper "willfully ignored critical facts that undermined its narrative".

"The piece uses a couple of isolated initiatives we decided against as evidence that we don't care about the underlying issues and it ignored the significant efforts we did make," Rosen said in an online post.

"As a result, readers were left with the impression we are ignoring an issue that in fact we have invested heavily in."

The Journal report also cited a 2016 study at Facebook which showed that, among German political groups, "64% of all extremist group joins are due to our recommendation tools."

"Our recommendation systems grow the problem," the report said.

For years Facebook has faced criticism for allowing hatred to flourish on the network globally, with posts stoking divisions during the coronavirus pandemic being only the most recent example.

One of the most notorious examples is in Myanmar, where the tech giant has been accused of being slow to respond to abusive posts portraying the country's Rohingya Muslims in sub-human terms, helping to drum up support for a military crackdown that forced more than 720,000 of the stateless minority to flee the country in 2017.

Rosen did not deny the existence of the study, but pointed out moves Facebook has made since 2016 to fight misinformation, harassment, threats and other abusive behavior.

"We've taken a number of important steps to reduce the amount of content that could drive polarization on our platform, sometimes at the expense of revenues," Rosen said.

"This job won't ever be complete because at the end of the day, online discourse is an extension of society and ours is highly polarized."

ADVERTISEMENT
(Published 28 May 2020, 04:44 IST)

Follow us on

ADVERTISEMENT
ADVERTISEMENT