Responsive ad

Do Facebook humans, not algorithms, decide what is news?

A screen shot of The Guardian's article, 12 May 2016. Photo: AFIS
  • twitter
  • Gmail

A screen shot of The Guardian‘s article, 12 May 2016. Photo: AFIS

Facebook news selection is in the hands of breathing human editors not algorithms, according to an ‘exclusive’ published 12 May 2016 by The Guardian newspaper, London, which cites ‘leaked’ internal FB guidelines showing human intervention at almost every stage of FB news operation, just as in a conventional media organization.

If confirmed independently that would make Facebook the world’s largest news organisation, putting immense power in the hands of a clutch of people managing the so-called social media portal. More than a billion individuals — a seventh of the population — post daily their ‘loved’ and ‘liked’ items of mainly personal but sometimes community interest. This activity by potential ‘customers’ of all sorts has drawn business to Facebook and enriched its stock and, as it increasingly appears, given Facebook political clout.

According to Facebook’s guidelines leaked by The Guardian, a team of news editors are instructed on how to ‘inject’ stories into the trending topics module.

Those documents show how Facebook, “now the biggest news distributor on the planet, relies on old-fashioned news values on top of its algorithms to determine what the hottest stories will be for the 1 billion people who visit the social network every day.”

The documents come “amid growing concerns over how Facebook decides what is news for its users,” The Guardian said. Recently the company was accused of an editorial bias against conservative news organizations, prompting calls for a congressional inquiry from the US Senate commerce committee chair, John Thune.

But such overall control would put Facebook at the top of opinion makers in the case of further key developments in The Middle East and North Africa, to mention only one area of great sensitivity in terms of reporting on and interpretation of events.

The Guardian quotes from what Facebook tells its visitors about its news operations. “The topics you see are based on a number of factors including engagement, timeliness, Pages you’ve liked and your location,” says a page devoted to the question ‘How does Facebook determine what topics are trending?

But, The Guardian adds,  the documents also show that Facebook relies heavily on the intervention of a small (“as few as 12 people”) editorial team to determine what makes its “trending module” headlines – the list of news topics that shows up on the side of the browser window on Facebook’s desktop version.

According to the newspaper, the company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds. The Ferguson unrest began after the fatal shooting of Michael Brown, 18, by police officer Darren Wilson on 9 August, 2014, and has led to injuries involving at least 10 protestors and six police and more than 320 arrests.

In contrast to the U.S. events involving black citizens, news management and its impact in areas that are poorly covered by independent media is less closely monitored or documented across the board in Europe and North America. Key regional news providers still struggle to get represented in content picked by news aggregators.

News that are seen to demonise individuals or countries at the centre of certain events can be switched on or off virtually overnight, as witnessed in the case of Iran since the signing of the nuclear power agreement on 14 July 2015.

Author: Editor

Share This Post On
Tweet
Share This

Share This

Share this post with your friends!

Share This

Share this post with your friends!