Facebook-Algorithms-Promoting-Extremism-and-Making-You-Addict-to-Social-Media-Study

Research at the social media giant in 2016 and 2018 unearthed a worrying trend linking the platform’s recommendations to extremist views on the site2016 research found that 64 percent of the time when Facebook Algorithms joined extremist groups, the groups had been recommended by the site’s algorithms Findings in 2018 then showed the social media platform fueled conflict among its users and increased extremist views 

Facebook researchers learnt as far back as in 2016 that 64 percent of all extremist group joins are due to its own recommendations but executives including Joel Kaplan killed any efforts to fix the problem, according to sources. 

Research at the social media giant in 2016 and again in 2018 unearthed a worrying trend linking the platform’s recommendations to extremist views on the site.  

But despite researchers coming up with several different solutions to tackle the problem of extremism, no action was taken. 

People familiar with the matter have told The Wall Street Journal that the move to dismiss the recommendations was largely down to Facebook VP for policy and former George W. Bush administration official Joel Kaplan, who famously threw Brett Kavanaugh a party when he was appointed Supreme Court Justice in the middle of sexual assault allegations in 2018.

The sources said executives including Kaplan and Mark Zuckerberg chose not to act on the concerning findings because they were already facing criticism for being biased against the right and were worried about being ‘paternalistic’.

Facebook researchers learnt as far back as in 2016 that 64 percent of all extremist group joins are due to its own recommendations but executives including Joel Kaplan (pictured) killed any efforts to fix the problem, according to sources

In 2016, the company carried out research that found there was a worryingly high proportion of extremist content and groups on the platform.  

Facebook researcher and sociologist Monica Lee wrote in a presentation at the time that there was an abundance of extremist and racist content in over a third of large German political Facebook groups. 

The presentation states ‘64% of all extremist group joins are due to our recommendation tools.’

Most of the joining activity came from the platform’s ‘Groups You Should Join’ and ‘Discover’ algorithms, she found, meaning: ‘Our recommendation systems grow the problem.’ 

Facebook then launched new research in 2017 looking at how its social media platform polarized the views of its users. 

The project was headed up by Facebook’s then-chief product officer Chris Cox who led the task force known as ‘Common Ground’. 

It revealed the social media platform was fueling conflict among its users and increasing extremist views. 

It also showed that bad behavior among users came from the small groups of people with the most extreme views, with more accounts on the far-right than far-left in the US.  

A page shows Facebook recommendations to other extreme groups to users in one group. Research at the social media giant in 2016 and again in 2018 unearthed a worrying trend linking the platform’s recommendations to extremist views on the site

The concerning findings were released in an internal presentation the following year.    

‘Our algorithms exploit the human brain’s attraction to divisiveness,’ a slide from the 2018 presentation read. 

‘If left unchecked,’ it warned,Facebook Algorithms would feed users ‘more and more divisive content in an effort to gain user attention and increase time on the platform.’ 

Cox and his team offered up several solutions to the problem, including building a system for digging out extreme content and suppressing clickbait around politics. 

Another initiative called ‘Sparing Sharing’ involved reducing the spread of content by what it called ‘hyperactive users’ – who are highly active on the platform and show extreme views on either the left or the right, the sources told the Journal. 

But the efforts – and the research – were reportedly blocked by senior executives including founder Mark Zuckerberg and Kaplan.

According to sources, Kaplan killed any attempts to change the platform branding the move ‘paternalistic Facebook Algorithms’ and citing concerns that they would mainly impact right-wing social media users, the Journal reported.  

Kaplan (left) and Mark Zuckerberg (right) pictured together in 2018

‘We’re explicitly not going to build products that attempt to change people’s beliefs,’ a 2018 document reads, according to the Journal.

‘We’re focused on products that increase empathy, understanding, and humanization of the “other side.”‘ 

This came at a time that the company was already under fire over allegations it was politically biased against the right.

However, it also came at a time when Kaplan publicly rallied behind Kavanaugh – who was sworn in by President Trump in 2018. 

In August 2018, while the discussions around extremist content were fresh, Facebook Algorithms employees were said to be outraged to learn that the executive, who also worked for the Republican George W. Bush administration, held a party for the conservative Supreme Court Justice to congratulate him in getting the role. 

This came after Kaplan supported the judge throughout the allegations leveled at him by Christine Blasey Ford that he sexually assaulted her in high school.  

Questions continue to be leveled at the social media giant over potential political bias.

In May, it emerged that Trump is considering creating a commission to review complaints of anticonservative bias and censorship on social media, including Facebook, Instagram, Twitter and Google.

Facebook hit back at the news, with a spokesperson telling the Wall Street Journal: ‘ People on both sides of the aisle disagree with some of the positions we’ve taken, but we remain committed to seeking outside perspectives and communicating clearly about why we make the decisions we do.’

Originally Publish at: https://www.dailymail.co.uk/