Facebook shut down efforts to make the site less divisive

0
608

Facebook is well aware that its product recommendation engine had a tendency to cause disagreement and polarization according to a report from The Wall Street Journal. This was presented to Facebook executives in 2018 as an internal Facebook report and the report states that “despite warnings about the effect this may have on the society, Facebook leadership ignored the findings and has largely tried to absolve itself of responsibility concerning partisan and other areas of polarization it directly contributed to”. Also, they stated that their algorithm explores human brain attraction to divisiveness. They found that if the recommendation engine is not checked properly, it would continue to serve users with more divisive content to gain a user’s attention.

Another report from 2016 stated that 64 per cent of people in the company joined a radical group only because the company’s algorithm recommended to do so. Joel Kaplan, Facebook’s Vice President of global public policy and former Chief of staff under George W Bush, is a controversial figure in part because he was very loyal and committed in attitude towards the right-wing political party and he supported Justice Brett Kavanaugh throughout his nomination. Kaplan is responsible for Facebook’s controversial policy and political, in which the company said that it won’t promote misinformation in ads.

Kaplan comes up with one main project called Common Ground, where people are allowed to share politically neutral content that will bring the people together who share the same interest and hobbies. The team who is building it said that the Facebook require to take a “moral stance” in some cases by preventing the usage of certain types of polarizing content and that the effort could harm overall engagement, the WSJ reports. Because of this reason the team has been disbanded.

“We have learned a lot since 2016 and are not the same as before. We have built a robust integrity team, strengthened our policies to avoid harmful content, and user research to understand our platform’s impact on society so that we can improve. In the past February, we announced $2M in funding to support independent research proposals on polarization”, stated the Facebook spokesperson.