Facebook, the social media conglomerate is one of the world’s most influential and popular social media platforms that people absolutely love to use. Social media was initially meant to connect people with the world and end barriers across communities and to a major extent, it did. However, there are definitely some downsides to having this kind of monopoly over the industry, the most significant one is promoting ‘extremism’ and ‘polarization’.
Yes, according to researchers, Facebook and other social media platforms are the main cause of fueling polarization in the communities. Well, it has played a major role to fire political polarization that eventually leads to extremist violence, according to researchers at New York University’s Stern Center for Business and Human Rights.
It is natural for the company to deny these claims but according to the latest research from the company, social media is not deemed to be the primary driver of harmful political polarization, as mentioned in a report by Engadget. So, here we are focusing on two different conclusions of two different research conducted on the same matter. New York researchers, on the other hand, claim that since 2016, there has been a widespread divisiveness in the community that led up to this “political polarization”.
As mentioned in a report by Engadget, a research scientist notes that social media is not the original cause of this polarization but over the years, it surely has intensified division and extremism, knowingly or unknowingly, of course. Reports claim that Facebook is well aware of its active role in rising polarization, especially after the U.S. Capitol Hill incident after which the company was sure enough to know that it is a big part of the problem. Although the company may not own up to its ‘significant role’ but it is taking measures like de-emphasizing political content in the News Feed by saying that users have been complaining about too much political content on their News Feed.
Having said that, Facebook and other social media platforms may not be the primary source for this highlighted political polarization in the world, but it intensifies it which needs to be immediately taken into consideration. Facebook is working to proactively detect such content from its platform and remove it if it violates its community standards. The company has also been actively involved in stopping the spread of misinformation by reducing the reach of content that may push polarization on the platform.
What do you think, should Facebook and other social media companies share more detailed data on how their algorithms work, to let researchers’ study and connect the dots? Should Facebook be mandatorily asked to go make it’s functioning a public record?
Do let us know what you think because your opinions are the voice of TechStory.