YouTube, YouTube, YouTube! A platform that defaults as a mainstream video streaming application, one very important aspect of every individual’s entertainment and a very powerful tool to share content with the world. As much as YouTube is popular for monetizing creators on its platform, there are millions of people from around the globe that share video content on YouTube every day and two-fold the number of people that watch that content.
Having said that, spreading of information on YouTube becomes a piece of cake and so does spreading of misinformation. Amidst the global COVID-19 pandemic, spreading of misinformation is not a good thing and the company, being responsible as it should be, is taking strict measures to minimize the spread of COVID-19 misinformation on the platform.
According to recent reports, YouTube has removed over a million videos for allegedly violating YouTube’s policies on the spread of COVID-19 misinformation. The company has removed a total of 1 million videos from its platform since February 2020 for the same reason, according to Neal Mahon, YouTube’s Chief Product Manager.
The company executives have been actively putting forward their opinions on the spread of COVID-19 misinformation, some say that bad content is just a small part of YouTube’s overall content and some argue that misinformation has moved from the sidelines to enter into the mainstream, in both the cases, the video streaming service is taking strict measures to comply with state norms and its policies on COVID-19 misinformation.
A YouTube executive stated that YouTube removed over 10 million videos each quarter, the majority of which don’t even have 10 views. However, misinformation on COVID-19 is dangerous for the community, and considering YouTube’s userbase and reach, it becomes even more concerning.
Anyhow, as mentioned in a report by Engadget, companies that play in the big leagues including Facebook, YouTube, Google, and more have taken their policies very seriously in order to comply with government regulations and norms. As noted in a report by Engadget, all of these companies have over a billion users and even a small fraction that gets misinformation on COVID-19 is a loss for the community.
YouTube is actively working to reduce the number of videos containing misinformation on the platform and removing content that violated its policies. Facebook is doing something similar with its core app and Instagram because ultimately, the main agenda is to minimize e the spread of COVID-19 related misinformation and each company is doing it with their norms in their own way.