More than 30,000 videos with misinformation about vaccines have been removed from YouTube. However, experts point out that the platform cannot remove tens of thousands of videos automatically using filters.

Since October 2020, Google moderators have removed over 30,000 videos containing misinformation about vaccines, including those against coronavirus, from YouTube. The videos “included claims of COVID-19 vaccinations that contradict local health officials or the World Health Organization,” they said.

Since February 2020, more than 800,000 such videos have been removed from the platform. Google noted that misinformation about coronavirus and vaccines is a big threat to social networks and their users.

Facebook and Twitter have already implemented a similar policy to remove misinformation about COVID-19 vaccines that experts say could raise doubts about vaccines.

Although YouTube has removed many videos and increased the platform’s credibility, some potentially dangerous content cannot be automatically removed using filters.

A new investigation by Media Matters for America found that videos containing false information that vaccines are being used as a means to implant microchips in patients have been viewed by hundreds of thousands of people.