The YouTube video platform belonging to Google plays an important role in this. The company has responded and claims to have deleted thousands of videos about dangerous or misleading information about Covid-19.
A video by the German conspiracy theorist Ken Jebsen was not deleted. His video “Gates is hijacking Germany!” Apparently does not violate YouTube’s guidelines.
Nine years ago, a radio host wrote a confused email to a listener. In it he denies the Holocaust and calls it a PR campaign. The mail becomes public, the moderator loses his job. This is Ken Jebsen’s story, and it could end here. YouTube, Google’s largest video platform in the world, ensures that it continues. Here the dismissed moderator finds a new platform. He regularly publishes videos on YouTube in which he remains faithful to the content of that email.
Over the next few years, he initially worked 478,000 subscribers on his channel, largely unnoticed by the general public. Here he acts as a counterpoint to what he sees as the established media. Here he spreads conspiracy theories and anti-Semitism. And here, on May 4, Jebsen released a video claiming that a law compelling Germans to be vaccinated against the Corona virus was passed in silence, driven by Bill Gates.
Conspiracy theories like this get a lot of attention in the Corona crisis. In the UK, 77 cell towers burned after a theory spread that the new 5G technology was responsible for the outbreak. Youtube responded and banned videos on the subject.
On Monday, the federal government made an unusually clear statement: “From the federal government’s point of view, there is no place for extremist ideas, for false information, for myths, for misleading rumors. Whoever deliberately spreads false stories about the corona pandemic wants to divide our country and upset people against each other, ”said deputy government spokeswoman Ulrike Demmer.
The founder of the research firm Correctiv, David Schraven, sees the corona crisis as a “particularly confusing stage of incorrect information”. Almost 45 percent of the misinformation that his team recognized was “originated on YouTube,” said Schraven, whose team works as a fact checker for Facebook.
An evaluation of “Addendum” comes to a similar result. The Austrian non-profit organization has analyzed the content and distribution of over 1,000 corona virus videos. 177 of these videos contain conspiratorial content. Together, these videos have 18 million views. Youtube itself has said that it has deleted thousands of videos related to dangerous or misleading information about Covid-19.
Youtube has published unusually strict rules about what content you cannot tolerate in connection with Covid-19:
- Content that prevents someone from seeking medical treatment
- Content that recommends medically unsound methods to prevent serious illnesses instead of medical treatment
- Content that expressly disputes the effectiveness of the guidelines of the WHO or the respective health authority, which could lead to people violating these guidelines, e.g. B. Guidelines on social distancing and self-isolation
- Content that claims that the corona virus does not exist
- Content that contains medically unfounded diagnostic information for COVID-19
- Content that contests the existence or transmission of COVID-19 as described by WHO and local health authorities
- Claims that the COVID-19 test leads to infection with the virus
The video of Ken Jebsen apparently does not violate these rules, according to Youtube. The video just says: “You can find up-to-date, scientific information at the Federal Center for Health Education”. Youtube puts this note under every video that deals with Covid-19 – including those from reputable news sites.
Ken Jebsen’s video has not been deleted. The fact that YouTube has a problem with the video can be seen in the fact that YouTube apparently excluded the video from monetization through advertising.
YouTube does not delete the video, but no longer shows advertising in its environment. In itself, this is not unusual. Youtube sees monetization, i.e. the sharing of advertising revenue with the author, as a privilege that is only granted to selected channels and videos. The platform wants to motivate authors to create the right content.
Such content is video that essentially meets two criteria. Firstly, they are suitable for entertaining users and secondly, they offer an environment in which advertisers feel comfortable. At least the latter are not Jebsen’s videos. Youtubers like him often open up other sources of income and, for example, ask their users to provide them with direct financial support. This is how they make themselves financially independent. Youtube’s sanctions are in vain.
time is money
For some years now, large tech platforms have been optimizing primarily on one metric: “Time spent”. The value measures the time that users spend on the platform. It is so important because time is money. The more time a user spends on a platform, the more advertising they see, the more money the platforms earn.
If a user goes to YouTube, watches a video there and sees one, maybe two advertising clips, that’s good. But it can be done better. This is done by Youtube proposing a new video to the user after the video. Then another and another and another. Always interrupted by a little advertising break. The result: users stay on the platform longer, watch more videos – and YouTube earns more money.
YouTube’s core product is not the video player, but this recommendation algorithm in the background. This product is susceptible in one respect: it works best if it repeatedly reinforces the user’s opinion.
Critics therefore accuse YouTube of using its recommendation algorithm to spread conspiracy theories. The criticism became so strong that YouTube announced in January 2019 that it would severely limit the visibility of such videos. The following June, the company said that the number of users who saw a video like this after a YouTube recommendation had dropped by 50 percent.
At the same time, the company is now confronting its users with opposing positions in the recommendations. “Addendum” concludes in its analysis that a conspiratorial video is followed by a video with a conspiracy theory in 12 percent of the cases. Youtube recommends a serious counter position much more often.
Following the video from Jebsen, YouTube recommends a public fact check for this video. If you explicitly search for the title of Jebsen’s video, the public service production still ranks first in the results.
YouTube gives less space to conspiracy theories, but that doesn’t make the problem go away. This is the result of a study published by the campaign organization Avaaz in January. It says that 16 percent of the 100 most recommended videos about global warming contained disinformation. For “climate change” the number of misinformation is 8 percent, for “climate manipulation” it increases to 21 percent. Avaaz also points out that YouTube’s recommendation algorithm is responsible for about 70 percent of the total time users spend on the platform.
This last number is important because YouTube has found a remarkable way to deal with problematic content on its own platform: it makes it largely invisible. “Borderline content” is what the company calls this content, which it continues to tolerate on its platform, but significantly limits its visibility. They are banned from recommendations and from its homepage in the large non-recommended rest of YouTube, which according to Avaaz only accounts for 30 percent of user time.
Something else is important: In contrast to the number of deleted videos, which should account for less than one percent of the content on the platform, YouTube does not give any information about how much “Boderline Content” is on the platform.
Jebsen’s video has been viewed more than three million times. Maybe YouTube would call it Boderline content. In any case, it was never invisible. It was just a little harder to find them. Anyone who received the link and clicked on it, or who heard of the video and searched for it on Youtube, could see the video.
This is also important because YouTube is not alone. Facebook and the associated Whatsapp are also struggling with conspiracy theories and misinformation about Covid-19 that are spread on their platforms. In April alone, Facebook flagged 50 million posts about the corona virus with warnings. What also unites the large platforms: They shy away from speaking publicly about the problem.
As the tech portal “The Verge” clearly describes, we know that YouTube has made more than 30 changes to its recommendations since January 2019 because YouTube made this public in a blog post. However, we do not know which changes have been made, how many videos were recommended before and after these changes – because the platform is silent about this.
More than 500 hours of video material are uploaded to Youtube every minute. How much of it is so-called “Boderline content”? We do not know it. YouTube could say something about that. In December, the company blogged somewhat nebulously that meditation videos account for more daily watch time (“watch time”) than for “Boderline content” and dangerous misinformation combined. Two pointers to this: YouTube does not speak of views, but of time spent – and meditation videos are generally quite long. This information is not very helpful without context.
We can only speculate that Jebsen’s video spread virally via group chats on Facebook, Telegram and Whatsapp. That such a large proportion of the views on Jebsen’s video was created without YouTube giving conspiracy theories a boost. There is a lot to be said for it – only we cannot know.
What we do know is that YouTube largely excludes conspiracy theorists from its core product, the recommendation algorithm, but not from its video player. Youtube never stopped offering them a stage, only it is a little harder to find today.