How to Curb Disinformation in Your YouTube Feed
The benefits of studying disinformation on YouTube
YouTube is one of the most popular social media in Poland and Ukraine: two-thirds of the people in each country have an active account and use the platform to watch entertainment and informational content. The recommendation algorithm is the heart of YouTube: personalized, high-quality selections of similar videos keep viewers on the platform to maximize advertising profits and help users find videos of interest more easily.
YouTube is one of the most popular social media in Poland and Ukraine: two-thirds of the people in each country have an active account and use the platform to watch entertainment and informational content. The recommendation algorithm is the heart of YouTube: personalized, high-quality selections of similar videos keep viewers on the platform to maximize advertising profits and help users find videos of interest more easily.
However, according to the 2019 study conducted by the Mozilla Foundation, such a business model may encourage the creation of even more emotional and manipulative videos, while the recommendation algorithms themselves can quickly put a user into a bubble of conspiracy and disinformation. Not only can YouTube recommend videos that violate its policies, but by doing so, it can also directly affect the lives and mental health of its users, which were the study's findings. Moreover, non-English-speaking users were found to be the most vulnerable.
This begs the following questions that became our study's focus: what may be the role of YouTube in the Russo-Ukrainian war today? To what extent do its recommendation algorithms promote the aggressor's discourse and help spread Russian propaganda among Ukrainian audiences during the war? Are the moderation algorithms and the measures taken to block the main propaganda channels and speakers on the platform sufficient, or are there still gaps that enable using YouTube as an informational influence tool? Finally, are Ukraine and Poland, which recognizes the EU rule of law and is one of Ukraine’s leading partners and a haven for the most significant number of Ukrainian refugees who may be targeted by Russian informational influence, exposed to the same extent?
Given the sheer scale of informational influence in the online video format, Texty.org.ua studied the work of the YouTube recommendation system algorithm in Ukraine and Poland. Our dataset was powered by 205 Ukrainians and 122 Poles who installed a particular app and agreed to share information about all their views on the platform.
The Info Sapiens research agency recruited Ukrainian participants for the study, while MASMI recruited those in Poland. The study's participation was limited to adults who watch video content on YouTube on personal desktop or laptop computers running Google Chrome at least several times a week.
The results were then analyzed using descriptive statistics, topic modeling, and Named-entity recognition (NER) methods implemented in pandas, yt-dlp, and BERTopic Python libraries.
The Findings
We found that YouTube’s recommendation algorithm has certain flaws that allow the dissemination of propaganda and disinformation. Moreover, the platform's policy aims to resolve users’ issues quickly. It does not provide for systematic activities to remove or limit the circulation of propagandist materials spread by authoritarian regimes using YouTube. Indeed, our study revealed that the YouTube recommendations algorithm promotes content that threatens and endangers both individual viewers and the national security of Ukraine.
The platform’s algorithm fails to recognize Russia as the aggressor while recommending videos related to the Russo-Ukrainian war. YouTube’s recommendation algorithm does not detect pro-Russian views — i.e., those that support/justify the policies of Russia’s leadership. As a result, the viewer of a video posted by a pro-Russian blogger is advised to watch other similar bloggers and channels, including those banned in Ukraine.
Unlike the Ukrainian segment of YouTube, the Polish segment contained fewer videos with pro-Russian or anti-Ukrainian narratives, which are often the same in the context of war. However, even Polish YouTube viewers were found likely to be sucked into a propaganda funnel by the recommendations algorithm. Just one view of a questionable video, and you get a flurry of similar recommendations.
Why Does it Happen?
YouTube tries to follow the viewer’s interests. The platform's content recommendations are often based on what other users with similar interests like to watch.
In other words, if you watch cooking shows and several other users who also watch cooking shows have watched videos with Russian propaganda, the system is likely to extend the propaganda offerings to you.
⦁ Each video is categorized to belong to a specific category based on the information it contains, such as the title, description, keywords, and the results of the platform’s analysis of the video. The video category influences its promotion by the recommendations system. However, we found YouTube’s algorithms somewhat vague, judging by the frequent mistakes made while categorizing videos whose content is ambiguous. For example, videos with barefaced propaganda were categorized as gaming or entertainment content. It appears that the Russian state propaganda machine has adapted to YouTube’s rules and is attempting to circumvent them to ensure the most comprehensive coverage possible.
We found evidence that incorrectly categorizing a propagandist video as a “Video game” or labeling a video containing conspiracy theories as “Education” negatively affects those who come to the platform in search of gaming or educational content. As was pointed out earlier, such videos are recommended to other users with similar interests after being watched by just one person.
⦁ The recommendation system massively relies on users’ browsing history. If a user watches a new video from a totally different category, YouTube still promotes videos from the previously watched channels. Even after a complete change of preferences, it takes the system quite some time to stop giving the same recommendations “for old times’ sake.” This sometimes creates the information bubble or cycling effect.
After Russia’s full-scale invasion of Ukraine on 24 February 2022, many Ukrainians changed their attitude towards Russia. They decided to stop watching Russian content, but this particular feature of YouTube caused them to be “caught up in the past”: the platform continued to recommend videos based on their pre-war interests, thus creating a field of influence for Russian propaganda.
⦁ Personalization. The recommendations mechanism further reinforces the bubble effect. Suppose a user watches a video that features a particular person. In that case, the platform recommends much more content with that person, slashing the chances of new people on the recommendations list. On the one hand, the selection is driven by the user’s interests: if somebody enjoys videos that feature a specific person, it is safe to assume further interest. On the other hand, this is a vivid example of building an information bubble.
We also observed cases of YouTube promoting the persons and organizations in which the platform attempted to ban or curb their media impact.
The language issue fascinated us because the Ukrainian public is deeply concerned about the dominance of the Russian language on YouTube. We found that the language of recommended videos and the language of previously viewed ones are directly related. For example, users who watch videos in Ukrainian get recommended content in Ukrainian.
According to numerous polls (for example, the studies conducted by Ilko Kucheriv “Democratic Initiatives” Foundation or Sociological Group “Rating”), a significant part of Ukrainians decided to communicate exclusively in the Ukrainian language after 24 February 2022. Since YouTube’s algorithms also rely on the history of watched videos, the viewers were still suggested Russian-language content despite their change of heart. As users have no control over the recommendation algorithms, those mechanisms create additional obstacles for those who decide to stop using the Russian language, a part of their colonial heritage, and switch to another one. This contradicts the people’s desire for linguistic immersion, which is known to facilitate migration to another language in daily communication.
It is also worth noting that most of the Russian propagandist videos that YouTube’s recommendation system suggests to Ukrainian users are in Russian, not Ukrainian or Polish. Our study of the content viewed in the Polish segment revealed that the language barrier also stops the spread of Kremlin propaganda. Since most Poles do not speak Russian, they do not watch Russian-language videos that contain Russian propaganda or crazy conspiracy theories so popular in the Russian-language segment of YouTube.
YouTube Recommendations
Instead of blocking individual videos or producers, efforts to counter Russian propaganda should focus on entire narratives promoted by authoritarian regimes. To this end, YouTube ought to shift from a stopgap approach to its moderation policy to systematic blocking of content commissioned by anti-democratic governments that target democracy and the values of the Western world and justify war against independent countries.
While this strategic issue is being addressed, the campaign against propaganda and disinformation should focus on specific steps to improve YouTube's information environment.
1. Limiting the content financed by totalitarian regimes
YouTube ought to filter videos from countries with totalitarian regimes more carefully. This is especially true for propagandist content that glorifies such regimes, misleads citizens about the actual state of affairs, and calls for violence and interstate hostility. The efforts should consider that authors and legal entities can create propagandist content that is formally independent of the respective governments. As for the Ukrainian audience, this recommendation primarily concerns the content produced in Russia.
2. Sanctions not only against channels but also against people
YouTube’s blocking mechanisms enable blocking individual channels due to violations of the community rules. For example, blocking may be invoked to spread hate speech or outright fakes, which does little to prevent the authors of such blocked channels from appearing as guests on other channels. At the same time, bot networks or social engineers publish fragments of their videos. This causes the content featuring such propagandists to continue to appear on YouTube recommendations.
YouTube ought to develop algorithms for removing content featuring such persons or their narratives from its platform—not only by removing their copyrighted content but also by removing any content that may contain their narratives, thus eradicating such persons from the social network.
3. Categorization
YouTube ought to categorize the content better and focus more on the content rather than the description of the video provided by the authors. This should prevent Russian propagandists from appearing in the “Games” or “Movies” categories, as was observed in both the Ukrainian and Polish YouTube segments.
4. The Ukrainian language
At this stage, Ukrainians should be considered as one of the factors for recommendations in the Ukrainian segment of YouTube. After all, the Ukrainian-language segment is much less exposed to Russian propaganda and disinformation. Such a move would also meet the expectations of the members of Ukrainian society who strive to break free from the colonial past and restore the Ukrainian language. Many opinion polls have registered people's desire to use Ukrainian more often and watch more content in this language. This innovation from YouTube would help avoid the “trap of the past” when people are offered videos based on their viewing history despite changing their opinions and preferences.
5. Consideration of the list of individuals sanctioned by Ukraine, the EU, and the US
YouTube ought to block video content distributed on behalf of sanctioned individuals or to justify the crimes committed by individuals subject to international sanctions, by internationally wanted persons, or those who appear on the arrest warrants issued by the International Criminal Court. The platform should restrict the original content from such persons and extend the restrictions to all kinds of content that justifies them in any way.
6. Countering manipulative (i.e. clickbait) headlines
Propagandists and authors of fakes and fabrications often use emotion-provoking, sensationalized headlines, usually manipulative or outright false, to maximize video traffic. YouTube’s algorithms ought to moderate such titles and, in case of outright lies or hate speech, ban them or the entire block of content posted under such a title. The platform should also prioritize verified media that adhere to journalism ethics and standards.
7. Improving algorithms for moderating pseudo-medical content in the Russian and Ukrainian languages
The dissemination of false medical content, hidden advertisements of pseudo-medicines, or promoting the use of everyday household products to treat serious diseases can be particularly dangerous. Such videos should be appropriately moderated and removed from the platform. Furthermore, any content that contains medical information should be labeled accordingly and provided with an appropriate warning (e.g., “Choose evidence-based treatment,” “Consult a doctor first,” or “Self-medication can be dangerous”). Although YouTube has been working on ways to moderate such content for quite some time, our research showed that the platform’s algorithms are not entirely able to detect pseudo-medical and potentially harmful videos in the Russian and Ukrainian languages. Here are some examples of headlines the participants saw among the recommended videos: “BAKING SODA: an anti-aging, CANCER, and chronic inflammation treatment! How much and what kind of soda should you take?” “Superfood spirulina: treats everything from cancer to runny nose
Ukrainian government
Verkhovna Rada
Since Ukraine's accession to the EU may be a matter of distant future, Ukraine should proceed with borrowing the norms and approaches of the DSA and the related laws to achieve the following goals:
⦁ Create more effective mechanisms of public control over online platforms in Ukraine;
⦁ require major platforms and search engines to implement measures aimed at strict monitoring of posted content, immediate removal of illegal content, and removal of targeted advertising based on sexual orientation, religion, ethnicity, or political beliefs;
⦁ simplify the mechanism of receiving user complaints, and improve the response procedures and the transparency of the moderation process with mandatory notification of the author about the reasons for blocking;
⦁ introduce mandatory norms for online platforms to require greater transparency of algorithms, inform users about the reasons for recommending certain content, and offer the possibility of opting out of personalized recommendations.
Cabinet of Ministers
Since the Ukrainian language does provide a barrier for Russian propaganda, the Cabinet of Ministers of Ukraine ought to establish a system of grants to create Ukrainian-language content through the Ukrainian Cultural Foundation.
The Cabinet of Ministers should also provide sufficient staff for the Center for Strategic Communication and Information Security under the Ministry of Culture and Information Policy of Ukraine to monitor disinformation content and liaise with significant technology platforms on ways to improve their moderation algorithms and recommendations.
Communities that Counter Russian Disinformation
Following the example of the Polish organization “Never Again,” the communities ought to maintain a register of propagandists and YouTube channels that spread Russian messages and approach YouTube with blocking demands.
Citizens of Ukraine
- According to our study of the Polish segment of YouTube, you should stop watching Russian-language content altogether to safeguard yourself against videos with Russian propaganda as much as possible.
- Do not watch any content from Russian propagandists—even out of curiosity. Remember that if you watch such a video, YouTube will automatically recommend it to people with similar interests. The system will also suggest other propagandists or videos from the gray zone.
- Finally, never hesitate to complain to YouTube about all the propagandist videos you might see.