Why YouTube Is No Longer Removing Videos Spreading US Election Misinformation
YouTube has announced changes to its Elections Misinformation Policy. As of June 2, 2023, the website will not remove content claiming “widespread fraud, errors or glitches” in previous elections, including the 2020 US Presidential election.
This change seeks to balance YouTube’s commitment to safety and open political discourse. With another US Presidential Election approaching in 2024, you may wonder what that actually means and why YouTube would make this change now.

How YouTube’s Elections Misinformation Policy Started
YouTube’s Elections Misinformation Policy covers the platform’s standards for publishers and creators covering or commenting on elections. In December 2020, YouTube prohibited content making false claims about “widespread fraud, errors or glitches” in the 2020 US Presidential election.
The platform implemented this policy following the December 8 “Safe Harbor Deadline” for certification for that year’s election. Per US Federal Law, the Safe Harbor Deadline is six days before the electoral college votes. It’s each state’s deadline to certify election results or address contested results.

YouTube is more than crowdsourced entertainment. According to thePew Research Center, it was also a news source for over one-quarter of all Americans in 2020, including content by mainstream media outlets as well as independent creators.
Additionally, according to theUS Census Bureau, the 2020 election had the highest voter turnout in the 21st century, with over 68% of eligible citizens voting. Due to the combination of the higher-than-usual voter turnout and YouTube’s extensive reach, the platform attracted creators with a full range of beliefs and theories about the election.

In aYouTube Blog post, the platform claims it removed “tens of thousands” of videos containing content that violated its “Elections Misinformation Policy” in the past two years following the 2020 election. In a separateYouTube Blog post, YouTube also mentioned that more than 77% of these videos were removed before they accumulated 100 views.
Of course, YouTube wasn’t the only platform that did this. For instance,TikTok removed over 300,000 videos for election misinformation.

While the US Presidential Election was the catalyst for the 2020 Elections Misinformation Policy update, YouTube’s standards apply to other elections outside the US For example, the policy guidelines specifically mentioned the 2021 German federal election and the 2014, 2018, and 2022 Brazilian presidential elections.
Why Did YouTube Change Its Misinformation Policy?
YouTube said that its interest in allowing political discourse prompted the June 2023 update. The platform’s blog post says the following:
“While removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”

This doesn’t mean that YouTube has rolled back all restrictions. According to itsupdated Elections Misinformation Policy, “misleading or deceptive content with serious risk of egregious harm” is still not allowed, such as:
YouTube also maintains its restrictions against harassment, hate speech, and incitement to violence that apply to all content, including election commentary. In the media landscape, YouTube rolling back its restrictions concerning the 2020 US Presidential Election is similar to thechanges Twitter made earlier in 2023.
CNBC reportedthat tech company layoffs impacted Trust and Safety teams at Meta, Google, and Twitter. With a smaller team, enforcing the previous policy would become even more difficult than it was in 2020. If you see any content that you believe violates YouTube’s policies, follow the process outlined byYouTube’s support team.
What Will YouTube’s Change Mean for the 2024 US Presidential Election?
Most of YouTube’s Elections Misinformation Policy remains unchanged; the update doesn’t stop YouTube from opting to remove content that the platform determine spreads misinformation about future elections in the US or anywhere else in the world.
Like its parent company Google, YouTube attracts viewers and content creators with a full range of interests, making it harder to tackle all misinformation constantly. Still, YouTube will continue to monitor content that goes against its policies. Likewise, you can always report any content you think is inappropriate.
YouTube has struggled to tackle misinformation in the past, but has the situation improved? Is the platform doing a better job now? Let’s find out.
Goodbye sending links via other apps.
Revolutionize your driving experience with these game-changing CarPlay additions.
Who asked for these upgrades?
You don’t need to fork out for expensive hardware to run an AI on your PC.
You’ve been quoting these famous films wrong all along!