Time is порнография спермаup for YouTubers spreading conspiracy theories about the outcome of the 2020 U.S. presidential election.
On Wednesday, YouTube announcedthat it will begin removing content that alleges fraud interfered with the results of the election in November. YouTube’s policy change comes around five weeks after election day. That gave disinformation peddlers a considerable amount of time to spread conspiracy theories about the election results and unproven claims of voter fraud.
What took them so long? According to the Google-owned company, it was waiting for enough states to certify the election results.
“Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect,” reads the announcement on YouTube’s blog. “Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections.”
While Facebook and Twitter both amped up their policies against misinformation in preparation for the election, YouTubewas often criticized for its more hands-off approach to stopping the spread of potentially dangerous falsehoods on the platform.
For example, in the days following the election, YouTube alloweda video by the right wing One America News Network (OANN) that falsely claimed that President Trump won the election spread on the site. In fact, it wasn’t until OANN brokethe company’s COVID-19 misinformation policy that YouTube took action against the right wing news organization’s channel.
According to YouTube, the company had previously terminated thousands of channels and videos that misled voters about “where and how to vote.” The company also removeda number of conspiratorial channels, such as those spreading QAnon-related falsehoods, in the weeks leading up to the election.
The new policy update means that YouTube will now also remove videos that claim “a Presidential candidate won the election due to widespread software glitches or counting errors.”
There will be some exceptions to this rule, such as content that discusses these topics in an educational, scientific, or artistic way. But, if your intent is to spread these unsubstantiated claims about the election, YouTube is no longer the place to do it.
Topics Social Media YouTube
The Reporter Without BordersManage election stress with these 5 tipsMeta Connect 2024: Meta’s Orion AR glasses unveiledAlexa in space? Why freeWordle today: The answer and hints for September 27Google and Roblox partner on new game for kidsLien on MeThe UN is calling for global cooperation on AI. Is it too late?Intolerant IndiaWordle today: The answer and hints for September 27 Best fitness deal: Snag the Hydro Pro Rower for $450 off at Amazon PlayStation's Days of Play is live — Get a PS5 'Call of Duty' bundle for just $400 Bose QuietComfort earbuds are $30 off at Amazon if you hurry NYT Strands hints, answers for May 28 ByteDance aims to launch video DMV text scams are on the rise in some states Microsoft launches Tencent App Store Zone in Windows Store for Chinese users · TechNode McDonald vs. Djokovic 2025 livestream: Watch French Open for free OpenAI explores 'sign in with ChatGPT' for other apps Alibaba group reports steady growth in Q2 2024, as net income soars 63% · TechNode
0.1384s , 7986.03125 kb
Copyright © 2025 Powered by 【порнография сперма】YouTube will finally remove videos that spread misinformation about the 2020 election,Feature Flash