YouTube will stop taking down content that promotes false claims about the 2020 US presidential election, marking an about-face by the world’s largest video site as the next American presidential contest begins heating up.
The division of Alphabet Inc.’s Google said the new policy will take effect on Friday.
“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm,” the company said in a blog post. “With that in mind, and with 2024 campaigns well underway, we will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US presidential elections.”
YouTube and other social media companies have been under pressure since the 2016 US presidential election to ensure their platforms are not manipulated to spread political misinformation. YouTube said in December that it had taken down more than 10,000 videos related to the US midterm elections for violating its policies on election integrity and other guidelines. Some of those videos were removed for promoting the falsehood that the 2020 US presidential election was stolen.
YouTube said Friday it would continue to ban content that misleads people about how to vote, advances false information that could dissuade people from voting, and promotes interference with elections. Axios earlier reported the company’s policy shift.