YouTube has announced that it will soon allow previously banned creators to apply for reinstatement, rolling back a policy that had treated certain violations as permanent. The change, confirmed in a letter from Alphabet lawyer Daniel Donovan to US House Judiciary Chair Jim Jordan, applies specifically to accounts removed for posting misinformation related to Covid-19 or elections. Until now, such offences carried lifetime bans.
“Today, YouTube’s Community Guidelines allow for a wider range of content regarding COVID and election integrity,” Donovan wrote in the letter.
The Google-owned platform said on X that the reinstatement scheme will begin as a limited pilot, open to a select group of creators and channels removed under policies that have since been retired. The company did not specify which accounts might be reinstated, but noted that the programme would be launched soon.
We’ve had a lot of questions about a pathway back to YouTube for some terminated creators to set up a new channel. This will be a limited pilot project that will be available to a subset of creators in addition to those channels terminated for policies that have been deprecated.…
— Updates From YouTube (@UpdatesFromYT) September 23, 2025
Political pressure and past enforcement
Channels linked to figures such as Deputy FBI Director Dan Bongino, former White House strategist Steve Bannon and Health and Human Services Secretary Robert F. Kennedy Jr were among those previously banned. It remains unclear whether these channels will be restored under the new policy.
The decision follows increased Republican pressure on major technology companies to reverse restrictions introduced during the Biden administration on vaccine and political misinformation. In March, Congressman Jordan subpoenaed Alphabet CEO Sundar Pichai, accusing YouTube of being a “direct participant in the federal government’s censorship regime.”
Back in 2021, YouTube introduced measures to remove content spreading misinformation about all approved vaccines. Donovan noted in his letter that during the pandemic, senior Biden administration officials urged the company to remove certain COVID-related videos, even when they did not formally violate YouTube’s policies. He described this as “unacceptable and wrong.”
End of Covid misinformation rules and fact-checking approach
According to Donovan, YouTube ended its stand-alone Covid misinformation rules in December 2024. He also stressed that the platform “will not empower third-party fact-checkers” to decide which videos remain online and will continue to prioritise “free expression.”
While Donovan insisted the platform has not relied on fact-checkers, YouTube does display information panels beneath videos. These panels contain links to third-party sources and are intended to provide viewers with more context about the content they are watching.
Other technology companies have made similar moves. In January, Meta announced that it had scrapped its fact-checking programme across Facebook and Instagram. Google, YouTube’s parent company, introduced a fact-checking feature in 2017 that placed labels on search and news results. However, this tool was focused on providing additional context rather than removing material.
The upcoming reinstatement programme marks a significant change in YouTube’s approach, reflecting broader debates over online speech, misinformation, and the balance between safety and free expression.