The rapid rise of generative artificial intelligence (AI) assistants such as ChatGPT is having a profound impact on traditional online search and, by extension, the already fragile digital news industry. With AI-powered tools now summarising articles directly within search results, users are less inclined to visit the original news sources, threatening both advertising revenue and subscription-based models that publishers rely on for survival.
“The next three or four years will be incredibly challenging for publishers everywhere. No one is immune to the AI summaries storm gathering on the horizon,” said Matt Karolian, vice-president of research and development at Boston Globe Media. “Publishers need to build their shelters or risk being swept away.”
Although comprehensive data is still emerging, a recent study from the Pew Research Centre found that users click on links in search results containing AI-generated summaries about half as often as they do in traditional search results. For publishers, this sharp drop in clicks equates to a significant reduction in traffic, with direct consequences for ad revenue and the ability to convert visitors into paying subscribers.
John Wihbey, a professor at Northeastern University, warned that the shift is likely to intensify. “These trends will accelerate, and pretty soon we will have an entirely different web,” he said.
Shifting strategies and uncertain gains from AI
In response to declining advertising returns due to the dominance of platforms like Google and Meta, many news outlets have transitioned to subscription-based models. However, Wihbey stressed that traffic remains essential even for these strategies to work. Without sufficient visitors, there are simply not enough conversions to sustain major media operations.
Some signs of adaptation are emerging. According to Karolian, the Boston Globe has observed a small number of new subscribers coming via ChatGPT, offering a new — albeit limited — opportunity to engage readers. Other tools, such as Perplexity, are also generating subscriptions, but these numbers remain minor compared to traditional platforms and even smaller search engines.
As media companies seek to adapt, many are now embracing Generative Engine Optimisation (GEO), a strategy designed to improve how AI models interpret and cite their content. GEO involves ensuring that articles are well-structured, easy to understand, and highly visible on platforms like Reddit, which are frequently crawled by AI tools.
Still, the question of how to handle AI access to original content remains a contentious issue. “Should you allow OpenAI crawlers to crawl your website and your content basically?” asked Thomas Peham, CEO of optimisation firm OtterlyAI.
In light of concerns about unauthorised data use, some publishers have responded by blocking AI crawlers altogether. Others, however, are opting for cautious engagement in the hopes of receiving fair compensation.
“We just need to ensure that companies using our content are paying fair market value,” said Danielle Coffey, chief executive of the News/Media Alliance, a trade group representing publishers.
Licensing deals and legal battles offer mixed outcomes
Some progress is being made through licensing agreements. Major players, including The New York Times, Amazon, Google, the Associated Press, Mistral, and Agence France-Presse, have entered into agreements that allow AI firms to use their content legally. However, legal tensions persist, most notably with The New York Times pursuing a landmark lawsuit against OpenAI and Microsoft over the unauthorised use of its journalism.
This leaves publishers in a difficult position. Blocking access to AI crawlers protects their intellectual property, but it also limits the visibility of their content. As a result, Peham noted, “media leaders are increasingly choosing to reopen access.”
Still, greater exposure does not always lead to greater impact. According to OtterlyAI, media outlets account for only 29 per cent of the citations offered by ChatGPT, while corporate websites account for 36 per cent. Unlike traditional Google search, which tends to favour well-established media sources, ChatGPT’s citation patterns are more varied and less predictable.
Beyond the business implications, there are also questions of trust and transparency. The Reuters Institute’s Digital News Report for 2025 revealed that around 15 per cent of people under 25 are now turning to generative AI for their news. As these tools often obscure the source of information, there is concern that this could further erode trust in journalism, similar to what occurred during the rise of social media platforms.
“At some point, someone has to do the reporting,” Karolian emphasised. “Without original journalism, none of these AI platforms would have anything to summarise.”
Some tech companies appear to be recognising the value of original reporting. Google, for example, is working on partnerships with news organisations to enhance its generative AI offerings in a more structured and collaborative way.
“I think the platforms will realise how much they need the press,” said Wihbey. Whether that realisation will come in time to preserve what remains of the struggling journalism sector, however, remains uncertain.