A Nobel economist models how AI rots the information environment
May 10, 2026

Most Australians already know something is wrong with our information environment. An Australian National University survey of 20,000 average Australians found disinformation consistently ranked among top national security concerns. Participants rated it as more serious than the prospect of a foreign military attack.
That gut feeling – what participants described as being overwhelmed by information volume, an inability to distinguish truth from falsehood, and a sense of algorithmic manipulation – now has economic proof.
In a September 2025 report, Nobel Prize-winning economist Joseph Stiglitz and his Columbia University colleague Maxim Ventura-Bolet use economic modelling to show why the information environment is deteriorating and explain why AI will make it worse unless governments intervene.
Their main conclusion is that without regulation, markets will systematically produce more disinformation and less truth, and AI will compound the problem. Today’s disinformation challenge cannot be solved by asking people to behave better online; it can only be solved by fixing the incentive structures that produce it.
If, like me, you don’t have a Nobel prize in economic modelling, here’s the simplified version of how the rot is happening.
Information exists in a market. Newsmakers – newspapers, researchers, journalists, broadcasters – have long been the established information producers. Readers have been the information consumers.
For much of the 20th century, advertising revenue kept quality journalism, which is expensive to produce, just financially viable enough to function.
Enter social media. As new information producers such as citizen journalists and influencers emerged, so too did a new middle layer: digital platforms.
Do you remember the internet before Google search? Finding good information was genuinely hard work. Platforms solved this: they aggregated everything into one place, made it searchable and surfaced what seemed most relevant – free for consumers.
The problem, Stiglitz and Ventura-Bolet show, is not that you’re lazy and decided to stop going to the original source. It’s that platforms made the difficulty of doing so work in their favour, and now their entire business model depends on your attention.
Platform owners make money from engagement through ads and data collection. Every second on their platforms creates revenue. Every second on the original news site creates revenue for someone else.
Platforms discovered that outrage and emotion kept people engaged longer than calm, accurate information. So they reward provocative content, whether true or not, because it drives more clicks. Nuanced, verified sources, public interest journalism and your long-term understanding – these aren’t profitable.
When the algorithm is optimised to keep you coming back for more, consumers no longer go directly to news sites; they go to an AI overview or their social media feed. The original content producer loses the visit and therefore the revenue. Google search and Facebook feeds became the habit before anyone noticed the cost.
Enter AI. It produces legitimate content by scraping journalism and summarising it without paying the original producer. And it produces convincing fake content, cheap and fast. Stiglitz points out that AI is not interested in whether content is truthful or untruthful; it is optimised for efficiency.
Stiglitz and Ventura-Bolet’s model shows that the damage compounds as AI floods the information ecosystem with disinformation and low-quality content – both cheap and infinitely scalable – and as producers are no longer incentivised to produce new, quality truth.
Since AI is limited by the quality of its training data, feeding it distorted or unreliable information leads it to produce distorted content.
Stiglitz uses an old but relevant adage: garbage in, garbage out.
The paper also models polarisation, which gets locked in once disinformation begins gaining ground. Polarised audiences seek out content that confirms their views, which creates demand for more disinformation, which earns revenue for producers of disinformation, which drives quality news organisations further out of business. The market economics of disinformation actively reward more disinformation.
And so, a downward spiral begins.
Who wins? The platforms, as well as the misinformation producers that flood the internet with cheap AI-generated content and profit from engagement.
Who loses? Quality news, public interest journalism and the public that depends on it.
Stiglitz and Ventura-Bolet say we don’t have to accept this situation, but the market will not self-correct, because nobody is incentivised to act differently. Individuals opting out of social media will not change the incentives facing online media platforms or AI companies.
Only government intervention can stop the spiral. Stiglitz and Ventura-Bolet argue this should include regulated platform accountability for content amplification, enforced obligations to address coordinated disinformation campaigns, and intellectual-property protection for news producers.
Australia has a good record of leading global efforts to make digital platforms pay for news content. In April, the government signed a memorandum of understanding with Anthropic. This signals that at least one frontier AI company is willing to engage on responsible development. But a memorandum of understanding is not regulation, and goodwill is not an incentive structure. The challenge now is to lock in these efforts before the damage becomes irreversible.
For the average Australian, disinformation is not abstract. The information landscape is polluted, fragmented and manipulative. Stiglitz and Ventura-Bolet have mathematically shown us that, unless the government steps in, things are about to get a whole lot worse.
Search
RECENT PRESS RELEASES
Related Post
