Blame It on the Algorithm — Political Leaders Lean on AI as the New ‘Fake News’

Stuart Kerr
0

 

Illustration showing Google AI Overviews with political leaders pointing and blaming algorithms as the new ‘fake news’.

By Stuart Kerr, Technology Correspondent

Published: 08/09/2025 | Updated: 08/09/2025 
Contact: [email protected] | @LiveAIWire

AI as the New Scapegoat

When Donald Trump dismissed compromising audio as artificially generated, headlines marked the moment as a turning point. “The president blamed AI and embraced doing so,” reported AP News, capturing how claims of synthetic fakery are becoming a political weapon. Just as “fake news” once became a rhetorical shield, AI is now the latest scapegoat.

The strategy has a name: the “liar’s dividend.” By sowing doubt about whether evidence is real, leaders gain political cover, even when the evidence is authentic. Researchers warn that this emerging tactic could undermine democratic accountability, granting politicians an escape route from legitimate scrutiny.

The Liar’s Dividend in Action

Academic studies confirm the power of this approach. A Purdue University report summarises findings from political scientists showing that when voters are told scandalous material might be fake, they are more forgiving. Similarly, a Yale ISPS paper highlights how misinformation about misinformation becomes its own strategy, compounding public confusion.

The most robust evidence comes from controlled experiments. In the American Political Science Review, scholars tested whether politicians benefit by claiming damaging reports were fake. They found the “liar’s dividend” effect was strongest with textual scandals: when leaders labelled incriminating documents as fabricated, voters’ trust eroded less than when no denial was offered.

Weaponising Doubt

The tactic is particularly dangerous in a media environment already strained by AI-generated misinformation. As LiveAIWire explored in AI and Emotional Manipulation, generative models can shape not just what people read but how they feel. When politicians invoke AI to dismiss accountability, they weaponise that same uncertainty, exploiting public fears about the technology.

It’s not just theory. A Brookings brief warns that the liar’s dividend could corrode trust in institutions, particularly during elections. If every damaging revelation can be brushed aside as an algorithmic fabrication, the ability of journalism to hold power to account is fundamentally weakened.

The Accountability Crisis

This problem resonates across the information ecosystem. At LiveAIWire, we’ve examined how AI alters content flows in contexts as diverse as Beyond Algorithms — Hidden Carbon & Water and the survival of publishers in the Zero‑Click Era. But the liar’s dividend adds a uniquely political dimension: it enables those in power to rewrite the rules of credibility itself.

By invoking AI as a catch‑all excuse, leaders can dodge accountability while appearing tech‑savvy. Yet the cost is profound: erosion of truth as a shared foundation for democratic debate.

Towards a Response

What can be done? Scholars suggest pre‑emptive education campaigns to raise awareness of the liar’s dividend. Others argue for watermarking AI content or mandating provenance disclosures to help voters distinguish genuine material from fabrications. But no technical fix can fully blunt the political incentives to cry “AI!” when caught in scandal.

Ultimately, the liar’s dividend forces society to confront an uncomfortable truth: accountability depends not just on evidence, but on whether citizens believe evidence when they see it. And if politicians can casually dismiss reality as synthetic, democracy itself risks becoming collateral damage of the algorithmic age.

About the Author
Stuart Kerr is the Technology Correspondent for LiveAIWire. He writes about artificial intelligence, ethics, and how technology is reshaping everyday life. Read more

Post a Comment

0 Comments

Post a Comment (0)

#buttons=(Ok, Go it!) #days=(20)

Our website uses cookies to enhance your experience. Check Now
Ok, Go it!