Professor Konstantin Sonin

Two new working papers by Konstantin Sonin, the John Dewey Distinguished Service Professor at the University of Chicago Harris School of Public Policy, offer insights into how authoritarian regimes deploy propaganda: not to convince their citizens of truth, but to strategically weaponize and exploit lies. Part of the Becker Friedman Institute working paper series, these studies explore state-sponsored misinformation by discerning the logic behind implausible state messaging and the subtle architecture of modern propaganda systems.

Lies That Everyone Knows Are Lies

In “The Reverse Cargo Cult: Why Authoritarian Governments Lie to Their People,” Sonin revisits a puzzle from his childhood in 1980s Moscow: why did the Soviet Union hold elections where a single candidate ran unopposed and won with 99.9% of the vote? Why stage a democratic ritual that fooled no one? And, just as importantly, why turn in a ballot in that election?

“In 1984, the dying General Secretary of the Communist Party and Chairman of the Presidium of the Supreme Soviet of the USSR, Konstantin Chernenko, was this one candidate in our local electoral district,” Sonin explains in the introduction. “The ballot did not suggest any way to vote ‘no.’ With single candidate on every ballot and the turnout and vote for this single candidate reported as 99.9%, was there a chance that Soviet citizens did not know this was not an election?”

Sonin’s answer is both counterintuitive and illuminating. These sham elections weren’t intended to create a façade of legitimacy; instead, they served to manipulate public perception through a shared lie. Drawing on the metaphor of the "Reverse Cargo Cult," Sonin argues that authoritarians benefit not when citizens believe the lie, but when citizens know the lie is official and unavoidable. The shared recognition of deception reinforces the idea that all politics everywhere must be equally deceptive. In this way, the farce of Soviet elections seeded cynicism about democracy itself.

Sonin connects the phenomenon to “whataboutism,” a rhetorical tactic in which regimes deflect criticism by highlighting supposed hypocrisies in others. If every government is corrupt, the thinking goes, then it’s not worth trying to replace your own. This psychological sleight of hand dulls the appetite for political change, even in the face of obvious authoritarian control.

The Science of Influence

Another new paper, “Authoritarian Propaganda and Social Networks,” builds a formal model of how dictatorships manipulate information through state-controlled media and social connectivity. Drawing on historical campaigns in the USSR and China from encouraging abstinence to eliminating sparrows, Sonin shifts the question from what is said to how propaganda travels.

His model reveals a surprising dynamic: the impact of propaganda is not highest in moderately connected societies, but rather at the extremes. When citizens are either completely isolated (atomized) or perfectly connected (saturated), propaganda has its greatest reach. In an isolated society, people are more likely to subscribe directly to state messaging, having no reliable peers to consult. In a fully connected society, information, however slanted, spreads like wildfire.

Ironically, it is in moderately connected societies where propaganda is least effective, the paper finds. Here, social networks dilute the slant of official messaging, and citizens are more discerning about what they consume. The model’s implication is stark: from a dictator’s perspective, either isolate everyone or connect them entirely – anything in between is dangerous for an authoritarian leader.

This also explains why regimes may target the periphery of the social graph – those citizens with few connections – rather than major influencers. Isolated individuals are both more vulnerable to misinformation and more likely to act on it uncritically.

As Hannah Arendt wrote in The Origins of Totalitarianism, authoritarian loyalty is easiest to extract from someone “without any other social ties to family, friends, comrades, or even mere acquaintances," Sonin notes.

“If the regime can influence the social network architecture – as the Soviet regime tried – it gets the maximum propaganda impact when citizens are isolated from each other – as Soviet citizens arguably were,” Sonin concludes.

The Politics of Isolation and Control

Together, these two papers form a framework for understanding propaganda in modern authoritarian states. Far from relying solely on brute force or digital surveillance, these regimes curate entire “epistemic environments,” as the paper calls them: systems of belief shaped by lies, comparisons, and carefully managed networks.

In Sonin’s view, manipulation doesn’t require mass belief in state messages. It only requires enough confusion, cynicism, or disconnection to prevent coordinated opposition.