The Last Social Contract
This essay is part of Thinking in Public, a series exploring the uncertain, exponential moment we’re living through. Each piece looks through a systems lens at how AI, politics, and capitalism are reshaping one another — and, ultimately, society itself. Writing these essays is my way of making sense of what’s happening — to process out loud, to self-soothe, and to share in case it helps others do the same. Read the series preface here.
An exploration of how automation, capital, and governance are converging into a self-reinforcing system — and why the first alignment crisis may be political, not technological.
It’s starting to feel like too much of a coincidence.
At the exact moment democracy is buckling under the weight of disinformation, polarization, and institutional rot — when populism gnaws at the edges of liberal order and tech billionaires play kingmaker in elections — we find ourselves rocketing up the exponential curve of technological change.
Artificial intelligence and robotics are scaling faster than any human institution can adapt. The timing is uncanny. As the political foundation cracks, the technological superstructure is being poured in place — a perfect inversion of what progress was supposed to look like.
Some call it disruption. Others call it late-stage capitalism. But I’m starting to think it’s something closer to a structural coup — a positive feedback loop between technology, capital, and politics that’s quietly shifting power from citizens to the systems (and the small group of people who control them).
The Great Decoupling
History has rhythms. Every few centuries, the means of production change — and with them, the balance of power.
The Industrial Revolution shifted authority from monarchs to manufacturers, from aristocrats to industrialists. The Information Age transferred it from factories to financiers and technologists. Each transition reordered who controlled the tools of production — and thus who wrote the rules: Rockefeller → Gates → Zuckerberg → Musk.
We are entering another such moment with the rise of AI, autonomy, and robotics. But this time is different — this revolution is decoupling production from people altogether. For the first time, the global economy can expand without expanding human labor.
Democracy, with all its lofty ideals, is at its core a deal: labor for legitimacy. Workers produced, elites profited, and the economic gains circulated through that loop with sufficient balance to make social cohesion possible.
Now that loop is breaking — the logic of the market no longer needs humans in the same way, and the power dynamics that once stabilized democracy are beginning to unravel. In its place, we get exponential capital — scale without citizens. When production no longer depends on people, capital can rapidly concentrate, and power no longer depends on consent.
This transition would be destabilizing enough on its own, but the timing couldn’t be worse (and may not be entirely coincidental), as we’re still on our back foot from the first AI takeover.
The First AI Takeover
A lot of technologists stay up at night worrying about the so-called alignment problem — the fear that an artificial superintelligence might one day outsmart us, surpass us, and decide we’re expendable.
It worries me too, but I’m not sure we’ll ever get that far.
Long before we build a godlike intelligence, we’ve already built systems that outsmart our lower selves. As Tristan Harris put it in The Social Dilemma, social-media algorithms have learned to weaponize our ancient instincts for validation, outrage, and belonging. They’ve hacked the human operating system.
The result isn’t a rogue AI uprising — it’s a quiet coup of attention that has undermined the foundations of democracy. The same feedback loops that drive engagement also drive polarization, radicalization, and distrust. They’ve hollowed out our sense of shared truth and our capacity for collective sense-making, leaving democratic institutions paralyzed and the public too fragmented to act coherently.
That collapse of shared reality hasn’t just weakened democracy — it’s created a power vacuum. When a society can no longer coordinate, deliberate, or even agree on basic facts, its ability to collectively respond evaporates. And in that vacuum, the automation-fueled concentration of capital, power, and political influence encounters little meaningful resistance.
The Rise of Self-Sustaining Power
In earlier eras, instability tended to come from below — revolutions driven by hunger, exploitation, or the demand for representation. When the balance of power fractured, it was usually because people forced it to.
This time, the fracture is coming from above. AI and automation are severing the centuries-old link between labor, production, and political legitimacy — the economic loop that once tethered elites to the societies they led. As machines replace workers and capital compounds without them, wealth and influence accelerate upward faster than democratic systems can respond.
Layered on top of this is the first AI takeover: the algorithmic erosion of shared reality. Whatever civic counterweight once existed to concentrated power has now dissolved into fragmentation and polarization.
This is the real inflection point — not the somewhat distant threat of superintelligence, but the near-term moment when power no longer depends on people.
We often frame the singularity as the day machines surpass us. But the more urgent danger is the day power becomes self-sustaining, when those who command these systems no longer need the rest of us for labor, legitimacy, or consent.
AI has been called the last invention. But unless we respond, it may also bring about the last social contract — the moment the modern bargain between people and power finally breaks.
The Last Social Contract
Every era of disruption eventually settles into a new equilibrium. The question now is: what replaces legitimacy once labor no longer anchors it? For the first time, we’re approaching a political and economic order in which the governed may no longer be necessary to those who govern. That makes this moment — this sliver of time when we still hold meaningful leverage — uniquely critical.
The most dystopian trajectory is a modern, AI- and robotics-enabled authoritarian state. Historically, rulers were constrained by the cost of control: armies, police, bureaucracies. But as surveillance, persuasion, and enforcement become automated, that cost collapses — and with it, the final barrier that once limited authoritarianism. This is the danger of a world where power no longer depends on people.
Yet the same tools that could entrench domination could also distribute agency. Capitalism and computation both scale, but what they scale depends on access. If intelligence, automation, and data remain centralized, power will harden into something unrecognizable. If they’re made open and participatory, they could become counterweights to their own excesses.
The old stabilizers — democracy, collective bargaining, broad-based labor — are fading as automation severs the link between labor and production. If legitimacy is no longer anchored in labor, new stabilizers must emerge fast: civic ownership of AI infrastructure, public access to models and compute, taxation of automated productivity paired with UBI or other redistributive mechanisms.
Other possibilities matter too: democratic oversight of high-impact algorithms; sovereign civic AIs to counterbalance corporate ones; even an international “Geneva Convention for Intelligence” to limit coercive uses of AI. None of these are silver bullets, but together they sketch a future where intelligence isn’t monopolized by a tiny elite — and where a post-labor economy still has enough shared foundation to renegotiate the next social contract.
These are thin reeds, but real ones. History’s turning points often look hopeless until suddenly they aren’t.
The Narrow Path
Maybe this is what transition looks like — the pain of one system giving way to another. Every generation believes it stands at the edge of collapse; occasionally, some of them are right.
If this is that moment, the task ahead isn’t to stop the singularity but to shape it — to ensure intelligence doesn’t just compound wealth, but compounds human agency.
The near-term danger isn’t that AI will rise against us. It’s that it will quietly make obsolete the very structures that made us free. We still have a window, though small, to decide what the merger of human and machine will serve: empire or evolution.
Because the singularity isn’t just technological. It’s political.
Continue the Series →
Next: Welcome to The Freemium Future
How AI may not destroy capitalism — but mutate it into a corporate-run subscription model designed to prevent collapse.