Table of Contents
April 9, 2026
The Humble HyperlinkApril 2, 2026
The Hare Is Running Away With Your Data But the Tortoise Has a ShellMarch 23, 2026
Language == (Code && 💕)March 20, 2026
Notes on Cognitive LibertyMarch 20, 2026
From Tulips to Transformers: A Brief History of Expensive MistakesMarch 18, 2026
DoomtubersMarch 18, 2026
The Private MindMarch 16, 2026
The Fluency IllusionMarch 16, 2026
The Map Is Not the TerritoryMarch 16, 2026
The Modular MindMarch 16, 2026
Who Named These Animals?March 4, 2026
Align AI? Try Aligning HumansFebruary 22, 2026
This AI Cannot Be EmpireFebruary 14, 2026
Unsubscribe & ResistJanuary 26, 2026
The ContentkeeperApril 11, 2025
Review: The Idea FactoryMarch 26, 2025
Whither Apple Intelligence?March 19, 2025
Let’s Call Them Answer EnginesMarch 11, 2025
DoomtubersMarch 4, 2025
“It’s not A.I., It’s the Non-Economy of Content, Stupid.”January 25, 2025
The Lasting Creak of Legacy CodeJanuary 26, 2005
Weaponizing BitsWhy US Tech Companies Are the Last Companies That Should Be Building A.I.
We complain endlessly about technical debt. Legacy code is the bane of every software engineer’s existence—those ancient COBOL systems running critical infrastructure, the spaghetti code no one dares touch because nobody remembers why it works, the countless hours lost to backwards compatibility with systems designed when Reagan was president.
“We should just rewrite it from scratch,” says every developer who’s never actually tried to rewrite something from scratch.
But here’s the thing: humans are running on legacy code too. We’ve got biological legacy code (circadian rhythms hardwired to ~24-hour cycles you can’t just patch), cultural legacy code (languages with irregular verbs, QWERTY keyboards designed to prevent mechanical typewriter jams), and infrastructural legacy code (measurement systems, time-keeping systems, calendars with month lengths that make absolutely no mathematical sense).
The difference is we can’t schedule downtime for humanity to deploy the update.
In 1793, France tried to refactor time itself. The Revolutionary calendar was mathematically elegant: 10-hour days, 100-minute hours, 100-second minutes. Twelve months of exactly 30 days each, plus 5-6 festival days. Perfectly decimal. Beautifully rational.
Why? Because you can’t just rewrite the runtime while it’s executing. Every clock in France needed replacing. All existing schedules became gibberish. The traditional 7-day week with one rest day became a 10-day week with one rest day—workers hated losing weekly breaks. And the Earth stubbornly continued rotating once every 24 traditional hours regardless of what the French government declared.
The network effects were insurmountable. France couldn’t force other countries to adopt it. Trade required constant conversion. Even within France, people kept using traditional time informally. The “technically superior” system failed because it required too much coordination and disrupted too many established patterns.
Sometimes “good enough and universal” beats “perfect but incompatible.”
Here’s where it gets interesting for those of us watching the current attempt to “optimize” government: law itself is legacy code.
Every statute, every regulation, every bureaucratic process represents an accumulated solution to a real problem someone encountered. That “redundant” office? It exists because in 1973 someone died when two departments failed to coordinate. That “inefficient” form? It’s capturing information needed to prevent the fraud that cost taxpayers millions in 1989. That “overstaffed” division? Those people know where the bodies are buried. Sometimes literally.
When DOGE announced plans to cancel VA contracts, burial services for veterans were on the chopping block. The people managing our national cemeteries, tracking remains, ensuring families can find their loved ones’ graves—that’s “waste” to be eliminated in the great rewrite.
The same pattern appears everywhere: fire the institutional knowledge, delete the “bloat,” assume complexity equals inefficiency. Then watch the cascading failures nobody predicted. The “inefficiency” was often the cost of maintaining coordination across 330 million people. Those legacy systems weren’t bugs—they were features that evolved to solve real problems.
But the rewrite-from-scratch fantasy persists because dealing with legacy systems is exhausting. Democracy is slow. Consensus is hard. People are irrational. The dream of just sweeping it all away and starting fresh is seductive.
Which brings us to the really troubling part.
The tech industry’s track record on managing legacy systems is, to put it charitably, mixed:
Yet somehow, these same people confidently announce they’re going to bootstrap recursive self-improvement and align a superintelligence with human values. What could go wrong?
The cognitive dissonance is spectacular. These are people who’ve lived the nightmare of legacy systems, who know intimately how hard it is to change anything once it’s embedded in infrastructure, who’ve watched “just rewrite it” projects fail catastrophically.
And they look at human civilization—which makes COBOL look simple—and think “AGI will sort this out, no problem.”
It’s the same mentality that says “we’ll fix government by deleting it” but applied to the entire human condition. Both assume legacy systems are bugs rather than features. Both assume complexity is waste rather than evolved solutions to coordination problems. Both assume you can just declare new systems into existence without breaking everything that depends on the old ones.
Both are catastrophically wrong in exactly the same way.
Here’s what the “rewrite from scratch” people always miss: the friction in legacy systems isn’t just inefficiency to be optimized away. It’s the record of how we’ve negotiated living together.
The French Revolutionary calendar failed not because humans were too stupid to understand decimal time, but because coordination is the point. You can’t unilaterally change the substrate of social cooperation. The “waste” in the system is often the price of getting people to agree on anything at all.
This is why the superintelligence narrative is so revealing. It’s partly a fantasy of escape from coordination costs. We’re exhausted by human legacy systems, so we dream of an external optimizer that could just… fix everything without all that messy negotiation.
But wiping the slate clean isn’t optimization. It’s authoritarianism with extra steps.
We can’t fix our existing systems, but we’re confident we can build new ones of vastly greater complexity and power. We struggle for hours getting a web form to submit properly, then announce we’re ready to perform brain surgery on civilization itself.
The evidence suggests otherwise.
When you can’t coordinate a calendar change across one country, building a superintelligence to coordinate everything seems… ambitious. When cutting VA contracts leads to accidentally eliminating burial services, trusting that same approach with recursively self-improving AI seems… unwise.
The technical debt in human systems isn’t a bug that superintelligence will fix. It’s a feature of what it means to have multiple agents with different goals trying to coordinate. The legacy code is us—our biology, our culture, our evolved methods of cooperation.
You can’t refactor that. You can only work with it.
But there’s an even deeper problem here, one that should terrify anyone paying attention: the corporations building AI are themselves legacy code badly in need of an update.
These aren’t neutral technical organizations guided by engineering principles. They’re legal structures optimized for one thing: shareholder returns. When no one from the community sits on their boards, when unions have no voice in their governance, when quarterly earnings reports matter more than long-term consequences, corporations act like psychopaths. They pollute. They injure. They externalize costs onto society while privatizing gains.
Remember Google’s “Don’t Be Evil”? They quietly dropped that. Because it turns out “Don’t Be Evil” is incompatible with the actual legacy code running inside modern corporations—the fiduciary duty to maximize shareholder value above all else.
These are the institutions building our AI future. Not democratically accountable organizations. Not transparent public institutions. Not structures designed to balance multiple stakeholder interests. Corporations. The same corporations that can’t stop dumping toxic waste in communities without Environmental Protection Agency enforcement. The same corporations that fight basic worker protections. The same corporations that optimize for engagement over human wellbeing.
We’re handing the most powerful technology in human history to entities whose governing logic is “make line go up,” and hoping they’ll somehow align superintelligence with human values when they can barely align with basic human decency.
The people building AI can’t coordinate across their own organizations without creating toxic work cultures. They can’t ship products without breaking things they promised to keep working. They can’t manage their own technical debt. Corporate psychopaths are drawn to senior positions precisely because modern corporate structures reward the traits that make them dangerous—and research shows they destroy shareholder value while poisoning organizational culture. As psychologists Paul Babiak and Robert Hare document in Snakes in Suits: When Psychopaths Go to Work, these individuals are attracted to exactly the kind of fast-moving, high-stakes environments that characterize AI development.
But sure, let’s trust them with recursively self-improving artificial general intelligence.
The patient, unglamorous labor of understanding why legacy systems exist and carefully improving them without breaking dependencies—that’s the real work. But it’s slow. And boring. And doesn’t make you a visionary.
So instead we get attempts to delete government, and tech billionaires trying to bootstrap AGI, and corporations optimizing for everything except human flourishing, all convinced that this time the rewrite-from-scratch approach will work.
It won’t.
Because humans are legacy code all the way down. Corporations are legacy code with psychopathic tendencies. Law is legacy code encoding centuries of painful lessons. And the runtime is still executing.
We can’t refactor ourselves. We can only work with what we’ve got—carefully, humbly, with deep respect for why things are the way they are.
That’s not the vision that launches a thousand think pieces. But it’s the one that might actually keep us alive.