The Slowdown

Practical technology is slowing down.

Not the flashy stuff — somebody’s always making a thinner phone or a faster GPU. I mean the kind of technology that makes daily life work better. The rate at which that improves is decelerating, and the reasons are structural. At least five forces push in the same direction, and they feed on each other.

the complexity tax

The biggest factor is straightforward: our technology is complicated now. Complicated technology is inherently harder to advance.

This is just diminishing returns under constant investment. It takes disproportionately more effort to achieve the same calibre of breakthrough as the years go by. Antibiotics are the textbook case — penicillin basically fell into Fleming’s lap. Now discovering a novel class takes billions of dollars and a decade. The early gains were cheap because the problem space was fresh. Low-hanging fruit gets picked first because it’s low-hanging.

But here’s the part that gets less attention. It’s not just that each breakthrough costs more. The base gets more complicated too. Every solution becomes part of the problem space for the next solution. You’re not climbing a hill that levels off — the hill is growing underneath you. At some point the investment costs more than the invention is worth, and that point keeps arriving sooner.

This isn’t a crisis. It’s arithmetic. And it would be manageable if it were the only force in play.

technology against itself

My freezer has a computer in it. A smart DC brushless compressor, networked sensors, the works. And it’s always failing.

The computer decides, based on some sensor ghost, that it can’t run. It stops. The food thaws. I reset it. It runs for a while. It stops again. I’m this close to ripping that board out and replacing it with a little microcontroller that does the dumb thing: keeps the box cold.

Another one. I’ve got a pump pulling water out of a low spot on the property. Put a check valve on it so I wouldn’t have to keep re-priming. The valve keeps failing on me — cheap metal, bad seals, something. So I took the valve out and started kinking the hose instead. Haven’t had a problem since.

These aren’t edge cases. This is the pattern.

A freezer from the ’80s had a compressor, a thermostat, and a relay. It ran for thirty years. Mine has a computer and lasts maybe five, assuming the firmware doesn’t develop opinions about sensor readings. Each layer of technology promises improvement, and in isolation it delivers — a brushless DC motor is more efficient. But bolted into a system with enough other clever parts, the aggregate reliability drops. You trade a simple thing that works for a complex thing that works better when it works. Which is less often.

Complicated systems have more failure points. And the failure modes don’t just add up — they multiply. The computer can’t account for the sensor it doesn’t know is drifting. The valve can’t compensate for the metallurgy it was never designed to withstand. Nobody at the design stage modelled the interaction between the smart compressor and the ambient humidity sensor in a room that gets to minus thirty in January. They modelled pieces. The pieces were fine. The whole thing isn’t.

the cultural turn

Something is shifting, and I think it’s going to be big.

I’ve been watching people around me choose technologies not because they’re the newest or most capable, but because they understand them. Because they’re comfortable maintaining them. Because the supply chain is short enough to trust.

A neighbour runs an old tractor he can rebuild with hand tools. He’s not a luddite — he runs GPS guidance on his combine. He just knows which complexity is worth it and which isn’t. That calculation is changing for a lot of people.

Deep supply chains and tight interdependence are fragile. People are learning this from experience, not from theory. Your new truck can’t be fixed at the local shop because it needs a proprietary diagnostic computer. Your appliance needs a part from a factory overseas and the lead time is four months. Your phone stops getting security updates after three years because the manufacturer has moved on to the next model.

I expect this to become a major cultural shift as the realization sinks in. Not a rejection of technology — more of a renegotiation. People will increasingly choose tools they can understand, maintain, and source parts for. Repairability. Legibility. Short supply chains. The appeal isn’t nostalgia. It’s risk management.

And part of what’s driving it runs deeper.

the trust deficit

Technology requires trust. You trust that the food is safe, that the car’s software won’t kill you, that your phone isn’t spying on you. That trust is eroding fast, and for good reason.

Look at how far Apple goes to present their devices as secure against government actors. That marketing works because the underlying fear is real — people don’t trust that their devices serve their interests anymore, and they have reason not to.

Biotech gets used to make crops resistant to herbicide so that commodity growers can spray poison indiscriminately to save a buck on labour. The technology isn’t serving the person eating the food. It’s serving the economics of the person growing it. That gap — between who technology serves and who it affects — keeps widening.

Major companies and governments have demonstrated in enough ways that their customers’ interests come after other priorities. Once the benefit of the doubt is spent, every new technology meets suspicion rather than enthusiasm. Enthusiasm has to be earned now, and most of what ships isn’t earning it.

When people don’t trust that new technology is for them, they stop adopting it eagerly. That alone slows everything down. But there’s a deeper version of this problem.

the veil

This one cuts deepest.

Technology hides the truth from us.

Solar panels cost energy to manufacture — mining, smelting, transport, fabrication — such that it’s genuinely unclear whether some installations ever pay back their own energy debt over their operational lifetime. Electric cars run on electricity that, in plenty of jurisdictions, comes from burning coal. The thing that feels clean at point of use carries costs that are real but invisible to the user.

This is the reductionist nature of technology at work. We can look at one link in the chain and feel good about it without seeing the whole system. Every intermediary, every supply chain link, every layer of abstraction between action and consequence makes it easier not to see what’s actually happening.

People are hiding — or have hidden — the real costs of their activities from their own moral reasoning. Not always deliberately. Often the system itself does the hiding. You don’t have to lie to anyone if the truth is buried under seven layers of indirection. You just have to not look.

The other forces slow technology down: complexity makes breakthroughs expensive, fragility makes adoption costly, distrust makes people cautious. Those are brakes. This one is different. A society that systematically deceives itself about what its technology actually does isn’t just decelerating. It’s steering blind. And a vehicle that’s steering blind doesn’t slow down gracefully — it hits something.

I don’t know what that collision looks like. Maybe we’ve already made the mistake and just can’t see it yet through all the layers. That’s rather the point.

the compound

These five forces aren’t independent. They reinforce each other in a cycle that tightens over time.

Complexity makes technology fragile. Fragility erodes trust. Eroded trust drives people toward simpler, more legible tools. The veil — the systematic truth-hiding — prevents the feedback that would let us correct course. And all of it runs on top of the basic arithmetic of diminishing returns, which means even if we solved the trust problem and tore down the veil, each step forward would still cost more than the last.

I don’t think this means technology stops. Humans are stubborn, and some problems are worth throwing resources at no matter the cost. But the era of broad-based, cheap, reliable technological improvement in everyday life? I suspect we’re past the peak of that, and that most people can feel it even if they haven’t named it yet.

The question I keep turning over: is there a version of technological progress that accounts for all five of these forces? That builds in transparency, manages complexity honestly, earns trust, and stays robust? Or does the nature of complex systems mean that past a certain scale, the whole project starts working against itself?