> Eventually it was easier just to quit fighting it and let it do things the way it wanted.
I wouldn't have believed it a few tears ago if you told me the industry would one day, in lockstep, decide that shipping more tech-debt is awesome. If the unstated bet doesn't pay off, that is, AI development will outpace the rate it generates cruft, then there will be hell to pay.
Don't worry. This will create the demand for even more powerful models that are able to untangle the mess created by previous models.
Once we realize the kind of mess _those_ models created, well, we'll need even more capable models.
It's a variation on the theme of Kernighan insight about the more "clever" you are while coding the harder it will be to debug.
EDIT: Simplicity is a way out but it's hard under normal circumstances, now with this kind of pressure to ship fast because the colleague with the AI chimp can outperform you, aiming at simplicity will require some widespread understanding
As someone who's been commissioned many times before to work on or salvage "rescue projects" with huge amounts of tech debt, I welcome that day. Still not there yet though I am starting to feel the vibes shifting.
This isn't anything new of course. Previously it was with projects built by looking for the cheapest bidder and letting them loose on an ill-defined problem. And you can just imagine what kind of code that produced. Except the scale is much larger.
My favorite example of this was a project that simply stopped working due to the amount of bugs generated from layers upon layers of bad code that was never addressed. That took around 2 years of work to undo. Roughly 6 months to un-break all the functionality and 6 more months to clean up the core and then start building on top.
Are you not worried that the sibling comment is right and the solution to this will be "more AI" in the future? So instead of hiring a team of human experts to cleanup, management might just dump more money into some specialized AI refactoring platform or hire a single AI coordinator... Or maybe they skip to rebuild using AI faster, because AI is good at greenfield. Then they only need a specialized migration AI to automate the regular switchovers.
I used to be unconcerned, but I admit to be a little frightened of the future now.
Well, in general worrying about the future is not useful. Regardless of what you think, it is always uncertain. I specifically stay away from taking part in such speculative threads here on HN.
What's interesting to me though is that very similar promises were being made about AI in the 80s. Then came the "AI Winter" after the hype cycle and promises got very far from reality. Generative AI is the current cycle and who knows, maybe it can fulfill all the promises and hype. Or maybe not.
There's a lot of irrationality currently and until that settles down, it is difficult to see what is real and useful and what is smoke and mirrors.
I'm aware of that particular chapter of history, my master's thesis was on conversational interfaces. I don't think the potential of the algorithms (and hardware) back then was in any way comparable to what's currently going on. There is definitely a hype cycle going on right now, but I'm nearly convinced it will actually leave many things changed even after it plays out.
Funny thing is that meanwhile (today) I've actually been on an emergency consulting project where a PO/PM kind of guy vibecoded some app that made it into production. The thing works, but a cursory audit laid open the expected flaws (like logic duplication, dead code, missing branches). So that's another point for our profession still being required in the near future.
The industry decided that decades ago. We may like to talk about quality and forethought, but when you actually go to work, you quickly discover it doesn't matter. Small companies tell you "we gotta go fast", large companies demand clear OKRs and focusing on actually delivering impact - either way, no one cares about tech debt, because they see it as unavoidable fact of life. Even more so now, as ZIRP went away and no one can afford to pay devs to polish the turd ad infinitum. The mantra is, ship it and do the next thing, clean up the old thing if it ever becomes a problem.
And guess what, I'm finally convinced they're right.
Consider: it's been that way for decades. We may tell ourselves good developers write quality code given the chance, but the truth is, the median programmer is a junior with <5 years of experience, and they cannot write quality code to save their life. That's purely the consequence of rapid growth of software industry itself. ~all production code in the past few decades was written by juniors, it continues to be so today; those who advance to senior level end up mostly tutoring new juniors instead of coding.
Or, all that put another way: tech debt is not wrong. It's a tool, a trade-off. It's perfectly fine to be loaded with it, if taking it lets you move forward and earn enough to afford paying installments when they're due. Like with housing: you're better off buying it with lump payment, or off savings in treasury bonds, but few have that money on hand and life is finite, so people just get a mortgage and move on.
--
Edited to add: There's a silver lining, though. LLMs make tech debt legible and quantifiable.
LLMs are affected by tech debt even more than human devs are, because (currently) they're dumber, they have less cognitive capability around abstractions and generalizations[0]. They make up for it by working much faster - which is a curse in terms of amplifying tech debt, but also a blessing, because you can literally see them slowing down.
Developer productivity is hard to measure in large part because the process is invisible (happens in people's heads and notes), and cause-and-effect chains play out over weeks or months. LLM agents compress that to hours to days, and the process itself is laid bare in the chat transcript, easy to inspect and analyze.
The way I see it, LLMs will finally allow us to turn software development at tactical level from art into an engineering process. Though it might be too late for it to be of any use to human devs.
--
[0] - At least the out-of-distribution ones - quirks unique to particular codebase and people behind it.
Sure, but we have high growth on top of that - meaning all those "perpetual intermediaries" are always the minority and gravitate upwards in the org chain, while ~all the coding work is done by people who just started working in the field, and didn't even learn enough yet to become mediocre.
I'm yet to encounter an AI-bull who admits the LLM tendency towards creating tech debt- outside of footnotes stating it can be fixed by better prompting (with no examples), or solved by whatever tool they are selling
> I wouldn't have believed it a few tears ago if you told me the industry would one day, in lockstep, decide that shipping more tech-debt is awesome.
It's not debt if you never have to pay it back. If a model can regenerate a whole relibale codebase in minutes from a spec, then your assessment of "tech debt" in that output becomes meaningless.
I wouldn't have believed it a few tears ago if you told me the industry would one day, in lockstep, decide that shipping more tech-debt is awesome. If the unstated bet doesn't pay off, that is, AI development will outpace the rate it generates cruft, then there will be hell to pay.