We are currently living through the greatest inflation of software in history.
With the AI tools we have available in 2026, a Junior Developer can...
For further actions, you may consider blocking this person and/or reporting abuse
I decided to refactor my code of 5,327 lines in main.py + 344 + 665 additional scripts + 7 + 358 + 226 + 378 lines of shaders. I feel like I haven't slept in a week. Seriously, I just tried using
__init__.pyand I don't even understand its purpose, other than burning my patience...?!Refactoring is an art. And apparently, the way I format the structure, it's safe to say I'm a bad architect. No. As Paolo Veronese said, "I'm an artist, I see it that way." And I'll say, "I'm a vibe coder, and I don't need architectural skills, I just need it to work." Maybe that's bad, but at least it gives me impetus to create something bigger (I hope).
5,327 lines in main.py? That is not a script, that is a novel. Respect.
I love the term "Vibe Coder". Honestly, getting it to work is the hard part. Architecture is just what we do later when we are tired of scrolling up and down 5,000 lines to find a variable.
Don't let init.py break you. Python packaging confuses people with 10 years of experience. Go get some sleep.
Thanks!
@embernoglow Vibe coding builds prototypes. Deletion builds cathedrals.
The arc:
You're living the post exactly and great to see you finally did prune your codebase. BTW your sdf repo is just gold :) Love to see more open source contributions like this.
Thank you!
This is not a 2026 skill, writing as less code as possible has always been the goal of programming.
Libraries and frameworks should be a considered solution in the application, not a starting point.
With AI the libraries and frameworks could become a thing of the past because it can produce custom code faster than setting up a framework.
In the AI experiments I'm doing I'm using a router for a front controller, the base of most web frameworks, to check how far I can push AI.
A router can come with one or more ways to identify the routes; config, a builder pattern, attributes. But an application will have a single way of router identification to make it predictable.
So you don't need the other options, which means you can remove abstractions that are made to make the other options possible.
Because the base isn't a framework anymore, the application doesn't force you to use the opinion of the framework if it has one.
The question is not can you delete lines, which is the worst metric you can use to measure code quality.
The question is how much code are you willing to maintain.
You nailed it with your last sentence. How much code are we willing to maintain is the only metric that really matters.
Your point about AI replacing frameworks is incredibly interesting. We used to accept framework bloat because we needed the development speed. Now that AI can write a custom router just as fast, we do not need to import all that extra baggage anymore.
The challenge now is just making sure the AI does not invent its own bloated abstractions when we ask it to build those custom solutions. Keeping it minimal is definitely a timeless skill.
You can instruct AI to write the simplest code, and the output always has to be reviewed.
The death of frameworks with AI is that they are opinionated, and everyone has their own opinion.
Like you mentioned we accepted the framework opinion, because it brought us speed.
Now we can start with the libraries and custom code, and let AI create the glue code in the same time that is needed to setup a framework.
That is why I took the router component as a base to experiment with AI. Does it reliably alters the router, Does it do a bit weird stuff like using FFI code for the route matching. Does it create the base for a maintainable application. Does it write tests an documentation that requires minor changes. And so on.
I'm late starting with the experiments but the tools we have today are much easier and diverse, and we also understand the pros and cons of AI use better. Most of the time it pays off not to dive in from the start.
The maintenance cost point is the one that doesn't get enough attention. We benchmarked AI-generated code recently and found that 65-75% of functions had security vulnerabilities — so it's not just that AI creates more code to maintain, it creates more risk to maintain. Every line you don't delete is a line you're implicitly agreeing to secure, and most teams aren't doing that audit. The "code janitor" framing is spot on. @the_nortern_dev
That 65-75% stat is terrifying, but honestly not surprising.
I absolutely love how you phrased this: "Every line you don't delete is a line you're implicitly agreeing to secure." That should be printed on every CTO's wall.
It reframes the "Code Janitor" from someone who just cleans up messes to someone who is actively reducing the attack surface. In 2026, deleting code is arguably the most effective security patch you can apply.
Thanks for adding that data point to the discussion.
That 65-75% stat is terrifying, but honestly not surprising.
I absolutely love how you phrased this: "Every line you don't delete is a line you're implicitly agreeing to secure." That should be printed on every CTO's wall.
It reframes the "Code Janitor" from someone who just cleans up messes to someone who is actively reducing the attack surface. In 2026, deleting code is arguably the most effective security patch you can apply.
Thanks for adding that data point to the discussion.
Honestly, I don’t really like the vibe of "vibe coding". It’s great for scaffolding and getting something off the ground fast. But once you start leaning on it too heavily, it almost feels like the AI senses the dependency and starts churning out slop.
That is a perfect description. "Vibe coding" feels like borrowing time from the future at a predatory interest rate.
I have noticed that exact degradation too. Once the context gets complex, the AI starts guessing. If you do not have the deep knowledge to audit and delete that "slop" immediately, you end up with a codebase that is technically working but impossible to maintain.
This is an excellent consideration to bring forward this year. The ability to simplify and create lean code that does more with less signifies a deeper understanding of the codebase and should be rewarded as such.
Spot on, Julien. I think the challenge for engineering leaders in 2026 is exactly that: figuring out how to reward the "negative space" in a project.
It’s easy to measure a feature launch, but it's much harder to measure the value of a bug that never happened because the code was simplified. We need to move away from "lines of code" as a metric and start looking at "reduced complexity" as the true sign of seniority.
Have you seen any teams successfully implementing metrics or cultures that actually incentivize this kind of subtraction?
In one of my previous teams, it would be taken as a positive to simplify/remove code whenever possible. However implemented as an actual metric no, it would be interesting to explore that further.
Making it a formal KPI is definitely risky. Goodhart's law kicks in fast and people might start deleting safety checks just to hit a number.
We just highlighted negative lines of code during sprint reviews. It was enough to signal that cleanup mattered without creating perverse incentives
Good point! Goodhart's law indeed!
This resonates a lot with me.
As someone building AI-powered and full-stack products, I’ve realized that shipping fast is easy in 2026 — especially with AI — but maintaining clarity is the real engineering skill. Every abstraction, every dependency, every “future-proof” config adds cognitive load to the system.
In one of my recent projects, removing an unnecessary state library and simplifying the data flow improved performance more than any “new feature” I added. The code became easier to reason about — and that’s real productivity.
Writing code is creation.
Deleting code is judgment.
And judgment is what separates developers from engineers.
"Writing code is creation. Deleting code is judgment." That is a brilliant way to frame it. I might have to quote you on that in the future.
State management is the absolute perfect example. We always reach for the heavy libraries on day one, convinced the app will be massive. Ripping that global state out later and realizing standard data flow works fine is the best feeling.
Keystrokes are cheap now. Judgment is the actual bottleneck.
That really means a lot — feel free to quote it 😄
You’re spot on: premature architecture is just overengineering in disguise. AI made keystrokes cheap, but judgment, restraint, and knowing when not to abstract — that’s the real senior skill.
Consider the quote officially stolen.
Fighting the urge to build for scale on day one is definitely the hardest habit to break. Thanks for a great back-and-forth. See you in the next thread.
Spot on for 2026!
Pruning AI-generated cruft exposes the real system. (I once refactored a 10k-line LLM service to 2k—same perf, zero tech debt)
The new interview: "Delete 50% of this codebase. Explain why" :)
Codebase value concentrates as code shrinks ! 🚀
I bet the maintainability of that service went through the roof.
I absolutely love that interview concept. We spend so much time testing candidates on their ability to write new algorithms, but almost zero time testing their ability to read and simplify existing ones.
If a candidate can look at a module and confidently say "we can delete this half because it is just legacy bloat," that is an instant hire for me.
True! I am starting to think, from now on I am gonna keep this as an evaluation parameter for candidates while hiring.
YES. I wrote about this same topic this week because it's so underappreciated.
The hardest part isn't finding code to delete — it's convincing the team. Nobody wants to be responsible for deleting something that "might be needed someday." I've found that framing it as risk reduction works better than "code cleanup":
"This unused auth module has 3 unpatched CVEs in its dependencies. Deleting it eliminates the attack surface and saves us from maintaining code nobody uses."
Security risk + maintenance cost > "it's messy." Managers respond to the first, not the second.
My biggest deletion win: 8,200 lines of a deprecated auth system. Saved $340/month in Redis costs for a cluster that only the dead code was using.
8,200 lines and killing a useless Redis cluster is the absolute dream.
You are completely right about the framing. "Code cleanup" sounds like a low-priority chore to management, but "attack surface reduction" sounds like an urgent necessity.
Tying deletion directly to infrastructure costs and CVEs is a brilliant way to get buy-in from the people holding the budget.
This is exactly the mental shift that separates seniors from juniors in 2026. Everyone can generate code now — the differentiator is knowing which generated code to keep and which to throw away.
I've seen teams drown in AI-generated boilerplate because they treated every suggestion as gospel. The real skill is asking "does this actually solve my problem, or did the model just throw its most common pattern at me?"
The "Code Janitor" framing is perfect. Deletion isn't just cleanup — it's curation. In a world of infinite generation, the engineer who can say "no, we don't need this" is more valuable than the one who can write it.
Your digital hoarding analogy hit hard. My "read later" list and my codebase have a lot in common — both are full of things I thought I'd need someday but never touched.
Great piece!
The parallel between a "read later" list and a codebase is spot on. We are just hoarding text in different formats.
You hit the nail on the head regarding AI boilerplate. The models are designed to be helpful, which usually means they over-deliver. They will hand you a massive factory pattern when all you needed was a simple function.
Being the person who can look at 50 lines of perfectly generated code and just say "no thanks" is absolutely the new senior skill.
I think the key is controlling that thin boundary between the layers where you make architectural decisions and where you hand things off to generative AI.
It’s really a system of abstraction levels. At the business level, you could say “build me a program that makes a million dollars” and delegate everything below. Or you can consciously design each layer yourself and use AI as a tool that translates your thinking into working, understandable components that you can combine into a product.
Right now, I’ve chosen to use AI to generate mostly isolated modules — things I could have written myself and fully understand. Then I treat them as reusable building blocks.
I was already working this way before AI, but now it feels like the optimal approach. Fixing, debugging, or vibe-checking a module is limited to its local context, so cognitive load doesn’t explode — and I don’t feel tempted to outsource all decisions to the model
Owning the boundaries is exactly the right approach.
If you let the model architect the entire system, you are basically just a passenger. Generating isolated modules that you actually understand is the absolute sweet spot right now. It keeps the cognitive load entirely local.
The moment you ask the AI to start wiring all those modules together is when the technical debt really starts compounding. Treating them as reusable, human-verified building blocks is the only way to stay sane.
If you want to be a Senior Engineer in this new era, stop asking "What can I add?" and start asking "What can I remove?"
Not if you have a bad leadership who lakes technical skills and only care about how many lines of code you have written vs the optimal way of getting things done. Then it becomes a “performance” issue based on some bogus metrics.
If leadership is still measuring lines of code in 2026, you don't have a performance problem. You have a resume problem.
Counting lines is like measuring an airplane's quality by its weight.
Honestly, if I saw that metric today, I would just start interviewing immediately.
The "digital hoarding" analogy really hit home. I spent a month building an elaborate plugin system for a personal project, convinced I would need it "someday." Eventually I ripped the whole thing out and replaced it with a 40-line script that did exactly what I needed. The relief was immediate — not just in the codebase, but mentally.
Your point about code being a liability rather than an asset is especially true now that AI can regenerate boilerplate in seconds. The real skill is knowing what NOT to build in the first place. I have started asking myself "can I solve this with configuration instead of code?" before writing anything, and it is surprising how often the answer is yes.
Replacing a month of architecture with a 40 line script is painful but also the best feeling ever.
You nailed the psychological part. Code isn't just bytes, it is mental load. Every abstraction is just one more thing you have to remember later.
I love the configuration vs code rule. I am definitely stealing that idea.
This hit close to home. I built 80+ automation scripts in two days for a side project pipeline. Felt incredibly productive... until I realized half of them overlap or do things I could consolidate into 10 well-designed ones.
The irony: I automated the creation of automation scripts. Peak code hoarding.
Now I'm going through the painful but necessary process of deleting the ones that were 'just in case.' Your framing of code as liability rather than asset is exactly the mental shift that makes deletion feel like progress instead of loss.
One thing I'd add: the hardest code to delete is the code that works but isn't needed. Broken code is easy to kill. Working code that serves no real purpose? That's the hoarder's trap.
Automating the creation of automation scripts is the ultimate developer trap. I love that you shared that.
Your last point is the absolute truth. Broken code is just a bug, so deleting it is easy. But perfectly working code feels like an asset, even when it is just dead weight.
It takes actual discipline to throw away something that functions perfectly just because it solves a problem you no longer have.
This resonates deeply. The shift from "code quantity" to "code curation" is the real Senior leap.
But here's the harder truth: deletion requires more architectural understanding than creation. AI can generate that "future-proof" abstraction in seconds. Only experience knows it's premature optimization.
The crisis isn't just maintenance cost—it's cognitive load. Every "might need later" feature is a decision tree your brain keeps open. That's why your mental model cleared after deletion.
One addition: robustness matters more now. C#/.NET's strong typing catches AI hallucinations at compile time. Weakly-typed generated code? Production roulette at 3 AM.
The Janitor role is undervalued because we don't measure prevented complexity. Maybe that's the metric shift we need.
"Production roulette at 3 AM" is the best description of weakly-typed AI code I have heard. That is exactly why strict typing (like TypeScript for me) is completely non-negotiable now. The AI is simply too confident when it guesses.
You also nailed the part about the Janitor role. We have dashboards for lines added and PRs merged, but zero metrics for "complexity prevented."
It is a massive blind spot for the industry right now. Deleting code really does require more architectural vision than writing it.
This is a beautiful narrative. "The job is no longer to build the mountain but to carve the sculptor out of the rock."
Glad that metaphor landed with you. It is a hard mental shift when we are so used to measuring value by volume, but it feels inevitable now.
I like this - the KISS principle, "less is more" :-)
Exactly. The irony is that it usually takes more time to build something simple than something complex. Complexity is the path of least resistance.
Yes - keeping things simple is (often) HARD ...
Thats very true!
That's right. Minimalism is the key
Its true!
interesting
"This is the most senior dev take I've read all year.
Junior devs think code is asset. Senior devs know code is liability.
Every line you write is:
Something that can break
Something someone has to maintain
Something that adds complexity
The best code I ever wrote? The code I didn't write.
10x developers aren't the ones writing 10x more code. They're the ones deleting 10x more code.
🛐🔥"
Asset vs liability is the perfect framing.
We spend years learning how to write code, but nobody teaches us when not to write it.
My favorite PRs are always the ones with more red lines than green.
This was true always. The AI people say we no longer need to review code. We just need to update our workflow. I think the job shifted from writing to reviewing the code. But the next step is definitely the development of autonomous agents that won't need any code review. Coding will then be abstracted. Fun times.
This hit close to home. Last month I inherited a Node.js service that had grown to ~15k lines over two years. Half of it was "flexible" config parsing that nobody actually used beyond the defaults. Ripped it out, hardcoded the three configs we actually run in production, and suddenly the whole team could reason about the service again.
One thing I'd add though - the hardest part isn't the deleting itself, it's convincing your team it's safe. I've found that good test coverage is what gives you the confidence to delete aggressively. Without tests, every deletion feels like defusing a bomb blindfolded. Do you have a process for validating that removals don't break things, or is it mostly gut feel + monitoring?
I’m not writing code as much as I used to — I’m writing prompts...and I like it
But that doesn’t remove the need to understand architecture, read the code, and do proper code reviews. If anything, it makes those skills more important
Working with generative AI feels like having a very capable junior developer : fast, productive, but still needing clear, well-scoped tasks and supervision.
The quality of the result depends heavily on how precisely the task is formulated.
I’m trying to design applications as isolated modules, so changes in one area don’t break stable parts of the system.
With AI-generated code, maintaining clear boundaries and responsibility separation becomes even more critical.