Doom
People love defaulting to doom and gloom. I get it. I've been there, it used to be my default view. It took a lot of work to reshape and reframe my views and default ways of thinking.
A lot of discourse at the moment revolves around AI making software engineers obsolete.
But I don't believe that. Some? Maybe. For others? The opposite.
Context
I've been a software engineer for over 8 years now. Before that, I got an art degree from uni. Yes yes, make all the jokes you want - 'Haha, 3 years to ask people if they want a cappuccino or a latte 🤓' (I was never a barista, props to them. You need skills and patience for that kinda job).
Before getting into software, I worked in retail, a variety of office jobs, and hated every minute of all of it. So I taught myself to code around my full time job at the time, in the mornings before starting work, the evenings after it, and on the weekends.
I interviewed a lot, and bombed a lot. I managed to get my foot in the door as a junior in an American scale up, and I've been growing my skills ever since.
So maybe I'm better placed than a lot of others in the SWE world to feel positive about the seismic changes AI is bringing. I had to work hard to get here in the first place - I'm not getting replaced now.
Views
I 100% agree that software engineering as we know it is changing massively, I would not argue against that point. What I would argue against however, is the widespread opinion and fear I hear about how SWE as a career is dying/dead/doomed.
In its current form? Sure. But cmon folks. How long have no-code and low-code solutions been a thing? Instead of having to manually provision and set up infra, you can use Terraform (or any other Infra-as-code solutions you might care to name).
Software's been following the trend of abstracting complexity away for my entire career. Remember when Squarespace replaced frontend devs?
Nah, me neither.
Glib as that is - I view agentic coding in a similar vein. Software engineering is potentially heading to another layer of abstraction. Instead of focusing on the syntax and structure of specific programming languages, we can now write in prose, translated to code.
I think of it similarly to writing JavaScript, which is then compiled to machine code. I have no idea how to write Assembly. I haven't needed to, or honestly had much a drive to learn it. JS engines have done that for me, and have 100% done a better job of it than I could. (Thanks to the wonderful Alistair Brown for that analogy).
Future
I mentioned earlier that some SWEs may be replaced. Sure. I don't particularly want them to be, but it seems like there's a self-selecting filter taking place in engineering. Many people are against gen AI, and I respect that. There are several aspects of it that I'm not a fan of (the environmental impact for instance). But I do think SWEs that don't learn to use it at all or competently may leave themselves at a disadvantage in the future.
SWE is being reframed - rather than the syntax and quirks of particular languages and writing code being the main focus, it feels like it may wind up being closer to an architect/code reviewer/agent line manager hybrid.
You now need to be able to clearly explain your thinking for any given system in a structured, clear, precise way. Don't like the way an LLM's heading with a project? You need to know enough about system design and thinking to be able to step in and course correct. Understand best practices and industry standards - LLMs don't need to reinvent the wheel. There's an established pattern for dealing with this problem we're facing - let's research some implementation patterns and follow those.
I've seen a fair bit of discourse around AI coding assistance and how it affects dev skills. I 100% believe in this - we can't let LLMs replace our skills and knowledge and our THINKING.
Research
Thinking being the key point. There's some really interesting research going around to back this up.
Anthropic themselves (the folks that have given us access to lovely Claude. I'm biased.) ran a study - developers learning new skills with AI assistance scored 17% lower on assessments than those who learned without it. The biggest gaps? Debugging, and understanding when and why code is incorrect. Their main takeaway was blunt and I'm all about it - "cognitive effort — and even getting painfully stuck — is important for fostering mastery."
Then there's this METR study. They tracked experienced developers using AI coding tools across 140+ hours of screen recordings. The devs worked 19% slower with AI assistance - while believing they were 20% faster. A 43-point gap between perception and reality. Woof. Time saved generating code was eaten up with context switching, verifying AI suggestions, and integrating outputs with existing codebases.
I touched on the idea of some SWEs being impacted by this more than others at the start of this post. There's a divergence developing - a K-shaped split. Teams with experienced engineers who already understand systems deeply? They're integrating AI successfully and maintaining code quality. But teams with weaker foundational skills are seeing increasing technical debt, difficult to track down bugs, and systems that work in common cases but fail on edge cases.
A Stanford Digital Economy Lab study reported that employment for developers aged 22-25 has fallen nearly 20% since 2022. If AI handles the grunt work that used to train junior developers, who becomes senior in 10 years? AWS CEO Matt Garman put it pretty clearly - if you stop hiring juniors today, you'll face a serious experience gap down the line.
Closing
So - in my view, SWE isn't dead, dying, or doomed. It's shifting. And that's scary, and I'm not clairvoyant - I don't know for sure this will pan out as I expect. But we all have agency. There are steps we can take to do our best to offset any potential future obsolescence on our parts.
It's a reminder and a call to stay sharp. The engineers who treat AI as a tool that requires their judgement, their systems thinking, their ability to say "no, that's wrong, here's why" - they'll be fine. Better than fine. The ones who let it replace their thinking? That's where the risk is.
It's on us to remain relevant.
Top comments (0)