Writing Code Was Never the Job
· by Michael Doornbos · 1701 words
The career panic is everywhere. AI writes code now, so what do developers do?
I keep having the same conversation. Someone with five or ten years of experience, mass-applying to jobs, wondering if they need to “pivot to AI.” Every time, I ask the same question. What did you actually do all day? If the honest answer is “I turned Jira tickets into React components,” then yes, you have a problem. But AI didn’t create that problem. It exposed it.
The industry spent twenty years hiring based on framework knowledge and algorithm puzzles. It built a workforce optimized for implementation. Now implementation is getting cheap, and everyone is acting like the ground shifted. The ground didn’t shift. We just built a lot of houses on sand and called it bedrock.
The cuts are real
I’m not going to pretend this is abstract. Oracle laid off up to 30,000 people today. Not struggling Oracle. Record-profit Oracle, with a 95% jump in net income last quarter. They sent termination emails at 6am with no warning from managers or HR. Your role has been eliminated as part of a broader organizational change. That was it. System access cut immediately. Final working day.
Oracle isn’t alone. Block cut 40% of its workforce in February. Jack Dorsey wrote in a shareholder letter that the cuts weren’t about financial difficulty but about “the growing capability of AI tools to perform a wider range of tasks.” The stock jumped 24%. Atlassian cut 1,600 people in March while posting 26% revenue growth. Snowflake eliminated its entire technical writing team and replaced them with an AI documentation system. About 60,000 tech workers lost their jobs in Q1 2026, and roughly one in five of those cuts explicitly cited AI.
So let’s not pretend this is a thought exercise about career strategy. People are getting fired.
But look at where the money is going. Oracle is redirecting that $8-10 billion in salary savings toward $156 billion in AI data center buildout. Someone has to design those data centers, run them, secure them, keep them operational. Block’s remaining engineers are expected to produce more with AI tools, which means the systems they build will be larger, more complex, and harder to operate. The headcount dropped. The complexity didn’t.
There’s also a less comfortable truth in the numbers. A Forrester survey found that 59% of companies admit they emphasize AI when explaining layoffs because “it sounds strategic and forward-looking.” Only 9% said AI has actually fully replaced any roles. Some of these cuts are genuine restructuring. Some are cost-cutting dressed up in an AI narrative because it makes the stock go up. Both things are true, and the people who got the 6am email don’t care which one applies to them.
The cuts are real. The pain is real. And telling someone who just got laid off to “learn systems thinking” is tone-deaf. But understanding where the value is moving isn’t career advice for people who just lost their jobs. It’s a map for everyone else.
The flood nobody is ready for
The conversation about AI and careers has mostly been about who loses their job. That’s the wrong frame. The real story is what happens to the software.
When building software gets cheap, you don’t get less software. You get a flood. Every business unit that used to submit a ticket and wait six months is now spinning up tools in an afternoon. Every internal process that was “too small to justify developer time” is getting automated by someone who doesn’t know what a SQL injection is. Marketing has a custom dashboard. HR built an internal portal. The intern wrote a Slack bot that queries production data.
None of this is hypothetical. It’s happening now.
And all of it has to run somewhere. On infrastructure someone has to manage. Behind authentication someone has to configure. Touching data that someone has to govern. Every one of these tools is a new row in an asset inventory, a new entry in a penetration test report, a new thing that pages someone when it breaks in a way its creator never anticipated, because its creator left the company six months ago and the tool has no documentation.
The problem isn’t fewer developers. The problem is a wave of software that nobody fully understands, built by people who didn’t need to understand it to build it.
The security tab is coming due
If you work in security, you already see it. If you don’t, you will.
Every AI-generated internal tool is a new endpoint. A new data flow. A new thing that needs to be secured, monitored, patched, and audited. The business unit that “just built a quick dashboard” connected it to a production database with a service account that has write access. The marketing team’s custom CRM integration is storing API keys in a config file committed to a public repo. The AI-generated code works. It also doesn’t validate input, doesn’t handle errors gracefully, and logs sensitive data to stdout.
For MSSPs, the addressable market just grew by an order of magnitude, and the work got harder. It’s no longer enough to scan and report. A client’s attack surface now includes dozens of tools that nobody in IT even knows about, built by people who have never heard of OWASP and don’t know they should have. The MSSP that can actually understand a client’s infrastructure at the systems level, map the shadow IT, and explain the risk in terms the CFO cares about, that’s not a commodity service. That’s the only thing standing between a mid-size company and a breach that makes the news.
Software companies selling security products face the same shift. You can’t compete on features when every competitor can ship features at the same speed. The moat is understanding the customer’s environment deeply enough to catch the threats their own developers introduced by accident. That’s domain expertise, not implementation speed.
Deployment is a human problem
The other thing that generic career advice gets wrong is treating deployment like a technical step. “Learn CI/CD” is on every career advice list, as if the hard part of getting software into production is the pipeline.
The hard part is the night shift supervisor at a hospital who has to change a workflow she’s been doing the same way for twelve years. The hard part is the warehouse floor manager who needs to trust that the new inventory system won’t lose a shipment during peak season. The hard part is the VP who approved the project but didn’t approve the timeline, and now someone has to sit in a room and explain why the cutover can’t happen on a Friday afternoon before a holiday weekend.
CI/CD is plumbing. Deployment is convincing twenty people across four departments that the change is safe, the training is adequate, and the rollback plan actually works. AI doesn’t do any of that. It can’t read the room during a go-live meeting. It can’t tell that the operations manager’s “sounds good” means “I don’t understand this and I’m going to resist it passively for six weeks.”
The people who are good at this, who can translate between technical reality and organizational politics, are the ones who actually get systems into production. They were never the ones writing the code. They were the ones making sure the code mattered.
What actually holds up
I’ve been running Linux for over thirty years. I’ve watched entire categories of jobs appear and disappear. The pattern isn’t mysterious.
The people who understood what was happening on the machine always landed fine. Not because they were smarter, but because they were solving a different kind of problem. The kind that doesn’t go away when the tools change. Understanding why a system is slow isn’t a framework skill. Debugging a network partition under pressure isn’t something you learn from a tutorial. Knowing that a “simple” schema change will lock a table for twenty minutes in production, that’s experience, and it’s the kind of experience that gets more valuable when more people are building things without it.
The Unix philosophy keeps winning for the same reason. Small, composable tools. Understanding your inputs and outputs. Knowing what’s actually happening instead of trusting the abstraction. The AI coding agents themselves are built on this. They shell out to grep, pipe text through parsers, read files one at a time. Under the hood, the most advanced AI tools in the industry are just calling the same Unix primitives that have been there since the 1970s. When everyone can generate code, the person who understands what the code is actually doing, and what it’s running on, has the leverage.
Coding on credit cuts both ways. AI handles solved problems well. But the flood of AI-generated software is creating new, unsolved problems at a rate that outpaces the old ones getting automated away. Someone has to understand the systems that nobody designed on purpose, the ones that emerged from a dozen teams building independently with tools they didn’t fully understand.
That’s the job. It was always the job.
The tell
The panic tells on itself. The people doing it are the ones who describe their job in terms of tools. “I’m a React developer.” “I write Python.” “I do full-stack JavaScript.” When the tool gets commoditized, the identity crisis follows.
The people who describe their job in terms of problems don’t panic. “I keep the billing system running.” “I make sure patient data doesn’t leak.” “I figure out why the warehouse picks are wrong on Tuesdays.” Those jobs didn’t get easier. They got harder, because now there’s more software between them and the answer, and less of it was built by someone who understood the system it plugs into.
The career everyone is mourning, the “I implement features in whatever framework is current” career, was a temporary artifact of an industry growing faster than it could train people. AI didn’t kill it. Time did. AI just made the timeline shorter.
Writing code was never the job. Understanding the system was. More software doesn’t change that. It just raises the stakes.
What’s your job, really? Not your title. The problem you actually solve.