First DALL-E came for the artists, and we laughed, because artists are worthless.
Then chatGPT came for the writers, and we laughed even harder, because those people are not only worthless but actively detrimental to society.
Then GitHub Copilot (and a thundercloud of bigger, better AI models on the horizon) came for programmers like us—and we were prepared, because automation doesn’t make US obsolete, oh no. It only makes us FASTER and BETTER and lets us focus on MORE IMPORTANT THINGS.
OK, that isn’t how every software developer thinks about AI. But … it’s a pretty common subtext, right?
As evidence I submit two HackerNews discussions.1 This one (until it was flagged) was about an article that bemoaned AI’s existential threat to human creatives. The top responses to this article were variants on:
Art is just personal expression anyway, like a child’s finger painting; humans don’t need or deserve commercial opportunities to create art.
Technology marches on, jobs become obsolete. Suck it up and deal, artists.
Haha, this article sounds like it was written by ChatGPT. (Implied: you’ve already been replaced, wordcel.)
Any mediating voices were heavily downvoted.
The second discussion was about an ACM op-ed titled “The End of Programming”. The article predicts that “classical computer science” (algorithms, programming language design, and so on) will become obsolete2 because future builders will tease nondeterministic AI models into producing software instead of writing actual code.
In this view, Copilot is not automating you out of a job, it’s doing something much more demoralizing: it’s automating you into a job, a strange and scary new job made up of prompt engineering and model-stirring that doesn’t feel much like slinging code at all.
The reactions to this article were… a little different. They were not dismissive; they were ENRAGED. Everybody agreed that “The End of Programming” was ridiculous, but no two programmers could converge on why. Here are some sample complaints:
The author of the article, Matt Welsh, cannot be taken seriously because he currently runs an AI-tooling startup, which gives him a financial motive to push his ideas. (No commenter seemed concerned that as a programmer, they might have an equally strong financial motive to suppress his ideas.)
Writing code is different and more exacting than other kinds of knowledge work, and AI will never be able to sit between humans and code in a trustworthy fashion. (“Would YOU trust the design of a life-support system to an LLM?”)
Actually, programming itself is often trivial; it’s the LEAST interesting and important part of software engineering, and giving that part of the job to an AI doesn’t change much about the need for a professional software engineer’s input.
Those last two complaints, you will note, apparently contradict each other—which, again, nobody seemed to realize.
So—on the one hand, we have a bunch of shape rotators feeling pretty cavalier about the destruction of human creative opportunities, if not actively welcoming it. On the other hand, they insist that AI won’t upend their own very different and special jobs.
I went to the take store to pick up my own take on this, but they were all out of takes because the threadbois had grabbed them up like shoppers buying all the milk and bread before a half-inch snowfall.
But it seems like maybe the two contradictory HN arguments can be resolved like this:
SOME software engineering jobs are extra-finicky and likely to involve human developers writing hands-on code for the foreseeable future.3
MANY software engineering jobs can have LOTS and LOTS of the “code” part generated by AIs. It probably won’t be great code at first. It will probably have lots of bugs. But it will be incredibly fast and increasingly cheap, and that’s gonna look appealing to both engineers and their bosses.
Hey, that’s automation for you. It Frees Us Up To Focus On More Important Things. We said the same thing about the cloud and about high-level programming languages and probably about abacuses.4 But not to worry; even the foretold AI developer-bots will need human engineers alongside them. 5
That’s why Matt Welsh’s article isn’t called “The End of Software Engineering.” For all I know, the demand for people called “software engineers” will increase. The question is what those people will do. Welsh thinks they’ll be like educators, shouting examples at AI models the way substitutes teach unruly children. My friend Ben, in his own takedown of the “End of Programming” article, suggests that they might be something like product managers, translating user needs and business requirements for an AI dev team. (You know, wordcel stuff!)
But programming - well, programming is a different story. We still have mechanical engineers, but they don’t employ roomfuls of draftspeople drawing straight lines anymore; they use AutoCAD. Is programming—or, if you like, “hacking” in the hackers and painters sense6—headed the way of draftsmanship, an outdated craft replaced by AI assistance?
And that brings us to the existential question: If AI really lets you work faster and focus on more important things, but those things aren’t programming, will you be happy?7
Sure, we spend lots of time complaining about the soul-sucking slog of coding, the StackOverflow tabs and the tooling sprawl and all the rest. But just stop for a minute and think about what you love about programming.
Think about the first language you ever loved—Lisp or ANSI C or Haskell or Python or ASP.NET or TurboPascal—inscrutable to most people, beautiful to you. Imagine digging into an algorithm on a whiteboard or a scrap of copy paper, feeling yourself disappearing deeper and deeper into the abstractions until you too become almost a symbolic construct. Inhabit that feeling of flow after you get the boilerplate code laid out and things begin to happen: screens are populating and commands are parsing and now you see a better way to modularize everything so you abstract out some components and feel the design click into place like the little tumblers in a safe. All built on the bedrock confidence that beneath the layers of abstraction, when you tell a computer to do something, it will do EXACTLY what you said—and if it doesn’t, then you or somebody in your toolchain has made a mistake and with enough patience you can trace that bug right back to its source and mount its thorax on a little pin. That profound satisfaction when the bug is dead and logic triumphs again. The pride of creation. The joy of self-expression in the medium you love.
Yeah, you were lucky enough to live in that world, and if you were in the right place at the right time you even got paid unimaginable amounts of money to do it. How will you feel when it’s all over?
Because here come the business bois cranking out AI-assisted apps by the thousands. Breaking that fundamental link between input and output: if you don’t like what the five-quintillion parameter model spit out, write another prompt and hope you held your tongue the right way this time. Of course it’s mediocre and buggy and unreliable, of course it’s worse than your handcrafted code. How could it not be? But it’s cheap. And it’s fast. So it’s winning.
You might still program by hand, the old-fashioned way, just for the sake of personal enjoyment. You might even open-source some stuff, like a child sticking its finger paintings on the refrigerator. But is that really enough? Will that really give you the joy and pride you felt when you knew that doing what you loved most really mattered in the world?
Well, now you know how the artists feel.
Anyway, the other thing that AI is going to do is turn a lot of people into shape rotators in title, but wordcels in function--and probably in salary too. It might be good to start getting mad at that idea now, so you're accustomed to it when it gets real.
In the meantime, here’s at least one game AI isn’t likely to beat us at anytime soon:
Links and events
I got pretty worked up on this podcast about how tech is failing entry-level applicants, particularly those from nontraditional backgrounds. TL;DL: the term "skills gap" is a rhetorical trick that shifts the blame onto newbies. The actual problem is an "experience gap", and that puts the blame right back on the industry for not providing onramps to experience for juniors.
On Wednesday I’m joining Lucy Wang for a chat about learning cloud in 2023, including a walkthrough of the Cloud Resume Challenge.
Also, my 30-minute “set” from DevOps Enterprise Summit is now online, featuring a grand piano, a one-man rap battle, and some actual serious points, too.
Just for fun
You wouldn’t be worried about AI at all if you were as hardcore as this:
Hacker News is a questionable source for a lot of things, but it’s a great window into how a certain subset of Silicon Valley-centric tech types think about reality.
“Obsolete” is relative, I guess. As someone with both a bachelor’s and a master’s degree in “classical computer science”, I should say that compiler design and microprocessor architecture rarely came up in my professional software engineering jobs, though I was always grateful for the background knowledge.
The term “data-driven” takes on a scary connotation if what’s being driven by the data is your Tesla.
Abaci?
And ops. Always ops.
I realize this is now the second footnote disclaiming something created by Paul Graham, but “Hackers and Painters” (the essay, not the book) has aged well and bears re-reading. I think it’s a glimpse of truth.
Would a painter be happy if you told them that using Midjourney or StableDiffusion would let them spend more time on, oh I don’t know, having meetings with their copyright lawyers?
Just wanted to say that I'm so thankful for this article. Even though I'm mostly passionate programmist, and by no means professional neither in knowledge nor in experience, I feel what was written here heavily. I tried Bing few times, and while it's a nice treat for a while, it sucks up the love for what you do on longer run. It's just like image gens, a toy built for dopamine shots, which sometimes can do something useful, if you lie to yourself that you learned how to prompt it properly.
But what's more important, is philosophical part of this issue so little people notice. Thankfully besides programming, I also love doing art and reading philosophy, being involved in sociology. That whole AI hype is beyond terrifying, not because it will take jobs - but because it will make one of the most joyous things of the world, beyond lifeless. It will take any incentive to do art or coding, to learn lifelong lessons of patience and communicate with the world, because who will choose difficult road when easy one is just behind two clicks and one "register" button?
And even though same can be said about high-level langs, visual coding and so on, they are not the same. They lure you, they are limited, they give you motivation by allowing for easier conceptualising and making things come to life. Then you start diving deeper, because your desires are too big for those slow Python and Elixir codes. And so, easier route actually is only a trap, for you, to fall in love with programming as a whole.
I don't see this being outcome of AI. At all.
Why would software engineering be still needed at all? Did KITT have infotainment touchscreen? Did Jarvis draw to Tony's visor according to specs agreed on last December's meeting? The truth is, GPT-5 will completely end any software at all, and GPT-6 will design its own chips (Nvidia did this already just this month, albeit in a very limited extent, baby steps). It would be direct natural language <=> hardware. Companies that still have "engineers" and "sprints" will be driven out of business due to inefficiencies.