Be careful what you wish for.
Just wanted to say that I'm so thankful for this article. Even though I'm mostly passionate programmist, and by no means professional neither in knowledge nor in experience, I feel what was written here heavily. I tried Bing few times, and while it's a nice treat for a while, it sucks up the love for what you do on longer run. It's just like image gens, a toy built for dopamine shots, which sometimes can do something useful, if you lie to yourself that you learned how to prompt it properly.
But what's more important, is philosophical part of this issue so little people notice. Thankfully besides programming, I also love doing art and reading philosophy, being involved in sociology. That whole AI hype is beyond terrifying, not because it will take jobs - but because it will make one of the most joyous things of the world, beyond lifeless. It will take any incentive to do art or coding, to learn lifelong lessons of patience and communicate with the world, because who will choose difficult road when easy one is just behind two clicks and one "register" button?
And even though same can be said about high-level langs, visual coding and so on, they are not the same. They lure you, they are limited, they give you motivation by allowing for easier conceptualising and making things come to life. Then you start diving deeper, because your desires are too big for those slow Python and Elixir codes. And so, easier route actually is only a trap, for you, to fall in love with programming as a whole.
I don't see this being outcome of AI. At all.
Why would software engineering be still needed at all? Did KITT have infotainment touchscreen? Did Jarvis draw to Tony's visor according to specs agreed on last December's meeting? The truth is, GPT-5 will completely end any software at all, and GPT-6 will design its own chips (Nvidia did this already just this month, albeit in a very limited extent, baby steps). It would be direct natural language <=> hardware. Companies that still have "engineers" and "sprints" will be driven out of business due to inefficiencies.
As a professional programmer - therefore, going sleepless at night by doomsday promised by chatGPT - I also wonder who all will run these quintillion-paremeter crunching set ups? Do we really think that garage-based, short-of-funds companies will be able to compete with such Goliaths? Eventually, the capability to spit out answers to everything will be concentrated in few hands. Is that a good thing?
I am with the idea that automation (supported by AI) in general is a way to make tech more humane, by eliminating the dull elements of human labour. Generative AI is a great way to come to results or ideas you could not plan for, e.g. as seen in manufacturing where AI finds ways to design products more efficient by reducing surface areas by applying weird shapes and interconnections.... And so on and so on, the AI just goes through many, many hops to present something a human can improve, take to practical use. Which is not (yet) something AI is necessarily good at. And we will see how well this statement will age in the next years, but hey, it's Jan 2023 right now. And all of this applies to code as well or to the act of coding.
"Our Robots Ourselves" by David A. Mindell might be worth checking out. I think just the drive for profit leads to crap code ("The Craftsman" by Richard Sennett). Makes sense that this would lead to AI. But as Mindell shows in his book and work "automation" doesn't actually "automate" because of (fight me all you want) Godel's incompleteness theorem there will always be gaps. Gaps that can only be filled in by humans. Thus, automation "creates" jobs it doesn't replace them.
Regulation, legal concerns around AI generated code, and cost will be big factors in all of this. The simplest thing to tackle is that ChatGPT will not be free long term. When it costs actual money to run the thousands of queries it would take to build real software human Devs might seem cheaper.