I’d believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.
In other words, I doubt we’ll see human programmers going anywhere any time soon.
Edit:
Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn’t exist. I don’t remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.
Honestly, GPT has strengthened my coding skills… for the simple reason that the handful of times I’ve asked it to do something the response I get back is so outlandish that I go “That CAN’T be right” and figure out how to do it myself…
I feel like it’s whispering bad advice at me while I’m typing. It’s good for as auto completing the most rudimentary stuff, but I have a hard time imagining it completing even one file without injecting dangerous bugs, let alone a large refactor.
The best copilot can do is autofill lines that everyone’s written a million times. That’s not nothing, but it aint replacing a human brain any time soon.
To be honest, this could be an example of where AI could do marginally better. I don’t mean that because of code quality or functionality. I mean it in the sense of MS software getting absolutely fucked by internal competition and stack-ranking fostered during the Balmer years. The APIs are inconsistent and there is a ton of partially implemented stuff that will never be pushed to completion because everyone who worked on it was fired.
An AI might be able to implement things without intentionally sabotaging itself but, because LLMs are in the forefront of what would be used and do not have the capability of intention or understanding context, I’m a bit pessimistic.
I’d believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.
In other words, I doubt we’ll see human programmers going anywhere any time soon.
Edit:
Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn’t exist. I don’t remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.
Honestly, GPT has strengthened my coding skills… for the simple reason that the handful of times I’ve asked it to do something the response I get back is so outlandish that I go “That CAN’T be right” and figure out how to do it myself…
Research with extra steps… I get it, but still…
I feel like it’s whispering bad advice at me while I’m typing. It’s good for as auto completing the most rudimentary stuff, but I have a hard time imagining it completing even one file without injecting dangerous bugs, let alone a large refactor.
The best copilot can do is autofill lines that everyone’s written a million times. That’s not nothing, but it aint replacing a human brain any time soon.
Could you imagine Microsoft replacing windows engineers with a chat gpt prompt? What would that prompt even look like?
To be honest, this could be an example of where AI could do marginally better. I don’t mean that because of code quality or functionality. I mean it in the sense of MS software getting absolutely fucked by internal competition and stack-ranking fostered during the Balmer years. The APIs are inconsistent and there is a ton of partially implemented stuff that will never be pushed to completion because everyone who worked on it was fired.
An AI might be able to implement things without intentionally sabotaging itself but, because LLMs are in the forefront of what would be used and do not have the capability of intention or understanding context, I’m a bit pessimistic.
No matter the human expense, the green line must go up.