zlacker

[parent] [thread] 0 comments
1. noname+(OP)[view] [source] 2022-12-15 14:10:48
The entire history of computer programming is using code generation tools to increase the level of abstraction most programmers work at. Having yet another one of those doesn't seem to present any realistic chance of replacing all of the development, testing, maintenance, and refinement of that entire stack. If your job is literally just being handed over a paragraph or so written requirement for a single function or short script, giving back that function/script, and you're done, then sure, worry.

But at least every job I've had so far also entailed understanding the entire system, the surrounding ecosystem, upstream and downstream dependencies and interactions, the overall goal being worked toward, and playing some role in coming up with the requirements in the first place.

ChatGPT can't even currently update its fixed-in-time knowledge state, which is entirely based on public information. That means it can't even write a conforming component of a software system that relies on any internal APIs! It won't know your codebase if it wasn't in its training set. You can include the API in the prompt, but then that is still a job for a human with some understanding of how software works, isn't it?

[go to top]