zlacker

[return to "Cloudlflare builds OAuth with Claude and publishes all the prompts"]
1. paxys+A6[view] [source] 2025-06-02 15:04:53
>>gregor+(OP)
This is exactly the direction I expect AI-assisted coding to go in. Not software engineers being kicked out and some business person pressing a few buttons to have a fully functional app (as is playing out in a lot of fantasies on LinkedIn & X), but rather experienced engineers using AI to generate bits of code and then meticulously reviewing and testing them.

The million dollar (perhaps literally) question is – could @kentonv have written this library quicker by himself without any AI help?

◧◩
2. gokhan+na[view] [source] 2025-06-02 15:23:43
>>paxys+A6
> Not software engineers being kicked out ... but rather experienced engineers using AI to generate bits of code and then meticulously reviewing and testing them.

But what if you only need 2 kentonv's instead of 20 at the end? Do you assume we'll find enough new tasks that will occupy the other 18? I think that's the question.

And the author is implementing a fairly technical project in this case. How about routine LoB app development?

◧◩◪
3. paxys+5b[view] [source] 2025-06-02 15:27:13
>>gokhan+na
Increased productivity means increased opportuntity. There isn't going to be a time (at least not anytime soon) when we can all sit back and say "yup, we have accomplished everything there is to do with software and don't need more engineers".
◧◩◪◨
4. spider+4c[view] [source] 2025-06-02 15:34:23
>>paxys+5b
But there very well might be a time very soon where human's no longer offer economic value to the software engineering process. If you could (and currently you can't) pay an AI $10k/year to do what a human could do in a year, why would you pay the human 6 figures? Or even $20k?

Nobody is claiming that human's won't have jobs simply because "we have accomplished everything this is to do". It's that humans will offer zero economic value compared to AI because AI gets so good and so cheap.

◧◩◪◨⬒
5. paxys+Dc[view] [source] 2025-06-02 15:37:23
>>spider+4c
And there might be a giant asteroid that strikes the earth a few years down the line ending human civilization.

If there is some magic $10k AI that can fully replace a $200k software engineer then I'd love to see it. Until that happens this entire discussion is science fiction.

◧◩◪◨⬒⬓
6. spider+2f[view] [source] 2025-06-02 15:51:20
>>paxys+Dc
If experts were saying the astroid will hit earth in the next 5 years, would it still be science fiction?

You acting like those two scenarios are the same is disingenuous. Fuck that.

◧◩◪◨⬒⬓⬔
7. paxys+Yf[view] [source] 2025-06-02 15:55:09
>>spider+2f
Remove all the "experts" who have a major conflict of interest (running AI startups, selling AI courses, wanting to pump their company's stock price by associating with AI) and you'll find that very few actual experts in the field hold this view.
◧◩◪◨⬒⬓⬔⧯
8. motore+312[view] [source] 2025-06-03 06:17:14
>>paxys+Yf
> Remove all the "experts" who have a major conflict of interest (...) and you'll find that very few actual experts in the field hold this view.

You might seek comfort in your conspiracy theories, but back in the real world the likes of me were already quite capable of creating complete and fully working projects from scratch using yesterday's LLMs.

We are talking about afternoons where you grab your coffee, saying to yourself "let's see what this vibecode thing is all about", and challenging yourself to create projects from scratch using nothing but a definition of done, LLM prompts, and a free-tier LLM configured to run in agent mode.

What, then?

You then can proceed to nitpick about code quality and bugs, but I can also say the same thing about your work, which you take far longer to deliver.

[go to top]