zlacker

[return to "Cloudlflare builds OAuth with Claude and publishes all the prompts"]
1. paxys+A6[view] [source] 2025-06-02 15:04:53
>>gregor+(OP)
This is exactly the direction I expect AI-assisted coding to go in. Not software engineers being kicked out and some business person pressing a few buttons to have a fully functional app (as is playing out in a lot of fantasies on LinkedIn & X), but rather experienced engineers using AI to generate bits of code and then meticulously reviewing and testing them.

The million dollar (perhaps literally) question is – could @kentonv have written this library quicker by himself without any AI help?

◧◩
2. bigstr+N9[view] [source] 2025-06-02 15:21:24
>>paxys+A6
> but rather experienced engineers using AI to generate bits of code and then meticulously testing and reviewing them.

My problem is that (in my experience anyways) this is slower than me just writing the code myself. That's why AI is not a useful tool right now. They only get it right sometimes so it winds up being easier to just do it yourself in the first place. As the saying goes: bad help is worse than no help at all, and AI is bad help right now.

◧◩◪
3. motore+mR1[view] [source] 2025-06-03 04:33:12
>>bigstr+N9
> My problem is that (in my experience anyways) this is slower than me just writing the code myself.

In my experience, the only times LLMs slow down your task is when you don't use them effectively. For example, if you provide barely any context or feedback and you prompt a LLM to write you the world, of course it will output unusable results, primarily because it will be forced to interpolate and extrapolate through the missing context.

If you take the time to learn how to gently prompt a LLM into doing what you need, you'll find out it makes you far more productive.

[go to top]