zlacker

[return to "Cloudlflare builds OAuth with Claude and publishes all the prompts"]
1. infini+X7[view] [source] 2025-06-02 15:11:10
>>gregor+(OP)
From this commit: https://github.com/cloudflare/workers-oauth-provider/commit/...

===

"Fix Claude's bug manually. Claude had a bug in the previous commit. I prompted it multiple times to fix the bug but it kept doing the wrong thing.

So this change is manually written by a human.

I also extended the README to discuss the OAuth 2.1 spec problem."

===

This is super relatable to my experience trying to use these AI tools. They can get halfway there and then struggle immensely.

◧◩
2. myster+ra[view] [source] 2025-06-02 15:24:15
>>infini+X7
This to me is why I think these tools don't have actual understanding, and are instead producing emergent output from pooling an incomprehensibly large set of pattern-recognized data.
◧◩◪
3. diggan+0b[view] [source] 2025-06-02 15:26:43
>>myster+ra
> these tools don't have actual understanding, and are instead producing emergent output from pooling an incomprehensibly large set of pattern-recognized data

I mean, bypassing the fact that "actual understanding" doesn't have any consensus about what it is, does it matter if it's "actual understanding" or "kind of understanding", or even "barely understanding", as long as it produces the results you expect?

◧◩◪◨
4. scepti+jc[view] [source] 2025-06-02 15:35:48
>>diggan+0b
> as long as it produces the results you expect?

But it's more the case of "until it doesn't produce the results you expect" and then what do you do?

◧◩◪◨⬒
5. diggan+gd[view] [source] 2025-06-02 15:40:18
>>scepti+jc
> "until it doesn't produce the results you expect" and then what do you do?

I'm not sure I understand what you mean. You're asking it to do something, and it doesn't do that?

◧◩◪◨⬒⬓
6. dingnu+Bs[view] [source] 2025-06-02 17:06:46
>>diggan+gd
if you give an LLM a spec with a new language and no examples, it can't write the new language.

until someone does that, I think we've demonstrated that they do not have understanding or abstract thought. they NEED examples in a way humans do not.

[go to top]