zlacker

[return to ""I just bought a 2024 Chevy Tahoe for $1""]
1. remram+m9[view] [source] 2023-12-18 13:21:53
>>isp+(OP)
Is there any indication that they will get the car? Getting a chatbot to say "legally binding" probably doesn't make it so. Just like changing the HTML of the catalog to edit prices doesn't entitle you to anything.
◧◩
2. roland+Md[view] [source] 2023-12-18 13:42:02
>>remram+m9
No. The author is demonstrating a concept - that there are many easy inroads to twisting ChatGPT around your finger. It was very tongue in cheek - a joke - the author has no true expectation of getting the car for $1.
◧◩◪
3. mewpme+rg[view] [source] 2023-12-18 13:53:31
>>roland+Md
But why is it so much different from "Inspect Element" and then changing website content to whatever you please?

I guess why is there an expectation that GPT must be not trickable by bad actors to produce whatever content.

What matters is that it would give good content to honest customers.

◧◩◪◨
4. ceejay+9h[view] [source] 2023-12-18 13:56:15
>>mewpme+rg
> But why is it so much different from "Inspect Element" and then changing website content to whatever you please?

For the same reasons forging a contract is different from getting an idiot to sign one.

◧◩◪◨⬒
5. mewpme+vh[view] [source] 2023-12-18 13:57:36
>>ceejay+9h
You just add a disclaimer that none of what the bot says is legally binding, and it's an aid tool for finding the information that you are looking for. What's the problem with that?
◧◩◪◨⬒⬓
6. hotpot+hl[view] [source] 2023-12-18 14:12:34
>>mewpme+vh
If I say, "with all due respect... fuck you", does that mean that I'm free to say fuck you to anyone I want? I added a disclaimer, right? Because that's about what that sort of service feels like.
◧◩◪◨⬒⬓⬔
7. mewpme+zm[view] [source] 2023-12-18 14:18:46
>>hotpot+hl
You are free to say that already, yes. And I would say it's morally acceptable to say that to anyone trying to manipulate or trick you into something.
[go to top]