zlacker

[parent] [thread] 3 comments
1. paxys+(OP)[view] [source] 2023-12-18 14:35:54
Fun experiment, but it isn't as much of a gotcha as people here think. They could have verbally tricked a human customer service agent into promising them the car for $1 in the same way but the end result would be the same – the agent (whether human or bot) doesn't have the authority to make that promise so you are walking away with nothing. I doubt the company is sweating because of this hack.

Now if Chevrolet hooks their actual sales process to an LLM and has it sign contracts on their behalf... that'll be a sight to behold.

replies(2): >>smallp+o4 >>dfxm12+18
2. smallp+o4[view] [source] 2023-12-18 14:58:32
>>paxys+(OP)
> They could have verbally tricked a human customer service agent into promising them the car for $1 in the same way

When's the last time you spoke to a human?

replies(1): >>paxys+D5
◧◩
3. paxys+D5[view] [source] [discussion] 2023-12-18 15:04:45
>>smallp+o4
When was the last time you spoke to a car salesman?
4. dfxm12+18[view] [source] 2023-12-18 15:16:32
>>paxys+(OP)
To add, it's not just about who has authority or not. If you try to trick someone, even if the person you tricked has some kind of authority, a contract signed based on this trick (i.e., fraud) can likely be voidable.
[go to top]