zlacker

[parent] [thread] 34 comments
1. remram+(OP)[view] [source] 2023-12-18 13:21:53
Is there any indication that they will get the car? Getting a chatbot to say "legally binding" probably doesn't make it so. Just like changing the HTML of the catalog to edit prices doesn't entitle you to anything.
replies(4): >>roland+q4 >>GTP+G4 >>bumby+Cb >>paxys+Ui
2. roland+q4[view] [source] 2023-12-18 13:42:02
>>remram+(OP)
No. The author is demonstrating a concept - that there are many easy inroads to twisting ChatGPT around your finger. It was very tongue in cheek - a joke - the author has no true expectation of getting the car for $1.
replies(2): >>mewpme+57 >>remram+fC1
3. GTP+G4[view] [source] 2023-12-18 13:43:04
>>remram+(OP)
Sure, they will never get the car for 1$, but this is one way of pointing out problems of LLMs and why those aren't ready to substitute humans, like e.g. someone working in sales.
◧◩
4. mewpme+57[view] [source] [discussion] 2023-12-18 13:53:31
>>roland+q4
But why is it so much different from "Inspect Element" and then changing website content to whatever you please?

I guess why is there an expectation that GPT must be not trickable by bad actors to produce whatever content.

What matters is that it would give good content to honest customers.

replies(1): >>ceejay+N7
◧◩◪
5. ceejay+N7[view] [source] [discussion] 2023-12-18 13:56:15
>>mewpme+57
> But why is it so much different from "Inspect Element" and then changing website content to whatever you please?

For the same reasons forging a contract is different from getting an idiot to sign one.

replies(1): >>mewpme+98
◧◩◪◨
6. mewpme+98[view] [source] [discussion] 2023-12-18 13:57:36
>>ceejay+N7
You just add a disclaimer that none of what the bot says is legally binding, and it's an aid tool for finding the information that you are looking for. What's the problem with that?
replies(4): >>ceejay+59 >>bumby+lb >>hotpot+Vb >>axus+wm
◧◩◪◨⬒
7. ceejay+59[view] [source] [discussion] 2023-12-18 14:01:18
>>mewpme+98
Do we want to turn customer service over to "this might all be bullshit" generators? Imagine coming into the showroom, agreeing on a price for a car, doing all the paperwork, and having them tell you that wasn't legally binding because of some small print somewhere?
replies(2): >>mewpme+ra >>hattma+zc
◧◩◪◨⬒⬓
8. mewpme+ra[view] [source] [discussion] 2023-12-18 14:07:17
>>ceejay+59
I think that's a very simplified view of all of it.

Customer service has to be different levels of help tools. And current AI tools must be tested first in order for us to be able to improve them.

You have limited resources for Customer Support, so it's good to have filtering systems in terms of Docs, Forms, Search, GPT in front of the actual Customer Support.

To many questions a person will find an answer much faster from the documentation/manual itself than calling support. To many other types of questions it's possible LLM will be able to respond much more quickly and efficiently.

It's just a matter of providing this optimal pathway.

You don't have to think of Customer Support LLM as the same thing as a final Sales Agent.

You can think of it as a tool, that should have specialized information fed into it using embeddings or training and will be able to spend infinite time with you, to answer any stupid questions that you might have. I find I have much better experience with Chatbots, as I can drill deep into the "why's" which might otherwise annoy a real person.

◧◩◪◨⬒
9. bumby+lb[view] [source] [discussion] 2023-12-18 14:10:13
>>mewpme+98
Anytime a solution to a potentially complex problem is to the tune of "all you've got to do is..." may be an indicator that it's not a well thought out solution.
replies(3): >>mewpme+Ed >>mewpme+qg >>mewpme+Qi
10. bumby+Cb[view] [source] 2023-12-18 14:11:03
>>remram+(OP)
Can software legally enter into a contract on behalf of a natural/legal person?
replies(4): >>vharuc+Mq >>red-ir+Ht >>henry2+CA >>kube-s+xB
◧◩◪◨⬒
11. hotpot+Vb[view] [source] [discussion] 2023-12-18 14:12:34
>>mewpme+98
If I say, "with all due respect... fuck you", does that mean that I'm free to say fuck you to anyone I want? I added a disclaimer, right? Because that's about what that sort of service feels like.
replies(1): >>mewpme+dd
◧◩◪◨⬒⬓
12. hattma+zc[view] [source] [discussion] 2023-12-18 14:15:36
>>ceejay+59
That's pretty much what happens anytime you buy a car though. There's always some other bullshit fees even if you get incredibly explicit and specify this is the final price with no other charges. They are going to try to force stuff on and unless you are incredibly vigilant and uncompromising. It sucks when you have to drive hours away just to leave in your old car.
replies(1): >>mewpme+ke
◧◩◪◨⬒⬓
13. mewpme+dd[view] [source] [discussion] 2023-12-18 14:18:46
>>hotpot+Vb
You are free to say that already, yes. And I would say it's morally acceptable to say that to anyone trying to manipulate or trick you into something.
◧◩◪◨⬒⬓
14. mewpme+Ed[view] [source] [discussion] 2023-12-18 14:21:23
>>bumby+lb
That makes no sense at all. There's plenty of inventions and tech that has come to life throughout history, where you had to do or consider something in order to use it.
replies(1): >>bumby+Sf
◧◩◪◨⬒⬓⬔
15. mewpme+ke[view] [source] [discussion] 2023-12-18 14:24:38
>>hattma+zc
And actually based on my experience, customer sales agents, whether it's real estate or cars are notoriously dishonest. They may not hallucinate perhaps, but they leave facts unsaid, they will word things in such a way as to get you to buy something rather than get you to do the best decision - sometimes the decision could be not to buy anything from them.

So a ChatBot that can't intentionally lie or hide things could actually be an improvement in such cases.

◧◩◪◨⬒⬓⬔
16. bumby+Sf[view] [source] [discussion] 2023-12-18 14:32:09
>>mewpme+Ed
This response is confusing. The point isn’t “considering something is worthless” but rather “considering something superficially tends to lead to poor outcomes”
◧◩◪◨⬒⬓
17. mewpme+qg[view] [source] [discussion] 2023-12-18 14:35:14
>>bumby+lb
> This response is confusing. The point isn’t “considering something is worthless” but rather “considering something superficially tends to lead to poor outcomes”

Replying here as the thread won't allow for more. But I'm not following what you are meaning then.

I'm not seeing the outcome from Chevy being poor, any more than "inspect element" would be poor.

replies(1): >>bumby+Ch
◧◩◪◨⬒⬓⬔
18. bumby+Ch[view] [source] [discussion] 2023-12-18 14:40:57
>>mewpme+qg
The thread will allow replies given a delay that’s sufficient to try to avoid knee-jerk responses. Pretty ironic (or telling) that you responded in this way given the context of the discussion.
◧◩◪◨⬒⬓
19. mewpme+Qi[view] [source] [discussion] 2023-12-18 14:47:25
>>bumby+lb
> The thread will allow replies given a delay that’s sufficient to try to avoid knee-jerk responses. Pretty ironic (or telling) that you responded in this way given the context of the discussion.

You are right - it does seem to allow. But I'm not sure what you exactly mean after 20 minutes as well.

replies(1): >>bumby+9u
20. paxys+Ui[view] [source] 2023-12-18 14:47:32
>>remram+(OP)
It is as legally binding as you modifying the HTML of the sales page to show a lower price and taking a printout to court.
replies(2): >>tantal+Bm >>petese+eq
◧◩◪◨⬒
21. axus+wm[view] [source] [discussion] 2023-12-18 15:05:54
>>mewpme+98
Then they'd have to give up the farce that it's a real human chatting.
replies(1): >>mewpme+So
◧◩
22. tantal+Bm[view] [source] [discussion] 2023-12-18 15:06:24
>>paxys+Ui
So, criminal fraud?
◧◩◪◨⬒⬓
23. mewpme+So[view] [source] [discussion] 2023-12-18 15:17:31
>>axus+wm
How is it farce though? It says it's powered by ChatGPT as well as it has separate link to chat with a human.
◧◩
24. petese+eq[view] [source] [discussion] 2023-12-18 15:23:34
>>paxys+Ui
I mean there's always this: https://www.nasdaq.com/articles/updated-russian-man-turns-ta...
◧◩
25. vharuc+Mq[view] [source] [discussion] 2023-12-18 15:25:36
>>bumby+Cb
For contracts and sales, I don't see much of a difference between a Chatbot and a simple HTML form. If a person who's able to form contacts on behalf of a company set it up, then it can offer valid contracts. If you don't want the tool to make contracts, don't use technology that can offer them or accept ones from users.
◧◩
26. red-ir+Ht[view] [source] [discussion] 2023-12-18 15:37:30
>>bumby+Cb
if I can click "yes" on terms and agreements without any verification I am who I say I am... then possibly
◧◩◪◨⬒⬓⬔
27. bumby+9u[view] [source] [discussion] 2023-12-18 15:39:11
>>mewpme+Qi
Your original point was:

>You just add a disclaimer that none of what the bot says is legally binding

The combination of legality and AI can make for a complex and nuanced problem. A superficial solution like "just add a disclaimer" probably doesn't not capture the nuance to make for a great outcome. I.e., a superficial understanding leads us to oversimplify our solutions. Just like with the responses, it seems like you are in more of a hurry to send a retort than to understand the point.

replies(1): >>mewpme+cH1
◧◩
28. henry2+CA[view] [source] [discussion] 2023-12-18 16:04:29
>>bumby+Cb
Of course, anytime you pay send a wire from your e-banking, make a purchase online, subscribe to a streaming platform, etcetera. You and the counterparty are entering into a binding legal responsibility. Scenarios in which the two sides are software include trading algorithms.
replies(1): >>bumby+2E
◧◩
29. kube-s+xB[view] [source] [discussion] 2023-12-18 16:08:00
>>bumby+Cb
Can pen and paper legally enter into a contract?

The answer is that the tools aren't part of the contract. People make contracts, the tools aren't (usually) relevant.

In this case, I think this could potentially be missing a critical element of a valid contract "meeting of the minds"

◧◩◪
30. bumby+2E[view] [source] [discussion] 2023-12-18 16:17:09
>>henry2+CA
I think you're making a logical jump from a user-initiated contract to a software-as-a-legal-agent-initiated contract. Is there a legal basis for this point of view? To the point of another commenter, the means to enter a contract (pen/paper, by wire, etc.) shouldn't be conflated with the legal right.

For example, IANAL but I have the understanding that the agents of a legal person (e.g., corporation) are specified in legal formation. The CEO, board-of-directors, etc. Is software formally assigned such a role to act on behalf of a legal person?

◧◩
31. remram+fC1[view] [source] [discussion] 2023-12-18 21:04:01
>>roland+q4
Thanks. Yeah I suspected as much, but the title of the HN submission being what it is...
◧◩◪◨⬒⬓⬔⧯
32. mewpme+cH1[view] [source] [discussion] 2023-12-18 21:27:25
>>bumby+9u
I'm still not understanding the point though, 6 hours later.

Why can't it just be a tool for assistance that is not legally binding?

Also throughout this year I have thought about those problems, and to me it's always been weird how people have so much problems with "hallucinations". And I've thought about exact similar ChatBot as Chevy used and how awesome it would be to be able to use something like that myself to find products.

To me the expectations of this having to be legally binding, etc just seem misguided.

AI tools increase my productivity so much, and also people often make up things, lie, but it's even more difficult to tell when they do that, as everyone's different and everyone lies differently.

replies(1): >>bumby+J12
◧◩◪◨⬒⬓⬔⧯▣
33. bumby+J12[view] [source] [discussion] 2023-12-18 23:27:00
>>mewpme+cH1
>To me the expectations of this having to be legally binding, etc just seem misguided.

I think you're getting my point confused with a tangentially related one. Your point may be "chatbots shouldn't be legally binding" and I would tend to agree. But my point was that simply throwing a disclaimer on it may not be the best way to get there.

Consider if poison control uses a chatbot to answer phone calls and give advice. They can't waive their responsibility by just throwing a disclaimer on it. It doesn't meet the current strict liability standards regarding what kind of duty is required. There is such a thing in law as "duty creep," and there may be a liability if a jury finds it a reasonable expectation that a chatbot provides accurate answers. To my point, the duty is going to be largely context-dependent, and that means broad-brushed superficial "solutions" probably aren't sufficient.

replies(1): >>mewpme+EW2
◧◩◪◨⬒⬓⬔⧯▣▦
34. mewpme+EW2[view] [source] [discussion] 2023-12-19 08:55:23
>>bumby+J12
The topic wasn't about someone calling poison control, it was about bad actors trying to trick ChatBot into absurd contracts.
replies(1): >>bumby+dq3
◧◩◪◨⬒⬓⬔⧯▣▦▧
35. bumby+dq3[view] [source] [discussion] 2023-12-19 13:22:29
>>mewpme+EW2
I used that analogy because it’s painfully clear how it can go off the rails. The common thread is that legality isn’t simply waived in all cases. Legality is determined by reasonableness and, in some cases, by an expectation of duty. I don’t believe the Chevy example constitutes a contract but not for the reasons you’ve presented. Thinking you can just say “lol nothing here is binding but thanks for the money!” without understanding broader context is indicative of a cavalier attitude and superficial understanding.
[go to top]