zlacker

[return to ""I just bought a 2024 Chevy Tahoe for $1""]
1. Michae+kd[view] [source] 2023-12-18 13:40:04
>>isp+(OP)
I never understand people who engage with chat bots as customer service.

I find them deeply upsetting, not one step above the phone robot on Vodafone support: "press 1 for internet problems" ... "press 2 to be transferred to a human representative". Only problem is going through like 7 steps until I can reach that human, then waiting some 30 minutes until the line is free.

But it's the only approach that gets anything done. Talking to a human.

Robots a a cruel joke on customers.

◧◩
2. bradfa+Dd[view] [source] 2023-12-18 13:41:20
>>Michae+kd
I chatted with a chat bot this morning for getting reimbursed for a recalled product. It went fine. It was quick and easy. Chat bots type a lot faster than call center pay-grade humans.
◧◩◪
3. robvir+3g[view] [source] 2023-12-18 13:51:52
>>bradfa+Dd
I'll take a human any day. The amount of times I've had a person say "Oh. I see the system always does this." And suddenly my previously intractable problem disappeared is staggering. Granted experienced people are hard to find, but when false positives occur it's the only thing I have seen fix it. I need that.
◧◩◪◨
4. hiAndr+vi[view] [source] 2023-12-18 14:01:30
>>robvir+3g
If only there was a way to speak to a chat bot first, in order to filter out the 90/99/99.9/99.99% of problems that can be handled efficiently by the automaton, and then transfer to a human being for the most difficult tasks!
◧◩◪◨⬒
5. Animal+vj[view] [source] 2023-12-18 14:05:36
>>hiAndr+vi
If only there was a way to quickly bypass the chatbot when you knew you had a problem that needed a human.

But it was almost the same before chatbots. You got a human, but it was a human that had a script, and didn't have authority to depart from it. You had to get that human to get to the end of their script (where they were allowed to actually think), or else you had to get them to transfer you to someone who could. It was almost exactly like a chatbot, except with humans.

◧◩◪◨⬒⬓
6. bluGil+6n[view] [source] 2023-12-18 14:21:41
>>Animal+vj
Some of those humans had a script that was useful and thus worth going through - 99% of the time your issue is the same as the one everyone else is having. Maybe you check before calling things like it is plugged in, but even then there are many common problems and since you don't have the checklist they need to go through it to see what item on the checklist you forgot.

What humans do well though is listen - the 1 minute explanation often often gives enough clues to skip 75% of the checklist. Every chatbot I've worked ends up failing because I use some word or phrasing in my description that wasn't in their script and so they make me check things on the checklist that are obviously not the issue (the light are on, so that means it is plugged in)

◧◩◪◨⬒⬓⬔
7. bumby+iq[view] [source] 2023-12-18 14:38:02
>>bluGil+6n
>Every chatbot I've worked ends up failing because I use some word or phrasing in my description that wasn't in their script

This is an interesting insight I’ve experienced as well. It makes me wonder if the use of chatbots becoming more and more prevalent will eventually habitualize humans into specific speech patterns. Kinda like the homogenization of suburban America by capitalism, where most medium sized towns seem to have the same chain stores.

◧◩◪◨⬒⬓⬔⧯
8. Animal+Dr[view] [source] 2023-12-18 14:44:35
>>bumby+iq
So the chatbots are going to program us to work with them, since we can't program them to work with us?

I for one do not welcome our new robot overlords.

◧◩◪◨⬒⬓⬔⧯▣
9. bluGil+fW[view] [source] 2023-12-18 16:58:52
>>Animal+Dr
In this case I support them - language variation like this eventually leads to a new language that isn't mutually understandable. Anything to force people to speak more alike increases communication. Ever try to understand someone from places like Mississippi, Scotland, or Australia - they all speak English, but it is not always mutually understandable. There are also cases where words mean different/opposite things in different areas leading to confusing.

There are lots of other reasons to hate chatbots, but if they can force people to speak the same language that would be good.

[go to top]