zlacker

[parent] [thread] 6 comments
1. Toucan+(OP)[view] [source] 2024-05-15 15:09:50
The use of LLM's as pseudo-friends or girlfriends for people as a market solution for loneliness is so incredibly sad and dystopian. Genuinely one of the most unsettling goddamn things I've seen gain traction since I've been in this industry.

And so many otherwise perfectly normal products are now employing addiction mechanics to drive engagement, but somehow this one is just even further over the line for me in a way I can't articulate. I'm so sick of startups taking advantage of people. So, so fucking gross.

replies(2): >>swatco+v8 >>Intral+kr1
2. swatco+v8[view] [source] 2024-05-15 15:48:51
>>Toucan+(OP)
It's a technological salve that gives individuals a minor and imperfect remedy for a profound failure in modern society. It's of a kind with pharmaceutical treatments for depression or anxiety or obesity -- best seen as a temporary "bridge" towards wellness (achieved, perhaps, through other interventions) -- but altogether just trying to help troubled individuals navigate a society that failed to enable their deeper wellness in the first place.
replies(1): >>reduce+wb
◧◩
3. reduce+wb[view] [source] [discussion] 2024-05-15 16:01:40
>>swatco+v8
These type of techno-solutions are some of the root cause of those “profound failure of modern society!” The technological salve is just a further extreme causing these people’s problems! Much like there exists some societal problems, alcohol is a tiny relief, but can further exacerbate those problems, and you advocate that they drink even more alcohol because society has issues and they should escape it.

Idk how we’ve gotten away from such a natural human experience, but everyone knows damn well that the happiest children are out playing soccer with their friends in a field or eating lunch together at a park bench, and not holed up in their room watching endless YouTube.

replies(2): >>swatco+gf >>Intral+zr1
◧◩◪
4. swatco+gf[view] [source] [discussion] 2024-05-15 16:16:12
>>reduce+wb
I don't disagree in the least. I'm just saying it's the in same bucket as many commercial products that are designated as therapeutic and that they should all be looked at with a similar kind of celebration/skepticism.
replies(1): >>Toucan+j51
◧◩◪◨
5. Toucan+j51[view] [source] [discussion] 2024-05-15 20:46:25
>>swatco+gf
I think celebration of any sort should be belayed until we have actual evidence of these things having positive effects on people. Like this is just me reacting as a human to a human issue but: a fake friend in an LLM is not a friend. It's never going to crawl out of the phone and help you put the donut on your car when your get a flat tire. It's not going to take you out for a drink if you go through a rough breakup. It's not going to have difficult conversations with you and call you out on your bullshit because it cares about you.

LLM friends have the same energy to me as video game progression: it's a homeopathic version of a real thing you need, social activation and achievement respectively. But like homeopathy, you don't actually get anything out of it. The placebo effect will make the symptoms of your lack feel better, for awhile, but it will never be solved by it, and because of that whatever is selling you your LLM girlfriend or phony achievement structure will never lose you as a customer. I'm suspicious of that.

6. Intral+kr1[view] [source] 2024-05-15 23:20:14
>>Toucan+(OP)
Idk man, I'm too busy being terrified of the use of LLMs as propaganda agents, micro-targetting adtech vectors, mass gaslighters and cultural homogenizers.

I mean, these things are literally designed to statelessly yet convincingly talk about events they can't see, experiences they can't understand, emotions they can't feel… If a human acted like that, we'd call them a psychopath.

We already know that our social structures tend to be quite vulnerable to dark triad type personalities. And yet, while human psychopaths are limited by genetics to a small percentage of the population, there's no limit on the number of spambot instances you can instruct to attack your political rivals, Alexa 2.0 updates that could be pushed to sound 5% sadder when talking about a competitor's products, LLM moderators that can be deployed to subtly correct "organic" interactions that leave a known profitable state space… And that's just the obvious next steps from where we're already at today. I'm sure the real use cases for automated lying machines will be more horrifying than most of us could imagine today, just as nobody could have predicted in 2010 that Twitter and Facebook would enable ISIS, Trump, unconsensual mass human experimentation, the Rohingya genocide…

Which is to say, selling LLM "friends" or "girlfriends" as a way to addictively exploit people's loneliness seems like one of the least harmful things that could come out of the current "AI" push. Sad, yes, but compared to where I think this is headed, that seems like dodging a bullet.

> I'm so sick of startups taking advantage of people. So, so fucking gross.

Silicon Valley was a mistake. An entire industry controlled largely by humans that decided they like predictable programmable machines more than they like free and equal persons. What was the expected outcome?

◧◩◪
7. Intral+zr1[view] [source] [discussion] 2024-05-15 23:23:23
>>reduce+wb
> Idk how we’ve gotten away from such a natural human experience, but everyone knows damn well that the happiest children are out playing soccer with their friends in a field or eating lunch together at a park bench, and not holed up in their room watching endless YouTube.

A soccer ball can't (usually) spy on you to sell you stuff, though, is the thing…

[go to top]