And so many otherwise perfectly normal products are now employing addiction mechanics to drive engagement, but somehow this one is just even further over the line for me in a way I can't articulate. I'm so sick of startups taking advantage of people. So, so fucking gross.
Idk how we’ve gotten away from such a natural human experience, but everyone knows damn well that the happiest children are out playing soccer with their friends in a field or eating lunch together at a park bench, and not holed up in their room watching endless YouTube.
LLM friends have the same energy to me as video game progression: it's a homeopathic version of a real thing you need, social activation and achievement respectively. But like homeopathy, you don't actually get anything out of it. The placebo effect will make the symptoms of your lack feel better, for awhile, but it will never be solved by it, and because of that whatever is selling you your LLM girlfriend or phony achievement structure will never lose you as a customer. I'm suspicious of that.
I mean, these things are literally designed to statelessly yet convincingly talk about events they can't see, experiences they can't understand, emotions they can't feel… If a human acted like that, we'd call them a psychopath.
We already know that our social structures tend to be quite vulnerable to dark triad type personalities. And yet, while human psychopaths are limited by genetics to a small percentage of the population, there's no limit on the number of spambot instances you can instruct to attack your political rivals, Alexa 2.0 updates that could be pushed to sound 5% sadder when talking about a competitor's products, LLM moderators that can be deployed to subtly correct "organic" interactions that leave a known profitable state space… And that's just the obvious next steps from where we're already at today. I'm sure the real use cases for automated lying machines will be more horrifying than most of us could imagine today, just as nobody could have predicted in 2010 that Twitter and Facebook would enable ISIS, Trump, unconsensual mass human experimentation, the Rohingya genocide…
Which is to say, selling LLM "friends" or "girlfriends" as a way to addictively exploit people's loneliness seems like one of the least harmful things that could come out of the current "AI" push. Sad, yes, but compared to where I think this is headed, that seems like dodging a bullet.
> I'm so sick of startups taking advantage of people. So, so fucking gross.
Silicon Valley was a mistake. An entire industry controlled largely by humans that decided they like predictable programmable machines more than they like free and equal persons. What was the expected outcome?
A soccer ball can't (usually) spy on you to sell you stuff, though, is the thing…