To my mind the bigger issue is how much of it is a total scam. OF models offshoring their DM responses so their clients think they’re having conversations with the model when it’s actually some dude half the world away. Or using AI for the same, which I’m sure is increasing exponentially.
It’s going to be interesting to see what happens when AI is able to generate on demand video/photo and chat that’s realistic enough to satisfy an online client. If people are specifically told it’s AI will they be content with that? Or will they still want an actual real human? We're not exactly rational creatures at the best of times so it’ll be fascinating to see. We’ll have gone from the phone sex lines of yore, where you are interacting with a real human even though they’re definitely not the human you’re imagining in your head, to an AI video chat where you’re seeing exactly what you want but there’s nothing behind it.
This seems like OF's Etsy trap moment.
On the one hand, scaling creator:individual_fan multiples via AI assisted messaging = $$$ (to creators and OF)
On the other hand, it canabalizes their core business value tenet -- authenticity.
It'll be curious to see which path they choose, and if it ends up playing out similar to Etsy. I.e. temporarily increasing their revenue while erroding their brand, then having to tack back once they realize how dire things have gotten in customers' eyes.
...
OF models offshoring their DM responses
I mean this sounds to me like the toxic middlemen have changed form, rather than gone away. Now the toxic middlemen work for the performer, rather than the other way around. But they're still toxic and their toxicity is now directed at the buyer instead.
That said, people only need to _believe_ it's real.
Wait, are you intentionally ignoring the fact that OF is the middleman? Because it definitely is, making about 1 billion dollars off of 5 billion dollars of transactions. Or are you saying OF is a "good non-toxic middleman".
When it's that easy to screw up, it's easier and cheaper to pay real humans $1k a month for sexting than to build an LLM-based system that never makes mistakes and is 100% secured against prompt injection.
If it's not done, then creators have a fundamental time cap to the amount of personalized content they can create.
If it's done, but users don't know about it, then creators increase their revenue several times.
If it's done, but users do know about it, then creators lose several multiples of revenue.
If people are going to a porn site to spend relatively small amounts of money to get "genuine human to human interaction," there are more than a few flaws in their strategy. Unless they're spending many thousands of dollars a month, there could be no reasonable expectation they're getting anything but extremely superficial interactions. If they get mad because they think they should get an e-girlfriend for $10 a month or whatever, I'd say that's on them because of unreasonable expectations.
Honestly, I think gen AI is pretty much inevitable for these kinds of parasocial services, but it will be clandestinely used because otherwise it makes perfect sense for the "content creator." Whatever relationship they think they have is an illusion in their head anyway, and they're probably expending a fair amount of energy to maintain it.
I've never done business with them and am not interested in buying that kind of content, but it certainly seems like an improvement over any more traditional sex-related work for those who are interested in being in that market.