zlacker

[parent] [thread] 6 comments
1. xcamba+(OP)[view] [source] 2022-12-12 06:14:34
From the perspective of the receiver of the message, there is no such thing as the story being true or not.

If it's byte for byte the same story and I don't know whether the author is a human or a bot and I believe the story, the same reaction will be triggered at every level. The emotions, the symbolics, the empathy, all the same, whether the author is this or that.

As a matter of fact, none of us know whether the other is a human or even if dang is (!), because it is orthogonal to the contents and discussion.

What is it that you don't like? That the story is made up or that it is made up (possibly) by a not? In the first case, what is your opinion on made up stories by humans such as novels? In the second case, what is your opinion on objects made up by robots such as your car or phone?

Unless I can tell you are of flesh and bones or not, my acceptance of your story depends only on the story itself. Not whether it happened to a human or not.

replies(3): >>Mistle+z >>midori+X1 >>roenxi+R7
2. Mistle+z[view] [source] 2022-12-12 06:18:40
>>xcamba+(OP)
A made up story likely bears no resemblance to the reality we inhabit since it doesn’t obey the same physical laws of cause and effect for our universe? I’m surprised we have to even explain why a made up story is not useful.
replies(2): >>xcamba+g3 >>jodrel+09
3. midori+X1[view] [source] 2022-12-12 06:32:24
>>xcamba+(OP)
>As a matter of fact, none of us know whether the other is a human or even if dang is (!), because it is orthogonal to the contents and discussion.

Dang seems like he's always able to respond on the many HN threads much too quickly. I suspect he's really an advanced AI.

◧◩
4. xcamba+g3[view] [source] [discussion] 2022-12-12 06:44:53
>>Mistle+z
Made up stories have no constraint either creating reality (eg, capitalism, Santa for kids, religion) or mapping to reality (eg, science, Zola's Germinal).
replies(1): >>Mistle+x4
◧◩◪
5. Mistle+x4[view] [source] [discussion] 2022-12-12 06:58:45
>>xcamba+g3
Ok man you are being obtuse on purpose. I’m talking about shared anecdotes from an AI about something about their life that people might find useful. If it is made up it can be as (un)useful as the bogus code ChatGPT makes sometimes that looks good and authentic but doesn’t work. The intersection of the real world and the story is what makes it useful to others on HN. We aren’t talking about writing fiction.

https://www.vice.com/en/article/wxnaem/stack-overflow-bans-c...

6. roenxi+R7[view] [source] 2022-12-12 07:35:27
>>xcamba+(OP)
The fact that the nature of the story teller mattering more than the nature of the story is a bias. One of the more compelling robot-takeover scenarios is they turn out to be much better at making decisions because a machine can be programmed to weight strong evidence more strongly than an emotionally compelling story.

It is visible even in this thread. im3w1l cares about the teller of the story because that is the medium to relate to another human's experience. Which is fine, but that is probably part of the decision making process. And that is a terrible way to make decisions when good alternatives (like poverty statistics, crime statistics, measures of economic success, measures of health & wellbeing) exist.

A fake story out of a chatbot which leads to people making good decisions is more valuable than the typical punter's well-told life experiences. People wouldn't like that though.

◧◩
7. jodrel+09[view] [source] [discussion] 2022-12-12 07:47:21
>>Mistle+z
The parent constrained with "Byte for byte identical stories".
[go to top]