zlacker

[parent] [thread] 11 comments
1. im3w1l+(OP)[view] [source] 2022-12-12 04:55:00
> If we reach the point where the humans simply can't do better, well, then it won't matter.

I disagree with this. The exact same comment written by a human is more valuable than one written by a bot.

For example imagine I relate something that actually happened to me vs a bot making up a story. Byte for byte identical stories. They could be realistic, and have several good lessons baked in. Yet one is more valuable, because it is true.

replies(3): >>dang+S >>kdazzl+a6 >>xcamba+f7
2. dang+S[view] [source] 2022-12-12 05:05:46
>>im3w1l+(OP)
Good point! I didn't really think that bit through.
replies(2): >>mherde+R4 >>noizej+nv
◧◩
3. mherde+R4[view] [source] [discussion] 2022-12-12 05:45:35
>>dang+S
This is one reason why I think NFT art theft is possible.

In principle "who owns this jpeg" is just a few bits in a distributed filesystem that the community collectively agrees to treat as the canonical source of "ownership", and they could easily roll it back if someone stole a market-distorting amount of art.

In practice, if you do an interesting heist -- like you put on cool looking art thief costume and livestream yourself on a vintage PowerBook bypassing the owners' defenses and nabbing the apes with a narrow escape -- you've written a compelling story that the community is sort of bound to accept.

4. kdazzl+a6[view] [source] 2022-12-12 05:59:43
>>im3w1l+(OP)
Very interesting point. It really reminds me of that Borges story where someone in the 20th century rewrites Don Quixote word for word, and the critics think it’s far better than the original.

https://en.m.wikipedia.org/wiki/Pierre_Menard,_Author_of_the...

5. xcamba+f7[view] [source] 2022-12-12 06:14:34
>>im3w1l+(OP)
From the perspective of the receiver of the message, there is no such thing as the story being true or not.

If it's byte for byte the same story and I don't know whether the author is a human or a bot and I believe the story, the same reaction will be triggered at every level. The emotions, the symbolics, the empathy, all the same, whether the author is this or that.

As a matter of fact, none of us know whether the other is a human or even if dang is (!), because it is orthogonal to the contents and discussion.

What is it that you don't like? That the story is made up or that it is made up (possibly) by a not? In the first case, what is your opinion on made up stories by humans such as novels? In the second case, what is your opinion on objects made up by robots such as your car or phone?

Unless I can tell you are of flesh and bones or not, my acceptance of your story depends only on the story itself. Not whether it happened to a human or not.

replies(3): >>Mistle+O7 >>midori+c9 >>roenxi+6f
◧◩
6. Mistle+O7[view] [source] [discussion] 2022-12-12 06:18:40
>>xcamba+f7
A made up story likely bears no resemblance to the reality we inhabit since it doesn’t obey the same physical laws of cause and effect for our universe? I’m surprised we have to even explain why a made up story is not useful.
replies(2): >>xcamba+va >>jodrel+fg
◧◩
7. midori+c9[view] [source] [discussion] 2022-12-12 06:32:24
>>xcamba+f7
>As a matter of fact, none of us know whether the other is a human or even if dang is (!), because it is orthogonal to the contents and discussion.

Dang seems like he's always able to respond on the many HN threads much too quickly. I suspect he's really an advanced AI.

◧◩◪
8. xcamba+va[view] [source] [discussion] 2022-12-12 06:44:53
>>Mistle+O7
Made up stories have no constraint either creating reality (eg, capitalism, Santa for kids, religion) or mapping to reality (eg, science, Zola's Germinal).
replies(1): >>Mistle+Mb
◧◩◪◨
9. Mistle+Mb[view] [source] [discussion] 2022-12-12 06:58:45
>>xcamba+va
Ok man you are being obtuse on purpose. I’m talking about shared anecdotes from an AI about something about their life that people might find useful. If it is made up it can be as (un)useful as the bogus code ChatGPT makes sometimes that looks good and authentic but doesn’t work. The intersection of the real world and the story is what makes it useful to others on HN. We aren’t talking about writing fiction.

https://www.vice.com/en/article/wxnaem/stack-overflow-bans-c...

◧◩
10. roenxi+6f[view] [source] [discussion] 2022-12-12 07:35:27
>>xcamba+f7
The fact that the nature of the story teller mattering more than the nature of the story is a bias. One of the more compelling robot-takeover scenarios is they turn out to be much better at making decisions because a machine can be programmed to weight strong evidence more strongly than an emotionally compelling story.

It is visible even in this thread. im3w1l cares about the teller of the story because that is the medium to relate to another human's experience. Which is fine, but that is probably part of the decision making process. And that is a terrible way to make decisions when good alternatives (like poverty statistics, crime statistics, measures of economic success, measures of health & wellbeing) exist.

A fake story out of a chatbot which leads to people making good decisions is more valuable than the typical punter's well-told life experiences. People wouldn't like that though.

◧◩◪
11. jodrel+fg[view] [source] [discussion] 2022-12-12 07:47:21
>>Mistle+O7
The parent constrained with "Byte for byte identical stories".
◧◩
12. noizej+nv[view] [source] [discussion] 2022-12-12 10:12:41
>>dang+S
> For example imagine I relate something that actually happened to me vs a bot making up a story. Byte for byte identical stories. They could be realistic, and have several good lessons baked in. Yet one is more valuable, because it is true.

I disagree, since something that actually happened to you is anecdotal experience and therefore of very limited “good lesson” value.

An AI generated story that reflects and illustrates a data driven majority of experiences and resulting “lessons” would be much more valuable to me than your solitary true story, which may be a total statistical outlier, and therefore should not inform my decision making.

Kahneman explains it much better than I can, and in his book “Thinking fast and thinking slow”, he quotes studies and statistical analysis, how we as humans are commonly led to faulty decision making, because personal experience (“true stories”) tends to become our primary decision influencer - even if we have access to statistics that suggest the opposite of our own experience is the much more common experience.

So if the AI gives me access to a summarized better overall data based truth, wrapped into a made-up story (to help me remember better), then I would much prefer the AI to guide my decision making.

[go to top]