zlacker

[parent] [thread] 1 comments
1. ZetaZe+(OP)[view] [source] 2022-12-12 13:55:53
"Sure it might produce convincing examples of human speech, but it fundamentally lacks an internal point of view that it can express..."

Sounds just like the chess experts from 30 years ago. Their belief at the time was that computers were good at tactical chess, but had no idea how to make a plan. And Go would be impossible for computers, due to the branching factor. Humans would always be better, because they could plan.

GPT (or a future successor) might not be able to have "an internal point of view". But it might not matter.

replies(1): >>contra+Wf
2. contra+Wf[view] [source] 2022-12-12 15:25:56
>>ZetaZe+(OP)
Having some internal point of view matters in as much that not having one means it's not really trying to communicate anything. A text generation AI would be a much more useful interface if it can form a view an express it rather than just figuring it all out from context.
[go to top]