zlacker

[parent] [thread] 0 comments
1. Zababa+(OP)[view] [source] 2026-01-20 14:11:15
>My digital thermometer doesn't think. Imbibing LLM's with thought will start leading to some absurd conclusions.

What kind of absurd conclusions? And what kind of non absurd conclusions can you make when you follow your let's call it "mechanistic" view?

>It's an algorithm and a completely mechanical process which you can quite literally copy time and time again. Unless of course you think 'physical' computers have magical powers that a pen and paper Turing machine doesn't?

I don't, just like I don't think a human or animal brain has any magical power that imbues it with "intelligence" and "reasoning".

>A cursory read of basic philosophy would help elucidate why casually saying LLM's think, reason etc is not good enough.

I'm not saying they do or they don't, I'm saying that from what I've seen having a strong opinion about whether they think or they don't seem to lead people to weird places.

>What is thinking? What is intelligence? What is consciousness? These questions are difficult to answer. There is NO clear definition.

You see pretty certain that whatever those three things are a LLM isn't doing it, a paper and pencil aren't doing it even when manipulated by a human, the system of a human manipulating a paper and pencil isn't doing it.

[go to top]