zlacker

[return to "A non-anthropomorphized view of LLMs"]
1. elliot+qh[view] [source] 2025-07-07 01:17:16
>>zdw+(OP)
To claim that LLMs do not experience consciousness requires a model of how consciousness works. The author has not presented a model, and instead relied on emotive language leaning on the absurdity of the claim. I would say that any model one presents of consciousness often comes off as just as absurd as the claim that LLMs experience it. It's a great exercise to sit down and write out your own perspective on how consciousness works, to feel out where the holes are.

The author also claims that a function (R^n)^c -> (R^n)^c is dramatically different to the human experience of consciousness. Yet the author's text I am reading, and any information they can communicate to me, exists entirely in (R^n)^c.

◧◩
2. tdulli+oq1[view] [source] 2025-07-07 13:25:56
>>elliot+qh
Author here. What's the difference, in your perception, between an LLM and a large-scale meteorological simulation, if there is any?

If you're willing to ascribe the possibility of consciousness to any complex-enough computation of a recurrence equation (and hence to something like ... "earth"), I'm willing to agree that under that definition LLMs might be conscious. :)

[go to top]