zlacker

[parent] [thread] 0 comments
1. joseph+(OP)[view] [source] 2025-04-11 00:54:01
I'm the person who asked about the definition of a chair up thread.

Just to make a very obvious point: Nobody thinks of the definition for a chair as a particularly controversial idea. But clearly:

- We don't all agree on what a chair is (is a stump a chair or not?).

- Nobody in this thread has been able to give a widely accepted definition of the word "chair"

- It seems like we can't even agree on what criteria are admissible in the definition. (Eg, does it matter that I can sit on it? Does it matter that I can intend to sit on it? Does it matter that my dog can sit on it?)

If even defining what the word "chair" means is beyond us, I hold little hope that we can ever manually explain the concept to a computer. Returning to my original point above, this is why I think expert systems style approaches are a dead end. Likewise, I think any AI system that uses formal or symbolic logic in its internal definitions will always be limited in its capacity.

And yet, I suspect chatgpt will understand all of the nuance in this conversation just fine. Like everyone else, I'm surprised how "smart" transformer based neural nets have become. But if anything has a hope of achieving AGI, I'm not surprised that:

- Its something that uses a fuzzy, non-symbolic logic internally.

- The "internal language" for its own thoughts is an emergent result of the training process rather than being explicitly and manually programmed in.

- That it translates its internal language of thought into words at the end of the thinking / inference process. Because - as this "chair" example shows - our internal definition for what a chair is is seems clear to us. But it doesn't necessarily mean we can translate that internal definition into a symbolic definition (ie with words).

I'm not convinced that current transformer architectures will get us all the way to AGI / ASI. But I think that to have a hope of achieving human level AI, you'll always want to build a system which has those elements of thought. Cyc, as far as I can tell, does not. So of course, I'm not at all surprised its being dumped.

[go to top]