zlacker

[parent] [thread] 0 comments
1. pasaba+(OP)[view] [source] 2023-11-19 22:29:04
Tbh, I always thought the whole stuff about 'intelligence' was just marketing garbage. There are no really good rigorous descriptions of intelligence, so asking if a product exhibits intelligence is basically nonsense. There are two questions about LLMs that are good, though:

1. Are they useful?

2. Are they going to become more useful in the forseeable future?

On 1, I would say, maybe? Like, somewhere between Microsoft Word and Excel? On 2, I would say, sure - an 'AGI' would be tremendously useful. But it's also tremendously unlikely to grow somehow out of the current state of the art. People disagree on that point, but I don't think there are even compelling reasons to believe that LLMs can evolve beyond their current status as bullshit generators.

[go to top]