zlacker

[parent] [thread] 1 comments
1. buz11+(OP)[view] [source] 2025-07-07 11:04:42
The most useful analogy I've heard is LLMs are to the internet what lossy jpegs are to images. The more you drill in the more compression artifacts you get.
replies(1): >>Feepin+s1
2. Feepin+s1[view] [source] 2025-07-07 11:17:42
>>buz11+(OP)
(This is of course also the case for the human brain.)
[go to top]