zlacker

[parent] [thread] 3 comments
1. shaneb+(OP)[view] [source] 2023-05-16 15:12:28
'If you have some data and continuously update it with a function, we usually call that data state. That's what happens when you keep adding tokens to the output. The "story so far" is the state of an LLM-based AI.'

You're conflating UX and LLM.

replies(2): >>chpatr+23 >>danena+Tq
2. chpatr+23[view] [source] 2023-05-16 15:25:58
>>shaneb+(OP)
I never said LLMs are stateful.
3. danena+Tq[view] [source] 2023-05-16 16:58:44
>>shaneb+(OP)
You're being pedantic. While the core token generation function is stateless, that function is not, by a long shot, the only component of an LLM AI. Every LLM system being widely used today is stateful. And it's not only 'UX'. State is fundamental to how these models produce coherent output.
replies(1): >>shaneb+sE
◧◩
4. shaneb+sE[view] [source] [discussion] 2023-05-16 18:02:14
>>danena+Tq
"State is fundamental to how these models produce coherent output."

Incorrect.

[go to top]