zlacker

[parent] [thread] 13 comments
1. chpatr+(OP)[view] [source] 2023-05-16 14:23:24
Neither is text generation as you continue generating text.
replies(1): >>shaneb+l3
2. shaneb+l3[view] [source] 2023-05-16 14:39:31
>>chpatr+(OP)
"Neither is text generation as you continue generating text."

LLM is stateless.

replies(1): >>chpatr+j5
◧◩
3. chpatr+j5[view] [source] [discussion] 2023-05-16 14:48:08
>>shaneb+l3
On a very fundamental level the LLM is a function from context to the next token but when you generate text there is a state as the context gets updated with what has been generated so far.
replies(2): >>shaneb+17 >>jazzyj+R8
◧◩◪
4. shaneb+17[view] [source] [discussion] 2023-05-16 14:56:46
>>chpatr+j5
"On a very fundamental level the LLM is a function from context to the next token but when you generate text there is a state as the context gets updated with what has been generated so far."

Its output is predicated upon its training data, not user defined prompts.

replies(2): >>chpatr+l8 >>alpaca+7d
◧◩◪◨
5. chpatr+l8[view] [source] [discussion] 2023-05-16 15:03:23
>>shaneb+17
If you have some data and continuously update it with a function, we usually call that data state. That's what happens when you keep adding tokens to the output. The "story so far" is the state of an LLM-based AI.
replies(1): >>shaneb+ja
◧◩◪
6. jazzyj+R8[view] [source] [discussion] 2023-05-16 15:06:01
>>chpatr+j5
the model is not effected by its inputs over time

its essentially a function that is called recursively on its result, no need to represent state

replies(1): >>chpatr+Cd
◧◩◪◨⬒
7. shaneb+ja[view] [source] [discussion] 2023-05-16 15:12:28
>>chpatr+l8
'If you have some data and continuously update it with a function, we usually call that data state. That's what happens when you keep adding tokens to the output. The "story so far" is the state of an LLM-based AI.'

You're conflating UX and LLM.

replies(2): >>chpatr+ld >>danena+cB
◧◩◪◨
8. alpaca+7d[view] [source] [discussion] 2023-05-16 15:25:21
>>shaneb+17
> Its output is predicated upon its training data, not user defined prompts.

Prompts very obviously have influence on the output.

replies(1): >>shaneb+ck
◧◩◪◨⬒⬓
9. chpatr+ld[view] [source] [discussion] 2023-05-16 15:25:58
>>shaneb+ja
I never said LLMs are stateful.
◧◩◪◨
10. chpatr+Cd[view] [source] [discussion] 2023-05-16 15:27:13
>>jazzyj+R8
Being called recursively on a result is state.
replies(1): >>jazzyj+vG
◧◩◪◨⬒
11. shaneb+ck[view] [source] [discussion] 2023-05-16 15:52:42
>>alpaca+7d
"Prompts very obviously have influence on the output."

The LLM is also discrete.

◧◩◪◨⬒⬓
12. danena+cB[view] [source] [discussion] 2023-05-16 16:58:44
>>shaneb+ja
You're being pedantic. While the core token generation function is stateless, that function is not, by a long shot, the only component of an LLM AI. Every LLM system being widely used today is stateful. And it's not only 'UX'. State is fundamental to how these models produce coherent output.
replies(1): >>shaneb+LO
◧◩◪◨⬒
13. jazzyj+vG[view] [source] [discussion] 2023-05-16 17:21:23
>>chpatr+Cd
if you say so, but the model itself is not updated by user input, it is the same function every time, hence, stateless.
◧◩◪◨⬒⬓⬔
14. shaneb+LO[view] [source] [discussion] 2023-05-16 18:02:14
>>danena+cB
"State is fundamental to how these models produce coherent output."

Incorrect.

[go to top]