zlacker

[return to "Sam Altman goes before US Congress to propose licenses for building AI"]
1. srslac+I7[view] [source] 2023-05-16 12:00:15
>>vforgi+(OP)
Imagine thinking that regression based function approximators are capable of anything other than fitting the data you give it. Then imagine willfully hyping up and scaring people who don't understand, and because it can predict words you take advantage of the human tendency to anthropomorphize, so it follows that it is something capable of generalized and adaptable intelligence.

Shame on all of the people involved in this: the people in these companies, the journalists who shovel shit (hope they get replaced real soon), researchers who should know better, and dementia ridden legislators.

So utterly predictable and slimy. All of those who are so gravely concerned about "alignment" in this context, give yourselves a pat on the back for hyping up science fiction stories and enabling regulatory capture.

◧◩
2. Yajiro+Y8[view] [source] 2023-05-16 12:08:33
>>srslac+I7
Who is to say that brains aren't just regression based function approximators?
◧◩◪
3. shaneb+V9[view] [source] 2023-05-16 12:13:37
>>Yajiro+Y8
Humanity isn't stateless.
◧◩◪◨
4. chpatr+ez[view] [source] 2023-05-16 14:23:24
>>shaneb+V9
Neither is text generation as you continue generating text.
◧◩◪◨⬒
5. shaneb+zC[view] [source] 2023-05-16 14:39:31
>>chpatr+ez
"Neither is text generation as you continue generating text."

LLM is stateless.

◧◩◪◨⬒⬓
6. chpatr+xE[view] [source] 2023-05-16 14:48:08
>>shaneb+zC
On a very fundamental level the LLM is a function from context to the next token but when you generate text there is a state as the context gets updated with what has been generated so far.
◧◩◪◨⬒⬓⬔
7. shaneb+fG[view] [source] 2023-05-16 14:56:46
>>chpatr+xE
"On a very fundamental level the LLM is a function from context to the next token but when you generate text there is a state as the context gets updated with what has been generated so far."

Its output is predicated upon its training data, not user defined prompts.

◧◩◪◨⬒⬓⬔⧯
8. chpatr+zH[view] [source] 2023-05-16 15:03:23
>>shaneb+fG
If you have some data and continuously update it with a function, we usually call that data state. That's what happens when you keep adding tokens to the output. The "story so far" is the state of an LLM-based AI.
[go to top]