zlacker

[parent] [thread] 3 comments
1. august+(OP)[view] [source] 2023-11-18 12:23:58
nothing I’ve seen from OpenAI is any indication that they’re close to AGI. gpt models are basically a special matrix transformation on top of a traditional neural network running on extremely powerful hardware trained on a massive dataset. this is possibly more like “thinking” than a lot of people give it credit for, but it’s not an AGI, and it’s not an AGI precursor either. it’s just the best applied neural networks that we currently have
replies(2): >>keepam+x >>calf+22
2. keepam+x[view] [source] 2023-11-18 12:27:21
>>august+(OP)
I'm not saying OpenAI is close. Collectively we are tho. The train is rolling, unstoppable momentum. We just have to wait.
3. calf+22[view] [source] 2023-11-18 12:36:58
>>august+(OP)
As a layperson, what does the special matrix transformation do? Is that the embeddings thing, or something else entirely? Something about transformer architecture I guess?
replies(1): >>august+o6q
◧◩
4. august+o6q[view] [source] [discussion] 2023-11-26 04:07:09
>>calf+22
https://nlp.seas.harvard.edu/2018/04/03/attention.html
[go to top]