nothing I’ve seen from OpenAI is any indication that they’re close to AGI. gpt models are basically a special matrix transformation on top of a traditional neural network running on extremely powerful hardware trained on a massive dataset. this is possibly more like “thinking” than a lot of people give it credit for, but it’s not an AGI, and it’s not an AGI precursor either. it’s just the best applied neural networks that we currently have
>>august+(OP)
As a layperson, what does the special matrix transformation do? Is that the embeddings thing, or something else entirely? Something about transformer architecture I guess?