zlacker

[return to "Ilya Sutskever to leave OpenAI"]
1. zoogen+Ix[view] [source] 2024-05-15 04:50:43
>>wavela+(OP)
Interesting, both Karpathy and Sutskever are gone from OpenAI now. Looks like it is now the Sam Altman and Greg Brockman show.

I have to admit, of the four, Karpathy and Sutskever were the two I was most impressed with. I hope he goes on to do something great.

◧◩
2. nabla9+pH[view] [source] 2024-05-15 06:45:38
>>zoogen+Ix
Top 6 science guys are long gone. Open AI is run by marketing, business, software and productization people.

When the next wave of new deep learning innovations sweeps the world, Microsoft eats whats left of them. They make lots of money, but don't have future unless they replace what they lost.

◧◩◪
3. fsloth+O21[view] [source] 2024-05-15 10:40:27
>>nabla9+pH
If we look at history of innovation and invention it’s very typical the original discovery and final productization are done by different people. For many reasons, but a lot of them are universal I would say.

E.g. Oppenheimer’s team created the bomb, then following experts finetuned the subsequent weapon systems and payload designs. Etc.

◧◩◪◨
4. fprog+I51[view] [source] 2024-05-15 11:12:12
>>fsloth+O21
Except OpenAI hasn’t yet finished discovery on its true goal: AGI. I wonder if they risk plateauing at a local maximum.
◧◩◪◨⬒
5. Zambyt+hc1[view] [source] 2024-05-15 11:58:23
>>fprog+I51
I'm genuinely curious: what do you expect an "AGI" system to be able to do that we can't do with today's technology?
◧◩◪◨⬒⬓
6. Symmet+Rn1[view] [source] 2024-05-15 13:11:39
>>Zambyt+hc1
A working memory that can preserve information indefinitely outside a particular context window and which can engage in multi-step reasoning that doesn't show up in its outputs.

GPT4o's context window is 128k tokens which is somewhere on the order of 128kB. Your brain's context window, all the subliminal activations from the nerves in your gut and the parts of your visual field you aren't necessarily paying attention to is on the order of 2MB. So a similar order of magnitude though GPT has a sliding window and your brain has more of an exponential decay in activations. That LLMs can accomplish everything they do just with what seems analogous to human reflex rather than human reasoning is astounding and more than a bit scary.

◧◩◪◨⬒⬓⬔
7. datame+7F1[view] [source] 2024-05-15 14:36:17
>>Symmet+Rn1
I'm curious what resources led you to calculate a 2MB context window, I'd like to learn more.
◧◩◪◨⬒⬓⬔⧯
8. Symmet+FM1[view] [source] 2024-05-15 15:10:19
>>datame+7F1
Looking up an estimate of the brain's input bandwidth at 10 million bits per second and multiplying by the second or two a subliminal stimuli can continue to affect a person's behavior. This is a very crude estimate and probably an order of magnitude off, but I don't think many orders of magnitude off.
[go to top]