zlacker

[return to "My AI skeptic friends are all nuts"]
1. gdubs+Z[view] [source] 2025-06-02 21:18:21
>>tablet+(OP)
One thing that I find truly amazing is just the simple fact that you can now be fuzzy with the input you give a computer, and get something meaningful in return. Like, as someone who grew up learning to code in the 90s it always seemed like science fiction that we'd get to a point where you could give a computer some vague human level instructions and get it more or less do what you want.
◧◩
2. csalle+z1[view] [source] 2025-06-02 21:22:05
>>gdubs+Z
It's mind blowing. At least 1-2x/week I find myself shocked that this is the reality we live in
◧◩◪
3. mentos+W4[view] [source] 2025-06-02 21:39:26
>>csalle+z1
It’s surreal to me been using ChatGPT everyday for 2 years, makes me question reality sometimes like ‘howtf did I live to see this in my lifetime’

I’m only 39, really thought this was something reserved for the news on my hospital tv deathbed.

◧◩◪◨
4. hattma+CS[view] [source] 2025-06-03 04:58:36
>>mentos+W4
Ok, but do you not remember IBM Watson beating the human players on Jeopardy in 2011? The current NLP based neural networks termed AI isn't so incredibly new. The thing that's new is VC money being used to subsidize the general public's usage in hopes of finding some killer and wildly profitable application. Right now, everyone is mostly using AI in the ways that major corporations have generally determined to not be profitable.
◧◩◪◨⬒
5. epicco+mA4[view] [source] 2025-06-04 13:45:44
>>hattma+CS
That's not entirely true though, the "Attention is All You Need" paper that first came up with the transformer architecture that would go on to drive all the popular LLMs of today came out in 2017. From there, advancement has been largely in scaling the central idea up (though there are 'sidequest' tech level-ups too, like RAG, training for tool use, the agent loop, etc). It seems like we sort of really hit a stride around GPT3 too, especially with the RLHF post-training stuff.

So there was at least some technical advancement mixed in with all the VC money between 2011 and today - it's not all just tossing dollars around. (Though of course we can't ignore that all this scaling of transformers did cost a ton of money).

[go to top]