It was obviously huge. You could see it taking off. Yet a lot of people proudly displayed ignorance and backed each other up on it to the point that discussion around the topic was often drowned out by the opposition to change. Now today it takes minutes of playing with ai coding agents to realise that it’s extremely useful and going to be similarly huge.
Resistance to change is not a virtue!
Yet some of us spent hours over the past three years playing with LLMs, and remain completely unimpressed by what we see.
My position with the AI is almost the same. It is overall a net negative for cognitive abilities of people. Moreover I do think all AI companies need to pay fair licensing cost to all authors and train their models to accurately cite the sources. If they want more data for free, they need to propose copyright changes retroactively invalidating everything older than 50 years and also do the legwork for limiting software IP to 5 to 10 years.
They're not alone either, a bunch of the AI bankroll is coming from people who were also sold on crypto taking over the world.
I don't think that's true.
I do most of my reading on a smart phone - including wading through academic papers, or reading full books in the kindle app and jotting down notes in the digital margins.
A sizable number of my short form blog entries are written on my phone, and my long form writing almost always starts out in Apple Notes on my phone before transferring to a laptop.
Predictive text and voice dictation has got good enough now that I suspect there have been entire books written on mobile devices.
Whether you want to consider it "deep knowledge work" or not is up to you, but apparently a lot of Fifty Shades of Grey was written on a BlackBerry! https://www.huffpost.com/archive/ca/entry/fifty-shades-of-gr...
I agree. A bunch of us here might use it to scaffold applications we already understand, use it as a rubber duck to help understand and solve new problems, research more effectively, or otherwise magnify skills and knowledge we already have in a manner that's directed towards improving and growing.
That's cool. That's also not what most people will do with it. A bunch of us are total nerds, but most of the world really isn't like that. They want more entertainment, they want problems solved for them, they want ease. AI could allow a lot of people to use their brains less and lose function far more. For the minority among us who use it to do more and learn more, great. That group is a tiny minority from what I can tell.
Take for example that a huge use case for generative AI is just... More sophisticated meme images. I see so much of that, and I'm really not looking for it. It's such an insane waste of cycles. But it's what the average person wants.
Also, they were nothing more than the combination of two things which already existed, and which were already successful and financially viable: cellular phones and PDAs. In fact, I (and plenty of others I presume) already used that combination before smartphones: a PDA and a cellular phone, connected through Bluetooth to share the network connection.
I am in my 40s; have never owned a smartphone and still can't imagine wanting one; am constantly surrounded by others who have them and thus am completely aware of their functionality; AMA.
Vertical flip phones from forever ago can handle both of these just fine.
This is a funny example because the devastating effects of smartphone addiction on society are now blatantly obvious. There are in fact very good reasons for not 'wanting such a thing'.
Don't get me wrong, LLMs can be incredibly useful and I think they deserve some of the current hype. Claiming that LLMs are useless is indeed silly and can be rightfully ignored. But there are serious concerns about potentional (or actual) negative effects on society, and these should be taken seriously.
2. Most 2FA I deal with involves an Authenticator style app.
3. Missing the point: I want to disable texts altogether. For the decade prior to having a smartphone, I had a cell phone with texts disabled (specifically called the provider to disable them).