This thread has hundreds of comments where people are screaming that everyone needs to learn AI coding.
If it was such an edge would they not otherwise keep quiet?
Imagine that there was a serum that gives you superhuman strength only under specific conditions that you’re supposed to discover. Then there’s half room who screams that it should be banned, because it is cheating/fake/doesn’t work. And there’s another half room that swears by it, because they know how to utilize it properly.
You know it works and you don’t want to give up your secret sauce or make another half of the room stronger.
Let's all just muse some and imagine what the next cycle of this wheel will look like.
However real life does have illicit drugs that many people hype up and claim that they need.
Also real life has performance enhancement drugs that cause a host of medical issues.
Even drugs for medical necessity come with a list of side effects.
You could say it is a lack of imagination or not connecting the dots, but I think there is a more human reason. A lot of people don't want the disruption and are happy with the status quo. I'm a software engineer so I know how problematic AI may be for my job, but I think anyone who looks at our current state and the recent improvements should be able to see the writing on the wall here.
I for one am more curious than afraid of AI, because I have always felt that writing code was the worst part of being a programmer. I am much happier building product or solving interesting problems than tracking down elusive bugs or refactoring old codebases.
And it seems pretty obvious why. The benefits were clear and palpable. Communication was going to become a heck of a lot easier, faster, cheaper, barriers were being lowered.
There's no such qualitative advantage offered by GenAI, compared to the way we did things before. Web vs. pre-Web, the benefits were clear.
GenAI? Some execs claim it's making stuff cheaper, but it doesn't consider quality and long-term effects, plus it's spouted by those with no technological knowledge and with a reputation to long have cashed out and moved on by the time their actions crash a company. Plus, still nobody seems to have figured out how to make money (real money, not VC) off of this. Faster -- again, at what price to quality?
Then there's the predictions. We've been told for about three years now about the explosive rise in quality we'll see from GenAI output. I'm still waiting. The predictions of wider spread, higher speed and lower cost of the web sounded plausible, and they materialised. Comparatively, I see a lot of very well-reasoned arguments for the hypothesis that GenAI has peaked (for now) and this is pretty much as good as it's going to get, with source data sets exhausted and increasingly polluted by poor GenAI slop. So far, the trajectory makes me believe this scenario to be a lot more likely.
None of this seems remotely comparable to the Internet or web cases to me. The web certainly didn't feel like a hype to me in the 90s and I don't remember anyone having had that view.