zlacker

[return to "Cloudlflare builds OAuth with Claude and publishes all the prompts"]
1. stego-+6b[view] [source] 2025-06-02 15:27:21
>>gregor+(OP)
On the one hand, I would expect LLMs to be able to crank out such code when prompted by skilled engineers who also understand prompting these tools correctly. OAuth isn’t new, has tons of working examples to steal as training data from public projects, and in a variety of existing languages to suit most use cases or needs.

On the other hand, where I remain a skeptic is this constant banging-on that somehow this will translate into entirely new things - research, materials science, economies, inventions, etc - because that requires learning “in real time” from information sources you’re literally generating in that moment, not decades of Stack Overflow responses without context. That has been bandied about for years, with no evidence to show for it beyond specifically cherry-picked examples, often from highly-controlled environments.

I never doubted that, with competent engineers, these tools could be used to generate “new” code from past datasets. What I continue to doubt is the utility of these tools given their immense costs, both environmentally and socially.

◧◩
2. btown+Qf[view] [source] 2025-06-02 15:54:41
>>stego-+6b
It's said that much of research is data janitorial work, and from my experience that's not just limited to the machine learning space. Every research scientist wishes that they had an army of engineers to build bespoke tooling for their niche, so they could get back to trying ideas at the speed of thought rather than needing to spend a day writing utility functions for those tools and poring over tables to spot anomalies. Giving every researcher a priceless level of leverage is a tremendous social good.

Of course, we won't be able to tell the real effects, now, because every longitudinal study of researchers will now be corrupted by the ongoing evisceration of academic research in the current environment. Vibe-coding won't be a net creativity gain to a researcher affected by vibe-immigration-policy, vibe-grant-availability, and vibe-firings, for all of which the unpredictability is a punitive design goal.

Whether fear of LLMs taking jobs has contributed to a larger culture of fear and tribalism that has emboldened anti-intellectual movements worldwide, and what the attributable net effect on research and development will be... it's incredibly hard to quantify.

◧◩◪
3. stego-+Yh[view] [source] 2025-06-02 16:07:22
>>btown+Qf
> Vibe-coding won't be a net creativity gain to a researcher affected by vibe-immigration-policy, vibe-grant-availability, and vibe-firings, for all of which the unpredictability is a punitive design goal.

Quite literally this is what I’m trying to get at with my resistance to LLM adoption in the current environment. We’re not using it to do hard work, we’re throwing it everywhere in an intentional decision to dumb down more people and funnel resources and control into fewer hands.

Current AI isn’t democratizing anything, it’s just a shinier marketing ploy to get people to abandon skilled professions and leave the bulk of the populace only suitable for McJobs. The benefits of its use are seen by vanishingly few, while its harms felt by distressingly many.

At present, it is a tool designed to improve existing neoliberal policies and wealth pumps by reducing the demand for skilled labor without properly compensating those affected by its use, nor allowing an exit from their walled gardens (because that is literally what all these XaaS AI firms are - walled gardens of pattern matchers masquerading as intelligence).

◧◩◪◨
4. cpursl+4P[view] [source] 2025-06-02 19:49:08
>>stego-+Yh
That’s one perspective, but it’s wrong and typical gatekeeping (do you have a software degree by any chance?). People had the same attitude towards open source tooling and low code frameworks - god forbid someone not certified and ordained build a solution in something other than Java...

AI code tools are allowing people to build things they couldn't before due to lack of skillset, time or budget. I’ve seen all sorts of problems solved by semi technical and even non-technical people. My brother for example built a thing with Microsoft copilot that helped automate more in his manufacturing facility (used to be paper).

But yeah, keep yelling at that cloud - the rest of us will keep shipping cool things that we couldn’t before, and faster.

◧◩◪◨⬒
5. sithar+511[view] [source] 2025-06-02 21:12:21
>>cpursl+4P
The problem isn't that people can quickly prototype an idea that they've had without contracting an expensive professional, I think this is great. This will give ideas that would never see the light of day a chance. Plus this gives a much better talking point if they do choose to get a professional onboard.

The problem is that it's sold as a complete solution. Use the LLM and you'll get a fully working product. However if you're not an experienced programmer you won't know what's missing, if it's using outdated and insecure options, or is just badly written. This still needs a professional.

The technology is great and it has real potential to change how things are made, but it's being marketed as something it isn't (yet).

◧◩◪◨⬒⬓
6. kenton+A21[view] [source] 2025-06-02 21:22:53
>>sithar+511
> However if you're not an experienced programmer you won't know what's missing, if it's using outdated and insecure options, or is just badly written. This still needs a professional.

I think a lot of this could be solved by a platform that implements appropriate guardrails so that the application code literally cannot screw up the security. Not every conceivable type of software would fit in such a platform, but a lot of what people want to do to automate their day-to-day lives could.

[go to top]