zlacker

[parent] [thread] 12 comments
1. phkahl+(OP)[view] [source] 2025-06-03 01:02:47
>> Blaming LLM hallucinations on the programming language?

My favorite was suggesting that people select the programming language based of which ones LLMs are best at. People who need an LLM to write code might do that, but no experienced developer would. There are too many other legitimate considerations.

replies(3): >>mediam+Y3 >>stevek+Z4 >>SoftTa+O6
2. mediam+Y3[view] [source] 2025-06-03 01:40:57
>>phkahl+(OP)
If an LLM improves coding productivity, and it is better at one language than another, then at the margin it will affect which language you may choose.

At the margin means that both languages, or frameworks or whatever, are reasonably appropriate for the task at hand. If you are writing firmware for a robot, then the LLM will be less helpful, and a language such as Python or JS which the LLM is good at is useless.

But Thomas's point is that arguing that LLMs are not useful for all languages is not the same as saying that are not useful for any language.

If you believe that LLM competencies are not actually becoming drivers in what web frameworks people are using, for example, you need to open your eyes and recognize what is happening instead of what you think should be happening.

(I write this as someone who prefers SvelteJS over React - but LLM's React output is much better. This has become kind of an issue over the last few years.)

replies(1): >>rapind+n5
3. stevek+Z4[view] [source] 2025-06-03 01:49:23
>>phkahl+(OP)
People make productivity arguments for using various languages all the time. Let's use an example near and dear to my heart: "Rust is not as productive as X, therefore, you should use X unless you must use Rust." If using LLMs makes Rust more productive than X, that changes this equation.

Feel free to substitute Y instead of Rust if you want, just I know that many people argue Rust is hard to use, so I feel the concreteness is a good place to start.

◧◩
4. rapind+n5[view] [source] [discussion] 2025-06-03 01:52:36
>>mediam+Y3
I'm a little (not a lot) concerned that this will accelerate the adoption of languages and frameworks based on their popularity and bury away interesting new abstractions and approaches from unknown languages and frameworks.

Taking your react example, then if we we're a couple years ahead on LLMs, jQuery might now be the preferred tool due to AI adoption through consumption.

You can apply this to other fields too. It's quite possible that AIs will make movies, but the only reliably well produced ones will be superhero movies... (I'm exaggerating for effect)

Could AI be the next Cavendish banana? I'm probably being a bit silly though...

replies(1): >>simonc+xs
5. SoftTa+O6[view] [source] 2025-06-03 02:09:53
>>phkahl+(OP)
Maybe they don’t today, or up until recently, but I’d believe it will be a consideration for new projects.

Is certainly true that at least some projects choose languages based on or at least influenced by how easy it is to hire developers fluent in that language.

◧◩◪
6. simonc+xs[view] [source] [discussion] 2025-06-03 06:16:04
>>rapind+n5
> I'm a little ... concerned that this will accelerate the adoption of languages and frameworks based on their popularity and bury away interesting new abstractions and approaches...

I'd argue that the Web development world has been choosing tooling based largely on popularity for like at least a decade now. I can't see how tooling selection could possibly get any worse for that section of the profession.

replies(1): >>rapind+X01
◧◩◪◨
7. rapind+X01[view] [source] [discussion] 2025-06-03 12:01:42
>>simonc+xs
I disagree. There’s a ton of diversity in web development currently. I don’t think there’s ever been so many language and framework choices to build a web app.

The argument is that we lose this diversity as more people rely on AI and choose what AI prefers.

replies(3): >>svacha+222 >>jhatem+Is2 >>simonc+MKa
◧◩◪◨⬒
8. svacha+222[view] [source] [discussion] 2025-06-03 18:16:45
>>rapind+X01
In the relatively near future this is going to be like arguing what CPU to buy based on how you like the assembly code. Human readability is going to matter less and less and eventually we will likely standardize on what the LLMs work with best.
◧◩◪◨⬒
9. jhatem+Is2[view] [source] [discussion] 2025-06-03 20:53:08
>>rapind+X01
You raise a valid concern but you presume that we will stay under the OpenAI/Anthropic/etc oligopoly forever. I don't think this is going to be the status quo in the long-term. There is demand for different types of LLMs trained on different data. And there is demand for hardware. For example the new Mac Studio has 512gb VRAM which can run the 600B param Deepseek model locally. So in the future I could see people training their own LLMs to be experts at their language/framework of choice.

Of course you could disagree with my prediction and that these big tech companies are going to build MASSIVE gpu farms the size of the Tesla Gigafactory which can run godlike AI where nobody can compete, but if we get to that point I feel like we will have bigger problems than "AI react code is better than AI solidjs code"

replies(1): >>rapind+LO2
◧◩◪◨⬒⬓
10. rapind+LO2[view] [source] [discussion] 2025-06-03 23:49:19
>>jhatem+Is2
I suspect we’ll plateau at some point and the gigafactories won’t produce a massive advantage. So running your own models could very well be a thing.
replies(1): >>jhatem+Ha3
◧◩◪◨⬒⬓⬔
11. jhatem+Ha3[view] [source] [discussion] 2025-06-04 04:52:42
>>rapind+LO2
Yea probably..... I wonder when the plateau is. Is it right around the corner or 10 years from now? Seems like they can just keep growing it forever, based on what Sam Altman is saying. I'm botching the quote but either he or George Hotz said something to the effect of: every time you add an order of magnitude to the size of the data, there is a noticeable qualitative difference in the output. But maybe past a certain size you get diminishing returns. Or maybe it's like Moore's Law where they thought it would just go on forever but it turned out it's extremely difficult to get the distance between two transistors smaller than 7nm
replies(1): >>mwarke+NI3
◧◩◪◨⬒⬓⬔⧯
12. mwarke+NI3[view] [source] [discussion] 2025-06-04 11:16:46
>>jhatem+Ha3
Yes, some think it’s happening now: https://www.aisnakeoil.com/p/is-ai-progress-slowing-down
◧◩◪◨⬒
13. simonc+MKa[view] [source] [discussion] 2025-06-07 00:48:03
>>rapind+X01
> There’s a ton of diversity in web development currently.

You misunderstand me. It's not incompatible for a culture to choose options based largely on popularity (rather than other properties that one would expect to be more salient when making a highly-technical choice), and for there to also be many options to choose from.

[go to top]