zlacker

[return to "Cursor IDE support hallucinates lockout policy, causes user cancellations"]
1. jgb198+Wg4[view] [source] 2025-04-15 23:01:44
>>scared+(OP)
LLM anything makes me queasy. Why would any self respecting software developer use this tripe? Learn how to write good software. Become an expert in the trade. AI anything will only dig a hole for software to die in. Cheapens the product, butchers the process and absolutely decimates any hope for skill development for future junior developers.

I'll just keep chugging along, with debian, python and vim, as I always have. No LLM, no LSP, heck not even autocompletion. But damn proud of every hand crafted, easy to maintain and fully understood line of code I'll write.

◧◩
2. callc+Qj4[view] [source] 2025-04-15 23:24:07
>>jgb198+Wg4
I’m pretty much in the same boat as you, but here’s one place that LLMs helped me:

In python I was scanning 1000’s of files each for thousands of keywords. A naive implementation took around 10 seconds, obviously the largest share of execution time after running instrumentation. A quick ChatGPT led me to Aho-Corasick and String searching algorithms, which I had never used before. Plug in a library and bam, 30x speed up for that part of the code.

I could have asked my knowledgeable friends and coworkers, but not at 11PM on a Saturday.

I could have searched the web and probably found it out.

But the LLM basically auto completed the web, which I appreciate.

◧◩◪
3. valent+No4[view] [source] 2025-04-16 00:09:05
>>callc+Qj4
I mean, even in the absence of knowledge of the existence of text searching algorithms (where I'm from we learn that in university) just a simple web search would have gotten you there as well no? Maybe would have taken a few minutes longer though.
◧◩◪◨
4. callc+za6[view] [source] 2025-04-16 15:32:08
>>valent+No4
Extremely likely, yes. In this case, since it was an unknown unknown at the time, the LLM nicely explaining that this class of algorithms exists was nice, then I could immediately switch to Wikipedia to learn more (and be sure of the underlying information)

I think of LLMs as an autocomplete of the web plus hallucinations. Sometimes it’s faster to use the LLM initially rather than scour through a bunch of sites first.

[go to top]