zlacker

[return to "Why software stocks are getting pummelled"]
1. latefo+G02[view] [source] 2026-02-02 19:39:18
>>peteth+(OP)
> The fear is that these [AI] tools are allowing companies to create much of the software they need themselves.

AI-generated code still requires software engineers to build, test, debug, deploy, secure, monitor, be on-call, support, handle incidents, and so on. That's very expensive. It is much cheaper to pay a small monthly fee to a SaaS company.

◧◩
2. mattma+072[view] [source] 2026-02-02 20:04:23
>>latefo+G02
A lot of these companies are not small monthly fees. And if you’ve ever worked with them, you’ll know that many of the tools they sell are an exact match for almost nobody’s needs.

So what happens is a corporation ends up spending a lot of money for a square tool that they have to hammer into a circle hole. They do it because the alternative is worse.

AI coding does not allow you to build anything even mildly complex with no programmers yet. But it does reduced by an order of magnitude the amount of money you need to spend on programming a solution that would work better.

Another thing AI enables is significantly lower switching costs. A friend of mine owned an in person and online retailer that was early to the game, having come online in the late 90s. I remember asking him, sometime around 2010, when his Store had become very difficult to use, why he didn’t switch to a more modern selling platform, and the answer was that it would have taken him years to get his inventory moved from one system to another. Modern AI probably could’ve done almost all of the work for him.

I can’t even imagine what would happen if somebody like Ford wanted to get off of their SAP or Oracle solution. A lot of these products don’t withhold access to your data but they also won’t provide it to you in any format that could be used without a ton of work that until recently would’ve required a large number of man hours

◧◩◪
3. datsci+Bj2[view] [source] 2026-02-02 20:57:35
>>mattma+072
Our company just went through an ERP transition and AI of all kinds was 0% helpful for the same reason it’s difficult for humans to execute: little to no documentation and data model mismatches.
◧◩◪◨
4. dehugg+Im2[view] [source] 2026-02-02 21:14:09
>>datsci+Bj2
surprising considering you just listed two primary use cases (exploring codebases/data models + creating documentation)
◧◩◪◨⬒
5. heavys+mt3[view] [source] 2026-02-03 03:01:06
>>dehugg+Im2
Please don't feed people LLM generated docs
◧◩◪◨⬒⬓
6. dehugg+Jd6[view] [source] 2026-02-03 20:02:37
>>heavys+mt3
i love the assumption by default that "ai generated" automatically excludes "human verified".

see, i actually read and monitor the outputs. i check them against my own internal knowledge. i trial the results with real trouble shooting and real bug fixes/feature requests.

when its wrong, i fix it. when its right, great we now have documentation where none existed before.

dogfood the documentation and you'll know if its worth using or not.

◧◩◪◨⬒⬓⬔
7. heavys+Bj7[view] [source] 2026-02-04 02:25:27
>>dehugg+Jd6
Literally several times a week, I have to close PRs with docs that clearly no one read because they are blatantly wrong. This happened after LLMs. If what you're claiming is happening, I'm not seeing it anywhere.

AI is incapable of capturing human context that 99.999% of the time exists in people's brains, not code or context. This is why it is crucial that humans write for humans, not an LLM that puts out docs that have the aesthetics of looking acceptable.

[go to top]