zlacker

[return to "AI agents are starting to eat SaaS"]
1. jwr+rQ[view] [source] 2025-12-15 08:17:55
>>jnord+(OP)
I am the founder of a niche SaaS (https://partsbox.com/ — software for managing electronic parts inventory and production). While I am somewhat worried about AI capabilities, I'm not losing too much sleep over it.

The worry is that customers who do not realize the full depth of the problem will implement their own app using AI. But that happens today, too: people use spreadsheets to manage their electronic parts (please don't) and BOMs (bills of materials). The spreadsheet is my biggest competitor.

I've been designing and building the software for 10 years now and most of the difficulty and complexity is not in the code. Coding is the last part, and the easiest one. The real value is in understanding the world (the processes involved) and modeling it in a way that cuts a good compromise between ease of use and complexity.

Sadly, as I found out, once you spend a lot of time thinking and come up with a model, copycats will clone that (as well as they can, but superficially it will look similar).

◧◩
2. TeMPOr+H91[view] [source] 2025-12-15 11:05:28
>>jwr+rQ
The problem IMO is simpler.

You have a product, which sits between your users and what your users want. That product has an UI for users to operate. Many (most, I imagine) users would prefer to hire an assistant to operate that UI for them, since UI is not the actual value your service provides. Now, s/assistant/AI agent/ and you can see that your product turns into a tool call.

So the simpler problem is that your product now becomes merely a tool call for AI agents. That's what users want. Many SaaS companies won't like that, because it removes their advertising channel and commoditizes their product.

It's the same reason why API access to SaaS is usually restricted or not available for the users except biggest customers. LLMs defeat that by turning the entire human experience into an API, without explicit coding.

◧◩◪
3. MangoT+va1[view] [source] 2025-12-15 11:11:31
>>TeMPOr+H91
> Many (most, I imagine) users would prefer to hire an assistant to operate that UI for them, since UI is not the actual value your service provides

That's ridiculous. A good ui will improve on assistant in every way.

Do assistants have some use? Sure—querying.

◧◩◪◨
4. ben_w+Rd1[view] [source] 2025-12-15 11:37:13
>>MangoT+va1
> A good ui will improve on assistant in every way.

True.

"Good" UI seems to be in short supply these days, even from trillion dollar corporations.

But even with that, it is still not "ridiculous" for many to prefer to "hire an assistant to operate that UI for them". A lot of the complexity in UI is the balance between keeping common tasks highly visible without hiding the occasional-use stuff, allowing users to explore and learn more about what can be done without overwhelming them.

If I want a spaceship in Blender and don't care which one you get — right now the spaceship models that any GenAI would give you are "pick your poison" between Diffusion models' weirdness and the 3D equivalent of the pelican-on-a-bike weirdness — the easiest UI is to say (or type) "give me a spaceship", not doing all the steps by hand.

If you have some unformatted time series data and want to use it to forecast the next quarter, you could manually enter it into a spreadsheet, or you could say/type "here's a JPG of some time series data, use it to forecast the next quarter".

Again, just to be clear, I agree with everyone saying current AI is only mediocre in performance, it does make mistakes and shouldn't be relied upon yet. But the error rates are going down, the task horizons they don't suck at are going up. I expect the money to run out before they get good enough to take on all SaaS, but at the same time they're already good enough to be interesting.

[go to top]