zlacker

[return to "Show HN: NanoClaw – “Clawdbot” in 500 lines of TS with Apple container isolation"]
1. popcor+lb[view] [source] 2026-02-02 00:24:58
>>jimmin+(OP)
> running it scares the crap out of me

A hundred times this. It's fine until it isn't. And jacking these Claws into shared conversation spaces is quite literally pushing the afterburners to max on simonw's lethal trifecta. A lot of people are going to get burned hard by this. Every blackhat is eyes-on this right now - we're literally giving a drunk robot the keys to everything.

◧◩
2. Tactic+qf[view] [source] 2026-02-02 01:04:18
>>popcor+lb
I understand that things can go wrong and there can be security issues, but I see at least two other issues:

1. what if, ChadGPT style, ads are added to the answers (like OpenAI said it'd do, hence the new "ChadGPT" name)?

2. what if the current prices really are unsustainable and the thing goes 10x?

Are we living some golden age where we can both query LLMs on the cheap and not get ad-infected answers?

I read several comments in different threads made by people saying: "I use AI because search results are too polluted and the Web is unusable"

And I now do the same:

"Gemini, compare me the HP Z640 and HP Z840 workstations, list the features in a table" / "Find me which Xeon CPU they support, list me the date and price of these CPU when they were new and typical price used now".

How long before I get twelve ads along with paid vendors recommendations?

◧◩◪
3. spider+kk[view] [source] 2026-02-02 01:49:18
>>Tactic+qf
> what if the current prices really are unsustainable and the thing goes 10x?

Where does this idea come from? We know how much it costs to run LLMs. It's not like we're waiting to find out. AI companies aren't losing money on API tokens. What could possibly happen to make prices go 10x when they're already running at a profit? Claude Max might be a different story, but AI is going to get cheaper to run. Not randomly 10x for the same models.

◧◩◪◨
4. overga+7p[view] [source] 2026-02-02 02:35:00
>>spider+kk
From what I've read, every major AI player is losing a (lot) of money on running LLMs, even just with inference. It's hard to say for sure because they don't publish the financials (or if they do, it tends to be obfuscated), but if the screws start being turned on investment dollars they not only have to increase the price of their current offerings (2x cost wouldn't shock me), but some of them also need a (massive) influx of capital to handle things like datacenter build obligations (10s of billions of dollars). So I don't think it's crazy to think that prices might go up quite a bit. We've already seen waves of it, like last summer when Cursor suddenly became a lot more expensive (or less functional, depending on your perspective)
◧◩◪◨⬒
5. rainco+JL[view] [source] 2026-02-02 06:46:41
>>overga+7p
> From what I've read, every major AI player is losing a (lot) of money on running LLMs, even just with inference.

> It's hard to say for sure because they don't publish the financials (or if they do, it tends to be obfuscated)

Yeah, exactly. So how the hell the bloggers you read know AI players are losing money? Are they whistleblowers? Or they're pulling numbers out of their asses? Your choice.

[go to top]