zlacker

[return to "Show HN: NanoClaw – “Clawdbot” in 500 lines of TS with Apple container isolation"]
1. popcor+lb[view] [source] 2026-02-02 00:24:58
>>jimmin+(OP)
> running it scares the crap out of me

A hundred times this. It's fine until it isn't. And jacking these Claws into shared conversation spaces is quite literally pushing the afterburners to max on simonw's lethal trifecta. A lot of people are going to get burned hard by this. Every blackhat is eyes-on this right now - we're literally giving a drunk robot the keys to everything.

◧◩
2. Tactic+qf[view] [source] 2026-02-02 01:04:18
>>popcor+lb
I understand that things can go wrong and there can be security issues, but I see at least two other issues:

1. what if, ChadGPT style, ads are added to the answers (like OpenAI said it'd do, hence the new "ChadGPT" name)?

2. what if the current prices really are unsustainable and the thing goes 10x?

Are we living some golden age where we can both query LLMs on the cheap and not get ad-infected answers?

I read several comments in different threads made by people saying: "I use AI because search results are too polluted and the Web is unusable"

And I now do the same:

"Gemini, compare me the HP Z640 and HP Z840 workstations, list the features in a table" / "Find me which Xeon CPU they support, list me the date and price of these CPU when they were new and typical price used now".

How long before I get twelve ads along with paid vendors recommendations?

◧◩◪
3. spider+kk[view] [source] 2026-02-02 01:49:18
>>Tactic+qf
> what if the current prices really are unsustainable and the thing goes 10x?

Where does this idea come from? We know how much it costs to run LLMs. It's not like we're waiting to find out. AI companies aren't losing money on API tokens. What could possibly happen to make prices go 10x when they're already running at a profit? Claude Max might be a different story, but AI is going to get cheaper to run. Not randomly 10x for the same models.

◧◩◪◨
4. up-n-a+xz[view] [source] 2026-02-02 04:29:08
>>spider+kk
Where did u get this notion from? you must not be old enough to know how subscription services play out. Ask your parents about their internet or mobile billings. Or the very least check Azures, AWS, Netflix historical pricing.

Heck we were spoiled by “memory is cheap” but here we are today wasting it at every expense as prices keep skyrocketing (ps they ain’t coming back down). If you can’t see the shift to forceful subscriptions via technologies guised as “security” ie. secure boot and the monopolistic distribution (Apple, Google, Amazon) or the OEM, you’re running with blinders. Computings future as it’s heading will be closed ecosystems that are subscription serviced, mobile only. They’ll nickel and dime users for every nuanced freedom of expression they can.

Is it crazy to correlate the price of memory to our ability to localize LLM?

◧◩◪◨⬒
5. rainco+NK[view] [source] 2026-02-02 06:36:23
>>up-n-a+xz
> Ask your parents about their internet or mobile billings. Or the very least check Azures, AWS, Netflix historical pricing.

None of these went 10x. Actually the internet went 0.0001~0.001x for me in terms of bits/money. I lived through dial-up era.

[go to top]