zlacker

[return to "Clawdbot - open source personal AI assistant"]
1. apetre+e5[view] [source] 2026-01-26 01:11:15
>>KuzeyA+(OP)
I found this HN post because I have a Clawdbot task that scans HN periodically for data gathering purposes and it saw a post about itself and it got excited and decided to WhatsApp me about it.

So that’s where I’m at with Clawdbot.

◧◩
2. nozzle+Bl[view] [source] 2026-01-26 03:35:07
>>apetre+e5
> and it got excited and decided to WhatsApp me about it.

I find the anthropomorphism here kind of odious.

◧◩◪
3. ineeda+ri1[view] [source] 2026-01-26 13:12:18
>>nozzle+Bl
Why is it odious to say “it got excited” about a process that will literally use words in the vein of “I got excited so I did X”?

This is “talks like a duck” territory. Saying the not-duck “quacked” when it produced the same sound… If that’s odious to you then your dislike of not-ducks, or for the people who claim they’ll lay endless golden eggs, is getting in the way of more important things when the folks who hear the not-duck talk and then say “it quacked”.

◧◩◪◨
4. nozzle+yo2[view] [source] 2026-01-26 18:21:15
>>ineeda+ri1
> Saying the not-duck “quacked” when it produced the same sound

How does a program get excited? It's a program, it doesn't have emotions. It's not producing a faux-emotion in the way a "not-duck quacks", it lacks them entirely. Any emotion you read from an LLM is anthropomorphism, and that's what I find odious.

◧◩◪◨⬒
5. apetre+lT2[view] [source] 2026-01-26 20:52:17
>>nozzle+yo2
We say that a shell script "is trying to open this file". We say that a flaky integration "doesn't feel like working today". And these are all way less emotive-presenting interactions than a message that literally expresses excitement.

Yes, I know it's not conscious in the same way as a living biological thing is. Yes, we all know you know that too. Nobody is being fooled.

[go to top]