zlacker

[return to "OpenClaw – Moltbot Renamed Again"]
1. voodoo+7l[view] [source] 2026-01-30 08:59:45
>>ed+(OP)
So i feel like this might be the most overhyped project in the past longer time.

I don't say it doesn't "work" or serves a purpose - but well i read so much about this beein an "actual intelligence" and stuff that i had to look into the source.

As someone who spends actually a definately to big portion of his free time researching thought process replication and related topics in the realm of "AI" this is not really more "ai" than any other so far.

Just my 3 cents.

◧◩
2. xnorsw+xA[view] [source] 2026-01-30 11:14:34
>>voodoo+7l
I've long said that the next big jump in "AI" will be proactivity.

So far everything has been reactive. You need to engage a prompt, you need to ask Siri or ask claude to do something. It can be very powerful once prompted, but it still requires prompting.

You always need to ask. Having something always waiting in the background that can proactively take actions and get your attention is a genuine game-changer.

Whether this particular project delivers on that promise I don't know, but I wouldn't write off "getting proactivity right" as the next big thing just because under the hood it's agents and LLMs.

◧◩◪
3. Charli+L41[view] [source] 2026-01-30 14:38:25
>>xnorsw+xA
> ...delivers on that promise

Incidentally, there's a key word here: "promise" as in "futures".

This is core of a system I'm working on at the moment. It has been underutilized in the agent space and a simple way to get "proactivity" rather than "reactivity".

Have the LLM evaluate whether an output requires a future follow up, is a repeating pattern, is something that should happen cyclically and give it a tool to generate a "promise" that will resolve at some future time.

We give the agent a mechanism to produce and cancel (if the condition for a promise changes) futures. The system that is resolving promises is just a simple loop that iterates over a list of promises by date. Each promise is just a serialized message/payload that we hand back to the LLM in the future.

[go to top]