zlacker

[parent] [thread] 0 comments
1. iliane+(OP)[view] [source] 2023-05-16 15:56:42
I don't think we need sentient AI for it to be autonomous. LLMs are powerful cognitive engines and weak knowledge engines. Cognition on its own does not allow them to be autonomous, but because they can use tools (APIs, etc.) they are able to have some degree of autonomy when given a task and can use basic logic to follow them through/correct their mistakes.

AutoGPTs and the likes are much overhyped (it's early tech experiments after all) and have not produced anything of value yet but having dabbled with autonomous agents, I definitely see a not so distant future when you can outsource valuable tasks to such systems.

[go to top]