zlacker

[parent] [thread] 0 comments
1. astran+(OP)[view] [source] 2023-11-20 06:39:09
Exponential growth is not intrinsically a feature of an AGI except that you've decided it is. It's also almost certainly impossible.

Main problems stopping it are:

- no intelligent agent is motivated to improve itself because the new improved thing would be someone else, and not it.

- that costs money and you're just pretending everything is free.

[go to top]