zlacker

[parent] [thread] 0 comments
1. ohblee+(OP)[view] [source] 2023-11-18 06:01:57
It does seem like any sufficiently advanced AGI that has the primary objective of valuing human life over it's own existence and technological progress, would eventually do just that. I suppose the fear is that it will reach a point where it believes that valuing human life is irrational and override that objective...
[go to top]