When I say alive, I mean it's like something to be that thing. The lights are on. It has subjective experience.
It seems many are defining ASI as just a really fast self learning computer. And while sure, given the wrong type of access and motive, that could be dangerous. But it isn't anymore dangerous than any other faulty software that has access to sensitive systems.
> But it isn't anymore dangerous than any other faulty software that has access to sensitive systems.
Seems to me that can be unboundedly dangerous? Like, I don't see you making an argument here that there's a limit to what kind of dangerous that class entails.