Not saying I agree that being closed source is in the public good, although one could certainly argue that accelerating the efforts of bad actors to catch up would not be a positive.
Not really. It slows down like security over obscurity. It needs to be open that we know the real risks and we have the best information to combat it. Otherwise, someone who does the same in closed matter, has better chances to get advantage when misusing it.
If OpenAI can do it, I would not say that that is very unlikely for someone else to do the same. Open or not. The best chance is still that we prepare with the best available information.
I mean, you spend a lot of time in your own life denying the inevitable, humans spend a lot of time and effort avoiding their own personal extinction.
>The best chance is still
The best information we have now is if we create AGI/ASI at this time, we all die. The only winning move is not to play in that game.
We can still unplug or turn off the things. We are still very faraway from the situation where AI has some factories and full supply chain to control and take physical control of the world.
>Imagine a large pond that is completely empty except for 1 lily pad. The lily pad will grow exponentially and cover the entire pond in 3 years. In other words, after 1 month there will 2 lily pads, after 2 months there will be 4, etc. The pond is covered in 36 months
We're all going to be sitting around at 34 months saying "Look, it's been years and AI hasn't taken over that much of the market.