You're literally painting a perfect analogy for biotech/nuclear/AI. Catastrophe and culture-shifting benefits go hand in hand with all of them. It's about figuring out where the lines are. But claiming there is minimal or negligible risk ("so let's just run with it" as some say, maybe not you) feels very cavalier to me.
But you're not alone, if you feel that way. I feel like I'm taking crazy pills with how the software dev field talks about sharing AI openly.
And I'm literally an open culture advocate for over a decade, and have helped hundreds of ppl start open community projects. If there's anyone who's be excited for open collaboration, it's me! :)