I appreciate the idea of being a "not-greedy typical company," but there's a reason you e.g. separate university type research or non-profits and private companies.
Trying to make up something in the middle is the exact sort of naivete you can ALWAYS expect from Silicon Valley.
I'd say yes, Sutsksever is... naive? though very smart. Or just utopian. Seems he couldn't get the scale he needed/wanted out of a university (or Google) research lab. But the former at least would have bounded things better in the way he would have preferred, from an ethics POV.
Jumping into bed with Musk and Altman and hoping for ethical non-profit "betterment of humanity" behaviour is laughable. Getting access to capital was obviously tempting, but ...
As for Altman. No, he's not naive. Amoral, and likely proud of it. JFC ... Worldcoin... I can't even...
I don't want either of these people in charge of the future, frankly.
It does point to the general lack of funding for R&D of this type of thing. Or it's still too early to be doing this kind of thing at scale. I dunno.
Bleak.
So I would say ChatGPT exists because its creators specifically transgressed the traditional division of universities vs industry. The fact that this transgressive structure is unstable is not surprising, at least in retrospect.
Indeed, the only other approach I can think of is a massive government project. But again with gov't bureaucracy, a researcher would be limited by legal issues of big data vs copyright, etc.--which many have pointed out that OpenAI again was able to circumvent when they basically used the entire Internet and all of humanity's books, etc., as their training source.
I think it at least remains to be seen as to whether "rampant copyright infringement" is necessarily a good thing here.
Honestly, this seems like a pretty good outcome.
Which is to say, I think the fearmongering sentient AI stuff is silly -- but I think we are all DEFINITELY better off with an ugly rough-and-tumble visible rocky start to the AI revolution.
Weed out the BS; equalize out who actually has access to the best stuff BEFORE some jerk company can scale up fast and dominate market share; let a de-facto "open source" market have a go at the whole thing.
And bleak because there doesn't seem to be an alternative where the people making these decisions are responsible to an electorate or public in some democratic fashion. Just a bunch of people with $$ and influence who set themselves up to be arbiters ... and
It's just might makes right.
And bleak because in this case the "mighty" are often the very people who made fun of arts students who took the philosophy and ethics classes in school that could at least offer some insight in these issues.
Now it's laughable, but OpenAI was founded in 2015. I don't know about Altman, but Musk was very respected at the time. He didn't start going off the deep end until 2017. "I'm motivated by... a desire to think about the future and not be sad," was something he said during a TED interview in 2017, and people mostly believed him.