To expound, this just seems like a power grab to me, to "lock in" the lead and keep AI controlled by a small number of corporations that can afford to license and operate the technologies. Obviously, this will create a critical nexus of control for a small number of well connected and well heeled investors and is to be avoided at all costs.
It's also deeply troubling that regulatory capture is such an issue these days as well, so putting a government entity in front of the use and existence of this technology is a double whammy — it's not simply about innovation.
The current generation of AIs are "scary" to the uninitiated because they are uncanny valley material, but beyond impersonation they don't show the novel intelligence of a GPI... yet. It seems like OpenAI/Microsoft is doing a LOT of theater to try to build a regulatory lock in on their short term technology advantage. It's a smart strategy, and I think Congress will fall for it.
But goodness gracious we need to be going in the EXACT OPPOSITE direction — open source "core inspectable" AIs that millions of people can examine and tear apart, including and ESPECIALLY the training data and processes that create them.
And if you think this isn't an issue, I wrote this post an hour or two before I managed to take it live because Comcast went out at my house, and we have no viable alternative competitors in my area. We're about to do the same thing with AI, but instead of Internet access it's future digital brains that can control all aspects of a society.
True open source AI also strikes me as prerequisite for fair use of original works in training data. I hope Congress asks ClosedAI to explain what’s up with all that profiting off copyrighted material first before even considering the answer.
What people also fail to understand is that AI is largely seen by the military industrial complex as a weapon to control culture and influence. The most obvious risk of AI — the risk of manipulating human behavior towards favored ends — has been shown to be quite effective right out the gate. So, the back channel conversation has to be to put it under regulation because of it's weaponization potential, especially considering the difficulty in identifying anyone (which of course is exactly what Elon is doing with X 2.0 — it's a KYC id platform to deal with this exact issue with a 220M user 40B head start).
I mean, the dead internet theory is turning true, and half the traffic on the Web is already bot driven. Imagine when it's 99%, as proliferation of this technology will inevitably generate simply for the economics.
Starting with open source is the only way to get enough people looking at the products to create any meaningful oversight, but I fear the weaponization fears will mean that everything is locked away in license clouds with politically influential regulatory boards simply on the proliferation arguments. Think of all the AI technologists who won't be versed in this technology unless they work at a "licensed company" as well — this is going to make the smaller population of the West much less influential in the AI arms race, which is already underway.
To me, it's clear that nobody in Silicon Valley or the Hill has learned a damn thing from the prosecution of hackers and the subsequent bloodbath of cybersecurity as a result of the exact same kinds of behavior back in the early to mid-2000s. We ended up driving out best and brightest into the grey and black areas of infosec and security, instead of out in the open running companies where they belong. This move would do almost the exact same thing to AI, though I think you have to be a tad of an Asimov or Bradbury fan to see it right now.
I don't know, that's just how I see it, but I'm still forming my opinions. LOVE LOVE LOVE your comment though. Spot on.
Relevant articles:
https://www.independent.co.uk/tech/internet-bots-web-traffic...
https://theconversation.com/ai-can-now-learn-to-manipulate-h....
Could you share the minutes from the Military Industrial Complex strategy meetings this was discussed at. Thanks.
[pause]
"No? Ok, I'll tell him."