zlacker

[return to "Sam Altman goes before US Congress to propose licenses for building AI"]
1. kypro+Ig[view] [source] 2023-05-16 12:51:14
>>vforgi+(OP)
While I'd agree with sentiment in this threat that GPT-4 and current AI models are not dangerous yet, I guess what I don't understand is why so many people here believe we should allow private companies to continue to develop the technology until someone develops something dangerous?

Those here who don't believe AI should be regulated, do you not believe AI can be dangerous? Is that you believe a dangerous AI is so far away that we don't need to start regulating now?

Do you accept that if someone develops a dangerous AI tomorrow there's no way to travel back in time and retroactively regulate development?

It just seems so obvious to me that there should be oversight in the development of a potentially dangerous technology that I can't understand why people would be against it. Especially for arguments as weak as "it's not dangerous yet".

◧◩
2. sledge+Mh[view] [source] 2023-05-16 12:57:18
>>kypro+Ig
They are already dangerous in the way they cause global anxiety and fear in people, and also because the effects of their usage to the economy and real lives of the people are unpredictable.

AI needs to be regulated and controlled, the alternative is chaos.

Unfortunately the current demented fossile and greedy monopolist lead system is most likely incapable of creating a sane & fair environment for the development of AI. I can only hope I'm wrong.

◧◩◪
3. wkat42+iK[view] [source] 2023-05-16 15:16:15
>>sledge+Mh
People always have been afraid of change. There's some religious villages in the Netherlands where the train station is way outside town because they didn't want this "devil's invention" there :P and remember how much people bitched about mobile phones. Or earlier, manual laborers were angry about industrialisation. Now we're happy that we don't have to do that kind of crap anymore.

Very soon they'll be addicted to AI like every other major change.

[go to top]