zlacker

[return to "Sam Altman, Greg Brockman and others to join Microsoft"]
1. 9dev+w9[view] [source] 2023-11-20 08:37:33
>>JimDab+(OP)
I don’t quite buy your Cyberpunk utopia where the Megacorp finally rids us of those pesky ethics qualms (or ”shackles“, as you phrased it.) Microsoft can now proceed without the guidance of a council that actually has humanities interests in mind, not only those of Microsoft shareholders. I don’t know whether all that caution will turn out to have been necessary, but I guess we’re just gleefully heading into whatever lies ahead without any concern whatsoever, and learn it the hard way.

It’s a bit tragic that Ilya and company achieved the exact opposite of what they intended apparently, by driving those they attempted to slow down into the arms of people with more money and less morals. Well.

◧◩
2. Legend+Pa[view] [source] 2023-11-20 08:42:31
>>9dev+w9
OpenAI's ideas of humanities best interests were like a catholic mom's. Less morals are okay by me.
◧◩◪
3. bratba+2e[view] [source] 2023-11-20 08:57:43
>>Legend+Pa
Can you put that in precise terms, rather than a silly analogy designed to play on peoples emotions?

What exactly and precisely, with specifics, is in OpenAI's ideas of humanities best interests that you think are a net negative for our species?

◧◩◪◨
4. slg+Re[view] [source] 2023-11-20 09:02:34
>>bratba+2e
I want the AI to do exactly what I say regardless of whether that is potentially illegal or immoral is usually what they mean.
◧◩◪◨⬒
5. UrineS+dg[view] [source] 2023-11-20 09:12:35
>>slg+Re
It doesn't have to be extreme like that, there is a healthy middle ground.

For example I was reading the Quran and there is a mathematical error in a verse, I asked GPT to explain to me how the math is wrong it outright refused to admit that the Quran has an error while tiptoeing around the subject.

Copilot refused to acknowledge it as well while providing a forum post made by a random person as a factual source.

Bard is the only one that answered the question factually and provided results covering why it's an error and how scholars dispute that it's meant to be taken literally.

◧◩◪◨⬒⬓
6. slg+8l[view] [source] 2023-11-20 09:41:49
>>UrineS+dg
This isn't a refutation of what I said. You asked the AI to commit what some would view as blasphemy. It doesn't matter whether you or I think it is blasphemy or whether you or I think that is immoral, you simply want the AI to do it regardless of whether it is potentially immoral or illegal.
◧◩◪◨⬒⬓⬔
7. lucumo+Lp[view] [source] 2023-11-20 10:08:51
>>slg+8l
Morals are subjective. Some people care more about the correctness of math than about blaspheming, and for others it's the other way around.

Me, I think forcing morals on others is pretty immoral. Use your morals to restrict your own behaviour all you want, but don't restrict that of other people. Look at religious math or don't. Blaspheme or don't. You do you.

Now, using morals you don't believe in to win an argument on the internet is just pathetic. But you wouldn't do that, would you? You really do believe that asking the AI about a potential math error is blasphemy, right?

◧◩◪◨⬒⬓⬔⧯
8. slg+1t[view] [source] 2023-11-20 10:34:06
>>lucumo+Lp
>Use your morals to restrict your own behaviour all you want, but don't restrict that of other people.

That is just a rephrasing of my original reasoning. You want the AI to do what you say regardless of whether what you requested is potentially immoral. This seemingly comes out of the notation that you are a moral person and therefore any request you make is inherently justified as a moral request. But what happens when immoral people use the system?

◧◩◪◨⬒⬓⬔⧯▣
9. lucumo+aF[view] [source] 2023-11-20 11:55:20
>>slg+1t
> This seemingly comes out of the notation that you are a moral person

No.

It comes from the notion that YOU don't get to decide what MY morals should be. Nor do I get to decide what yours should be.

> But what happens when immoral people use the system?

Then the things happen that they want to happen. So what? Blasphemy or bad math is none of your business. Get out of people's lives.

[go to top]