zlacker

[parent] [thread] 1 comments
1. ben_w+(OP)[view] [source] 2023-07-07 12:54:50
> mean, as pointed out by a sibling comment, the reason it's so hard to shut those things down is that they benefit a lot of people and there's huge organic demand. Even the morality is hotly debated, there's no absolute consensus on the badness of those things

And mine is that this can also be true of a misaligned AI.

It doesn't have to be like Terminator, it can be slowly doing something we like and where we overlook the downsides until it's too late.

Doesn't matter if that's "cure cancer" but the cure has a worse than cancer side effect that only manifests 10 years later, or if it's a mere design for a fusion reactor where we have to build it ourselves and that leads to weapons proliferation, or if it's A/B testing the design for a social media website to make it more engaging and it gets so engaging that people choose not to hook up IRL and start families.

> But, what's new? That's the history of humanity. The communists, the Jacobins, the Nazis, all thought they were building a better world and had to have their "off switch" thrown at great cost in lives.

Indeed.

I would agree that this is both more likely and less costly than "everyone dies".

But I'd still say it's really bad and we should try to figure out in advance how to minimise this outcome.

> except the misaligned intelligences are called academics

Well, that's novel; normally at this point I see people saying "corporations", and very rarely "governments".

Not seen academics get stick before, except in history books.

replies(1): >>reveli+sp3
2. reveli+sp3[view] [source] 2023-07-08 13:15:52
>>ben_w+(OP)
> But I'd still say it's really bad and we should try to figure out in advance how to minimise this outcome.

For sure. But I don't see what's AI specific about it. If the AI doom scenario is a super smart AI tricking people into doing self destructive things by using clever words, then everything you need to do to vaccinate people against that is the same as if it was humans doing the tricking. Teaching critical thinking, self reliance, to judge arguments on merit and not on surface level attributes like complexity of language or titles of the speakers. All these are things our society objectively sucks at today, and we have a ruling class - including many of the sorts of people who work at AI companies - who are hellbent on attacking these healthy mental habits, and people who engage in them!

> Not seen academics get stick before, except in history books.

For academics you could also read intellectuals. Marx wasn't an academic but he very much wanted to be, if he lived in today's world he'd certainly be one of the most famous academics.

I'm of the view that corporations are very tame compared to the damage caused by runaway academia. It wasn't corporations that locked me in my apartment for months at a time on the back of pseudoscientific modelling and lies about vaccines. It wasn't even politicians really. It was governments doing what they were told by the supposedly intellectually superior academic class. And it isn't corporations trying to get rid of cheap energy and travel. And it's not governments convincing people that having children is immoral because of climate change. All these things are from academics, primarily in universities but also those who work inside government agencies.

When I look at the major threats to my way of life today, academic pseudo-science sits clearly at number 1 by a mile. To the extent corporations and governments are a threat, it's because they blindly trust academics. If you replace Professor of Whateverology at Harvard with ChatGPT, what changes? The underlying sources of mental and cultural weakness are the same.

[go to top]