zlacker

[parent] [thread] 10 comments
1. jvande+(OP)[view] [source] 2024-05-17 15:45:53
Honestly, having a "Long term AI risk" team is a great idea for an early stage startup claiming to build General AI. It looks like they are taking the mission and risks seriously.

But for a product-focused LLM shop trying to infuse into everything, it makes sense to tone down the hype.

replies(1): >>nprate+U4
2. nprate+U4[view] [source] 2024-05-17 16:14:37
>>jvande+(OP)
It makes it look like the tech is so rad it's dangerous. Total bollocks, but great marketing.
replies(1): >>reduce+a7
◧◩
3. reduce+a7[view] [source] [discussion] 2024-05-17 16:27:35
>>nprate+U4
Ilya and Jan Leike[0] resigned (were fired) because they believed their jobs were a temporary marketing expense? Or maybe you think you understand the risks of AGI better than them, the creators of the frontier models?

Do you think this is a coherent world view? Compared to the other one staring you in the face? I'll leave it to the reader whether they want to believe this conspiratorial take in line with profit-motive instead of the scientists saying:

“Currently, we don't have a solution for steering or controlling a potentially superintelligent AI, and preventing it from going rogue.”

[0] https://scholar.google.co.uk/citations?user=beiWcokAAAAJ&hl=...

replies(3): >>nprate+ma >>jvande+Vi >>tim333+RQ
◧◩◪
4. nprate+ma[view] [source] [discussion] 2024-05-17 16:45:39
>>reduce+a7
People can believe whatever they like. It doesn't make them right.

The flaw is in your quote: there is no "super-intelligent AI". We don't have AGI, and given they were coming out with this a few years ago (GPT2?) it's laughable.

They're getting way ahead of themselves.

replies(1): >>reduce+Nd
◧◩◪◨
5. reduce+Nd[view] [source] [discussion] 2024-05-17 17:04:45
>>nprate+ma
"We don't have AGI"

We don't have 2 degrees Celsius warming either. Should we do nothing to change course or prepare? Any thinker worth their salt knows you need to plan ahead not react to things as they come and leave to chance that you then may not be able to.

replies(1): >>nprate+hj
◧◩◪
6. jvande+Vi[view] [source] [discussion] 2024-05-17 17:30:28
>>reduce+a7
They resigned (or were fired) because the business no longer needs their unit, which puts a damper on their impact and usefulness. It also makes them a cost center in a business that is striving to become profitable.

That is the simplest explanation, it's a tale as old as time. And is fundamentally explained by a very plausible pivot from "World changing general purpose AI - believe me it's real" to "world changing LLM integration and innovation shop".

◧◩◪◨⬒
7. nprate+hj[view] [source] [discussion] 2024-05-17 17:33:11
>>reduce+Nd
Exactly. Which is why this shows they don't genuinely believe their own hype and fear-mongering. This says more than anything that people at the cutting edge don't really think AGI is on the horizon.
replies(1): >>SpicyL+Xk
◧◩◪◨⬒⬓
8. SpicyL+Xk[view] [source] [discussion] 2024-05-17 17:45:28
>>nprate+hj
They're heavily incentivized not to! Exxon executives in the 80s disbelieved in climate change, too, despite reports from their internal "safety" teams that it was going to be a big problem.
replies(1): >>nprate+eB1
◧◩◪
9. tim333+RQ[view] [source] [discussion] 2024-05-17 21:34:48
>>reduce+a7
>“Currently, we don't have a solution for steering or controlling a potentially superintelligent AI, and preventing it from going rogue.”

We could always stop paying for the servers, or their electricity.

I think we'll have AGI soon but it won't be that much threat to the world.

replies(1): >>reduce+e51
◧◩◪◨
10. reduce+e51[view] [source] [discussion] 2024-05-17 23:55:10
>>tim333+RQ
> We could always stop paying for the servers, or their electricity.

This is satire, right? No one saying this or "off button" has thought this difficult problem through longer than 30 minutes.

https://youtu.be/_8q9bjNHeSo?si=a7PAHtiuDIAL2uQD&t=4817

"Can we just turn it off?"

"It has thought of that. It will not give you a sign that makes you want to turn it off before it is too late to do that."

◧◩◪◨⬒⬓⬔
11. nprate+eB1[view] [source] [discussion] 2024-05-18 08:14:38
>>SpicyL+Xk
The company that develops real AGI will become a trillion dollar company over night (and destroy all the competition). They are massively live-or-die incentivised.
[go to top]