zlacker

[return to ""]
1. skepti+(OP)[view] [source] 2024-02-14 02:35:28
>>mfigui+M3
Frankly, OpenAI seems to be losing its luster, and fast.

Plugins were a failure. GPTs are a little better, but I still don't see the product market fit. GPT-4 is still king, but not by that much any more. It's not even clear that they're doing great research, because they don't publish.

GPT-5 has to be incredibly good at this point, and I'm not sure that it will be.

2. mfigui+M3[view] [source] 2024-02-14 03:08:18
3. al_bor+Po[view] [source] 2024-02-14 06:56:06
>>skepti+(OP)
I know things keep moving faster and faster, especially in this space, but GPT-4 is less than a year old. Claiming they are losing their luster, because they aren’t shaking the earth with new models every quarter, seems a little ridiculous.

As the popularity has exploded, and ethical questions have become increasingly relevant, it is probably worth taking some time to nail certain aspects down before releasing everything to the public for the sake of being first.

◧◩
4. bayind+ZK[view] [source] 2024-02-14 11:19:50
>>al_bor+Po
You don't lose your luster only by not innovating.

Altman saga, allowing military use and other small things step by step tarnish your reputation and pushes you to the mediocrity or worse.

Microsoft has many great development stories (read Raymond Chen's blog to be awed), but what they did at the end to other competitors and how they behave removed their luster, permanently for some people.

◧◩◪
5. inglor+6N[view] [source] 2024-02-14 11:42:36
>>bayind+ZK
"allowing military use"

That would actually increase their standing in my eyes.

Not too far from where I live, Russian bombing is destroying homes of people whose language is similar to mine and whose "fault" is that they don't want to submit to rule from Moscow, direct or indirect.

If OpenAI can somehow help stop that, I am all for it.

◧◩◪◨
6. bayind+UN[view] [source] 2024-02-14 11:48:43
>>inglor+6N
On the other hand, Israel is using AI to generate their bombing targets and pound Gaza strip with bombs non-stop [0].

And, according to UN, Turkey has used AI powered, autonomous littering drones to hit military convoys in Libya [1].

Regardless of us vs. them, AI shouldn't be a part of warfare, IMHO.

[0]: https://www.theguardian.com/world/2023/dec/01/the-gospel-how...

[1]: https://www.voanews.com/a/africa_possible-first-use-ai-armed...

◧◩◪◨⬒
7. kj99+tS[view] [source] 2024-02-14 12:30:41
>>bayind+UN
> AI shouldn't be a part of warfare, IMHO.

Nor should nuclear weapons, guns, knives, or cudgels.

But we don’t have a way to stop them being used.

◧◩◪◨⬒⬓
8. foolof+471[view] [source] 2024-02-14 14:15:06
>>kj99+tS
This is literally the only thing that matters in this debate. Everything else is useless hand-wringing from people who don't want to be associated with the negative externalities of their work.

The second that this tech was developed it became literally impossible to stop this from happening. It was a totally foreseeable consequence, but the researchers involved didn't care because they wanted to be successful and figured they could just try to blame others for the consequences of their actions.

◧◩◪◨⬒⬓⬔
9. qetern+Wa1[view] [source] 2024-02-14 14:37:34
>>foolof+471
> the researchers involved didn't care because they wanted to be successful and figured they could just try to blame others for the consequences of their actions

Such an absurdly reductive take. Or how about just like nuclear energy and knives, they are incredibly useful, society advancing tools that can also be used to cause harm. It's not as if AI can only be used for warfare. And like pretty much every technology, it ends up being used 99.9% for good, and 0.1% for evil.

[go to top]