zlacker

[return to ""]
1. skepti+(OP)[view] [source] 2024-02-14 02:35:28
>>mfigui+M3
Frankly, OpenAI seems to be losing its luster, and fast.

Plugins were a failure. GPTs are a little better, but I still don't see the product market fit. GPT-4 is still king, but not by that much any more. It's not even clear that they're doing great research, because they don't publish.

GPT-5 has to be incredibly good at this point, and I'm not sure that it will be.

2. mfigui+M3[view] [source] 2024-02-14 03:08:18
3. al_bor+Po[view] [source] 2024-02-14 06:56:06
>>skepti+(OP)
I know things keep moving faster and faster, especially in this space, but GPT-4 is less than a year old. Claiming they are losing their luster, because they aren’t shaking the earth with new models every quarter, seems a little ridiculous.

As the popularity has exploded, and ethical questions have become increasingly relevant, it is probably worth taking some time to nail certain aspects down before releasing everything to the public for the sake of being first.

◧◩
4. bayind+ZK[view] [source] 2024-02-14 11:19:50
>>al_bor+Po
You don't lose your luster only by not innovating.

Altman saga, allowing military use and other small things step by step tarnish your reputation and pushes you to the mediocrity or worse.

Microsoft has many great development stories (read Raymond Chen's blog to be awed), but what they did at the end to other competitors and how they behave removed their luster, permanently for some people.

◧◩◪
5. inglor+6N[view] [source] 2024-02-14 11:42:36
>>bayind+ZK
"allowing military use"

That would actually increase their standing in my eyes.

Not too far from where I live, Russian bombing is destroying homes of people whose language is similar to mine and whose "fault" is that they don't want to submit to rule from Moscow, direct or indirect.

If OpenAI can somehow help stop that, I am all for it.

◧◩◪◨
6. bayind+UN[view] [source] 2024-02-14 11:48:43
>>inglor+6N
On the other hand, Israel is using AI to generate their bombing targets and pound Gaza strip with bombs non-stop [0].

And, according to UN, Turkey has used AI powered, autonomous littering drones to hit military convoys in Libya [1].

Regardless of us vs. them, AI shouldn't be a part of warfare, IMHO.

[0]: https://www.theguardian.com/world/2023/dec/01/the-gospel-how...

[1]: https://www.voanews.com/a/africa_possible-first-use-ai-armed...

◧◩◪◨⬒
7. kj99+tS[view] [source] 2024-02-14 12:30:41
>>bayind+UN
> AI shouldn't be a part of warfare, IMHO.

Nor should nuclear weapons, guns, knives, or cudgels.

But we don’t have a way to stop them being used.

◧◩◪◨⬒⬓
8. fwip+1m1[view] [source] 2024-02-14 15:27:16
>>kj99+tS
Sure we do. We enforce it through the threat of warfare and subsequent prosecution, the same way we enforce the bans on chemical weapons and other war crimes.

We may lack the motivation and agreement to ban particular methods of warfare, but the means to enforce that ban exists, and drastically reduces their use.

[go to top]