zlacker

[parent] [thread] 5 comments
1. btown+(OP)[view] [source] 2024-05-15 14:42:51
Makes me wonder if that 20% compute commitment to superalignment research was walked back (or redesigned so as to be distant from the original mission). Or, perhaps the two deemed that even more commitment was necessary, and were dissatisfied with Altman's response.

Either way, if it's enough to cause them both to think it's better to research outside of the opportunities and access to data that OpenAI provides, I don't see a scenario where this doesn't indicate a significant shift in OpenAI's commitment to superalignment research and safety. One hopes that, at the very least, Microsoft's interests in brand integrity incentivize at least some modicum of continued commitment to safety research.

replies(1): >>dontup+A1
2. dontup+A1[view] [source] 2024-05-15 14:50:06
>>btown+(OP)
Ironically Microsoft is the one that's notoriously terrible at checking their "AI" products before releasing them.

Besides the infamous Tay there was that apparently un-aligned Wizard-2[or something like that] model from them which got released by mistake for about 12 hours

replies(3): >>buildb+pd >>fzzzy+HG >>jerjer+RE1
◧◩
3. buildb+pd[view] [source] [discussion] 2024-05-15 15:43:58
>>dontup+A1
As an MS employee working on LLMs, that entire saga is super weird. We need approval for everything! Releasing anything without approval is quite weird.

We can’t just drop papers on arxiv. There is no way running your own twitter, github, etc as a separate group allowed.

I checked fairly recently to see if the model was actually released again, it doesn’t seem to be; I find this telling.

◧◩
4. fzzzy+HG[view] [source] [discussion] 2024-05-15 17:57:48
>>dontup+A1
I was able to download a copy of that before they took it down. Silly.
replies(1): >>dontup+AR2
◧◩
5. jerjer+RE1[view] [source] [discussion] 2024-05-16 00:30:36
>>dontup+A1
Sydney was their best "lest just release it without guardrails" bot.

Tay way trivially racist, but boy was Sydney a wacko.

◧◩◪
6. dontup+AR2[view] [source] [discussion] 2024-05-16 14:03:44
>>fzzzy+HG
Yeah it was already mirrored pretty quickly. I expect enough people are now running cronjobs to archive whitelists of HF pages and auto-cloning anything that gets pushed out.
[go to top]