zlacker

[parent] [thread] 4 comments
1. dontup+(OP)[view] [source] 2024-05-15 14:50:06
Ironically Microsoft is the one that's notoriously terrible at checking their "AI" products before releasing them.

Besides the infamous Tay there was that apparently un-aligned Wizard-2[or something like that] model from them which got released by mistake for about 12 hours

replies(3): >>buildb+Pb >>fzzzy+7F >>jerjer+hD1
2. buildb+Pb[view] [source] 2024-05-15 15:43:58
>>dontup+(OP)
As an MS employee working on LLMs, that entire saga is super weird. We need approval for everything! Releasing anything without approval is quite weird.

We can’t just drop papers on arxiv. There is no way running your own twitter, github, etc as a separate group allowed.

I checked fairly recently to see if the model was actually released again, it doesn’t seem to be; I find this telling.

3. fzzzy+7F[view] [source] 2024-05-15 17:57:48
>>dontup+(OP)
I was able to download a copy of that before they took it down. Silly.
replies(1): >>dontup+0Q2
4. jerjer+hD1[view] [source] 2024-05-16 00:30:36
>>dontup+(OP)
Sydney was their best "lest just release it without guardrails" bot.

Tay way trivially racist, but boy was Sydney a wacko.

◧◩
5. dontup+0Q2[view] [source] [discussion] 2024-05-16 14:03:44
>>fzzzy+7F
Yeah it was already mirrored pretty quickly. I expect enough people are now running cronjobs to archive whitelists of HF pages and auto-cloning anything that gets pushed out.
[go to top]