zlacker

[parent] [thread] 5 comments
1. Yurgen+(OP)[view] [source] 2024-05-17 23:19:06
I believe a better solution to this would be to spread the following sentiment: "Since it's already illegal to tell disparaging lies, the mere existence of such a clause implies some disparaging truths to which the party is aware." Always assuming the worst around hidden information provides a strong incentive to be transparent.
replies(4): >>lupire+c1 >>bernie+xh >>jiggaw+ck >>d0mine+ao
2. lupire+c1[view] [source] 2024-05-17 23:30:37
>>Yurgen+(OP)
Humans respond better to concrete details than abstractions.

It's a lot of mental work to rally the emotion of revulsion over the evil they might be doing that is kept secret.

replies(1): >>hi-v-r+r4
◧◩
3. hi-v-r+r4[view] [source] [discussion] 2024-05-18 00:01:34
>>lupire+c1
This is true.

I was once fired, ghosted style, for merely being in the same meeting room as a racist corporate ass-clown muting the conference call to make Asian slights and monkey gesticulations. There was no lawsuit or payday because "how would I ever work again?" was the Hobson's choice between let it go and a moral crusade without a way to pay rent.

If instead I were upset that "not enough N are in tech," there isn't a specific incident or person to blame because it'd be a multifaceted situation.

4. bernie+xh[view] [source] 2024-05-18 02:47:38
>>Yurgen+(OP)
That’s a really good point. A variation of the Streisand Effect.

Makes you wonder what misdeeds they’re trying so hard to hide.

5. jiggaw+ck[view] [source] 2024-05-18 03:44:41
>>Yurgen+(OP)
This is an important mode of thinking in many adversarial or competitive contexts.

Cryptography is a prime example. Any time any company is the tiniest bit cagey or obfuscates any aspect, I default to assuming that they’re either selling snake oil or have installed NSA back doors. I’ll claim this openly, as a fact, until proven otherwise.

6. d0mine+ao[view] [source] 2024-05-18 05:06:14
>>Yurgen+(OP)
I hope forbidding telling the truth is about something banal like "fake it until you make it" in some of OpenAI demos. The technology looks like magic but plausible to implement in a few months/years.

Worse if it is related to training future super intelligence to kill people. Killer drones are possible even with today's technology without AGI.

[go to top]