zlacker

[return to "Tracking the Fake GitHub Star Black Market"]
1. perihe+ca[view] [source] 2023-03-18 09:48:20
>>kaeruc+(OP)
Goodhart's law: if you rely on a social signal to tell you what's good, you'll break that signal.

Very soon, the domain of bullshit will extend to actual text. We'll be able to buy HN comments by the thousand -- expertly wordsmithed, lucid AI comments -- and you can get them to say "this GitHub repo is the best", or "this startup is the real deal". Won't that be fun?

◧◩
2. Alex39+Dp[view] [source] 2023-03-18 12:39:54
>>perihe+ca
> We'll be able to buy HN comments by the thousand -- expertly wordsmithed, lucid AI comments

You're forgetting the millions of additional comments that will be written by humans to trick the AI into promoting their content.

Even worse, currently if you ask Chat GPT to write you some code, it will make up an API endpoint that doesn't exist and then make up a URL that doesn't exist where you can register for an API key. People are already registering these domains, and parking fake sites on them to scam people. ChatGPT is creating a huge market for creating fake companies to match the fake information it's generating.

The biggest risk may not be people using AI-generated comments to promote their own repos, but rather registering new repos to match the fake ones that the AI is already promoting.

◧◩◪
3. permo-+Vq[view] [source] 2023-03-18 12:53:03
>>Alex39+Dp
I feel like you’re overstating this as a long term issue. sure it’s a problem now, but realistically how long before code hallucinations are patched out?
◧◩◪◨
4. trippi+SI[view] [source] 2023-03-18 15:22:46
>>permo-+Vq
An aside: what do people mean when they say “hallucinations” generally? Is it something more refined than just “wrong”?

As far as I can tell most people just use it as a shorthand for “wow that was weird” but there’s no difference as far as the model is concerned?

◧◩◪◨⬒
5. bombca+Ba1[view] [source] 2023-03-18 18:03:46
>>trippi+SI
Wrong is saying 2+2 is five.

Wrong is saying that the sun rises in the west.

By hallucinating they’re trying to imply that it didn’t just get something wrong but instead dreamed up an alternate world where what you want existed, and then described that.

Or another way to look at it, it gave an answer that looks right enough that you can’t immediately tell it is wrong.

[go to top]