zlacker

[return to "Tracking the Fake GitHub Star Black Market"]
1. perihe+ca[view] [source] 2023-03-18 09:48:20
>>kaeruc+(OP)
Goodhart's law: if you rely on a social signal to tell you what's good, you'll break that signal.

Very soon, the domain of bullshit will extend to actual text. We'll be able to buy HN comments by the thousand -- expertly wordsmithed, lucid AI comments -- and you can get them to say "this GitHub repo is the best", or "this startup is the real deal". Won't that be fun?

◧◩
2. Alex39+Dp[view] [source] 2023-03-18 12:39:54
>>perihe+ca
> We'll be able to buy HN comments by the thousand -- expertly wordsmithed, lucid AI comments

You're forgetting the millions of additional comments that will be written by humans to trick the AI into promoting their content.

Even worse, currently if you ask Chat GPT to write you some code, it will make up an API endpoint that doesn't exist and then make up a URL that doesn't exist where you can register for an API key. People are already registering these domains, and parking fake sites on them to scam people. ChatGPT is creating a huge market for creating fake companies to match the fake information it's generating.

The biggest risk may not be people using AI-generated comments to promote their own repos, but rather registering new repos to match the fake ones that the AI is already promoting.

◧◩◪
3. permo-+Vq[view] [source] 2023-03-18 12:53:03
>>Alex39+Dp
I feel like you’re overstating this as a long term issue. sure it’s a problem now, but realistically how long before code hallucinations are patched out?
◧◩◪◨
4. lanter+8r[view] [source] 2023-03-18 12:55:21
>>permo-+Vq
The black box nature of the model means this isn't something you can really 'patch out'. It's a byproduct of the way the system processes data - they'll get less frequent with targeted fine tuning and improved model power, but there's no easy solve.
◧◩◪◨⬒
5. permo-+7g1[view] [source] 2023-03-18 18:39:25
>>lanter+8r
this is clearly untrue. it’s an input, a black box, then an output. openai have 100% control over the output. they may not be able to directly control what comes out of the black box, but a) they can tune the model, and they undoubtedly will, and b) they can control what comes after the black box. they can—for example—simply block urls
◧◩◪◨⬒⬓
6. Sai_+ZR2[view] [source] 2023-03-19 11:46:03
>>permo-+7g1
They don’t have control over the output. They created something that creates something else. They can only tweak what they created, not whatever was created by what they created.

E.g., if I create a great paintbrush which creates amazing spatter designs on the wall when it is used just so, then, beyond a point, I have no way to control the spatter designs - I can only influence the designs to some extent.

[go to top]