zlacker

[parent] [thread] 7 comments
1. criley+(OP)[view] [source] 2023-11-18 14:07:28
All AI and all humanity hallucinates, and AI that doesn't hallucinate will functionally obsolete human intelligence. Be careful what you wish for, as humans are biologically incapable of not "hallucinating".
replies(5): >>hgomer+96 >>dudein+Za >>howrar+2b >>killer+Af >>roguec+yJp
2. hgomer+96[view] [source] 2023-11-18 14:40:01
>>criley+(OP)
Without supposing we're on this trajectory, humans no longer needing to focus on being productive is how we might be able to focus on being better humans.
3. dudein+Za[view] [source] 2023-11-18 15:09:53
>>criley+(OP)
Some humans hallucinate more than others
4. howrar+2b[view] [source] 2023-11-18 15:10:07
>>criley+(OP)
Well, that's the goal isn't it? Having AI take over everything that needs doing so that we can focus on doing things we want to do instead.
5. killer+Af[view] [source] 2023-11-18 15:39:39
>>criley+(OP)
GPT is better than an average human at coding. GPT is worse than an average human at recognizing bounds of its knowledge (i.e. it doesn't know that it doesn't know).

Is it fundamental? I don't think so. GPT was trained largely on random internet crap. One of popular datasets is literally called The Pile.

If you just use The Pile as a training dataset, AI will learn very little reasoning, but it will learn to make some plausible shit up, because that's the training objective. Literally. It's trained to guess the Pile.

Is that the only way to train an AI? No. E.g. check "Textbooks Are All You Need" paper: https://arxiv.org/abs/2306.11644 A small model trained on high-quality dataset can beat much bigger models at code generation.

So why are you so eager to use a low-quality AI trained on crap? Can't you wait few years until they develop better products?

replies(2): >>fennec+5b7 >>roguec+DJp
◧◩
6. fennec+5b7[view] [source] [discussion] 2023-11-20 12:59:15
>>killer+Af
Because people are are into tech? That's pretty much the whole point of this site?

Just imagining if we all only used proven products, no trying out cool experimental or incomplete stuff.

7. roguec+yJp[view] [source] 2023-11-26 02:25:40
>>criley+(OP)
humanity is capable of taking feedback, citing its sources, and not outright lying

these models are built to sound like they know what they are talking about, whether they do or not. this violates our basic social coordination mechanisms in ways that usually only delusional or psychopathic people do, making the models worse than useless

◧◩
8. roguec+DJp[view] [source] [discussion] 2023-11-26 02:27:26
>>killer+Af
Being better than the average human at coding is as easy as being better than the average human at surgery. Until it's better than actual skilled programmers, the people who are programming for a living are still responsible for learning to do the job well.
[go to top]