zlacker

[parent] [thread] 0 comments
1. godels+(OP)[view] [source] 2023-05-22 22:37:39
> But this is exactly why average is boring.

Honestly, I think it is the lens. Personally I find it absolutely amazing. It's this incredibly complex thing that we've been trying to describe for thousands of years but have completely failed to (we've gotten better of course). It's this thing that is right in front of that looks simple but because not many try to peak behind the curtain. Looking behind there is like trying to describe a Lovecraftian monster. But this is all in plain sight. That's pretty crazy imo. But hey, dig down the rabbit hole of any subject and you'll find this complex world. Most things are like collage. From far away their shape looks clear and precise but on close inspection you find that each tile itself is another beautiful piece. This is true even for seemingly simple things, and honestly I think that's even more beautiful. This complex and chaotic world is all around us but we take it for granted. Being boring comes down to a choice.

> If you ask the average person what it's like to be in the US Navy, ChatGPT could plausibly give a better response.

There's also a bias. Does a human know the instructions are a creative exercise? It is hard to measure because what you'd need to prompt a human with is "Supposing you were a conman trying to convince me you were in the Navy, how would you describe what it was like?" Because the average human response is going to default to not lying and fabricating things. You also need to remember that your interpretation is (assuming you aren't/weren't in the Navy) is as someone hearing a story rather than aligning that story to lived experiences. You'd need to compare the average human making up a story to GPT, not an average human's response.

> It's barely even capable of it, but has access to training data that the average person lacks.

I do agree that GPT is great as a pseudo and noisy library. I find that as a wonderful and very useful tool. I often forget specific words used to describe certain concepts. This is hard to google. GPT finds them pretty well or returns something close enough that I can do a quick iterative prompt and find the desired term. Much faster than when I used to do this by googling. But yeah, I think we both agree that GPT is by no means sentient and likely not intelligent (ill-defined and defined differently for different people). But we can find many things and different things interesting. My main point is I wanted to explain why I find intelligence so fascinating. Hell, it is a major part of why I got into ML research in the first place (Asimov probably helped a lot too).

[go to top]