zlacker

[return to "Ask HN: Is anyone else getting AI fatigue?"]
1. Frustr+Sf[view] [source] 2023-02-09 12:53:17
>>grader+(OP)
Engineers are always building things that are incredible, then turning their back on it as ordinary once the problem is solved, “oh that’s so normal, it was just a little math, a little tweak, no big deal”.

AI has gone through a lot of stages of “only X can be done by a human”-> “X is done by AI” -> “oh, that’s just some engineering, that’s not really human” or “no longer in the category of mystical things we can’t explain that a human can do”.

LLM is just the latest iteration of, “wow it can do this amazing human only thing X (write a paper indistinguishable from a human)” -> “doh, it’s just some engineering (it’s just a fancy auto complete)”.

Just because AI is a bunch of linear algebra and statistics does not mean the brain isn’t doing something similar. You don’t like terminology, but how is re-enforcement “Learning”, not exactly the same as reading books to a toddler and pointing at a picture and having them repeat what it is?

Start digging into the human with the same engineering view, and suddenly it also just become a bunch of parts. Where is the human in the human once all the human parts are explained like an engineer would. What would be left? The human is computation also, unless you believe in souls or other worldly mysticism. So why not think eventually AI as computation can be equal to human.

Just because Github CoPilot can write bad code, isn't a knock on AI, it's real, a lot of humans write bad code.

◧◩
2. haswel+Sk[view] [source] 2023-02-09 13:22:07
>>Frustr+Sf
I generally agree that we quickly adjust to new tech and forget how impactful it is.

But I can’t fully get on board with this:

> but how is re-enforcement “Learning”, not exactly the same as reading books to a toddler and pointing at a picture and having them repeat what it is? Start digging into the human with the same engineering view, and suddenly it also just become a bunch of parts. Where is the human in the human once all the human parts are explained like an engineer would.

The parent teaching a toddler bears some vague resemblance to machine learning, but the underlying results of that learning (and the process of learning itself) could not be any more different.

More problematic than this, while you may be correct that we will eventually be able to explain human biology with the precision of an engineer, these recent AI advances have not made meaningful progress towards that goal, and such an achievement is arguably many decades away.

It seems you are concluding that because we might eventually explain human biology, we can draw conclusions now about AI as if such an explanation had already happened.

This seems deeply problematic.

AI is “real” in the sense that we are making good progress on advancing the capabilities of AI software. This does not imply we’ve meaningfully closed the gap with human intelligence.

◧◩◪
3. Frustr+pu[view] [source] 2023-02-09 14:11:38
>>haswel+Sk
I think the point is that we have been “meaningfully closing” the gap rapidly, and at this point it is only a matter of time, the end can be seen, even if currently not completely written out in some equations.

It does seem like on HN, the audience is heavily weighted towards software developers that are not biologist, and often cannot see the forest for the trees. They know enough about AI programming to dismiss the hype, and not enough about biology, and miss that this is pretty amazing.

The understanding of the human ‘parts’ are being chipped away, just as quickly as we have had breakthroughs in AI. These fields are starting to converge and inform each other. I’m saying this is happening fast enough that the end game is in sight, humans are just made of parts, an engineering problem that will be solved.

Free will and consciousness are overrated, we think of ourselves as having some mystically exceptional consciousness, which clouds the credit we give advancements in AI. ‘AI will never be able to equal a human’, when humans just want lunch, and our ‘free will’ is based on how much sleep we got. DNA is a program; it builds the brain that is just responding to inputs. Read some Robert Sapolsky, human reactions are just hormones, chemicals, responding to inputs. We will eventually have an AI that mimics a human because humans aren’t that special. Even if the function of every single molecule in the body, or every equation in AI, is all fully mapped out, enough is to stop claiming 'specialness'.

[go to top]