I gave it a shitty harness and it almost 1 shotted laying out outlets in a room based on a shitty pdf. I think if I gave it better control it could do a huge portion of my coworkers jobs very soon
"Ok, I guess it could wipe out the economic demand for digital art, but it could never do all the autonomous tasks of a project manager"
"Ok, I guess it could automate most of that away but there will always be a need for a human engineer to steer it and deal with the nuances of code"
"Ok, well it could never automate blue collar work, how is it gonna wrench a pipe it doesn't have hands"
The goalposts will continue to move until we have no idea if the comments are real anymore.
Remember when the Turing test was a thing? No one seems to remember it was considered serious in 2020
> "the economic demand for digital art"
You twisted one "goalpost" into a tangential thing in your first "example", and it still wasn't true, so idk what you're going for. "Using a wrench vs preliminary layout draft" is even worse.
If one attempted to make a productive observation of the past few years of AI Discourse, it might be that "AI" capabilities are shaped in a very odd way that does not cleanly overlap/occupy the conceptual spaces we normally think of as demonstrations of "human intelligence". Like taking a 2-dimensional cross-section of the overlap of two twisty pool tubes and trying to prove a Point with it. Yet people continue to do so, because such myopic snapshots are a goldmine of contradictory venn diagrams, and if Discourse in general for the past decade has proven anything, it's that nuance is for losers.