zlacker

[return to "Who knew the first AI battles would be fought by artists?"]
1. meebob+kc[view] [source] 2022-12-15 13:03:10
>>dredmo+(OP)
I've been finding that the strangest part of discussions around art AI among technical people is the complete lack of identification or empathy: it seems to me that most computer programmers should be just as afraid as artists, in the face of technology like this!!! I am a failed artist (read, I studied painting in school and tried to make a go at being a commercial artist in animation and couldn't make the cut), and so I decided to do something easier and became a computer programmer, working for FAANG and other large companies and making absurd (to me!!) amounts of cash. In my humble estimation, making art is vastly more difficult than the huge majority of computer programming that is done. Art AI is terrifying if you want to make art for a living- and, if AI is able to do these astonishingly difficult things, why shouldn't it, with some finagling, also be able to do the dumb, simple things most programmers do for their jobs?

The lack of empathy is incredibly depressing...

◧◩
2. lordfr+O51[view] [source] 2022-12-15 16:54:57
>>meebob+kc
I want to apologize in advance if my response here seems callous considering your personal experience as an artist. I'm trying to talk about AI and labor in general here, and don't mean to minimize your personal experience.

That said, I don't think AIs ability to generate art is a major milestone in the progress of things, I think it's more of the same, automating low value-add processes.

I agree that AI is/will-be an incredibly disruptive technology. And that automation in general is putting more and more people out of jobs, and extrapolated forward you end up in a world where most humans don't have any practical work to do other than breed and consume resources at ever increasing rates.

As much as I'm impressed by AI art (it's gorgeous), at the end of the day it's mainly just copying/pasting/smoothing out objects it's seen before (training set). We don't think of it as clipart, but that's essentially what it is underneath it all, just a new form of clipart. Amazing in it's ability to reposition, adjust, smooth images, have some sense of artistic placement, etc. It's lightyears beyond where clipart started (small vector and bitmap libraries). But at the end of the day it's just automating the creation of images using clipart. Re-arranging images you've seen before so is not going to make anyone big $$$. End of the day the quality of the output is entirely subjective, just about anything reasonable will do.

This reminds me a lot of GPT-3... looks like it has substance but not really. GPT-3 is great at making low value clickbait articles of cut-and-paste information on your favorite band or celebrity. GPT-3 will never be able to do the job of a real journalist, pulling pieces together to identify and expose deeper truths, to say, uncover the Theranos fraud. It's just Eliza [1] on steroids.

The AI parlor tricks started with Eliza, and have gotten quite elaborate as of late. But they're still just parlor tricks.

Comparing it to the challenges of programming, well yes I agree AI will automate portions of it, but with major caveats.

A lot of what people call "programming" today is really just plumbing. I'm a career embedded real-time firmware engineer, and it continues to astonish me that there's an entire generation of young "programmers" who don't understand basic computing principles, stacks, interrupts, I/O operations.. at the end of the day their knowledge base seems comprised of knowing which tool to use where in orchestration, and how to plumb it together. And if they don't know the answer they simply google and stack overflow will tell them. Low code, no code, etc. (python is perfect for quickly plumbing two systems together). This skill set is very limited and wouldn't even get you a junior dev position when I started out. I'm not suprised it's easy to automate, as it will generally have the same quality code (and make the same mistakes) as a human dev that simply copies/pastes Stack Overflow solutions.

This is in stark contrast to the types of problems that most programmers used to solve in the old days (and a smaller number still do). Stuff that needed an engineering degree and complex problem solving skills. But when I started out 30 years ago, "programmers" and "software engineers" were essentially the same thing. They aren't now, there is a world of difference between your average programmer and a true software engineer today.

Not saying plumbers aren't valuable.. they absolutely are as more and more of the modern world is built on plumbing things together. Highly skilled software engineers are needed less and less, and that's a net-good thing for humanity. No one needs to write operating systems anymore, lets add value building on top of them. Those are the people making the big $$$, their skillset is quite valuable. We're in the middle of a bi-furcation of software engineering careers. More and more positions will only require limited skills, and fewer and fewer (as a percentage) will continue to be highly skilled.

So is AI going to come in and help automate the plumbing? Heck yes, and rightly so... They've automated call centers, warehouse logistics, click-bait article writing, carry-out order taking, the list goes on and on. I'd love to have an AI plumber I could trust to do most of the low-level work right (and in CI/CD world you can just push out a fix if you missed something).

I don't believe for a second that today's latest and greatest "cutting edge" AI will ever be able to solve the hard problems that keep highly skilled people employed. New breakthroughs are needed, but I'm extremely skeptical. Like fusion promises, general purpose AI always seems just a decade or two away. Skilled labor is safe, for now.. maybe for a while yet.

The real problem as I see it, is that AI automation is on course to eliminate most low skilled jobs in the next century, which puts it on a collision course with the fact that most humans aren't capable of performing highly skilled work (half are below average by definition). Single parent workig the GM line in the 50's was enough afford an average family a decent life. Not so much where technology is going. At the end of the day the average human will have little to contribute to civilization, but still expects to eat and breed.

Universal basic income has been touted as a solution to the coming crisis, but all that does is kick the can down the road. It leads to a world of too much idle time (and the devil will find work for idle hands) and ever growing resource consumption. A perfect storm.... at the end of the day what's the point of existing when all you do is consume everything around you and don't add any value? Maybe that's someone's idea of utopia, but not mine.

This has been coming for a long time, AI art is just a small step on the current journey, not a big breakthrough but a new application in automation.

/rant

[1] https://en.wikipedia.org/wiki/ELIZA

◧◩◪
3. unity1+pO1[view] [source] 2022-12-15 20:17:33
>>lordfr+O51
> entire generation of young "programmers" who don't understand basic computing principles, stacks, interrupts, I/O operations

Why would software engineers who work on web apps, kubernetes, and the internet in general need to understand interrupts. Not only they will never ever deal with any of that, but also they are supposed not to. All of those have been automated away so that what we call the Internet can be possible.

All of those stuff turned into specializations as the tech world progressed and the ecosystem grew. A software engineer specialized in hardware would need to know interrupts while he wouldnt need to know how to do devops. For the software engineer who works on Internet apps, its the opposite.

◧◩◪◨
4. lordfr+s42[view] [source] 2022-12-15 21:33:35
>>unity1+pO1
I'm not dissing cloud engineering. I've learned enough to really repesct the architects behind these large scale systems.

My point was about skill level, not specialization. Specialization is great.. we can build bigger and bigger things not having to engineer/understand what's beneath everything. We stand on the shoulders of giants as they say.

And I agree, there is no one job specialization that's more valuable than the other. It's contextual. If you have a legal problem, a specialized lawyer is more valuable than a specialized doctor. So yeah I agree that if you have a cloud problem, you want a cloud engineer and not a firmware engineer. Although I should add that things like interrupts/events/synchronization and I/O operations are fairly universal computing concepts even in the cloud world. If you're a cloud programmer and you don't know how long an operation takes / its big-O complexity, how much storage it uses / it's persistence etc. you're probably going to have some explaining to do when your company gets next months AWS bill.

And yes plumbing is useful! Someone has to hook stuff up that needs hooking up! But which task requires more skill; the person that designs a good water flow valve, or the person hooking one up? I'd argue the person designing the valve needs to be more skilled (they certainly need more schooling). The average plumber can't design a good flow valve, while the average non-plumber can fix a leaky sink.

AI is eating unskilled / low-skill work. In the 80's production line workers were afraid of robots. Well, here we are. No more pools of typists, automated call centers handling huge volumes of people, dark factories.

It's a terrible time to be an artist if AI can clipart compose images of the same quality much faster than you can draw by hand.

Back to original comment: I'm merely suggesting that some programming jobs require a lot more skill than others. If software plumbing is easy, then it can and will be automated. If those were the only skill I posessed, I'd be worried about my job.

Like fusion, I just don't see general purpose AI being a thing in my lifetime. For highly skilled programmers, it's going to be a lot longer before they're replaced.

Welcome to our digital future. It's very stressful for the average skilled human.

◧◩◪◨⬒
5. unity1+3m2[view] [source] 2022-12-15 23:15:23
>>lordfr+s42
> My point was about skill level, not specialization

I fail to see the skill level in someone working on the web knowing about interrupts. And a firmware engineer knowing about devops, integrations or react.

> Although I should add that things like interrupts/events/synchronization and I/O operations are fairly universal computing concepts even in the cloud world

Not really. I/O has nothing to do with cloud, likewise interrupts. Those remain buried way, way down in the hardware that run the cloud at a place where not even datacenter engineers reach.

> If you're a cloud programmer and you don't know how long an operation takes / its big-O complexity

That still has nothing to do with interrupts or hardware I/O.

◧◩◪◨⬒⬓
6. lordfr+xB2[view] [source] 2022-12-16 01:03:34
>>unity1+3m2
> I fail to see the skill level in someone working on the web knowing about interrupts

As a firmware/app guy I'm not qualified on talking about relative skill sets between different areas of cloud development. I agree that interrupts/threads aren't important at all to the person writing a web interface, should have found a better example. I'm not here to argue, for sure there are talented people up and down the stack.

What I can tell you is that I'm amazed at the mistakes I see this new generation of junior programmers making, the kind of stuff indicating they have little understanding of how computers actually actually work.

As an example, I continue to run into young devs that don't have any idea of what numeric over/underflow is. We do a lot of IoT and edge computing, so ranges/limits/size of the data being passed around matters a lot. Attempting to explain the concept reveals that a great many of them have no mental concept of how a computer even holds a number (let alone different variable sizes, types, signed/unsigned etc). When you explain that variables are a fixed size and don't have unlimited range, it's a revelation to many of them.

Sometimes they'll argue that this stuff doesn't matter, even as as you're showing them the error in their code. They feel the problem is that the other devs built it wrong, chose the wrong language or tool for the problem at hand etc. We had a dev (wrote test scripts) that would argue with his boss that everyone (including app and firmware teams) should ditch their languages and write everything in python, where mistakes can't be made. He was dead serious, ended up quitting out of frustration. I'm sure that was a personality problem, but still, the lack of basic understanding astounded us, and the phrase "knows enough to be dangerous" comes to mind.

I find it strange that there is a new type of programmer that knows very little about how computers actually work. I find it stranger that they are even a bit productive in their careers, although I suspect it's because the problem domains they work in are much more tolerant to these kinds of errors. CI/CD system is setup to catch/fix their problems, and hence the job positions can tolerate what used to be considered a below average programmer. Efficient? No. Good enough? Sure.

I suspect some of these positions can be automated before the others can.

◧◩◪◨⬒⬓⬔
7. unity1+JY2[view] [source] 2022-12-16 03:23:34
>>lordfr+xB2
> What I can tell you is that I'm amazed at the mistakes I see this new generation of junior programmers making, the kind of stuff indicating they have little understanding of how computers actually actually work.

> As an example, I continue to run into young devs that don't have any idea of what numeric over/underflow is

That doesn't happen in web application development either. You don't write that low level code that you could cause an overflow or underflow. There are a zillion layers in between your code and what could cause an overflow.

> they have little understanding of how computers actually actually work.

'The computer' has been abstracted away at the level of the Internet. Not even the experts who attend datacenters would ever pass near anything that is related to a numeric overflow. That stuff is hidden deep inside hardware or deep inside the software stack near the OS level in any given system. If there is anything that causes an overflow in such a machine, what they do would be to replace that machine instead of going into debugging. Its the hardware manufacturers' and OS developers' responsibility to do that. No company that does cloud or develops apps on the Internet would need to know about interrupts, numeric overflows and whatnot.

> I find it stranger that they are even a bit productive in their careers, although I suspect it's because the problem domains they work in are much more tolerant to these kinds of errors

Interrupt errors dont happen in web development. You have no idea at the level of abstraction that was built between the layers where it could happen and the modern Internet apps. We are even abstracting away servers, databases at this point.

You are applying a hardware perspective to the Internet. That's not applicable.

[go to top]