zlacker

[return to "Who knew the first AI battles would be fought by artists?"]
1. meebob+kc[view] [source] 2022-12-15 13:03:10
>>dredmo+(OP)
I've been finding that the strangest part of discussions around art AI among technical people is the complete lack of identification or empathy: it seems to me that most computer programmers should be just as afraid as artists, in the face of technology like this!!! I am a failed artist (read, I studied painting in school and tried to make a go at being a commercial artist in animation and couldn't make the cut), and so I decided to do something easier and became a computer programmer, working for FAANG and other large companies and making absurd (to me!!) amounts of cash. In my humble estimation, making art is vastly more difficult than the huge majority of computer programming that is done. Art AI is terrifying if you want to make art for a living- and, if AI is able to do these astonishingly difficult things, why shouldn't it, with some finagling, also be able to do the dumb, simple things most programmers do for their jobs?

The lack of empathy is incredibly depressing...

◧◩
2. akisel+D71[view] [source] 2022-12-15 17:01:15
>>meebob+kc
It’s not about empathy but about the fundamental nature of the job.

Developers will be fine because software engineering is an arms race - a rather unique position to be in as a professional. I saw this play out during the 2000s offshoring scare when many of us thought we'd get outsourced to India. Instead of getting outsourced, the industry exploded in size globally and everything that made engineers more productive also made them a bigger threat to competitors, forcing everyone to hire or die.

Businesses only need so much copy or graphic design, but the second a competitors gains a competitive advantage via software they have to respond in kind - even if it's a marginal advantage - because software costs so little to scale out. As the tech debt and the revenue that depends on it grows, the baseline number of staff required for maintenance and upkeep grows because our job is to manage the complexity.

I think software is going to continue eating the world at an accelerated pace because AI opens up the uncanny valley: software that is too difficult to implement using human developers writing heuristics but not so difficult it requires artificial general intelligence. Unlike with artists, improvements in AI don’t threaten us, they instead open up entire classes of problems for us to tackle

◧◩◪
3. oldstr+Ua1[view] [source] 2022-12-15 17:15:58
>>akisel+D71
Technically I'd imagine AI threatens developers (https://singularityhub.com/2022/12/13/deepminds-alphacode-co...) a lot more than artists because there's a tangible (or 'objectively correct') problem being solved by the AI. Whereas art is an entirely subjective endeavor, and ultimately the success of what is being made is left up to how someone is feeling. I also imagine humans will begin to look at AI generated art very cynically. Maybe we all collectively agree we hate AI art, and it becomes as cliché as terrible stock photography. Or, we just choose not to appreciate anything that doesn't come with a 'Made By Humans' authentication... Pretty simple solution for the artists.

Obviously a lot of money will be lost for artists in a variety of commercial fields, but the ultimate "success of art" will be unapproachable by AI given its subjective nature.

Developers though will be struggling to compete from both a speed and technical point of view, and those hurdles can't be simply overcome with a shift in how someone feels. And you're right about the arms race, it just won't be happening with humans. It'll be computing power, AIs and the people capable of programming those AIs.

◧◩◪◨
4. akisel+4h1[view] [source] 2022-12-15 17:43:43
>>oldstr+Ua1
If there’s a “tangible problem” people solve it with a SaaS subscription. That’s not new.

We developers are hired because our coworkers can’t express what they really want. No one pays six figures to solve glorified advent of code prompts. The prompts are much more complex, ever changing as more information comes in, and in someone’s head to be coaxed out by another human and iterated on together. They are no more going to be prompt engineers than they were backend engineeers.

I say this as someone who used TabNine for over a year before CoPilot came out and now use ChatGPT for architectural explorations and code scaffolding/testing. I’m bullish on AI but I just don’t see the threat.

◧◩◪◨⬒
5. oldstr+9l1[view] [source] 2022-12-15 18:03:54
>>akisel+4h1
I'm just arguing that its a lot easier for AI to replace something that has objectively or technically correct solutions vs something as subjective as art (where we can just decide we don't like it on a whim).
◧◩◪◨⬒⬓
6. akisel+To1[view] [source] 2022-12-15 18:21:34
>>oldstr+9l1
I’m arguing that there is no objectively or technically correct solutions to the work engineers are hired to do. You don’t “solve” a startup CEO or corp VP who changes their mind about the direction of the business every week. Ditto for consumers and whatever the latest fad they’re chasing is. They are agents of chaos and we are the ones stuck trying to wrangle technology to do their bidding. As long as they are human, we’ll need the general intelligence of humans (or equivalent) to figure out what to code or prompt or install.
◧◩◪◨⬒⬓⬔
7. oldstr+RJ1[view] [source] 2022-12-15 19:57:37
>>akisel+To1
In the sense that someone asks "I need a program that takes x and does y" and the AI is able to solve that problem satisfactorily, it's an objectively correct solution. There will be nuance to that problem, and how its solved, but the end results are always objectively correct answers of "it either works, or it doesn't."
[go to top]