zlacker

[return to "Who knew the first AI battles would be fought by artists?"]
1. meebob+kc[view] [source] 2022-12-15 13:03:10
>>dredmo+(OP)
I've been finding that the strangest part of discussions around art AI among technical people is the complete lack of identification or empathy: it seems to me that most computer programmers should be just as afraid as artists, in the face of technology like this!!! I am a failed artist (read, I studied painting in school and tried to make a go at being a commercial artist in animation and couldn't make the cut), and so I decided to do something easier and became a computer programmer, working for FAANG and other large companies and making absurd (to me!!) amounts of cash. In my humble estimation, making art is vastly more difficult than the huge majority of computer programming that is done. Art AI is terrifying if you want to make art for a living- and, if AI is able to do these astonishingly difficult things, why shouldn't it, with some finagling, also be able to do the dumb, simple things most programmers do for their jobs?

The lack of empathy is incredibly depressing...

◧◩
2. gus_ma+Ue[view] [source] 2022-12-15 13:16:38
>>meebob+kc
I think the correct way to get empathy is to use an equivalent that technical people understand, like Copilot:

* Can a Copilot-like generator be trained with the GPL code of RMS? What is the license of the output?

* Can a Copilot-like generator be trained with the leaked source code of MS Windows? What is the license of the output?

◧◩◪
3. Terret+ai[view] [source] 2022-12-15 13:32:38
>>gus_ma+Ue
Your example is like saying we should have empathy for people who can whittle when a 3D printer can now extrude the same design in bulk. Or like empathy for London cabbies having to learn roads when "anyone" can A-to-B now with a phone.

Code should not need to be done by humans at all. There's no reason coding as it exists today should exist as a job in the future.

Any time I or a colleague are "debugging" something, I'm just sad we are so "dark ages" that the IDE isn't saying "THERE, humans, the bug is THERE!" in flashing red. The IDE has the potential to have perfect information, so where is the bug is solvable.

The job of coding today should continue to rise up the stack tomorrow to where modules and libraries and frameworks are just things machines generate in response to a dialog about “the job to be done”.

The primary problem space of software is in the business domain, today requiring people who speak barely abstracted machine language to implement -- still such painfully early days.

We're cavemen chipping at rocks to make fire still amazed at the trick. No empathy, just, self-awareness sufficient to provoke us into researching fusion.

◧◩◪◨
4. Kalium+Qw[view] [source] 2022-12-15 14:40:10
>>Terret+ai
We can and should have empathy for all those people.

The question is perhaps not if we should have empathy for them. The question is what we should do with it once we have it. I have empathy for the cabbies with the Knowledge of London, but I don't think making any policy based on or around that empathy is wise.

This is tricky in practice. A surprising number of people regard prioritizing the internal emotional experience of empathy in policy as experiencing empathy.

◧◩◪◨⬒
5. Terret+8G8[view] [source] 2022-12-17 19:51:13
>>Kalium+Qw
Agree with this 100%. Feel for the outdated, sucks to be outmoded, but artificially prolonging the agony is not the way.
[go to top]