zlacker

[return to "Larry Ellison allegedly tried to have a professor fired for benchmarking Oracle"]
1. maskli+n6[view] [source] 2017-12-09 17:10:02
>>pavel_+(OP)
And remember,

> Do not fall into the trap of anthropomorphising Larry Ellison. You need to think of Larry Ellison the way you think of a lawnmower. You don't anthropomorphize your lawnmower, the lawnmower just mows the lawn, you stick your hand in there and it'll chop it off, the end. You don't think 'oh, the lawnmower hates me' -- lawnmower doesn't give a shit about you, lawnmower can't hate you. Don't anthropomorphize the lawnmower. Don't fall into that trap about Oracle. — Brian Cantrill (https://youtu.be/-zRN7XLCRhc?t=33m1s)

And

> I actually think that it does a dis-service to not go to Nazi allegory because if I don't use Nazi allegory when referring to Oracle there's some critical understanding that I have left on the table […] in fact as I have said before I emphatically believe that if you have to explain the Nazis to someone who had never heard of World War 2 but was an Oracle customer there's a very good chance that you would explain the Nazis in Oracle allegory. — also Brian Cantrill (https://www.youtube.com/watch?v=79fvDDPaIoY&t=24m)

◧◩
2. bigzen+27[view] [source] 2017-12-09 17:18:30
>>maskli+n6
While I agree with the lawnmower sentiment, should we really write off unethical behavior by comparing humans to machines?
◧◩◪
3. derefr+I7[view] [source] 2017-12-09 17:24:31
>>bigzen+27
Machines are one of the only things most people are familiar with that don't hold to all the same values that living beings (humans, other mammals, most vertebrates) usually do. (You might call Ellison a "force of nature", but that might not play well for people who attribute an "eventually-consistent omnibenevolence" to nature.)

Really, without the metaphor, what's going on is that Larry Ellison has modified himself to hold the values that a corporation holds, in order to more efficiently drive said corporation toward optimizing on its corporate goals (i.e. increase share value, etc.) Where human values and corporate values are in conflict, Ellison has chosen to forget about his human values and, effectively, become the avatar of the corporation's interests. He's the "ideal CEO", in about the same way as Locutus of Borg is an ideal CEO.

A better analogy for this effect, for those who understand it, would be to compare Ellison to a https://wiki.lesswrong.com/wiki/Paperclip_maximizer, but that's not really that well-known a meme.

[go to top]