zlacker

[parent] [thread] 3 comments
1. SeanAp+(OP)[view] [source] 2019-03-11 19:49:44
That depends on whether or not the machine has a conscious experience, and we have no way to interact with that question right now.

The reason we care about slavery is because it is bad for a conscious being, and we have decided that it is unethical to force someone to endure the experience of slavery. If there is no conscious being having experiences, then there isn't really an ethical problem here.

replies(2): >>lvoudo+x7 >>esrauc+48
2. lvoudo+x7[view] [source] 2019-03-11 20:44:43
>>SeanAp+(OP)
Isn't consciousness a manifestation of intelligence? I don't see how the two can be treated separately. Talking about AGI is talking about something that can achieve a level of intellect which can ask questions about "being", "self", "meaning" and all the rest that separate intelligence from mere calculation. Otherwise, what's the point of this whole endeavor?
replies(1): >>SeanAp+Cy2
3. esrauc+48[view] [source] 2019-03-11 20:49:19
>>SeanAp+(OP)
I think a lot of would be opposed to special lobotomized humans where they didn't realize they were slaves. It really gets into hairy philosophy once we start reaching AGI
◧◩
4. SeanAp+Cy2[view] [source] [discussion] 2019-03-12 19:27:27
>>lvoudo+x7
No one knows what consciousness is. Every neuroscientist I've talked to has agreed that consciousness has to be an emergent property of some kind of computation, but there is currently no way to even interact with the question of what computation results in conscious experience.

It could be true that every complex problem solving system is conscious, and in that case maybe there are highly unintuitive conscious experiences, like being a society, or maybe it is an extremely specific type of computation that results in consciousness, and then it might be something very particular to humans.

We have no idea whatsoever.

[go to top]