zlacker

[parent] [thread] 28 comments
1. cousin+(OP)[view] [source] 2018-02-15 09:22:00
This is a "culture war" topic, let's prepare for the shitshow in the comments.

I'd like to offer some advice to make things go a bit more smoothly. There's a widespread view that all beliefs are political, you can't be apolitical, and anyone arguing for a belief opposing yours must be an enemy. To me, that view is pretty much a type error. Beliefs are value-neutral. Only arguments for or against beliefs can be political or not.

More specifically, some arguments are rational (based on evidence) while other arguments are political (based on who benefits and who loses). You can be a very civil person, but still reach for political arguments when defending your beliefs, and thus cause net harm. Or you can be a rude person, but drawn to arguing based on evidence, and thus cause net benefit. It's up to you.

Now go forth and make a flamewar :-)

replies(6): >>duncan+8 >>jabot+R >>hnhg+T1 >>mehele+V1 >>tome+L3 >>tremon+O7
2. duncan+8[view] [source] 2018-02-15 09:26:08
>>cousin+(OP)
I'll be dammed. Opened the comments expecting a shit show and saw yours first. Thank you for brightening my day.
3. jabot+R[view] [source] 2018-02-15 09:38:03
>>cousin+(OP)
As a counter-point: People can hold beliefs because they have certain consequences. Simple example: "all people are created equal". This is a belief that is (probably) frequently held because it has assumed beneficial consequences for society.

So if people choose to believe in something because that has certain consequences - then a belief can be political.

EDIT: To expand on this a little... It seems to me that you divorce a belief itself from its consequences. As there are a lot of beliefs that have immediate and direct political and social consequences, i think that this separation is questionable.

If you have a belief, you probably will act on that belief. Having a belief and _not_ acting on it _at all_ seems rather useless and abstract to me. I'm not saying that this doesn't happen, but in general, if some person has a belief, he (or she) will act on that belief.

So, to be blunt: for some beliefs, having them is a political act.

replies(4): >>akvadr+31 >>Scea91+W2 >>barrke+r7 >>cousin+vc
◧◩
4. akvadr+31[view] [source] [discussion] 2018-02-15 09:42:39
>>jabot+R
> People can hold beliefs because they have certain consequences.

Nearly all beliefs are of this type. I believe the Earth orbits the Sun because it's helps me predict the seasons.

replies(1): >>jabot+A1
◧◩◪
5. jabot+A1[view] [source] [discussion] 2018-02-15 09:52:55
>>akvadr+31
In a narrow sense you are probably right.

In a more general sense, consider the "laws of physics". People believe in them not because of their consequences, but because they explain reality (and quite well). So people believe in gravity because it has worked for them in the past, and because it has been verified.

Also... _not_ believing in gravity will not make much of a difference for your life (as long as you don't start jumping off cliffs of course).

EDIT: I forgot about religion. AFAIK, most people believe in religion because they have been taught/indoctrinated/raised to believe in it, and _not_ because they have (deeply) thought about the social consequences of their particular religion.

replies(2): >>akvadr+b5 >>pbhjpb+j7
6. hnhg+T1[view] [source] 2018-02-15 09:59:37
>>cousin+(OP)
Many arguments are both partially rational and partially political, but lay claim to being only rational.

You might be implying that but it's worth stating as a particular case.

I'd say this article probably fits that category, and is why the flamewar will emerge :)

7. mehele+V1[view] [source] 2018-02-15 09:59:47
>>cousin+(OP)
It's the is-ought problem really. We can take a position on what is but very often pop off to talk about what ought to be. Further because people have differing levels of knowledge, understanding and acceptance of evidence of varying degrees of accuracy the rational positions on what is can be quite wide. Politics to me is not the opposite of rationality but where we take our is and extrapolate. As long as this extrapolation is reasonable to the evidence presented this is still rational since it's relative to the understanding of that individual. So I'm not sure that separating arguments into rational and political is that helpful except in dismissing others points of view.

For example there is plentiful evidence that women are less represented in certain STEM fields. But is that because they are ill-suited to them or the fields themselves are ill-suited to women. Do we accept this lack of representation as an inevitable consequence? It is this way and it ought to be so. Or do we broaden the possibilities and consider what these fields might be like and may accomplish otherwise if they were more female friendly.

I'd take a wild stab in the dark that prior political world view is probably a greater predictor than most for which of those options seems the more appealing.

◧◩
8. Scea91+W2[view] [source] [discussion] 2018-02-15 10:16:53
>>jabot+R
Just note that believing something because it has nice consequences is just logically wrong.

Formally:

1) A => B

2) I would really like B to be true, because it has some benefits.

3) Therefore A must be true.

If we are in search of truth and want to build a consistent model of the world then we just can't accept this kind of reasoning.

Also, even if A really turns out to be false, it doesn't mean that B can't be true and our world will forever be sad i.e. Even if it turns out that all people are not created equal, we can still live in just and enlightened society which treats everyone fairly.

And of course the implication between A => B may not even exist. Is it really true that if all people were created equal the society would benefit?

replies(2): >>jabot+S3 >>SuoDua+ia
9. tome+L3[view] [source] 2018-02-15 10:29:19
>>cousin+(OP)
> Beliefs are value-neutral. Only arguments for or against beliefs can be political or not.

I wonder why you didn't go a step further to postulate "only actions can be political or not". Of course argument is action, but it could be claimed that only argument qua action is political.

replies(2): >>jabot+44 >>cousin+Jb
◧◩◪
10. jabot+S3[view] [source] [discussion] 2018-02-15 10:31:04
>>Scea91+W2
First, please do not get too hung up about "all people are created equal". It's just an example in this context.

> If we are in search of truth and want to build a consistent model of the world then we just can't accept this kind of reasoning.

Yes, you can. It's called generalizing. Remember, we're not doing math here, this is ethics.

Think about all the different situations where being treated equally to someone else is a good thing. Now you generalize that into "everyone is created equal", and that turns into a justification for all these situations.

If you want to criticise the generalization into A, there is a way:

Construct some C so that

1) A => C

2) C is not beneficial

Alternatively, you can attack the implication "A => B" or you can question whether or not B is beneficial.

replies(2): >>Scea91+J7 >>badosu+yf
◧◩
11. jabot+44[view] [source] [discussion] 2018-02-15 10:32:53
>>tome+L3
Then again, a lot of beliefs have direct, strong and immediate consequences on your actions. So if your action is political, and you did that action because of your belief, is your belief still apolitical?
replies(1): >>tome+y4
◧◩◪
12. tome+y4[view] [source] [discussion] 2018-02-15 10:41:33
>>jabot+44
There are a number of different lines of enquiry one might take on this subject (and I'm sure they have been taken by those in research in philosophy or politics). For example, is being able to restrain oneself from acting on one's deeply held beliefs necessary to function in civilised society? Isn't it even possible to have mutally contradictory beliefs, thus making acting on them impossible under certain circumstances?
◧◩◪◨
13. akvadr+b5[view] [source] [discussion] 2018-02-15 10:52:13
>>jabot+A1
About religion: it may not be about societal consequences, but personal ones; acceptance of your peers, the afterlife, ...
◧◩◪◨
14. pbhjpb+j7[view] [source] [discussion] 2018-02-15 11:31:10
>>jabot+A1
The laws of Physics don't explain reality, they model it to our best approximation.

>and because it has been verified. //

That may be true for you but is essentially an appeal to authority. Most scientists claim to be falsificationists and the current agreed Scientific Method is one of Popperian falsification.

Your comment on cliffs belies an ignorance of epistemology - people didn't used to have a gravitational theory, as far as we know no animals have one, that doesn't mean you then jump off cliffs. One's beliefs about reality don't fundamentally change reality.

replies(1): >>jabot+M8
◧◩
15. barrke+r7[view] [source] [discussion] 2018-02-15 11:32:22
>>jabot+R
When beliefs run counter to reality, the divergence between what is, and what is perceived ought to be, can increase to the point that the state's monopoly on violence is used to prop up a belief system.

If the gap is big enough, I think the result becomes indistinguishable from a theocracy, with analogues to blasphemy and ostracization for describing reality too closely, enforced by the state.

◧◩◪◨
16. Scea91+J7[view] [source] [discussion] 2018-02-15 11:37:20
>>jabot+S3
> Think about all the different situations where being treated equally to someone else is a good thing. Now you generalize that into "everyone is created equal", and that turns into a justification for all these situations.

The valid generalization would be "everyone should be treated equally" not "everyone is created equal" which is a totally different thing.

replies(1): >>jabot+08
17. tremon+O7[view] [source] 2018-02-15 11:39:52
>>cousin+(OP)
some arguments are rational while other arguments are political

These attributes are mostly orthogonal though. Most evidence, even if conclusive, will show that certain approaches benefit one group more than some others. There is no such thing as purely evidence-based policy, because there is no agreement on what to optimize for.

I think this whole evidence-based fad completely misses the point. It insinuates that there is a perfect possible outcome that will benefit all equally, and that is simply not the case. And by focusing on the outcomes, it detracts from the real sticking point: let's talk about how to harmonize goals, instead of only results.

◧◩◪◨⬒
18. jabot+08[view] [source] [discussion] 2018-02-15 11:42:18
>>Scea91+J7
Good point.

EDIT: I think what you're doing here is criticising A by constructing A' and arguing that it is better (by having the same consequences while being less general). That is a good way of criticising the generalization as well.

◧◩◪◨⬒
19. jabot+M8[view] [source] [discussion] 2018-02-15 11:54:21
>>pbhjpb+j7
That was not the point I was trying to make, sorry.

GP said:

>> People can hold beliefs because they have certain consequences.

> Nearly all beliefs are of this type. I believe the Earth orbits the Sun because it's helps me predict the seasons.

I think there are a lot of beliefs that are held not because they have beneficial consequences, but for other reasons.

Religion as a belief is held by most people because they have been brought up with it. (converts nonwithstanding)

Scientific theories are believed (i.e. held true, used for explanations of reality) because they predict stuff that actually happens and can not be falsified.

Both of these are not believed primarily because their belief has actual consequences IMHO.

◧◩◪
20. SuoDua+ia[view] [source] [discussion] 2018-02-15 12:14:12
>>Scea91+W2
I would disagree with the generalization. Consider the following set of propositions:

1) The placebo effect predicts that believing I'll get over a bout of the flu quickly increases my chances of getting over the flu quickly

2) I believe in the placebo effect and would like to get over the flu quickly

3) Therefore, I choose to believe I will get over the flu quickly.

There is no reason besides practicality to believe proposition 3), but if propositions 1) and 2) are accepted, proposition 3) follows naturally. To my own way of thinking, some beliefs are self-justifying, but I'd be interested to hear how your worldview deals with this example.

edit: spacing

replies(1): >>Scea91+Dh
◧◩
21. cousin+Jb[view] [source] [discussion] 2018-02-15 12:31:44
>>tome+L3
Because actions are downstream from beliefs, while arguments are upstream of beliefs. I prefer to fix bugs upstream.
replies(1): >>tome+bk
◧◩
22. cousin+vc[view] [source] [discussion] 2018-02-15 12:42:30
>>jabot+R
Yes, beliefs are chosen for consequences. And the optimal way to choose a belief based on consequences is to choose the truth.

For example, my decision whether to take an umbrella today must be based on my honest best guess whether it'll rain today (and the relative utilities of various outcomes). If I shift my best guess one inch away from what's warranted by evidence, to obey social pressure or something, then acting-as-if the new belief was true will predictably lead to lower expected utility for me. That holds always, no matter how controversial the belief.

When social pressure is weak, it mostly makes people lie about their beliefs, while still acting-as-if their best guess was true. When social pressure gets strong, people start acting-as-if false things were true, and get lower utility. No amount of pressure can change the fact that actions based on accurate beliefs lead to higher expected utility. That's why I'm not a fan of social pressure on beliefs. For both individuals and societies, the best consequences are achieved by believing what is true.

replies(2): >>Firade+1f >>jabot+cg
◧◩◪
23. Firade+1f[view] [source] [discussion] 2018-02-15 13:14:58
>>cousin+vc
Social pressure itself is a fact. In your umbrella-example, you're restricting your definition of utility to the immediate benefit of dealing with rain weighted against the cost of having to carry the umbrella. But your choice can have a social effect, too. In particular, you need to take into account that other people will not always act on exactly what is warranted by evidence.
◧◩◪◨
24. badosu+yf[view] [source] [discussion] 2018-02-15 13:20:55
>>jabot+S3
>> If we are in search of truth and want to build a consistent model of the world then we just can't accept this kind of reasoning.

> Yes, you can. It's called generalizing. Remember, we're not doing math here, this is ethics.

To strive for a consistent set of beliefs grounded in reality is more akin to most branches of Philosophy than Mathematics as far as I am concerned, and seems particularly concerning to Ethics (e.g. you can achieve a set of beliefs of what you can consider 'good' or 'evil' ignoring all perception of reality but would the result be desirable?).

◧◩◪
25. jabot+cg[view] [source] [discussion] 2018-02-15 13:25:46
>>cousin+vc
You are right with your juxtaposition of belief and truth.

Two minor points:

1) What is truth? How do you know whether something is true? By perceiving it? Is your perception not influenced by belief?

I'm not saying that we cannot know truth, ever. But we have to keep in mind we might be wrong, too...

2) Believing in something might _make_ it become true. Or, to put it differently: Belief may lead to actions which change reality - essentially a self-fulfilling prophecy.

replies(1): >>cousin+qg
◧◩◪◨
26. cousin+qg[view] [source] [discussion] 2018-02-15 13:27:39
>>jabot+cg
Yeah, agreed on both points.
◧◩◪◨
27. Scea91+Dh[view] [source] [discussion] 2018-02-15 13:41:49
>>SuoDua+ia
Quite an interesting counterexample. The trick is in the self-reference where the fact that you believe the premise actually affects the conclusion which most of the time is not the case.

I might consider this an edge case and include it in my worldview as an exception ;)

◧◩◪
28. tome+bk[view] [source] [discussion] 2018-02-15 14:06:18
>>cousin+Jb
> arguments are upstream of beliefs

If only!

replies(1): >>cousin+yk
◧◩◪◨
29. cousin+yk[view] [source] [discussion] 2018-02-15 14:09:27
>>tome+bk
Aren't they? All beliefs have reasons. It's just that some of these reasons don't hold water and I'm trying to point out which.
[go to top]