Have you ever argued with someone about a political issue and lost your temper because they just didn’t seem to hear your arguments? Why don’t they hear your arguments? Your arguments were perfectly sound for you, but not for them. Could you provide more arguments to convince? You think that you could. You try better next time. The book that I read holds the answer. “It’s not about you” is the wrong answer. It is about you, but also about them. The book titled “The Science and Politics of Myside Thinking” by Keith E. Stanovich.
As a scientist, I learned early on about different biases. A bias is a trick of our cognition, a systematic distortion that occurs when information reaches the brain. For example, confirmation bias is an inclination to evaluate and interpret information in a way that favours the focal hypothesis. Confirmation bias is relatively benign, as long as the information that contradicts the focal hypothesis is processed nevertheless.
The author introduces two other biases. The first one is the belief bias. When data, which is presented to the reasoner, contradict their belief about the world, the reasoner experiences difficulty reasoning. Belief bias occurs towards testable beliefs. A typical example is a logical syllogism:
1. All flowers need water.
2. Roses need water.
3. Therefore, roses are flowers.
It only appears that the argument is valid, while it is not. Many other things need water, doesn’t mean that they are all flowers. Consider this:
1. All flowers need water.
2. Humans need water.
3. Therefore, humans are flowers.
This time, it is very clear that the argument is invalid. This is an example of belief bias. In the first case, people already believe the conclusion that roses are flowers, and it’s hard for them to evaluate the reasoning.
The second bias introduced by the author is the myside bias. Myside bias, in contrast to belief bias, is not about testable beliefs but about distal ones, about our attitudes, dispositions, and values. Myside bias will make people perceive the same physical object or situation differently depending on their values.
(alert) Politically charged example [ref1]. Participants were first asked whether they were for or against the Muslim ban (to ban people from Muslim countries from coming to the US; the study was among US participants), and then given the following statistics:
p(terrorist|muslim) The probability that an immigrant from a Muslim country is a terrorist is 0.00004 %.
p(muslim|terrorist) The probability that a terrorist immigrant is from a Muslim country is 72 %.
p(muslim) The probability that an immigrant is from a Muslim country is 17%.
p(terrorist) The probability that an immigrant is a terrorist is 0.00001 %.
Those who were against the Muslim ban also said that the most important probability is p(terrorist|muslim); while those who supported the Muslim ban had selected p(muslim|terrorist) probability. So far, nothing terribly surprising.
Secondly, the study asked a different question: whether participants (the same people as before) were for or against the weapons ban, and then were given probabilities:
p(weapons|shooting) Out of 6 American adults who committed a mass shooting, 4 owned an assault weapon
p(shooting|weapons) Out of 12 million American adults who owned an assault weapon, 4 committed a mass shooting.
p(shooting) In the last few years, 6 out of 100 million American adults committed a mass shooting.
p(weapons) In the last few years, 12 million out of 100 million American adults owned an assault weapon.
Those who were in favour of the weapon ban also said that p(weapons|shooting) was more important; and those who were against the weapon ban said that p(shooting|weapons) was more important.
The interesting thing is that those who were for the Muslim ban were often simultaneously against the weapon ban, and those who were against the Muslim ban were also in favour of the weapon ban. In two very similar situations, participants had selected the probability that supported their opinion. While the rational behaviour would be to choose always, for example, the hit rate, namely p(muslim|terrorist) or p(weapons|shooting).
Yet, it is important to refrain from thinking that one group was generally more biased or uneducated. Myside bias is the odd one out; it has not been shown to correlate with other cognitive biases. It was slightly correlated with intelligence, but in an unexpected way. Those who are more intelligent exhibit slightly larger myside bias.
It seems that myside bias is not person-related but topic-related. People show a larger myside bias on topics where they have a strong opinion in favour of one of the options. It is not possible to predict myside bias strength based on the opinion alone; it is not that pro-weapon people display higher myside bias. The strength of the opinion predicts the strength of myside bias. It is pro-weapon people who strongly support assault weapon owning who would show larger myside bias.
The term bias usually implies irrational behaviour. Myside bias again is different. Myside bias may have evolved as humans formed communities and lived in groups. For an ancient human, it was crucial to belong to a group. If they were to doubt group opinions on every case, they would quickly end up outside the group and die. Therefore, contemporary humans act rationally when exhibiting myside bias. They do not evaluate information on whether it is true or false, but on whether it complies with the values of the group they belong to (or want to belong to).
Although myside bias is beneficial for individuals, it is detrimental to society. As society becomes more and more polarized, different groups argue endlessly instead of finding a common ground. Thus, society doesn’t progress due to a lack of cooperation between politically charged poles.
The author has several tips on how to reduce myside bias within oneself (if one chooses).
1. Don’t hold strong opinions.
2. Realize that your opinions on different topics may contradict each other. From which it follows that those opinions were probably an influence of other people, not your own deliberation.
3. Detach from your opinions. A good way to detach is to practice perspective switching. If you hold an opinion, try to take the opposite point and reason for it (and against your own opinion).
I liked a book a lot. I found many things useful that helped me to look at my own opinions and the opinions of my inner circle in a new way. I can understand that some people would rather hold their opinions than embrace arguments of the opposite group. However, I would rather make sure that my opinions are as much my own as possible, than I would belong to the group, but be dragged by someone else’s vigor. Therefore, from now on, the only thing I will try to persuade other people to do is to practice perspective switching. Never harmed anyone.
1 Van Boven, L., Ramos, J., Montal-Rosenberg, R., Kogut, T., Sherman, D. K., & Slovic, P. (2019). It depends: Partisan evaluation of conditional probability importance. *Cognition*, *188*, 51-63.Favourite quote:
“When arguing about a political issue, your opponents may be closer to you than you think because, in many cases, the issue isn’t the issue—the tribe is.”
September, 2025