Interview with Dr. Christin Scholz
3 October 2024
Technically, polarisation is the existence or development of a persistent and extreme division in attitudes, identities, and behaviors. This might sound complicated, but it basically means that in society, there are people (or groups) who have very strong opinions on a topic and differ from the opinions of other people or groups. These opinions are often extreme in the sense that individuals hold onto them and are reluctant to consider evidence from other points of view, which consequently leads to a disconnection between these two or more groups.
Researchers in political psychology have mostly focused on the United States, where there is a two-party system, and in this case, it is easy to see only two poles. In contrast, in Europe, there is a different type of polarisation, usually called 'pillarization,' because there are several pillars. After all, in many countries on the continent, there are more than two political parties and more than two points of view to consider, which makes studying it much more complicated. Now, although the terminology may change, the idea is the same: there are different groups in society that are disconnected from others with different views. Furthermore, polarisation (or pillarization) should be viewed as a process, as a movement of society towards a different state. Why? Because it can be a warning sign: if we see processes leading to this, we should address them before reaching a highly polarised state, as it is easier to prevent polarisation than to reverse it once it has fully developed.
I would say no, although what we consider political is also quite difficult to define. In recent years, many topics have become politicized, such as the discussion about COVID-19: who gets vaccinated and who doesn't, the issue of masks, curfews... In that sense, anything can become political as soon as certain elites start talking about it. However, polarisation can affect any topic you can think of. It can influence health issues (should abortion be allowed?), environmental behaviors (should we invest in recycling systems?)... Basically, any topic where there can be strong disagreement and that divides society.
Humans have cognitive biases that predispose us to prefer information that aligns with our current opinionsChristin Scholz
This is one of the big questions. Having a firm opinion is not necessarily a bad thing. In fact, people being involved in politics and having well-developed opinions on current issues is, theoretically, good for democracy. In principle, this should mean we have an informed electorate that makes well-founded decisions when going to the polls. The problem starts when these opinions become too rigid. Sometimes this is called cognitive rigidity, meaning that all information one receives is evaluated solely based on whether it aligns with one's opinion or not. Then, the person is not open to new evidence that could, or should, change their mind. In my work, we are constantly learning new things that make us change the way we do things. If we were not open to these changes, then having firm opinions would not allow us to progress. The same happens in democratic systems. Being involved in politics is not enough: being closed to evidence can also lead us to make bad decisions.
Polarisation is far from new. We are neither the first polarised society nor the last. However, what's different now is the level of attention it receives, both in public discourse and research. On the positive side, there’s a growing body of research helping us understand polarisation better. For instance, we now recognize that it occurs on multiple levels—on a large scale, events like wars or economic crises create societal divides that lead to polarisation. On a more personal level, cognitive biases push us to seek information that aligns with our views, which can contribute to polarised states over time. Additionally, today’s fragmented media environment plays a role. With so many different channels, blogs, and outlets catering to specific viewpoints, relatively extreme voices are often amplified, while the broader, less polarised majority remains quieter. This makes it seem as though society is more polarised than it truly is.
The mind is designed to be as accurate as possible but also efficient. This is partly because, not only today but in all civilizations throughout history, humans are exposed to a lot of information simultaneously. Therefore, the brain is constantly busy filtering information to make quick decisions: What is relevant to me? How should I act now? This leads us to be rational decision-makers, yes, but imperfect ones. Ideally, we would consider all available evidence exhaustively and then make a fully informed decision, but that’s not possible because we neither have the time nor the capacity to evaluate everything.
That's right. In the context of political polarisation, the most common shortcuts or biases are known as confirmation bias and cognitive dissonance. Confirmation bias refers to the tendency to prefer information that agrees with our worldview; that is, we take our perception of the world as 'the correct' way any issue should be and we love information that confirms it. On the other hand, cognitive dissonance refers to the fact that, generally, people want to maintain a positive self-image and feel that they act consistently, that their opinions and behaviors align with their worldview. If you suddenly tell me something that contradicts my opinion, I'll probably feel uncomfortable because there's a disagreement between my worldview and the information I'm receiving. So, what I will consciously or unconsciously do is reduce that dissonance and eliminate the discrepancy between what you're telling me and what I believe. How can I do that? By evaluating the evidence, and if it's valid, integrating it into my worldview; or simply discrediting what you say. The first option undoubtedly requires much more effort.
Exactly. We have built up our world view over time through our life experiences. In most scenarios, it is most efficient for us to trust in that work we have previously, that is to trust our prior world views when evaluating new information. ON average, this is an effective system, but it can have unwanted consequences when it leads us to ignore or discount new, contradictory information that is strong enough so that it should change our world view. Phrases like 'you don't know what you're talking about,' 'you have no experience in the matter' are the easy way to make myself feel better when new information threatens what I previously believed, and because it's easy, it's more recurrent. Gradually, we avoid any exposure to information contrary to our attitudes. If you're a left-wing person, you're likely to read more left-wing newspapers. Something similar happens with our behavior on social media: our contacts tend to be from a specific group. Additionally, we're more likely to share information that corroborates our views than information that contradicts them.
Being involved in politics is not enough: being closed to evidence can also lead us to bad decisionsChristin Scholz
I don't think the amount of information is necessarily the problem. I believe that reducing the media system to just two channels wouldn't help address those cognitive biases. What would be problematic is more what we call media fragmentation, which means there are many channels but all very specialized, and if you choose to watch only one of those channels, then you'll hear only one point of view. That unilateral exposure could contribute to a polarising tendency.
I wish I had a good answer for that. So far, lots of research has been done and although some approaches have shown promise in impacting certain elements of polarisation, we do not have a strong solution for the overarching issue. We spoke earlier about how polarisation happens at many different levels of society. Most approaches to affecting polarisation target polarisation at one of those levels. For instance, one might consider developing a less polarised party system, new ways to elect representatives, changing the structure of the media environment, perhaps by regulating things like social media algorithms or imposing restrictions on certain types of journalism. However, making changes on level affects dynamics at other levels of society, which makes the problem very complex. For instance, changing the media system has consequences for individual consumers of media content. What would happen if you're a fan of a media outlet with extreme views and the authorities ban it? That would be censorship, right? You can't just silence an opinion because then those affected would become martyrs, and soon another outlet, legal or illegal, would emerge to meet the demand. My recommendation, for now, is to educate people about epistemic vigilance, which is basically learning to be aware of and handle our own biases when we encounter new information. We need to be aware that cognitive biases are inherent to humans and try to keep an open mind to new evidence. The world changes, we change, and it's okay to change your opinion if there are reasons to do so. We must teach and learn about media literacy, that is, understanding why certain types of media act the way they do and what consequences this can have.
Learning about epistemic vigilance should be a priority, meaning that we need to become smarter when handling new informationChristin Scholz
Absolutely. We need to understand the political system, social structures, the human mind, and the biological basis of information processing. A single scientist or a single field of research cannot explain everything. That's why interactions between different fields are really useful for understanding polarisation. In fact, that was the main motivation for starting the Priority Research Area on Polarisation at the University of Amsterdam, where we try to apply a cross-level approach rather than focusing on just one small aspect of polarisation. For example, a project I'm currently involved in brings together communication scientists, political scientists, psychologists, and neuroscientists. We are using MRI machines to study cognitive biases when people process information that aligns with and contradicts their opinions and study how these biases influence the ways in which people interact with different types of groups around them.
Although it happens every few years, it still shocks me: when there's a new development in society, like artificial intelligence, the internet, political polarisation… There’s always an avalanche of activity, a lot of public discussion, and a lot of research on it. In these cases, people initially get stuck in pessimism, and it seems inevitable to go through that phase of “Oh my God, what will happen if polarisation completely destroys all our democratic systems?”, or as it was decades ago, “What will happen if everyone starts watching TV and ends up brainwashed by all this visual content? How bad will it be for the youth raised with social media?”. What surprises me is this tendency to focus on everything that could go wrong. Then, generally, after the first few years of uncertainty, we start to see that the worst-case scenario isn’t always what happens. Often, we also see gray areas where good and bad things can happen, and maybe the bad isn’t as bad as we thought. We have now reached this more differentiated phase in our understanding of polarisation. We are beginning to acknowledge that not all aspects of polarisation (for instance an informed and opinionated electorate) are necessarily bad and that the level of polarisation that is portrayed in popular media and by the few who are particularly outspoken (for instance on online channels) is not necessarily representative of the silent majority. Stepping away from the initial panic lets us examine the different aspects and dynamics of polarisation in more detail and consider the problem in it’s full complexity across individuals, groups and societal systems. I am hopeful that this differentiated analysis rather than fear mongering will bring us closer to a solution in the coming years.