1.1. Introduction
We’ve all changed our minds at some point, about something. Maybe you were a cat person and became a dog person. Maybe you decided the place you lived, or the person you loved, or the religion you followed that they weren’t working for you any more. But changing your mind is rarely easy, and it’s not something you set out to do. Because changing your mind means admitting, on some level, that you used to be wrong. It can be seen as an act of weakness.
One thing about changing mind is it’s easier when you’re younger. We get less open to novelty as we get older. For ex: if you are not listening to a certain style of music by the time you’re 28 or so, 95 percent chance you’re never going to. By age 35, if you’re not eating sushi, 95 percent chance you never will.
Do most of us hold the beliefs that we do because the people around us hold those beliefs, or we’re more likely to assemble people around us based on the beliefs that they and we hold?
The former is more often true. That is, we believe what we do because the people around us believe what they do. This is the way humanity evolved. We depend on other people. And this is the reason it’s not enough to give people the information they need to change their mind.
People tend to associate with other people who are very similar to themselves. So we end up talking to people most of the time who have very similar past experiences and similar views of the world, and we tend to underestimate that. People don’t realize how isolated their world is.
In worlds where our network is well-balanced and we’re actually eventually incorporating everybody’s viewpoint, the system works extremely well.
1.2. Why it’s hard to change your mind?
This model where people just take facts and draw conclusions from them and then base their opinions on that is completely wrong. It seems that people, if you gave them the same kinds of information, they would make decisions the same way. They might have different experiences in their past, different influences, but somehow the fundamental ways in which they think about things and process things is the same. However, more you look at the data you realize that some people are very single-minded.
Well-educated people who consume a lot of information tend to hold disproportionately extreme views, apparently because they’re really good at seeking out information that confirms their priors. And ignoring information that might run counter. They start out with an emotional commitment to a certain idea, and then they use their formidable cognitive powers to organize facts to support what they want to believe anyhow and it takes a really big external shock that just clearly proves you wrong.
There was a group of about a quarter to a third of the subjects who actually became more polarized, who interpreted the information heavily in the direction of their priors, and actually ended up with more extreme positions after the experiment than before.
It seems to be a very important question whether the beliefs we hold about the outside world are somehow connected to these beliefs about ourselves. When there is a link between these beliefs, it’s not so clear that we should be changing our minds, and what are the costs and benefits of this.
One aspect of people seeing exactly the same information and coming away with different conclusions is how we interpret and store information in our brains. It’s very easy to sort of snippet things into small little pieces that we can remember. We don’t like breaking things down in detail. Most of us like to have a superficial understanding.
There is another factor that contributes to reluctance of changing mind, over-confidence. People seem to be exaggerating their own past performance in their head when this performance is bad. The conclusion from this is that people use memory selectively. They remember good outcomes and they tend to forget bad ones.
There’s also the possibility that people who’ve been at something for a while, who may consider themselves expert, simply don’t believe that non-experts have information worth paying attention to.
As you can see, there are a lot of reasons why a given person might be reluctant to change their mind about a given thing. Ego, selective memory, overconfidence, the cost of losing family or friends. But let’s say you remain committed to changing minds — your own or someone else’s. How do you get that done?
1.3. How to change your mind?
There is no silver bullet. It’s really hard. Think of something you have a really strong opinion about. Maybe the best ways to address climate change. The perils of income inequality. How to balance privacy and security. Now think about why you have such a strong opinion. How well do you think you could explain your position?
If you’re forced to give an explanation, you have to really understand, and you have to confront the fact that you might not understand. Whereas when you give reasons, then you do what people do around the Thanksgiving dinner table. They talk about their feelings about it, what they like, what they don’t like.
One experiment Prof. Steven Sloman has done is asking people to explain — not reason, as he pointed out, but to actually explain, at the nuts-and-bolts level — how something works.
That’s true not only for big, thorny issues like climate change or income inequality but even for things like. Toilets and zippers and ballpoint pens.
Unless you are a plumber or you make zippers or ballpoint pens, you probably can’t explain these very well. Even though, before you were asked the question, you would have thought you could. This gap, between what you know and what you think you know is called, naturally, the “illusion of explanatory depth.”
The illusion of explanatory depth was first demonstrated by a couple of psychologists named Rozenblit and Keil. And they asked people how well they understood how these things worked, and people gave a number between one and seven. And then they said, “Okay, how does it work? Explain in as much detail as you can how it works.” And people struggled and struggled and realized they couldn’t. And so when they were again asked how well they understood, their judgments tended to be lower. In other words, people themselves admitted that they had been living in this illusion, that they understood how these things worked, when, in fact, they don’t. We think the source of the illusion is that people fail to distinguish what they know from what others know. We’re constantly depending on other people, and the actual processing that goes on is distributed among people in our community.
In other words, someone knows how a toilet works: the plumber. And you know the plumber; or, even if you don’t know the plumber, you know how to find a plumber. It’s as if the sense of understanding is contagious. When other people understand, you feel like you understand.
You can see how the illusion of explanatory depth could be helpful in some scenarios — you don’t need to know everything for yourself, as long as you know someone who knows someone who knows something. But you could also imagine scenarios in which the illusion could be problematic.
Mind is something that’s shared with other people. I think the mind is actually something that exists within a community and not within a skull. And so, when you’re changing your mind, you’re doing one of two things: you’re either dissociating yourself from your community — and that’s really hard and not necessarily good for you — or you have to change the mind of the entire community. And is that important? Well, the closer we are to truth, the more likely we are to succeed as individuals, as a species. But it’s hard.
Reference: Based on the podcast How to Change Your Mind