In the Influential Mind, by Tali Sharot, it’s shown how difficult it can be to alter someone’s beliefs and actions by introducing data to prove that we are right and they are wrong. It fails because in the face of facts that clash with their beliefs, people either come up with counter-arguments which then actually strengthen their erroneous belief, or simply turn off from the argument completely and attack something else (ad hominem attacks, perhaps, or the source of the information as being corrupt, or elite.).
Essentially this is a form of confirmation bias – our tendency to cherry-pick information that confirms what we already believe, and ignore that which doesn’t. It’s also why we think that people who think like us are smarter than those who don’t. This bias is most pronounced when it’s anything emotional – vaccines causing autism, for example.
John Stuart Mill in 1869:
“So long as an opinion is strongly rooted in the feelings, it gains rather than loses in stability by having a preponderating weight of argument against it… when it rests solely on feeling, the worse it fares in argumentative contest, the more persuaded its adherents are that their feeling must have some deeper ground which the arguments do not reach”
Just as well he wasn’t on Twitter.
We can see how this plays out in the media all the time – ever tried arguing against Brexit with facts? They’ll just get disregarded, explained away, or ignored as they clash with the stronger, emotional beliefs of the person, and it’s much, much nicer to find the things which confirm we’re right, than have to face the fact that we could be wrong, and what that means to our sense of identity and self when the thing we’re wrong about is so emotional.
This problem is obviously amplified by the internet – we tend to follow people who believe the same as us, and unfollow them if they post things we disagree with. Google itself tries to give us results tailored to what we already believe – creating an echo chamber of self-confirmation.
Turns out the best way of changing someone’s mind is to present things in a manner which doesn’t contradict their baser beliefs; in other words, tell them the story in a way which matches the story they tell themselves anyway. It’s all about stories, in the end, as it’s pretty much always been.
There was a great example of this on Twitter recently, and I’m disappointed to say I can’t find the original source now so some of this will be paraphrased, though the main point remains:
A junior doctor in the US was being mentored by one more senior, and the way this works is for them to deal with the patient alone, and then report back to the more senior doctor what happened and go on from there.
The junior spoke to a patient who was refusing vaccines for her child because she was convinced that this would cause autism, and in trying to talk her round the junior doctor discovered she had similar views on chemtrails, the government, russia, etc. He reported back to his supervisor in frustration; the senior doctor smiled, and went go and talk to the patient, who then repeated all her fears and beliefs about the vaccines, and other theories of a more conspiratorial nature. When she’d finished, the senior doctor said – “Have you ever thought that the talk about vaccines causing autism is just a conspiracy theory spread by the Russians and Chinese to weaken American children?”
She allowed the vaccination.
Rather than trying to argue against what she inherently believed, he used her own story to transmit the message he wanted.
Again and again we can see how important it is to understand the customer, their beliefs, problems, and stories – and then see how we can fit ourselves into that, rather than attempting to push our own stories onto them. More on this to come, as we look at how we can re-work our services and offering to a more client-centric approach.