“A reliable way to make people believe in falsehoods is through frequent repetition, because familiarity is not easily distinguished from truth.”
People who were repeatedly exposed to the phrase “the body temperature of a chicken” were more likely to accept as true the statement that “the body temperature of a chicken is 144 °” (or any other arbitrary number). The familiarity of one phrase in the statement sufficed to make the whole statement feel familiar, and therefore true. If you cannot remember the source of a statement, and have no way to relate it to other things you know, you have no option but to go with the sense of cognitive ease. We are vulnerable.
Your environment or social context primes you to think (or not think) a certain way. For example, seeing ubiquitous portraits of the national leader all around, in dictatorial societies conveys the feeling that “big brother is watching” which can lead to an actual reduction in spontaneous thought and independent action. Your subjective experience consists largely of the story that your system 2 (the slower, more logical part of your mind) tells itself about what is going on. Priming is processed by system 1 (the fast thinking, more automatic part of your mind). Therefore, system 1 sets up systems 2’s interpretation of said event. Magicians make careers off of this notion- priming their viewers with slight-of-hand techniques to skew their understanding of what’s going on, leaving them asking “How did you do that?” We may let our system 2 guard down with magicians, but how in control are we really of what we experience?
Repetition of ideas (whether true or false) pushes us toward belief simply because repetition fosters cognitive ease and a sense of familiarity. This is called the mere exposure effect. This tendency and bias is exacerbated by several other characteristics of system 1. First, since system 1 operates automatically and cannot really be turned off, it is difficult to prevent errors associated with its intuitive analysis of the environment. Second, system 1 is pretty sure that its intuitive judgments are right. The result is that people tend to be overconfident and place too much faith in their intuitions. As Kahneman says, “System 1 is not prone to doubt. It suppresses ambiguity and spontaneously construct stories that are as coherent as possible. An essential hallmark of system 1 is that it is “gullible and biased to believe.” This happens in a number of ways. We tend to look for (and uncritically accept) information that supports our existing beliefs. This leads to a kind of confirmation bias. In addition, Kahneman argues that “system 1 understands sentences by trying to make them true, and the selective activation of compatible thoughts produces a family of systematic errors that make us gullible and prone to believe too strongly whatever we believe.” Unless the message is immediately negated, the associations that it evokes will spread as if the message were true. System 2 is in charge of doubt – but only if it is activated. Without the active intervention of system 2 to process and analyze doubts, ” we will continue to automatically process the information available to us as if it were true.”
– Daniel Kahneman, Nobel Prize Laureate and author of Thinking Fast and Thinking Slow
– Greg Cashman, author of What Causes War?
Action:
Let’s all try not mix up what we hear with what is our opinion of the truth.
If this post made you rethink your approach to living life in any way, please do others a service and SHARE the love.
Let’s Foster Critical Thinking.
Read More @ www.borisgodin.com
Follow us on Facebook @ Exploration of Human Condition