Read online free
  • Home
  • Romance & Love
  • Fantasy
  • Science Fiction
  • Mystery & Detective
  • Thrillers & Crime
  • Actions & Adventure
  • History & Fiction
  • Horror
  • Western
  • Humor

    Not Born Yesterday

    Prev Next


      For instance, if you, like most people, got stuck on the ten-cents

      answer to the Bat and Ball prob lem, and someone had told you

      that the correct answer was five cents, your initial reaction would

      have been to reject their statement. In this case, your System 2

      would have had to do some work to make you accept a sound

      belief, which is much more typical than cases in which System 2

      does extra work to make us reject an unfounded belief.

      There is no experimental evidence suggesting a systematic as-

      sociation between being less analytical y inclined— less likely

      to use one’s System 2— and being more likely to accept empiri-

      cally dubious beliefs. Instead, we observe a complex interaction

      between people’s inclinations to use diff er ent cognitive mecha-

      nisms and the type of empirically dubious beliefs they accept.

      Beliefs that resonate with people’s background views should be

      more successful among those who rely less on System 2, whether

      e v o l v in g o p e n - mind e d ne s s 45

      or not these beliefs are correct. But an overreliance on System 2

      can also lead to the ac cep tance of questionable beliefs that stem

      from seemingly strong, but in fact flawed, arguments.

      This is what we observe: the association between analytic

      thinking and the ac cep tance of empirically dubious beliefs is

      anything but straightforward. Analytic thinking is related to

      atheism but only in some countries.51 In Japan, being more

      analytically inclined is correlated with a greater ac cep tance of

      paranormal beliefs.52 Where brainwashing techniques failed to

      convert any POWs to the virtues of communism, the sophisti-

      cated arguments of Marx and Engels convinced a fair number

      of Western thinkers. Indeed, intellectuals are usually the first to

      accept new and apparently implausible ideas. Many of these

      ideas have been proven right (from plate tectonics to quantum

      physics), but a large number have been misguided (from cold

      fusion to the humoral theory of disease).

      Even when relative lack of sophistication seems to coincide

      with gullibility, there is no evidence to suggest the former is caus-

      ing the latter. On some mea sures, young children can be said

      to be more gullible than their older peers or than adults.53 For

      instance, it is difficult for three- year- olds to understand that

      someone is lying to them and to stop trusting them.54 (In other

      re spects, obviously, three- year- olds are incredibly pigheaded,

      as any parent who has tried to get their toddler to eat broccoli

      or go to bed early knows). But this apparent (and partial) gull-

      ibility isn’t caused by a lack of cognitive maturation. Instead, it

      reflects the realities of toddlers’ environment: compared with

      adults, small children know very little, and they can usually trust

      what the adults around them say.55 In the environment in which

      we evolved, young children were nearly always in the vicinity of

      their mothers, who have limited incentive to deceive them, and

      who would have prevented most abuse. This strong assumption

      46 ch ap t er 3

      of trustworthiness adopted by young children is, in some ways,

      similar to that found in bees, which have even fewer reasons to

      mistrust other bees than young children have to mistrust their

      caregivers. In neither case does lack of sophistication play any

      explanatory role in why some agents trust or do not trust others.

      The logic of evolution makes it essentially impossible for gull-

      ibility to be a stable trait. Gullible individuals would be taken

      advantage of until they stop paying attention to messages. In-

      stead, humans have to be vigilant. An arms race view of the

      evolution of vigilance is intuitively appealing, with senders evolv-

      ing to manipulate receivers, and receivers evolving to ward off

      these attempts. Even though this arms race view parallels nicely

      the popu lar association between lack of sophistication and gull-

      ibility, it is mistaken. Instead, openness and vigilance evolved

      hand in hand as human communication became increasingly

      broad and power ful. We can now explore in more detail the cog-

      nitive mechanisms that allow us to be both open and vigilant

      toward communication: How do we decide what to believe, who

      knows best, who to trust, and what to feel.

      4

      WHAT TO BELIEVE?

      imagine you are a foodie. You love all sorts of diff er ent

      cuisines. There’s one exception, though: Swiss cuisine. Based on

      a number of experiences, you have come to think it is mediocre

      at best. Then your friend Jacques tell you that a new Swiss res-

      taurant has opened in the neighborhood, and that it is really

      good. What do you do?

      Even such a mundane piece of communication illustrates the

      variety of cues that you ought to consider when evaluating any

      message. Has Jacques been to the restaurant, or has he just heard

      about it? Does he particularly like Swiss cuisine, or is he knowl-

      edgeable about food in general? Does he have shares in this new

      venture? The next two chapters are devoted to identifying and

      understanding the cues that relate to the source of the message.

      Here I focus on the content of the message.

      Imagine that Jacques is as knowledgeable about eating out as

      you are and has no reason to oversell this restaurant. How do you

      integrate his point of view— that the Swiss restaurant is

      great— with your skepticism toward Swiss cuisine? Evaluating

      messages in light of our preexisting beliefs is the task of the most

      basic open vigilance mechanism: plausibility checking.

      On the one hand, it is obvious enough that we should use our

      preexisting views and knowledge when evaluating what we’re

      47

      48 ch ap t er 4

      told. If someone tells you the moon is made of cheese, some

      skepticism is called for. If you have consistently had positive in-

      teractions with Juanita over the years, and someone tel s you

      she has been a complete jerk to them, you should treat that piece

      of information with caution.

      On the other hand, doesn’t relying on our preexisting beliefs

      open the door to bias? If we reject every thing that conflicts with

      our preexisting views, don’t we become hopelessly stubborn and

      prejudiced?

      How to Deal with Contrary Opinions

      Experimental evidence suggests the risk of irrational stub-

      bornness is real. In some circumstances, people seem to be-

      come even more entrenched in their views when presented

      with contrary evidence—to use the earlier example, it is as if

      you would become even more sure that Swiss cuisine sucks

      after being told that a Swiss restaurant was great. Psycholo-

      gists call this phenomenon the backfire effect. It has been re-

      peatedly observed; for instance, in an experiment that took

      place in the years following the second Iraq War. U.S. presi-

      dent George W. Bush and his government had invoked as a

      reason for invading Iraq the supposed development of weap-

      ons of mass destruction by
    Iraqi leader Saddam Hussein. Even

      though no such weapons were ever found, the belief they existed

      persisted for years, especially among conservatives, who had

      been more likely to support Bush and the Iraq War. In this

      context, po liti cal scientists Brendan Nyhan and Jason Reifler

      presented American conservatives with authoritative infor-

      mation about the absence of weapons of mass destruction in

      Iraq.1 Instead of changing their minds in light of this new infor-

      mation, even a little bit, the participants became more convinced

      w h at t o b e l ie v e ? 49

      that there had been weapons of mass destruction. A few years

      later, the same researchers would observe a similar effect

      among staunch opponents of vaccination: presenting anti-

      vaxxers with information on the safety and usefulness of the

      flu vaccine lowered even further their intention of getting the

      flu shot.2

      Surely, though, the backfire effect has to be the exception

      rather than the rule. Imagine you’re asked to guess the length of

      the Nile. You think it is about seven thousand kilo meters long.

      Someone says it is closer to five thousand kilo meters. If the back-

      fire effect were the rule, after several more iterations of the argu-

      ment, you would be saying the Nile is long enough to circle the

      earth several times over. Fortunately, that doesn’t happen. In this

      kind of situation— you think the Nile is seven thousand kilo-

      meters long, someone else thinks it is five thousand— people

      move about a third of the way toward the other opinion and very

      rarely away from it.3

      Even on sensitive issues, such as politics or health, backfire

      effects are very rare. Nyhan and Reifler had shown that conser-

      vatives told about the absence of weapons of mass destruction

      in Iraq had become even more convinced of the weapons’ exis-

      tence. Po liti cal scientists Thomas Wood and Ethan Porter re-

      cently attempted to replicate this finding. They succeeded but

      found this was the only instance of a backfire effect out of thirty

      persuasion attempts. In the twenty- nine other cases, in which

      participants were provided with a factual statement relating to

      U.S. politics (for example, that gun vio lence has declined, or that

      there are fewer abortions than ever), their opinions moved in line

      with the new, reliable information. This was true even when the

      information went against their preexisting opinion and their po-

      liti cal stance.4 As a rule, when people are presented with mes-

      sages from credible sources that challenge their views, they move

      50 ch ap t er 4

      some of the way toward incorporating this new information into

      their worldview.5

      In the examples we’ve seen so far, there was a direct clash

      between people’s beliefs (that there were weapons of mass

      destruction in Iraq, say) and what they were told (that there

      were no such weapons). The case of the Swiss restaurant is a little

      subtler. You don’t have an opinion about the specific restaurant

      Jacques recommended, only a prejudice against Swiss cuisine in

      general. In this case, the best thing to do is somewhat counter-

      intuitive. On the one hand, you are justified in doubting Jacques’s

      opinion and thinking this new Swiss restaurant is likely to be bad.

      But you shouldn’t then become even more sure that Swiss cui-

      sine in general is poor— that would be a backfire effect. Instead,

      your beliefs about Swiss cuisine in general should become some-

      what less negative, so that if enough (competent and trustwor-

      thy) people tell you that Swiss restaurants are great, you end up

      changing your mind.6

      Beyond Plausibility Checking: Argumentation

      Plausibility checking is an ever- present filter, weighing on

      whether messages are accepted or rejected. On the whole, this

      filtering role is mostly negative. If plausibility checking lets in

      only messages that fit with our prior beliefs, not much change

      of mind is going to occur— since we already essentially agree

      with the message. This is why you often need to recognize the

      qualities of a source of information— that it is reliable and of

      goodwill—to change your mind. There is, however, an exception,

      a case in which plausibility checking on its own, with no infor-

      mation whatsoever about the source, gives us a reason to accept

      a novel piece of information: when the new information in-

      creases the coherence of our beliefs.7

      w h at t o b e l ie v e ? 51

      Insight prob lems are good examples of how a new piece of

      information can be accepted purely based on its content. Take

      the following prob lem:

      Ciara and Saoirse were born on the same day of the same

      month of the same year to the same mother and the same father

      yet they are not twins.

      How is this pos si ble?

      If you don’t already know the answer, give it a minute or two.

      Now imagine someone saying “ They’re part of triplets.” Even

      if you have no trust whatsoever in the individual tel ing you this,

      and even though this is a new piece of information, you will ac-

      cept the answer. It just makes sense: by resolving the inconsis-

      tency between two girls being born at the same time of the same

      mother and their not being twins, it makes your beliefs more

      coherent.

      In some cases, simply being told something is not enough to

      change our minds, even though accepting the information would

      make our beliefs more coherent. Take the following prob lem:

      Paul is looking at Linda.

      Linda is looking at John.

      Paul is married but John is not.

      Is a person who is married looking at a person who is not

      married?

      Yes / No / Cannot be determined.

      Think about it for as long as you like (it is one of my favorite

      logical puzzles, which my colleagues and I have used in many

      experiments).8

      Now that you’ve settled on an answer, imagine that your friend

      Chetana tel s you, “The correct answer is Yes.” Unless you happen

      52 ch ap t er 4

      to already think Yes is the correct answer, you will likely believe

      Chetana has gotten the prob lem completely wrong. You prob-

      ably reached the conclusion that the correct answer is Cannot be

      determined— indeed, you are likely quite sure this is the correct

      answer.9

      Yet Chetana would be right, and you would be better off ac-

      cepting Yes as a correct answer. Why? Because Linda must be

      either married or not married. If she is married, then it is true

      that someone who is married (Linda) is looking at someone who

      is not married (John). But if she is not married, it is also true that

      someone who is married (Paul) is looking at someone who is not

      married (Linda). Because it is always true that someone who is

      married is looking at someone who is not married, the correct

      answer is Yes.

      Once you accept the Yes answer, you are better off. Yet, among

      the people who initially provide the wrong answer (so, the vast

      ma
    jority of people), essentially no one accepts the answer Yes if

      they are just told so without the accompanying argument.10 They

      need the reason to help them connect the dots.

      Arguments aren’t only useful for logical prob lems; they are

      also omnipresent in everyday life. Going to visit a client with a

      colleague, you plan on taking the metro line 6 to reach your des-

      tination. She suggests taking the bus instead. You point out that

      the metro would be faster, but she reminds you the metro con-

      ductors are on strike, convincing you to take the bus. If you had

      not accepted her argument, you would have gone to the metro,

      found it closed, and wasted valuable time.

      The cognitive mechanism people rely on to evaluate argu-

      ments can be called reasoning. Reasoning gives you intuitions

      about the quality of arguments. When you hear the argument

      for the Yes answer, or for why you should take the bus, reason-

      ing tel s you that these are good arguments that warrant chang-

      w h at t o b e l ie v e ? 53

      ing your mind. The same mechanism is used when we attempt

      to convince others, as we consider potential arguments with

      which to reach that end.11

      Reasoning works in a way that is very similar to plausibility

      checking. Plausibility checking uses our preexisting beliefs to

      evaluate what we’re told. Reasoning uses our preexisting infer-

      ential mechanisms instead. The argument that you shouldn’t take

      the metro because the conductors are on strike works because

      you naturally draw inferences between “The conductors are on

      strike” and “The metro will be closed” to “We can’t take the

      metro.”12 If you had thought of the strike yourself, you would

      have drawn the same inference and accepted the same conclu-

      sion: your colleague was just helping you connect the dots.

      In the metro case, the dots are very easy to connect, and you

      might have done so without help. In other instances, however,

      the task is much harder, as in the prob lem with Linda, Paul, and

      John. A new mathematical proof connects dots in a way that is

      entirely novel and very hard to reach, yet the people who under-

      stand the proof only need their preexisting intuitions about the

      validity of each step to evaluate it.

      This view of reasoning helps explain the debates surrounding

      the Socratic method. In Plato’s Meno, Socrates walks a young

     

    Prev Next
Read online free - Copyright 2016 - 2025