New Internationalist: Why do some people appear to be overly suspicious or even paranoid about plots and conspiracies, especially in high places or by people in positions of authority?
Evan Harrington: To some extent I believe that suspiciousness is part of human nature. Within the field of evolutionary psychology, researchers have attempted to link observable trends in human behavior to our distant collective evolutionary past. Seen in this light, it makes sense that distant ancestors who were suspicious of strangers would have had an advantage over those who were overly trusting. But suspiciousness can backfire too, evolutionarily, in that individuals who were obsessively suspicious would not have traded anything with anyone and so would have been at a distinct disadvantage. Humans evolved in small societies, not unlike chimpanzees and bonobos, who are our closest relatives. If our evolutionary ancestors trusted all strangers, they likely would have been taken advantage of quite often. However, if they trusted members of their own social groups while being suspicious of strangers, then they would have advanced their own interests and the interests of their social groups, while remaining safe from predation by strangers. Of course this is completely conjecture and not open to empirical testing.
From a more social perspective, in America it is common to see leaders who are narrow-minded and greedy. There have been numerous examples recently of corporate leaders whose actions benefited themselves to the detriment of the company and stockholders. The Bush White House has been called one of the most secretive administrations in our history. So there may be some very good reasons to distrust authority figures. When you have a lack of information on a topic that people feel strongly about, the situation is ripe for rumor and gossip. Rumors tend to grow when people feel strongly about the issue (remember the “Paul is Dead” rumor after the release by the Beatles of the Abbey Road album?). I see gossip, rumors, and conspiracy theories as a sort of continuum. What sets conspiracy theories apart is that there must be a will to believe where people suspend disbelief, much the same as when reading a book or watching a movie. Only then do you get people endorsing what appears to be a silly or foolish conspiracy theory. A person probably has to be deeply involved in an issue before they accept a conspiracy theory as truth. Extreme events, such as being in the center of a disaster like Hurricane Katrina, might lead to a suspension of disbelief. I understand there were quite a few conspiracy theories used by residents of New Orleans to explain the incompetence of the rescue efforts.
New Internationalist: How is it that many conspiracy theorists grasp onto extraordinary claims put forward by self-described “truth-tellers” while rejecting, or not even seeing, copious solid evidence that undermines these claims?
Evan Harrington: I think there is a leap of faith that occurs, and this is quite similar to a religious conversion experience. A person may spend months or even years thinking an issue before they come to endorse it. Within cults, the conversion experience happens at a much faster rate. I once spent 2 weeks at a summer camp run by the Unification Church, otherwise known as the “Moonies”. I was engaged in participant observation of their cult recruitment techniques. I witnessed troubled people from New York City being initiated into the fold within the course of several days, culminating in a religious experience for them. Once an individual makes such a deep investment in a belief system, whether it is a religion or a conspiracy theory, it can be very difficult to dissuade them. Experiments have shown that we all, to some extent, have a “disconfirmation bias” in which we try to explain away information that doesn’t fit what we already believe. When it comes to a strongly held belief system, disconfirmation bias will be bolstered by the threat to personal identity that destruction of the belief would entail. Some clinicians would call this a threat to ego identity.
New Internationalist: The brain seems to like solving puzzles, but some people can’t accept that some puzzles can’t be solved with the available evidence. Do we as humans have a need to “fill in the blanks” or “connect the dots” to resolve ambiguity or lack of information? Are some people more prone to this than others?
Evan Harrington: Humans are excellent at seeing patterns. Part of our evolutionary heritage is that we are amazingly adept at seeing patterns in the events that happen around us. Cognitive psychologists have studied this for years. The need to see patterns, or to “resolve ambiguity” is not really detrimental. In fact, it is to our advantage that we humans are able to take enormous amounts of information from our social worlds, process this information quickly and efficiently, and then act accordingly. The price we pay for being so efficient at organizing social information is that we often to make errors in certain predictable areas. Some cognitive psychologists have spent their careers identifying what these areas are and how they operate. Amos Tversky and Daniel Kahneman were pioneers in this research. People routinely overestimate the risk of flying in airplanes (I do this myself) and underestimate the risk of driving on the freeway. According to Tversky and Kahneman, it is much easier to imagine a plane crash than a car crash, for the simple reason that plane crashes are more dramatic. Flying in airplanes is much more rare for most people. Since it is easier for us to recall dramatic plane crashes, and because our own experiences with planes are rare (compared to driving), we tend to overestimate the risk associated with planes. Swimming pools are far more deadly to children than is having a loaded gun in the house, though many people will say that the gun poses a greater risk, simply because gun accidents are very easy to remember. So we have a strong tendency to make associations between events, though we often ignore base rate information, causing our errors when thinking about risk.
On the other hand, there has been some psychological research indicating that some people have a greater need for resolution of puzzles than others. Some people do appear to have a greater intolerance for ambiguity than others. Researchers working with patients exhibiting anxiety disorders have found that intolerance for ambiguity is associated with higher levels of worry and anxiety. This would make an interesting psychology study: does intolerance for ambiguity correlate with willingness to believe irrational things? One might hypothesize that a good conspiracy theory reduces ambiguity, thus reducing anxiety for those who are intolerant of ambiguity. Believing in a conspiracy theory might be expected to increase anxiety, but paradoxically it might reduce anxiety for these people. This might be a good topic for someone’s dissertation.
New Internationalist: Do some people tend to see the world in dualistic ways with little appreciation for nuance or complexity?
Evan Harrington: Milton Rokeach, a personality theorist active in the 1960s, identified rigidity of thinking as a personality style, but there hasn’t been a lot of research specifically on this topic. The 18th French chemist and tax collector Antoine Lavoisier noted that the mind tends to get “creased” into certain ways of looking at the world. Lavoisier was certainly correct to an extent – it’s easy to get into a routine and follow it. Changing one’s path through life on a daily basis can make things more exciting, but it can also be bothersome. Ellen Langer has studied what she calls mindlessness, which is essentially this tendency to adopt an autopilot approach to the mundane areas of life. If you think of mindfulness as a basic yearning for novel experiences and a desire to learn new things, then it makes sense that people high on mindfulness would have more appreciation for ambiguity and detail.
New Internationalist: How is dualism related to demonization?
Evan Harrington: Categorization, what you term dualism, is another commonly seen element of human nature. That’s not to say that we have to see the world in terms of us-and-them, but it occurs quite frequently and at a young age. Psychologist Henri Tajfel did in the lab what Jane Elliot did in the classroom with her blue-eyed and brown-eyed students. Tajfel divided children into groups based on meaningless variables such as a coin toss, or the estimate of how many dots were on a wall. Placed into groups, the children tended to support other members of their own group even to their own detriment, apparently out of a desire for their group to succeed. We’re the good group and, by extension, you must be the not-so-good group. If dichotomies are set up then differences will tend to be accentuated, although research indicates that members of the more powerful of two groups will be somewhat oblivious to disparities of power, or will come to believe that their accumulation of power is justifiable and based on merit rather than oppression. To the extent that disparity of power is seen by the weaker group as being unjustifiable, members of the weaker group may attempt to change the situation by joining the more successful group, changing the power dynamic, or in rare instances through social conflict (e.g., riots or terrorism). When you think of the global conflict being encouraged by Salafists like the al Qaeda organization, it is clear that the roots are in dualism and the different values held by modern Western countries and the types of fundamentalist Muslims who seek a return to the power of the Islamic Caliphate. Those who join the Jihad appear to be more educated than average, usually with secular educations, who feel powerless within industrialized European countries, and seek to address the global power imbalance through terror. They appear to have convinced themselves that in this way they will return Islam to the superpower status it enjoyed during the European medieval period. I would hypothesize that conspiracy theories are extremely common at al Qaeda training camps.
One point that Tajfel made is that group identity is transitory and shifts from one situation to the next. One thing that absolutely astonishes me is that cultures in which genocide was practiced can, within a few years time, go back to an apparent state of peace. If no one fans the flames of ethnic division then the group identities of former rivals might shift so that being a member of a nation becomes more important, for example.
New Internationalist: I argue that when conspiracy theories flourish in a community they are a “narrative form of scapegoating.” Why are demonization and scapegoating mechanisms that are so common among humans? We all seem to be tempted by demonization and scapegoating, but some people embrace it wholeheartedly. Why?
Evan Harrington: I think the answer lies in the allure of sadism. Social psychologist Philip Zimbardo recently published an excellent book examining the events at Abu Ghraib prison, and how social forces can compel people to do things they ordinarily would never dream of doing. When I think of the abuses at Abu Ghraib I think of children who pull the tails of cats, oblivious to the distress they are causing. To some extent, children need to be socialized to be empathic (e.g., Timmy, how do you think Fluffy felt when you pulled his tail?). There is an allure to cruelty, and with children the answer is to ask them to be both empathic and to ask them to be self-reflective. If children see the animal as something with feelings, and if they consider themselves to be kind natured, then they won’t want to hurt the cat. At Abu Ghraib people with no experience in corrections were acting as prison guards and they had no rules regarding how to handle prisoners. These soldiers were tired from massive overtime duties, they were anxious from all the hostile action outside the walls of the prison, they were bored, and they had no oversight from commanding officers. As Zimbardo says, it’s not a question of if abuses will happen, it’s a question of how bad will they be? I believe the tendency to scapegoat is always there; empathy and self-reflection prevent most of us from acting on these impulses. The power of the situation to set aside empathy and self-reflection cannot be overstated. The guards at Abu Ghraib for the most part had been known as good people prior to being in the prison setting. One guard, interviewed for Rory Kennedy’s documentary The Ghosts of Abu Ghraib, said that that setting foot in the prison was like going into another world, one in which the old self was left behind. It is the situation that brought out these behaviors where people wholeheartedly acted in a heartless manner and, judging by the photos, with amusement and glee.
Conspiracy theories may indeed be related to scapegoating, but I doubt it is a one-to-one relationship. Many examples of historical scapegoating exist where conspiracy theories were absent. In my days as a graduate student I attended conferences held by fringe therapists who believed in a massive conspiracy of incestuous satanic cannibals. These cannibals were thought to have killed 50,000 people a year in the USA, or so these therapists believed. This was a situation in which conspiracy theories flourished yet there was little scapegoating (and needless to say, no evidence for the grand cult). Conspiracy theories may at times act to justify a larger aggressive agenda such as genocide, or they may help individuals justify instances of cruelty they are committing, but at other times conspiracy theories may simply be beliefs that do not lend themselves to skeptical analysis. Perhaps the most powerful tool against conspiracy theories is the skeptical mind, but skepticism and the scientific approach is sadly not one of the primary values in the American educational system.