Wednesday, 26 September 2007

Can psychology save the world?

Scott Lilienfeld: "The most important psychology experiment that’s never been done would determine whether psychology can save the world.

Yes, that statement is admittedly more than a bit hyperbolic. And this experiment will probably never be conducted, at least not in our lifetimes or even our great-grandchildren’s lifetimes. But it is at least worth pondering as a Gedanken experiment. This experiment rests on three premises for which, I contend, there is substantial, although not yet definitive, support.

Premise #1: The greatest threat to the world is ideological fanaticism. By ideological fanaticism, I mean the unshakeable conviction that one’s belief system and that of other in-group members is always right and righteous, and that others’ belief systems are always wrong and wrongheaded – even to the point that others who hold them must be eliminated. Contra Hitchens (2007), religion per se is not a threat to the world, although certain religious beliefs can provide the scaffolding for ideological fanaticism, as we can see in the contemporary wave of Islamic extremism. As many historians have observed, the three most deadly political movements of the 20th century - Hitler’s Nazism, Mao Tse-Tung’s cultural revolution, and Pol Pot’s Khmer Rouge - were largely or entirely secular. What unites all of these movements, including Islamic extremism, is the deeply entrenched belief that one’s enemies are not merely misguided, but so profoundly misguided that they are wicked and must be liquidated.

Premise # 2. Biased thinking is a necessary, although not sufficient, condition for ideological fanaticism. Among the most malignant biases, and those most relevant to ideological fanaticism, are: (1) Naïve realism: the erroneous belief that the world is precisely as we
see it (Ross & Ward, 1996). Naïve realism in turn often leads to the assumption that “because I perceive reality objectively, others who disagree with me must be foolish, irrational, or evil” (see Pronin, Puccio, & Ross, 2002); (2) Bias blind spot (“not me” bias): the erroneous belief that we are not biased, although others are (Pronin, Gilovich, & Ross, 2004); and (3) Confirmation bias: the tendency to selectively seek out information consistent with one’s beliefs and to ignore, minimize, or distort information that that is not (Nickerson, 1998).

Premise # 3: Critical thinking is the most effective (partial) antidote against ideological fanaticism. By critical thinking, I mean thinking designed to overcome one’s biases, especially the three aforementioned biases.

Regrettably, malignant biases in thinking are virtually never addressed explicitly or even implicitly in educational curricula, which is troubling given that so much of everyday life - left-wing political blogs, right-wing political talk radio, political book buying habits (Krebs, 2007), ad infinitum - reinforce them. Moreover, our selection of friends can generate not only communal reinforcement for our biases (Carroll, 2003), but the erroneous belief that our views are shared by most or all other reasonable people (i.e., a false consensus effect; Ross, Greene, & House, 1977). In some Islamic countries, of course, much of the educational curriculum comprises indoctrination into a cultural and religious worldview that implies that one’s enemies are mistaken, blasphemous, and despicable. In the United States, some social critics (e.g., Bloom, 1987; Horowitz, 2007) have charged that the higher educational system typically engenders an insidious indoctrination into left-wing ideology. The merits of these arguments aside, it is undeniable that even among highly educated individuals (a group that includes many or most terrorists; Sageman, 2004), the capacity to appreciate views other than one’s own is hardly normative.

So, the most important psychological experiment never done would (1) begin with the construction of a comprehensive evidence-based educational programme of debiasing children and adolescents in multiple countries against malignant biases, (2) randomly assign some students to receive this program and others to receive standard educational curricula, and (3) measure the long-term effects of this debiasing program on well-validated attitudinal and behavioural measures of ideological fanaticism. To some extent, the goal of this program would be to inculcate not merely knowledge but wisdom (Sternberg, 2001), particularly aspects of wisdom that necessitate an awareness of one’s biases and limitations, and the capacity to recognize the merits of differing viewpoints (e.g., Meacham, 1990 see p.181-211 here).

The greatest obstacle to conducting this experiment, aside from the sheer pragmatic difficulty of administering a large scale curriculum across multiple countries, is the surprising paucity of research on effective debiasing strategies. Nevertheless, at least some controlled research suggests that encouraging individuals to seriously entertain viewpoints other than their own (e.g., “considering the opposite”) can partly immunize them against confirmation bias and related biases (Kray & Galinsky, 2003; Wilson, Centerbar, & Brekke, 2002). Whether such educational debiasing efforts, implemented on a massive scale, would help to inoculate future generations against ideological fanaticism, is unknown. But launching such an endeavour by conducting small-scale pilot studies would seem to be a worthwhile starting point."

Dr Scott O Lilienfeld is Professor of Psychology, Emory University, Atlanta.

Return to menu for 'most important psych experiment never done.'


Robin Hanson said...

Scott, your grand vision to overcome bias fits well at our blog, Consider joining us.

Anonymous said...

Certainly an interesting idea, Dr. Lilienfeld, but you may run into some problems down the road. Let’s be clear about what we’re arguing against. Is the problem of ideological fanaticism A) that certain people have the “wrong” beliefs, ie, beliefs that are morally distasteful to us – or B) that they go too far in attempting to implement or espouse their beliefs? In the US, of course, we have no problem with people who say morally distasteful things – in fact, as you know, it’s a protected right in our Constitution. We do have a problem with people who break laws; laws which comprise a set of codified morals that we, as a society, will not tolerate being violated. There are certain ways we, as a society, can accept being offended, and certain ways we will not. Where do we come up with this line? I find it hard to accept that what we consider to be unassailable values – egalitarianism, the dignity of the individual, etc – are the result of entirely rational, objective deliberation. I believe these things, but it’s not necessarily because I have equally considered the alternatives and made a choice as to which ones to follow.

Consider the possibility that you teach a generation of children to doubt their gut reactions, to think critically about their preconceived notions and to consider other peoples’ point of view. (This is a large part of the education that I myself received at a small liberal arts college – an education which I think was, for the most part, a good one). You’re left with a problem, one that I still struggle with: we still must rely on some information about what values to hold dear. How do we arrive at the morally correct way to act if we have to see everyone’s values and prescriptions for behavior as potentially equally valid? Ultimately, we must say – Yes, other people have reasons for believing what they believe, but I have my own beliefs. How then do we act in the wider world, where moral codes may conflict with one another and intersect? How do we decide which views to moderate and which views to act decisively on?

I am by no means a supporter of a return to “traditional” moral principles – that’s even worse – I don’t think that teaching tolerance for other moral points of view leaves people without any values to turn to, but it runs the risk of either A) leaving individuals and societal actors indecisive at potentially critical moments (it’s hard to be deliberative AND act fast), or, more likely, B) giving rise to a “stealth” code of political correctness whose practitioners claim to be (and believe themselves to be) open and respectful of other views, but in fact are equally dogmatic about their worldview.

I generally support your idea, mainly because I believe that the most important lesson to be learned is moderation in the face of competing ideologies – put simply, being respectful and courteous to others you may disagree with – but I guess I believe the human capacity towards bias, particularly via unconscious emotional arousal, is stronger than our cognitive ability to overcome it. In other words, our propensity to rationalize generally wins out over our capacity for reason.

Anonymous said...

I think you teach critical thinking to kids using much less controversial topics, like this:

Jane values saving her money for an emergency. John always wants to have the latest and greatest toys. A cool new iPhone will cost a thousand bucks this year (including the cost of a one-year service agreement). Would you expect Jane to buy it? Would you expect John to buy it? Would you buy it yourself?

When you get a kid that says "I buy it, and everyone buys it" or "I don't buy, and no one else buys it either," then you know that you've got someone who can't think about different people's values.

You don't have to teach this at the level of religious beliefs. Consumer education work is perfectly capable of teaching these skills.

Anonymous said...

"By ideological fanaticism, I mean the unshakeable conviction that one’s belief system and that of other in-group members is always right and righteous, and that others’ belief systems are always wrong and wrongheaded"

The definition is over-inclusive. E.g., my belief in the basic worth of every human being (and my belief that torture is inexcusable) is held with unshakeable conviction. I also believe that those who share this belief are right and righteous, and that those who do not appear to believe in this are wrong and wrongheaded. Is this ideological fanaticism?

Also, the reason why psychology will not save the world is because psychologists struggle when it comes to advising what and how we OUGHT to think and behave. One can never argue values from facts. Values are more appropriately and more competently discussed by philosophers (and perhaps, although rarely, politicians).

Don Cox said...

"my belief in the basic worth of every human being (and my belief that torture is inexcusable) is held with unshakeable conviction. I also believe that those who share this belief are right and righteous, and that those who do not appear to believe in this are wrong and wrongheaded. Is this ideological fanaticism?"___Yes. Ask the average cat how much human beings are worth and whether it matters if they are tortured. Your viewpoint (which I share) is specifically human-centric. Now consider that many people genuinely believe that only Muslims are fully human, and non-Muslims are by nature subhuman. Then again, what is your attitude to laboratory experiments on monkeys?

Anonymous said...


If you really believe that having the unshakeable belief that human beings have an equal worth qualifies as 'idealogical fanaticism' then you empty the term 'ideological fanaticism' of it's meaning. But I do agree that it's wrong that animals do not, in our culture anyway, appear to have been accorded the same level of rights.

I guess my wider point is that the content of what people believe in determines the application of the term 'ideological fanaticism' almost as much as (if not more than) the conviction the belief is held with. So perhaps the terms of the debate over whether something is idealogical fanaticism or not should move away from discussing the person's 'psychology' towards discussing the truth or otherwise of the actual issue (in most cases anyway).

In a nutshell, if you agree with someone's view you're unlikely to call them an idealogical fanatic - even if they meet the criteria proposed by Lilienfield.

Bit of a truism I know, but we need to be careful; I don't agree with religious fundamentalism either but let's not scientise a dispute over values. Let's keep the debate on the right terms... or at least let's be honest that it's WHAT the religious fundamentalist believes that's at stake, not the WAY he/she believes it.

Post a Comment

Note: only a member of this blog may post a comment.