Why Hawks Win
By Daniel Kahneman, Jonathan Renshon (Fonte)
Why are hawks so influential? The answer
may lie deep in the human mind. People have dozens of decision-making
biases, and almost all favor conflict rather than concession.
A look at why the tough guys win more than they should.
National leaders get all sorts of advice in
times of tension and conflict. But often the competing counsel
can be broken down into two basic categories. On one side
are the hawks: They tend to favor coercive action, are more
willing to use military force, and are more likely to doubt
the value of offering concessions. When they look at adversaries
overseas, they often see unremittingly hostile regimes who
only understand the language of force. On the other side are
the doves, skeptical about the usefulness of force and more
inclined to contemplate political solutions. Where hawks see
little in their adversaries but hostility, doves often point
to subtle openings for dialogue.
As the hawks and doves thrust and parry, one
hopes that the decision makers will hear their arguments on
the merits and weigh them judiciously before choosing a course
of action. Dont count on it. Modern psychology suggests
that policymakers come to the debate predisposed to believe
their hawkish advisors more than the doves. There are numerous
reasons for the burden of persuasion that doves carry, and
some of them have nothing to do with politics or strategy.
In fact, a bias in favor of hawkish beliefs and preferences
is built into the fabric of the human mind.
Social and cognitive psychologists have identified
a number of predictable errors (psychologists call them biases)
in the ways that humans judge situations and evaluate risks.
Biases have been documented both in the laboratory and in
the real world, mostly in situations that have no connection
to international politics. For example, people are prone to
exaggerating their strengths: About 80 percent of us believe
that our driving skills are better than average. In situations
of potential conflict, the same optimistic bias makes politicians
and generals receptive to advisors who offer highly favorable
estimates of the outcomes of war. Such a predisposition, often
shared by leaders on both sides of a conflict, is likely to
produce a disaster. And this is not an isolated example.
In fact, when we constructed a list of the
biases uncovered in 40 years of psychological research, we
were startled by what we found: All the biases in our list
favor hawks. These psychological impulsesonly a few
of which we discuss hereincline national leaders to
exaggerate the evil intentions of adversaries, to misjudge
how adversaries perceive them, to be overly sanguine when
hostilities start, and overly reluctant to make necessary
concessions in negotiations. In short, these biases have the
effect of making wars more likely to begin and more difficult
to end.
None of this means that hawks are always wrong.
One need only recall the debates between British hawks and
doves before World War II to remember that doves can easily
find themselves on the wrong side of history. More generally,
there are some strong arguments for deliberately instituting
a hawkish bias. It is perfectly reasonable, for example, to
demand far more than a 50-50 chance of being right before
we accept the promises of a dangerous adversary. The biases
that we have examined, however, operate over and beyond such
rules of prudence and are not the product of thoughtful consideration.
Our conclusion is not that hawkish advisors are necessarily
wrong, only that they are likely to be more persuasive than
they deserve to be.
VISION PROBLEMS
Several well-known laboratory demonstrations
have examined the way people assess their adversarys
intelligence, willingness to negotiate, and hostility, as
well as the way they view their own position. The results
are sobering. Even when people are aware of the context and
possible constraints on another partys behavior, they
often do not factor it in when assessing the other sides
motives. Yet, people still assume that outside observers grasp
the constraints on their own behavior. With armies on high
alert, its an instinct that leaders can ill afford to
ignore.
Imagine, for example, that you have been placed
in a room and asked to watch a series of student speeches
on the policies of Venezuelan leader Hugo Chávez. Youve
been told in advance that the students were assigned the task
of either attacking or supporting Chávez and had no
choice in the matter. Now, suppose that you are then asked
to assess the political leanings of these students. Shrewd
observers, of course, would factor in the context and adjust
their assessments accordingly. A student who gave an enthusiastic
pro-Chávez speech was merely doing what she was told,
not revealing anything about her true attitudes. In fact,
many experiments suggest that people would overwhelmingly
rate the pro-Chávez speakers as more leftist. Even
when alerted to context that should affect their judgment,
people tend to ignore it. Instead, they attribute the behavior
they see to the persons nature, character, or persistent
motives. This bias is so robust and common that social psychologists
have given it a lofty title: They call it the fundamental
attribution error.
The effect of this failure in conflict situations
can be pernicious. A policymaker or diplomat involved in a
tense exchange with a foreign government is likely to observe
a great deal of hostile behavior by that countrys representatives.
Some of that behavior may indeed be the result of deep hostility.
But some of it is simply a response to the current situation
as it is perceived by the other side. What is ironic is that
individuals who attribute others behavior to deep hostility
are quite likely to explain away their own behavior as a result
of being pushed into a corner by an adversary.
The tendency of both sides of a dispute to view themselves
as reacting to the others provocative behavior is a
familiar feature of marital quarrels, and it is found as well
in international conflicts. During the run-up to World War
I, the leaders of every one of the nations that would soon
be at war perceived themselves as significantly less hostile
than their adversaries.
If people are often poorly equipped to explain
the behavior of their adversaries, they are also bad at understanding
how they appear to others. This bias can manifest itself at
critical stages in international crises, when signals are
rarely as clear as diplomats and generals believe them to
be. Consider the Korean War, just one example of how misperception
and a failure to appreciate an adversarys assessment
of intentions can lead to hawkish outcomes. In October 1950,
as coalition forces were moving rapidly up the Korean Peninsula,
policymakers in Washington were debating how far to advance
and attempting to predict Chinas response. U.S. Secretary
of State Dean Acheson was convinced that no possible
shred of evidence could have existed in the minds of the Chinese
Communists about the non-threatening intentions of the forces
of the United Nations. Because U.S. leaders knew that
their intentions toward China were not hostile, they assumed
that the Chinese knew this as well. Washington was, therefore,
incapable of interpreting the Chinese intervention as a reaction
to a threat. Instead, the Americans interpreted the Chinese
reaction as an expression of fundamental hostility toward
the United States. Some historians now believe that Chinese
leaders may in fact have seen advancing Allied forces as a
threat to their regime.
CARELESSLY OPTIMISTIC
Excessive optimism is one of the most significant
biases that psychologists have identified. Psychological research
has shown that a large majority of people believe themselves
to be smarter, more attractive, and more talented than average,
and they commonly overestimate their future success. People
are also prone to an illusion of control: They
consistently exaggerate the amount of control they have over
outcomes that are important to themeven when the outcomes
are in fact random or determined by other forces. It is not
difficult to see that this error may have led American policymakers
astray as they laid the groundwork for the ongoing war in
Iraq.
Indeed, the optimistic bias and the illusion
of control are particularly rampant in the run-up to conflict.
A hawks preference for military action over diplomatic
measures is often built upon the assumption that victory will
come easily and swiftly. Predictions that the Iraq war would
be a cakewalk, offered up by some supporters of
that conflict, are just the latest in a long string of bad
hawkish predictions. After all, Washington elites treated
the first major battle of the Civil War as a social outing,
so sure were they that federal troops would rout rebel forces.
General Noel de Castelnau, chief of staff for the French Army
at the outset of World War I, declared, Give me 700,000
men and I will conquer Europe. In fact, almost every
decision maker involved in what would become the most destructive
war in history up to that point predicted not only victory
for his side, but a relatively quick and easy victory. These
delusions and exaggerations cannot be explained away as a
product of incomplete or incorrect information. Optimistic
generals will be found, usually on both sides, before the
beginning of every military conflict.
If optimism is the order of the day when it
comes to assessing ones own chances in armed conflict,
however, gloom usually prevails when evaluating another sides
concessions. Psychologically, we are receptive not only to
hawks arguments for war but also to their case against
negotiated solutions. The intuition that something is worth
less simply because the other side has offered it is referred
to in academic circles as reactive devaluation.
The very fact that a concession is offered by somebody perceived
as hostile undermines the content of the proposal. What was
said matters less than who said it. And so, for example, American
policymakers would likely look very skeptically on any concessions
made by the regime in Tehran. Some of that skepticism could
be the rational product of past experience, but some of it
may also result from unconsciousand not necessarily
rationaldevaluation.
Evidence suggests that this bias is a significant
stumbling block in negotiations between adversaries. In one
experiment, Israeli Jews evaluated an actual Israeli-authored
peace plan less favorably when it was attributed to the Palestinians
than when it was attributed to their own government. Pro-Israel
Americans saw a hypothetical peace proposal as biased in favor
of Palestinians when authorship was attributed to Palestinians,
but as evenhanded when they were told it was authored
by Israelis.
DOUBLE OR NOTHING
It is apparent that hawks often have the upper
hand as decision makers wrestle with questions of war and
peace. And those advantages do not disappear as soon as the
first bullets have flown. As the strategic calculus shifts
to territory won or lost and casualties suffered, a new idiosyncrasy
in human decision making appears: our deep-seated aversion
to cutting our losses. Imagine, for example, the choice between:
Option A: A sure loss of $890
Option B: A 90 percent chance to lose $1,000
and a 10 percent chance to lose nothing.
In this situation, a large majority of decision
makers will prefer the gamble in Option B, even though the
other choice is statistically superior. People prefer to avoid
a certain loss in favor of a potential loss, even if they
risk losing significantly more. When things are going badly
in a conflict, the aversion to cutting ones losses,
often compounded by wishful thinking, is likely to dominate
the calculus of the losing side. This brew of psychological
factors tends to cause conflicts to endure long beyond the
point where a reasonable observer would see the outcome as
a near certainty. Many other factors pull in the same direction,
notably the fact that for the leaders who have led their nation
to the brink of defeat, the consequences of giving up will
usually not be worse if the conflict is prolonged, even if
they are worse for the citizens they lead.
U.S. policymakers faced this dilemma at many
points in Vietnam and today in Iraq. To withdraw now is to
accept a sure loss, and that option is deeply unattractive.
The option of hanging on will therefore be relatively attractive,
even if the chances of success are small and the cost of delaying
failure is high.
Hawks, of course, can cite many moments in
recent history when adversaries actually were unremittingly
hostile and when force produced the desired result or should
have been applied much earlier. The clear evidence of a psychological
bias in favor of aggressive outcomes cannot decide the perennial
debates between the hawks and the doves. It wont point
the international community in a clear direction on Iran or
North Korea. But understanding the biases that most of us
harbor can at least help ensure that the hawks dont
win more arguments than they should.
Daniel Kahneman is a Nobel laureate in
economics and Eugene Higgins professor of psychology and professor
of public affairs at Princeton Universitys Woodrow Wilson
School of Public and International Affairs.
Jonathan Renshon is a doctoral student
in the Department of Government at Harvard University and
author of Why Leaders Choose War: The Psychology of Prevention
(Westport: Praeger Security International, 2006).
|
Andare alla guerra: psicologia dellinterventismo
militare.
(traduzione e adattamento presentato su PsicoCafè
)
Cari avventori, oggi vi segnalo la lettura
di un interessantissimo articolo apparso su Foreign Policy,
a firma di Daniel Kahneman, e Jonathan Renshon.
Per i pochi che non lo sapessero Kahneman è il solo
psicologo che abbia vinto un Premio Nobel nella storia della
nostra giovane disciplina, Renshon invece è un dottore
di ricerca del Department of Government della Harvard University
e autore del libro "Why Leaders Choose War: The Psychology
of Prevention".
Larticolo spiega come, nel caso in cui si debba prendere
la decisione di entrare in guerra, alcune distorsioni cognitive
(bias), proprie della mente umana, facciano propendere i leader
politici per le ragioni dei falchi (gli interventisti)
più che per quelle delle colombe (i conciliatori).
I bias (una
lista qui) sono errori predicibili, distorsioni
del pensiero che si verificano quando gli esseri umani giudicano
le situazioni e valutano i rischi. Gli psicologi sociali e
cognitivi le hanno identificate osservandole sia in laboratorio
che nel mondo reale.
Una di queste è la tendenza a sovrastimare le
proprie possibilità.
Per esempio circa l80% di noi crede di saper guidare
meglio della media. In un contesto decisionale come quello
politico-militare questa distorsione porta i decision maker
a stimare, con lo stesso ingiustificato ottimismo, le probabilità
di un esito positivo (per chi decide) della guerra. Siccome
anche lavversario è un essere umano
egli stimerà allo stesso modo le sue probabilità
di vincere, ed ecco che la frittata fa presto a farsi.
Un altro bias molto comune è stato
identificato nel modo con cui le persone valutano le intenzioni
altrui. Anche quando siamo consapevoli del contesto e delle
circostanze che potrebbero giustificare il comportamento della
controparte, tendiamo a non prendere in considerazione questi
elementi.
Immaginate di avere in una stanza alcuni soggetti a cui viene
chiesto di guardare dei filmati in cui alcuni studenti parlano
delle politiche del leader venezuelano Chavez.
Prima di iniziare dite a tutti che gli studenti hanno dovuto
recitare un discorso pro-Chavez o un discorso
contro Chavez, senza che potessero scegliere.
Ora chiedete ai vostri soggetti di valutare la posizione politica
di ciascuno studente.
Anche se vi sembrerà irragionevole vi accorgerete che
le persone tenderanno per esempio a giudicare più di
sinistra gli studenti che hanno fatto un discorso pro-Chavez,
nonostante sappiano perfettamente che si è trattato
di una recita e che quel discorso non dice nulla delle reali
convinzioni politiche dello studente!
Non ci sarà niente da fare, i vostri soggetti tenderanno
a ignorare le informazioni di contesto e attribuiranno il
discorso ascoltato a convinzioni concrete e persistenti del
parlante. Questa distorsione è così robusta
e comune che gli psicologi sociali lhanno chiamata errore
di attribuzione fondamentale.
In una situazione di conflitto capite bene che un governo,
coinvolto in uno scambio teso con un governo straniero, tenderà
a percepire come ostile il comportamento dei rappresentanti
del paese straniero, anche se fosse consapevole che quel comportamento
potrebbe essere motivato dalle circostanze così come
percepite dalla controparte.
Per ironia della sorte lerrore di attribuzione
fondamentale ha un rovescio della medaglia. Se si tratta di
noi stessi tendiamo invece a tenere in altissima considerazione
le circostanze e il contesto giustificante, ci raccontiamo
con facilità che siamo stati messi allangolo
dal nemico! Non è colpa mia, sono loro che mi costringono!
La tendenza di entrambe le parti a percepirsi come vittime
del comportamento provocatorio altrui è riscontrabile
storicamente in quasi tutti i conflitti: durante la seconda
guerra mondiale i leader di tutti i paesi si percepivano significativamente
meno ostili dei loro nemici.
Un altra distorsione cognitiva molto
comune è lillusione del controllo:consiste nel
sovrastimare il proprio controllo sugli esiti di unazione,
anche quando gli esiti sono, nella sostanza, dipendenti fortemente
dal caso o da altre forze interagenti. E la convinzione
che spinge a credere che la vittoria sarà rapida e
facile.
Datemi 700 mila uomini e conquisterò lEuropa,
disse il Generale Noel de Castelnau, capo dellArmata
francese, allinizio della prima Guerra Mondiale.
Le vagonate di morti a milioni che quella guerra produsse
non furono solo il risultato di scarse capacità militari
o di assenza di accurate e razionali informazioni logistiche;
furono, secondo Kanheman, il frutto di un illusione
del controllo nefasta.
Un' altra distorsione cognitiva importante
è la svalutazione reattiva e riguarda le
concessioni della controparte. Si tende cioè a sottostimare
il contenuto di una proposta di negoziazione, se questa è
offerta da una controparte percepita come ostile. Ciò
che viene detto conta di meno di chi lha detto.
E il caso dellAmerica rispetto alla concessioni
del regime di Teheran. Lo scetticismo con cui vengono quotidianamente
accolte le concessioni provenienti da quel regime sono senzaltro
il frutto della passata esperienza, ma sono anche il risultato
inconscio, e non necessariamente razionale, della svalutazione.
Se un israeliano deve valutare un piano di pace lo giudica
più positivamente se lo ascolta da un esponente israeliano
che se lo ascolta da un esponente palestinese. Il medesimo
piano di pace
Lultima distorsione cognitiva di cui
si parla nellarticolo è lavversione
alla perdita.
Immaginate di avere due scelte:
Opzione 1: una perdita sicura di 890 euro
Opzione 2: il 90% di possibilità di perdere 1000 euro
e il 10% di non perdere niente.
La maggiorparte delle persone sceglierà lopzione
2: preferiscono cioè evitare una perdita certa in favore
di una perdita potenziale anche se rischiano di perdere significativamente
di più.
Pensiamo per un attimo allIraq: ritirarsi ora significa
accettare una perdita sicura, restare sembra unopzione
relativamente più attrattiva anche se le possibilità
di successo sono scarse e il costo di un fallimento ancora
maggiore.
Kanheman dice chiaramente nellarticolo che questo non
significa affatto che i falchi non abbiano ragione
in certi contesti storici, afferma soltanto che le loro argomentazioni
sono spesso molto più persuasive perchè intercettano
queste modalità basiche di funzionamento della mente
umana.
Permettetemi una fierezza: questi sono solo
alcuni dei fondamentali contributi che la psicologia ha dato
come scienza dell'uomo, sarebbe opportuno farne tesoro.
|