Foreign Policy Research Institute A Nation Must Think Before it Acts How to Learn Lessons from History— And How Not To

How to Learn Lessons from History— And How Not To

  • Adam Garfinkle
  • May 20, 2001
  • Wachman Center for Civic and International Literacy

The matter of learning lessons from history has been a prodigious source of aphorism and free advice. George Santayana famously warned that those who fail to learn the lessons of history are doomed to repeat it. Aldous Huxley quipped that the most important thing we learn from history is that we never learn from history. Oliver Wendell Holmes defined history as one damned thing after another (and George Shultz, following Holmes, defined foreign policy as the same damned thing after another). But what do we really know about how to learn from history, and how not to?

We know, first and foremost, that how we may learn from history is a function of the purpose to which the exercise is put. One attitude toward the purpose of learning from history is that one does it as a form of aesthetics, not as a means to more practical ends. A.J.P. Taylor, one of the most prominent and best respected historians of the 20th century, believed that the writing of history is “an art just like painting or architecture and is designed like them only to give intellectual and artistic pleasure.”

But most of those who seek knowledge from history have more purposeful ends in mind, particularly policymakers and policy analysts who discern a moral responsibility to protect their nations and their fellows from gratuitous harm. Trying to learn from history for such purposes is both necessary and very difficult. It is necessary because the past is the only data we have from which to gleen patterns of behavior whose endurance as features of human nature, presumably, make them relevant to our own times and problems. The method is first inductive— sifting historical evidence to produce general truths— and then deductive— applying those truths to other, present circumstances taken to be more or less analogous.

And this is where the difficulty comes in. Hard as it is to derive useful general lessons from historical data, it is even harder to apply those lessons to contemporary events as they unfurl. After all, the entire process depends on the effective use of reasoning by analogy. Since it is the nature of the contemporary that we can discern its true shape only after it has passed us, deciding which lessons from the past apply to the present— and which do not— must flow from intuition, not the exercise of (even social) science. Guessing wrong can be worse than not guessing at all, for to get the basic analogy wrong, but stick to it anyway, means risking generative error.

This is what observers mean when they say that diplomats often prepare to prevent and generals to fight “the last war.” Thus, as the general line goes, European diplomats eager to prevent a repetition of World War I— a war whose outbreak and prolongation were viewed as an accident — inadvertently brought on World War II. They used appeasement— not at all a dirty word in the 1920s and 1930s— to avoid the brittle psychology of anticipation among those with limited goals that caused the war in 1914. The problem was that Nazi Germany, along with fascist Italy and militarist Japan, were not candidates to stumble into a war they did not wish, but were predatory regimes for whom the diplomacy of appeasement represented an opportunity to seize advantage. Similarly, after World War II, American statesmen in the early years of the Cold War were riveted by the lessons of Munich. Appeasement became a dirty word, and the decisions to fight in Korea and Vietnam were based on reasoning by analogy that Soviet Communism was just as unappeasable and evil as had been Germany Fascism. (Was this reasoning mistaken? Many thought so in the wake of the Vietnam War debacle. Now that the Cold War is over, and ended as it did, it’s not so clear after all.)

Reasoning by analogy need not concern only world-historical scale events. It goes on all the time, and usually concerns events of lesser weight. Much of the time the process is reasonably sensible and coherent. Thus, judging from an array of past Soviet behaviors, in which the Soviets pursued their interests until stopped by countervailing pressure, the principals of the Reagan Administration reasoned that to manage the Soviet proliferation of medium-range ballistic missiles in Europe NATO would have to install missiles, as well. Though it was politically difficult, this was done, and the eventual result was a treaty that eliminated this entire class of nuclear-weapons delivery systems (the Intermediate-range Nuclear Forces [INF] Treaty of 1987).

But mistakes happen, too. The Clinton Administration, confronted by instability in Kosovo in 1998, was determined not to repeat the mistakes it thought it had made in Bosnia a few years earlier. There were superficial reasons to see these problems as “more or less” similar, but in its concern not to err again it ignored differences that proved to be just as significant. (See my “Kosovo Is Not Bosnia,” FPRI E-Notes, June 19, 1998.) The result was a war that arguably: worsened the humanitarian dimension of the conflict; harmed NATO and stimulated fissiparous anti-alliance tendencies in Europe; worsened U.S.-Russian and U.S.-Chinese relations; emboldened Albanian nationalism, which is now causing havoc in Macedonia; and settled nothing with respect to Kosovo’s future. And it could have been even worse. We are still not sure why the Serbs relented after 78 days of bombing, an act of stamina that surprised and shocked U.S. officials— but that should not have, and would not have, had they not reasoned wrongly by analogy to Bosnia.

Aside from good intuition— and occasional episodes of dumb luck— what distinguishes failed from successful efforts to reason by analogy? In general, failure is more likely when the effort is plagued by the following conditions:

  • The problem of the “evoked set;”
  • Excessive reliance on ideological premises;
  • Emotional indulgence;
  • Unwitting bias born of selective interests;
  • Extremes of either solipsism or “groupthink.”

The evoked set is a term taken from cognitive psychology that describes the tendency to see what we expect to see, and to ignore what we do not expect to see in a given context. (As Robert Jervis put it, “I’ll see it when I believe it.” See his book Perception and Misperception in International Relations, published by Princeton University Press, 1976).  It happens that, when it comes to international politics, people form their sense of how the world works from their seminal adult experience and from the most recent experience deemed “more or less” analogous to the problem at hand. This is a very small historical sample, and the urge to jam all contemporary problems into one or the other of these frameworks can be very unfortunate. What this means, at a minimum, is that policymakers who know history will have a larger and more useful repertoire of analogues with which to think about current problems than policymakers who don’t know history.

As for ideology, belief in certain central premises can be helpful if they happen to be correct. For example, Senator Henry Jackson once said to me that you can tell what sort of “neighbor” a particular regime is likely to be just by looking at how it treats its own people. This corollary to the democratic peace theory, itself derived from the essential principles of the Scottish Enlightenment, I have found to be invariably true. But if core beliefs are not correct— e.g., if every conflict must in the Marxian analysis be a function of class warfare— the believer will be fundamentally misled. This is why Marxian analysts and the Soviet government grossly underestimated the importance of nationalism— in Eastern Europe, in Afghanistan, and elsewhere. The lesson here: try to stay nimble by at least occasionally revisiting core beliefs to make sure they still make sense in light of accumulated experience.

As for emotional indulgence, this is a real killer. It is natural that the egos of important people become bound up with particular views. Some people spend their entire later professional lives trying to vindicate judgments made and acts committed in earlier years. The excessive involvement of such emotions in one’s judgments is very injurious to clear and effective analysis. There are lots of examples, but one that comes to mind is the fact that the senior ranks of the U.S. Army in the early 1960s had been trained, indoctrinated, and promoted on the basis of a view of fighting limited wars that turned out to be very inappropriate for the Vietnam War. But so much were their emotions and egos bound up in their views that they refused to budge from their approach despite mounting evidence of failure. Indeed, one army officer in Vietnam said, “I’m not going to destroy the traditions and doctrine of the United States Army just to win this lousy war.” (Quoted in Brian M. Jenkins, “The Unchangeable War,” published by RAND, 1973.)

As for the problem of selective interests, this tends to distort the analysis of the past more than the application of the past to the present. If one wishes to study the French Revolution, for example, or any other historical event that is literally inaccessible to direct observation, one has to reply on the accumulation of prior written materials and, if available, physical evidence. If one reads about the events in Paris in 1789 in a book written in New York in 1938, the witting or unwitting biases of the 1938 author, based on his or her interest in the subject and on the literary and intellectual fashions of the day, will have inevitably filtered both information and interpretation. This is a classical problem treated in the philosophy of history, and to appreciate it one need not accept post-modern assertions that objective reality does not even exist, but only seems to thanks to a stream of “narratives.” But it is true that objectivity in history can only be approximate, and that it is affected by the selectivity born of the interests that the historian brings to the subject. The practical significance of this observation is simply the caveat that the user of historical analogues must take care with his tools. That means, unsurprisingly, that the more one knows about the subject, and about the breadth and depth of scholarship on it, the better.

Finally in this regard, the effective use of analogies ought to be a collective effort, but not just any kind of collective effort. Since learning from history and applying those lessons is so hard, it stands to reason that one ought to check one’s conclusions off against efforts by competent others. On the other hand, Milton Rokeach, Solomon Asch, Irving Janis, and others pioneered research on the phenomenon of groupthink (Janis’ term), which is the tendency of certain group dynamics to generate pressures to conformity in thinking. In other words, rather than multiplying possibilities and options, which to a point is a good thing, groupthink shrinks them. Groupthink is just as dangerous as solipsism, and more common when groups of senior policymakers find themselves under pressure to make decisions. The lesson within the “lesson”? Senior policymakers need to use “B-team” or “devil’s advocate” tactics to make sure that consultative processes designed to discover options do not end up burying them instead.

Put simply, there is a difference between a seat-of-the-pants style of reasoning by analogy and a more deliberate one informed by the canon of social science. Not that social science is perfect, or that it offers an infallible formula for getting this sort of thing right— not at all. Nevertheless, the scientific attitude toward the subject works for policy analysis even if the scientific method, strictly speaking, cannot. That attitude consists of six elements that, taken together, form a sort of ethics of investigation.

  • First, distinguish facts from non-facts. (Note: Most statistics are non-facts.)
  • Second, evaluate information without bias toward the source of the information.
  • Third, credit the achievements of others and share information and ideas with them.
  • Fourth, strictly follow the standard rules of evidence to ward off skewed conclusions.
  • Fifth, distinguish correlation from causality.
  • Sixth, bear in mind the critical role of context when examining discrete phenomena.

This suggests, among other things, that a liberal arts education— the sort of education that future senior policymakers in this culture are liable to have— should include training in natural science. The point is not so much to master the contents of biology or chemistry, but to impart a solid grounding in knowing and applying rules of evidence to empirical problems.

Following these six elements of an ethic of investigation will not guarantee good results every time. But it will improve one’s score, and ignoring them will almost certainly bring misfortune. There are no guarantees in life, save those proverbial ones having to do with death and taxes. But there are better and worse ways of making one’s way, and that includes making one’s way with the lessons of history.