Foreign Policy Research Institute A Nation Must Think Before it Acts Can Americans Count to Three?: The Anglo-Protestant Basis of U.S. Foreign Policy
Can Americans Count to Three?: The Anglo-Protestant Basis of U.S. Foreign Policy

Can Americans Count to Three?: The Anglo-Protestant Basis of U.S. Foreign Policy

  • Adam Garfinkle
  • February 13, 2018
  • Center for the Study of America and the West

Flag

An earlier and slightly different version of this essay was published in Orbis, Winter 2018, under the title “The Anglo-Protestant Basis of U.S. Foreign Policy.”

Many general templates have been advanced over the years to describe the core nature of U.S. foreign policy. The standard realism-versus-idealism schema found its finest form in Robert Osgood’s 1953 book Ideals and Self-Interest in American Foreign Policy. More recently, moving from two to two-times-two options, we have four-“schools” approach of Walter Russell Mead in Special Providence (2001). Others offerings, too, speckled through time, are useful for their parsimony to students set to pondering a complex subject. But all the better-known schema overlook or too heavily discount a central point of a “fish is the last to discover water” type.

Starting about three decades ago, a different synoptic sketch for thinking about U.S. foreign policy occurred to me, and it worked as well as any simple schema can, at least as a post-19th century framework. But I kept it to myself until, years later, a scholar published a roughly similar argument, freeing me to express my own since-practiced hunch.[1] A few others proffering similar arguments have added their voices, making for what amounts to a small “school” of thought.[2]

So what is this school’s basic template? It is that, denials to the contrary notwithstanding, the United States does too have an ideology that issues from a distinctive strategic culture. That strategic culture is essentially a secularized manqué of Anglo-Protestantism, leavened with certain key Enlightenment principles that themselves derive partly from the Abrahamic moral tradition, and of course partly from Hellenistic thought as transmuted via Rome. The ideology derived from it asserts democratic government and market capitalism, linked to the point of necessary mutual reinforcement, to be valid best-practice principles everywhere, and principles with definitive positive implications for global security. Unaware of its particularist origins in religious culture, most Americans since at least the dawn of the 20th century have believed that this secularized ideology is universally applicable and self-evidently superior to all others. Both the ideology and the fact that we rarely recognize it for what it is have gotten us into much trouble. So long as the ideology endures in its unselfconscious form, so will the trouble.

This contention is meant as no mere metaphor; it is offered up as a proposition suitable to a sustained argument. Six brief exemplary analyses follow, constituting the main burden of this essay, to illustrate that argument. I think this template explains the essence of these episodes, taken together, better than the alternatives on offer.

But before we come to examples, we need to elaborate the basic template, since it is little known for an obvious “fishy” reason: Americans Christians, still the majority of the American population and the founders of American society and political culture, think we separate church and state. So to most who gloss along on such things it is ipso facto impossible that the former could have anything fundamental to do with the latter; to suggest otherwise smacks of civil-religious heresy. Yet it does.

The Basic Idea

American political culture as a whole, not just its foreign policy culture, is a secularized, Scottish Enlightenment-accented form of Anglo-Protestantism.[3] As such, as an underlying orientation to political life both domestic and foreign, its core predicates are anti-hierarchical, anti-traditionalist, and anti-communalist compared to its Catholic and late medieval forbears. It is, conversely, highly individualistic and egalitarian. As the world’s first mass deep-literacy dependent religious expression, it is also resolutely scripturalist, hence contract-oriented, in both politics and commerce.

Anglo-Protestantism in America is of course not monolithic; the characteristic, if unselfconscious, American way of thinking about the world bears nontrivial inner divisions. The Puritans were Calvinists and so, later, were Presbyterians and Congregationalists; but many British and other early European migrants to North America were Anglican/Episcopalians and Lutherans, and in due course also Quakers, Baptists, Methodists, and others. Among the highborn American elite by the time of the Revolution, some leaned the Dutch Reformed/Calvinist variant of Protestantism into the mix of worldly affairs while others leaned different variants into it. They understood well their mutual and fundamental antipathy in what was still very much a religious age; 17th-century American colonial history reminds us, for examples, both that Anglicans ran evangelizing Puritans out of the Old Dominion at the point of pikes and swords, and that Boston features two burial grounds only a few hundred yards apart, both established in or around 1630, for a reason.

Indeed, it seems that at least some of the major cleavages in American politics and foreign policy over the years have reflected latter-day extensions of the main factions and lesser shards of the English Civil War. That war was in essence not a war of the Reformation but a war within it, pitting Enlightenment-friendly modernists (Anglicans et al.) against “boomeranging” anti-modernists (Dutch Reform et al.) who, though enchanted by Enlightenment rationality, recoiled from nascent modernity’s insistence on individual agency, secular political space (recall how the Massachusetts Bay Colony was governed), and a linear teleology of this-worldly progress. They did so by cooking up a radical form of determinism, “double predestination,” a broth of much older fare.[4]

In any event, the Dutch Reform theological variant decompressed over a few generations and, as Max Weber famously explained, came to yield unanticipated behaviors such that all variants of Protestantism in America gradually fused in their unwitting masquerade of secular viewpoints in the public sphere. Over time the religious energies dissipated, migrating fully into politics, such that while the content and vocabulary of American passions changed, the underlying cognitive syntax did not.[5] Appendix I provides a thumbnail concordance of religious and secular vocabulary.

In sum, American society moved from conscious efforts to apply particular religious views to political and social realities to the “social gospel” this-worlding of religious values, and then to the values taking on a life of their own in the absence of consciously religious thinking. So to argue for the preeminence of the Protestant-Enlightenment template in the formation of U.S. foreign policy is not to stake a claim about theology; it is not about the specific content of thinking so much as the style, or more accurately the syntax, of the thinking that matters: how evaluative thoughts are formed; how rules of evidence are shaped amid deductive and inductive possibilities; and perhaps most important of all, the way questions are generated according to whether and which variables are assumed to be dichotomous, cardinal, or ordinal. Under the right conditions that syntax can attach itself as a means of interpretation and guidance to nearly any set of problems.

For example, consider Woodrow Wilson (a Presbyterian) at Versailles. One sees, in this case, the Calvinist backdrop brought to U.S. foreign policy life, as Kurth put it, in Wilson’s “relentless opposition to the Habsburg Monarchy (the very embodiment of hierarchy and community, tradition and custom, and the only Roman Catholic great power) in the name of self-determination, which was an individualist . . . conception . . . ; and in his insistence upon the abstraction of collective security, as written down in the Covenant of the League of Nations.” All of this, with its dichotomous formulation of options and its deductive bias, seemed matter-of-factly normal and obvious to Wilson and to most Americans at the time, but did so only “to a people growing up in a culture shaped at its origins by Protestantism, rather than by some other religion.”[6]

Rest assured, the matter did not end with Wilson or World War I. The six examples that follow are all of post-World War II provenance, yet they differ substantively: understanding the USSR; controlling nuclear weapons; modernization theory; economic development praxis; the internationalization of consumerism; and democracy promotion.

Example I: The Problem with the USSR 

In Cold War days a debate ensued as to whether the problem we in the West had with the Soviet Union had mainly to do with the fact that it was Communist or the fact that it was Russian. Did the West face an ideologically driven adversary or a culture-driven one? Was the problem a discrete set of fairly modern ideas in the heads of the Soviet leadership, or some Russian thing deeper, older, more diffuse, and less formally articulated?

Most questions about complex issues that feature this sort of dichotomous arrangement are poor questions. It’s rarely one or the other, but usually some confluence of several factors. Nevertheless, the Communist explanation took strong pride of place over the Russian explanation in early Cold War years. The Communists believed certain things that were wrong, evil, contrary to human nature as divinely endowed. They vaulted communal interests over individual liberty; they concentrated power in a dictatorship rather than separated it; and they took a flexibly transactional approach to truth. They also had written texts, an ideology, that propounded these errors and evils; and they were armed, dangerous, and, above all—rather like the Devil—devious.

From an American point of view, therefore, the USSR was nearly the perfect enemy as a wrong-believing heathen power, and worse for being an apostate from Christian Europe. This para-theological approach helps to explain why otherwise intelligent people missed ample evidence of polycentric Communism displayed right before their eyes. Richard Nixon and Henry Kissinger are still thought of as genius innovators for understanding that Communist China could be leveraged against Communist Russia, but the more relevant observation is summed up in a question: What took us so long to work this out?

The question is pertinent because of arguably the worst mistake of U.S. Cold War policy: the Vietnam War. There were realist strategic arguments for taking a stand in Vietnam, but the main motive behind the ill-fated commitment to preserve a non-communist government in Saigon was the premise that North Vietnam was part of a monolithic international communist movement, and that the fall of South Vietnam, like the threatened fall of South Korea before it, would spell disaster ultimately on a global scale. That was not just how the war was “sold” by the political elite; it’s what most policymakers, senior and otherwise, really believed.

The ideological approach to analyzing the Soviets reflected Americans’ own affection for some of the worse features of ideological thinking. Not least of these was the two-valued orientation, as S.I. Hayakawa had famously put it in Language in Thought and Action back in 1949. The old cliche that it takes one to know one was, in this case anyway, a source of woeful delusion. We presumed the Soviets ideology besotted, while we merely knew self-evident truths—just a bunch of simple, straightforward “Merkins,” as Lyndon Johnson used to put it. The idea that both sides were products of mostly discrete protracted historical-cultural developments seemed not to occur.

Those few who were expert in Russian history and culture tended to the other side of the debate, and some did not fail to notice the irony of an American ideological pot calling the Soviet kettle a Communist shade of black. Stalin’s Georgian origins aside, the USSR was to them a Russian Empire in socialist disguise. They saw more continuity than discontinuity in Moscow’s behavior over time, and now that the Soviet Union is Russia again, the culture-based view clearly looks in retrospect to have been far more correct than not. One striking example of continuity from pre-Soviet times into Soviet times and back out again is the world-class Russian penchant for official lying.[7] Other examples, such as the instinct of officials to steal from the state—which bears a certain kind of logic in a situation where first the czar and then the proletariat party claimed to own everything—are not scarce.

But did Communism play no significant role in how Soviet leaders thought and acted? The question breaks down into two others: Where Russian culture and Communist ideology dovetailed and reinforced each other, as seems often to have been the case, what vocabulary prevailed?[8] And when Russian culture and Communist ideology pointed in different directions, how was the dissonance resolved into a Soviet/Russian “operational code,” to borrow Nathan Leites’s famous terminology? I offer no answers to these questions. I know but little Russian, and the fact that all four of my grandparents were born in the Russian Empire does not compensate. Besides, my interest is not in the Russian part of the story, but in the American part.

In retrospect, the mistaken if widely shared American view that it was Communism alone or overwhelmingly that made the USSR what it was as an adversary flowed from Americans’ own penchant to see the world through a highly abstract Manichean lens. Consider that in 1955 Secretary of State John Foster Dulles famously asserted that “neutrality has increasingly become obsolete and, except under very exceptional circumstances, it is an immoral and shortsighted conception.” In other words, it’s my way or the highway between the “free world” and the “communist world,” and there could be no legitimate third or fourth world in between. That is exactly the kind of two-valued, zero-sum thinking that underlay the premise behind the Vietnam War.

The two-valued orientation remains with us. On September 21, 2001, at another moment of American “exceptionalist” arousal, President George W. Bush said, “Every nation, in every region, now has a decision to make. Either you are with us, or you are with the terrorists.” This was a fully ecumenical sentiment: A week before then-Senator Hillary Clinton had said: “Every nation has to either be with us, or against us. Those who harbor terrorists, or who finance them, are going to pay a price.” And just a few months ago, on May 16, 2017, UN Ambassador Nikki Haley told the other nations of the world gathered at Turtle Bay, “You either support North Korea, or you support us. You have to choose, you have to pick a side.”

This is how Americans tend to talk—before the Cold War, during the Cold War, and since the Cold War—because this is how most Americans tend to think, educated ones as well as less-educated ones. The American two-valued orientation as applied to thinking about world affairs is not a class marker. It appears to be culturally embedded across classes, the observable exceptions proving the proverbial rule.

Example II: Nuclear Strategy and Arms Control 

In the decade or two after nuclear weapons were invented, Americans set about understanding how they—and their evolving systems of delivery—affected military strategy and geopolitics writ large. The effort created something between a cottage industry and guild in the United States, leaving in its wake hundreds of books, thousands of essays, and no shortage of classified and unclassified reports.

Despite manifold disagreements and varied perspectives, the overwhelming bias in the early period of this effort was what it is fair to call positivist-universalist in character. One can see this bias vividly in the development of game theory, a mathematically based approach to the logic of strategic interaction. The basic idea was that the weapons and their delivery systems pronounced their own impact according to a universally applicable logic. Cold metallic facts drove their own implications such that human decision-makers in different cultures would understand the basic characteristics of these weapons systems in the same way, and so reach the same basic conclusions about their meaning and utility. Americans, Soviets, British, French, Chinese, and others would therefore be able to read the meaning of others’ words and deeds without much muss and fuss. In other words, there was only one truth about nuclear weapons, and a person either had it or didn’t have it.

It was on the basis of this core epistemological assumption that Robert McNamara, President Kennedy’s Secretary of Defense, and others developed an initial game-theoretical approach to arms control. The central concept, before the age of counterforce complicated things, was that nuclear weapons could not be used as an adjunct to a political or ideological ambition because the level of destruction they would cause would overwhelm any rational actor’s calculation of costs and benefits in favor of non-use. This became known as the condition of mutual assured destruction, which then evolved from condition to doctrine in the American mind—and the word is used here advisedly.

It followed in McNamara’s thinking, displayed in a book titled Essence of Security, that a plateau of mutual sufficiency would eventually be reached in the buildup of nuclear forces, after which there would be no logic to building more. It would then be possible for the U.S. and Soviet governments to negotiate a condition of stable parity that would reinforce the supposedly universally understood predicate of mutual assured destruction. That condition of stability precluded robust efforts at missile defense, for anything that threw assured destruction into doubt was deemed destabilizing.

This calculation was the basis of U.S. arms control thinking well into the 1970s, and explains parsimoniously the outcome of the SALT I negotiations: an effort to establish parity in offensive forces and to proscribe defenses through the ABM Treaty. But two problems soon arose. The first was technological: the advent of counterforce complicated the assumptions of stable assured destruction. But the second was more daunting: Soviet behavior did not conform to the pattern of behavior predicted by the theory. Specifically, when the Soviet arsenal reached rough parity with the U.S. arsenal, the Soviets did not acknowledge any plateau, but kept building.

The high priests of arms control doctrine were thus confronted with a challenge. Most responded by doubling down on the theory, explaining away observed Soviet behavior with increasingly ornate excuses. They resembled panicked Ptolomaic astronomers trying to explain away Copernicus.  

Not everyone bought into the doctrine. As noted above, Nathan Leites at the RAND Corporation suggested as early as 1951 that different historical experiences, languages, and religious views might lead some people to see the same facts—in this case facts about nuclear weapons systems—differently from others. The Kremlin’s understanding was its “operational code.” Several years later, Jack Snyder at Columbia University followed Leites by coining the term “strategic culture,” which he defined as the “sum total of ideals, conditional emotional responses, and patterns of habitual behavior that members of the national strategic community have acquired through instruction or imitation and share with each other with regard to . . . strategy.”[9] Snyder argued that individuals within the USSR were socialized into a specific mode of thinking, the persistence of which qualified as a manifestation of a “culture” rather than mere policy. And later still Snyder’s Columbia colleague Robert Legvold published an influential essay at the very cusp of the 1979 SALT II debate in which he showed that U.S. and Soviet strategic doctrines were not the same, and laying out how those differences affected analysis of the SALT II Treaty’s terms. [10] 

In other words, contrary to early assumptions, nuclear weapons did not pronounce their own single uncluttered meaning; the range of meanings was culture-bound and multiple, not monadic and universal. Many Americans had a hard time with this idea and persisted in believing in the original formulation long after any evidence supported doing so. Indeed, many insisted even decades later that if the United States could deter the Soviet Union during the Cold War, then it did not matter if Iraq or Iran or North Korea got nuclear weapons because it could easily deter use of their much smaller arsenals, as well.

This superimposition of Cold War conditions onto circumstances and cultures radically different from those of the U.S.-Soviet competition during the Cold War illustrates the bias of the two-valued orientation and the penchant to seize upon abstract one-size-fits-all solutions to complex problems.[11] It vividly manifests the Manichean orientation: Either deterrence exists or it doesn’t exist, and there is no third possibility. If the weapons are so destructive as to make their use irrational, then, since political leaders are rational, culture differences and details of context are irrelevant and deterrence will exist.

Example III: Modernization Theory

When it comes to cottage industries and guild-like academic/policy science niches, modernization theory generated nearly as much ink as did strategic weapons theory in the 1950s and 1960s. The reason is straightforward: The global swath of the Cold War meant that newly independent countries squeezed from the dilapidated wombs of the European colonial empires were “in play.” The reigning theory was that communism thrived on discontent caused by poverty and exploitation, punctuated by the executive/administrative weakness of most new formally sovereign entities. It should therefore be U.S. policy to promote the modernization of traditional societies emerged into national freedom in order to protect then from communist subversion. An example of such thinking resides in a 1962 book by President Kennedy’s Secretary of State, Dean Rusk, titled Winds of Freedom.

The lion’s share of policy concern was economic development in order to overcome poverty, but modernization was understood even then to be about more than economics. Societies and their polities had to become modern too, or at any rate were likely to swept along toward modernity whether they liked it or not. The often-unstated assumption of the literature was that there was but one path to modernization out of traditional society. Some societies were further advanced along this road while other societies were further behind, but there was only one path: That is what the supposed laws of history or social science, according to one’s tastes, ordained.

There was nothing particularly American or Anglo about this assumption; nor was there anything new in it. In Würzburg, Franconia, in the 1744 Residenz of the Schönborn prince-bishops, there is a magnificent fresco ceiling painting by Giovanni Battista Tiepolo called “Allegory of the Planets and Continents,” completed in 1752. In the painting all the cultures of the earth are looking toward— and seem to be making their way to—the epicenter of culture and refinement, which just happens to be in Würzburg. The cultures are ranked from primitive to refined by their distance from the epicenter, such that Teipolo’s rendition of native Americans, for example, is pretty far down the wall. But the assumption painted into the whole is that they, like every other culture, are on the road to Franconia, and one day, if they keep the ideal of the Schönborn prince-bishops in mind, they too will get there—allegorically speaking, of course. Everyone would be Catholic, yes; but in the material splendor of the Residenz one could be forgiven for presuming that an afterthought. These were worldly men and women.

The point is not just about Western conceit or materialism, but about the presumably singular road to this-worldly refinement. That same presumption became formalized in the 19th century under the Three Age System theory, from whence it passed into the 20th-century modernization literature. A good broad example, focused on the Middle East, is Daniel Lerner’s influential 1958 book The Passing of Traditional Society.

It was assumed, too, that as modernization proceeded, superstition—also known to the divines of Western secular modernity as traditional forms of religion—would melt away before the power of scientific rationality, its attendant depredations and excuses for inequality and exploitation with it. Religious institutions would become privatized as in the West, and benign secularity would reign supreme. Insofar as it persisted, religion would become more “sophisticated,” turned essentially into a form of communal therapy in which the old theology became a respected if sometimes embarrassing vestige of times passed. In other words, everyone would have his own culturally specific version of weekend Unitarianism, in which what used to be ritual was transformed into mere ceremony. This general thesis of what it meant to become modern adorns in a minor key Peter Berger’s justly famous 1967 book The Sacred Canopy.

As with strategic nuclear doctrine, the main problem with modernization theory (and its corollary about religion) was that reality did not keep pace. It turned out that modernization was neither ineluctable nor, where it seemed to be occurring, uniform. It turned out that achieving the Weberian characteristics of a modern state—a prerequisite for Western-style democracy and complex market capitalism—came easier to some cultures than to others. Japan figured it out in the 19th century and some heretofore traditional societies on Europe’s fringe did, too—Finland is a good example. But much of the world remained within traditional patrimonial social structures. These structures were, and still are, not just different in degree on some stretched out unitary timeline; they are different in kind. Trying to export modern forms of democracy and market capitalism to tribal-patrimonial societies is a little like trying to attach a standard bicycle rack to a horse. You can perhaps tie it up there with effort, but it won’t work very well and it won’t stay there for long.

In due course, even modernization theory’s analysis of religion fell to pieces. Among other scholars, Ernst Gellner showed how in many Muslim societies upward mobility (associated with greater literacy, urbanization, education, and labor specialization) correlated with greater piety and stricter levels of observance, not the reverse.[12] He and others analyzed this phenomenon under the rubric of neo-fundamentalism, which showed that the sociological location of religious institutions within cultures varies. Martin Marty at the University of Chicago created the Fundamentalism Project to record the varieties on a global scale, and wiser observers, including Peter Berger, changed their minds to more carefully re-evaluate the range of linkages between modernization and religion.

But so powerful were the initial unilinear/universalist assumptions of modernization theory that, as with strategic nuclear theory, some people woke up very late to the collapse of their doctrine. I recall Madeleine Albright intoning with surprise sometime in the mid-1990s that religion was still a powerful force in political life in much of the world. This was at least a dozen years after Bernard Lewis had proclaimed “The Return of Islam,” after religious nationalism in South Asia and elsewhere had already become obvious facts of life, and after libraries already bulged with new analyses of these and other cases. I was embarrassed for her catching the tailwind of the zeitgeist many years after its headwinds had passed through, and she was hardly the only one.

Example IV: Economic Development Praxis

It is possible to see economic development theory as part and parcel of modernization doctrine, or to see it separately. How one parses this typological question presupposes a series of views on issues that were debated widely from the early 1960s into at least the 1980s. One such debate displayed the character of a standard chicken-and-egg conundrum about sequencing: Does economic development need to precede political modernization, or does political modernization need to precede sustainable economic development?

This and other disagreements notwithstanding, an overarching if tacit consensus held that, as Walter A. McDougall has put it, “the Third World was clay, that Americans were master potters, and that metrics could measure success.”[13] This consensus was of a piece with the positivist/hyper-rational and culture-blind social engineering mentality that characterized American domestic anti-poverty programs of the same era.

As to the sequencing debate, Samuel Huntington settled the matter intellectually in favor of the primacy of political institutions.[14] Except that he didn’t settle it in terms of praxis, because even then everyone knew that political modernization was hard and slow while economic development—as amazing as it sounds to say it today—was thought to be much simpler and quicker to achieve. The development people at the U.S. Agency for International Development, the World Bank, and prestigious non-governmental institutions (for prime examples, the Ford and Rockefeller Foundations) conceived the route to economic development as a technical exercise in which the more and higher the quality of the uniform inputs, the greater and quicker would be the desired and predictable outputs. An appropriate metaphor is that of a vending machine. Different machines might offer different products, but they were all still vending machines and one could build and repair all of them basically the same way.

Again the literature is instructive. An ur-source for this way of thinking is the work of yet another Kennedy Administration “best and brightest”: Walt Whitman Rostow. His work, The Stages of Economic Development, posited a method to achieve “take-off” that applied to all countries no matter their divergent histories, cultures, or factor-endowment blessings. It was based on abstract economic theory and helped form the thinking behind the 1961 birth of the U.S. Agency for International Development—USAID.

Any economy could grow, no matter its specific political arrangements, so long as those arrangements allowed for certain policy techniques to be applied:  import substitution; technical education; and especially technology transfer. The idea was that if you took a machine that worked wonders for productivity in a Western economy and shipped it to, say, India or Sierra Leone, it would—provided there were people who knew how to operate and maintain the machine and a market for what it produced—be as productive there as it was where it was invented. Again, the main presumption here was the universal applicability of technique based on abstract theory. The added assumption, true to the developing behemoth of macroeconomic orthodoxy, was that all human beings are interchangeable and that all can be presumed to be rational actors set on maximizing value.

Of course this did not work, as a few observers such as Peter Bauer and later Albert O. Hirschman warned it would not. Before long, some economists began to understand the reasons. Technology embodied the capital/labor ratio of the society that produced it, so if you simply shipped a machine to a place with a different capital/labor ratio, and other different factor endowments, it would not produce comparable net economic results. The result of technology transfer to India, mainly in agricultural techniques, proved a case in point. Machinery developed in a place with a lot of capital relative to labor sent to a place where capital was scarce but labor plentiful produced crops only at the cost of driving huge numbers of people off the land before a nascent industrial economy could assimilate them. The result was sprawling slums like that of Calcutta.[15]

Cultural differences mattered, too, it turned out. So in one West African country a USAID project designed to bring water up the hill from a river to a village using electrical pumps failed not so much because reliable supplies of electricity could not be maintained, but because the pumps displaced the lines of women who traditionally passed the water, bucket by bucket, up the hill each morning. That line of women served an important social communications function for the village and without it things went a little haywire. So the men and the women together took the pumps out of service, turned them upside down and planted flowers in them. When all this was explained to puzzled USAID personnel who had returned to review the project’s progress, they were predictably nonplussed. These were not “ugly” Americans, merely clueless ones of the generic variety who had yet to understand that one size does not fit all when it comes to the kaleidoscopic intersection of technology, social structure, and culture.

Example V: Consumerism as An Export

Americans often display the personality of second- or third-generation lapsed Calvinists. Without being explicit or even self-conscious about it, they tend to conceive of societal virtue in material terms: a “good” society is a prosperous society, and a prosperous society is evidence that the people who created it are “good.” Affluence is seen not as a means to a higher end, but as a sign of extant social virtue.

This belief helps to explain yet another cottage industry, that being the avalanche of books and essays in the Anglo-American world designed to show that market capitalism is both compatible with and structurally encourages egalitarian democratic politics, and that egalitarian democracies repay the favor by conducing to affluence-generating free markets. There are sophisticated versions of this thesis from John Locke and Adam Smith all the way to Michael Novak and other contemporaries. And there are simpler faith-based versions that manage to overlook the fact that some wealthy countries are not true electoral democracies (e.g., Singapore) and some electoral democracies are not wealthy countries (e.g., Kenya).

The point here for our purposes is that here we have, yet again, a secular faith-based universalist one-size-fits-all abstract theory about the relationships that make up the discipline that used to be properly called political economy. And the theory is wanting when faced with reality because its abstract premises cannot be replicated in the real world.

There are several reasons why pure egalitarian democracies cannot exist within mass-society national entities, and why all tend either some or a lot toward oligarchy under conditions of modern industrial economics.[16] To the extent they tend toward oligarchy economies will produce varying degrees of structural inequality even if there were no other “natural” reasons for it (but of course there are other reasons:  Adams’s and Jefferson’s famous agreement on a “natural aristocracy of talent and virtue” points to them). And inequality creates tensions within formal democratic politics at least when “factions,” as Madison called them in The Federalist, form around class interests. So oligarchical, class-inflected politics will invariably distort markets, making them less than wholly free even if there were no other reasons for it (but of course there are other reasons: for example, the non-transparency of much economically relevant information and the need for major infrastructural utilities to be public or public-private monopolies).

The result is that democratic publics must struggle to maintain democracy and to keep markets maximally free against those interests that would bend them to their own benefit. Economic inequality will always produce political inequality—which often produces reified structural economic inequality, as with aristocracies—unless institutions are devised and properly led to limit it. The tendency of wide-open systems to plutocracy and to mild and less-mild forms of political corruption is a given, for the only way political authority can enforce egalitarianism is for that authority to be crushingly authoritarian—which only opens the way for the “populist” authoritarian elite to plunder the commons instead. This is why the weight of Greco-Roman thinking on this point held that real tyranny comes about only as the result of a deteriorated democracy.

The relationship between democratic politics and market capitalism is thus one of everlasting tension. It is neither automatic by design nor homeostatic by function as depicted in the doctrinal versions of the democratic capitalist para-Calvinist faith. The relationship between the two is neither perfectly compatible nor perfectly incompatible; it is something in between that is worth the fight to preserve for the simple reason that, whatever the complications, it works better than any known and available alternative. But the Manichean mind, whether turned toward status quo conservatism or toward revolution, resists resting content with this in-between reality.

So what happened when the multiple-small “shopkeeper” economic system idealized by economic theorists morphed during the forty years after World War I into the juggernaut of industrial-scale consumerism lorded over ever-larger corporations and “big” labor? Americans told themselves a tale of hard work, virtue, and divine election in the evolving corpus of the American civil religion, and as usual assumed that what they were thrilled with everyone else in the world would be thrilled with, too. That was because there is but one universal measure of social wellbeing, and we possessed the “good news” of what it was to share with others (as long as they were not communists).

It is easy to illustrate the maturation of this conviction, especially in the post-Word War II period, admixed as it was with the tuning of the American civil religion to the frequency of the struggle against communism. McDougall points out that the newly strengthened neo-Calvinist dispensation (he does not call it exactly that) did not go uncriticized. The headiness of American materialist-boosted triumphalism of those days led Daniel Boorstin to write in 1962 that whenever the gods wish to punish us they make us believe our own advertising. Everywhere Americans looked in their hall of mirrors they saw distorted reflections of themselves, leading McDougall to conclude that Americans “believed the future must inevitably be shaped by the three things they assumed all people wanted—freedom, science, and stuff.”[17]

Will Herberg, too, in his 1955 book Protestant, Catholic, Jew, argued that the identification of religion with national purpose engendered a messianic need to, in Herberg’s words, “bring the American Way of Life, compounded almost equally of democracy and free enterprise, to every corner of the globe.”[18] Americans did again what they always do when they believe they have in hand a self-evident truth: They evangelized their presumably universally valid Anglo-Protestant/Enlightenment-lite creed on an appropriately universal scale, nicely pairing the materialist doxy of the Enlightenment with the evangelical methodology of a still relatively new and vibrant religious culture: Protestantism.

As economic metaphors formed the vanguard of an American-led global Calvinist surge, many otherwise temperamental conservatives lost their sense of tragedy. The ethos of the Eisenhower era was to deploy those Americans who understood what McDougall shrewdly calls the “mystical meaning of progress” in order to create a safer and better world. Conflict was supposed to surrender to cooperation, greed to discipline, coercion to self-government. But though conservative in some senses, this aspiration became a revolutionary and in time highly disruptive American export to the world.

The appeal of a process that could indeed conduce to global cooperation and peace was huge in the aftermath two world wars. It is therefore no surprise that as the idea of a global corporate commonwealth came of age, the foundation stone of the American civil religion’s foreign policy eschatology took firmer shape in the form of democratic peace theory. The doyens of the faith identified a need, and the alchemists (a.k.a. political scientists) obliged. World peace would prevail once democracies prevailed globally, and to achieve that the high priests of modernization and economic development would create thriving middle-class societies whose modes of economic behavior—not to exclude consumerism—would provide the material ballast necessary for the transformation to the desired this-worldly messianic victory. Even the communist world would succumb eventually through the deus ex machina of “convergence.”

And then, amazingly, in the 1989-91 period the prophecy seemed suddenly to come true as American moral virtue and consumer-driven affluence walked hand in hand into earthly glory, flattening the Soviet bloc in all aspects save the one that ended up mattering least—military power. The Berlin Wall fell, the USSR disintegrated without a shot being fired, and the unipolar moment was upon us. Then, during the belle epoch just around the corner, the Dow Jones index quadrupled, and the Federal budget came into balance—hallelujah! So the miracles of the first fifteen years after the end of World War II that sired a form of spiritualized triumphalism in Americans turned out to be a mere rehearsal for what happened in the 1990s.  

The Clinton Administration imbibed the entire spiked punchbowl. With history ended and universal best practice now unarguably established, there was no need for strategy in the traditional sense. Now that the great streams of American power, ideals, and wisdom had joined together, all that was left was detailing work: the expansion of best practice out to global scale. What had worked in America would, again, work everywhere. This was the later-day echo of what John Kennedy had said in his famous 1962 Ann Arbor speech, where he proclaimed what amounted to the end of ideology and history both, averring that with scarcity conquered the work ahead would be mainly a technical exercise.

So in 1992 the U.S. government did what it could not yet do in 1962: It deployed economic shock-therapy experts to Russia without a thought to the historically shaped dearth of institutions and attitudes necessary to sustain turbo-capitalist ways. Capital was freed further to flow globally, and the Washington Consensus pry bar helped it to flow even where some locals had their doubts about the wisdom of it. The result was a series of financial crises around the world, not to exclude Russia. When that warning went unheeded and market fundamentalist dogma led Robert Rubin, Larry Summers, and other Democrats to push the peddle to the floor with regard to U.S. banking regulations in 1999 and 2000, the result in due course was the meltdown of 2007-08.

Meanwhile, the global fallout from the American corporate penetration of heretofore relatively economically and culturally sheltered societies was to further weaken a series of fragile states, not least in the Middle East, and open a demon’s cupboard of festering sectarian and ethnic identity-politics energies. It is an exaggeration to say that in such fashion America caused al-Qaeda and ISIS, but, however inadvertently, we did have something to do with it.

Remarkably, whenever Americans saddle up in maximal ideological gear—ideological meaning here secularized theological “civil religion” gear taken as matter-of-factly true—they somehow manage to persuade themselves that they are the least ideological folk on the planet. That illusion came home to roost toward the close of the Clinton era. We became “the indispensible power” and recklessly said so out loud, as true-believing evangelists are wont to do. U.S. foreign policy became a global therapy exercise, with Americans as the wise doctors and everyone else as either nurses or as the needy patients. It was “foreign policy as social work” in Michael Mandelbaum’s perfect locution, and marked another victory lap for the prophet Philip Rieff.[19]

None of the patients thrived, however, not even in the Balkans and less so in Somalia, Haiti, and, as ever, Israel/Palestine; but we were not daunted. All this self-enraptured arrogance carried over but mildly diminished into the first eight months of the George W. Bush period, promises of humility yet unredeemed, until, once again, reality intruded on September 11, 2001, and Americans struggled to figure out what had gone wrong.

We are struggling still. When everything looks peachy to us ideological cleavages seem to disappear, at home and in the world. We interpret that to mean that everyone is singing from the same hymnal. When things turn scary, (someone else’s) ideology suddenly returns, or geopolitics “returns,” or something else returns that never actually went away in the first place. If people abroad hate and want to do nasty things to us, it must be because they are thinking the wrong way, so it falls to us to convert them into thinking the right way. It is very Protestant, after all, to think that intentions (right belief) are more important than and must prefigure outcomes (works).

As already suggested, that’s how we came upon, first, the monolithic world Communist threat and the birth of USIA, and more recently the scourge of extremist ideology against which we must, we are endlessly told, fight a “war of ideas.” These are, plainly put, calls to convert the heathens. That social circumstances and cultural dynamics many centuries in the making, which live well below airy abstractions in foreigners’ mind just as they do in ours, might have anything to do with our problems never seems to occur. Which brings us to our final example.

Example VI: Democracy Promotion

Democracy promotion in one form or another has long been part of U.S. foreign policy, going back to the 1820s’ support for Latin American independence from Spain—not that the success of independence sired much success for democracy there until many years later. As American power waxed, its leaders’ capacity for idealist indulgence waxed with it, boiling to a crowning froth with Woodrow Wilson at Vera Cruz and then Versailles. The same impulses grew anew with the opportunities anticipated from victory in World War II, after which the democracy promotion plank became part and parcel of “the diplomatic theology of containment,” in William Inboden’s apt phrase, pointing to the culmination of democratic peace theory beyond the success of containment.

With that success, geopolitics seemed to vanish and, as nature abhors a vacuum, the chalice of American foreign policy energies filled with the aforementioned Clintonian reign over the U.S.-led global corporate commonwealth. Democracy promotion, admixed with a mélange of human rights imperatives, rose in stature. The State Department’s sense of its purpose morphed accordingly from the tradition of supporting U.S. interests to the evangelical calling to transform (read: convert) other countries’ societies and political orders so that they could participate as fuller partners in the U.S.-led commonwealth.[20]

Then, with 9/11, the young George W. Bush Administration vaulted democracy promotion from one plank of U.S. foreign policy among others to the very means of salvation for the Republic in the face of a presumed new existential threat. This is not the place to rehearse again in detail the U.S. reaction to 9/11.[21]  Suffice it to say that both the left-of-center meliorist theory of the case (the wrong thinkers had to be deprived of converts through the mercies of charity) and the then-tenured right-of-center democracy-deficit theory of the case (the wrong thinkers had to be born again into the faith of freedom) missed the essence of the terrorist threat, which was, ironically enough, an explicitly religion-based mobilization of political energies designed to defeat those inside and outside Middle Eastern societies thought bent on destroying the corporate identity of the umma. Both American theories of the case amounted to highly abstract faith-based interpretations of what had happened, both of them tethered to the Cold War experience, and hence of what to do about it: a “Marshall Plan for the Middle East” versus a forced-march democracy promotion campaign.

President Bush’s para-religious view, complete with a one-time etymologically innocent use of the word “crusade,” that democracy (and free markets) is the natural default condition of all humanity came to be known, through the dark arts of speechwriting, as the “forward strategy for freedom.” Its presumption that merely removing artificial obstacles to liberal institutions can bring stable democracy into being fairly quickly—just as seemed to occur in Mitteleuropa after the Soviet Union collapsed—is without any support in history or social science. These views are, again plainly put, matters of faith inextricably bound up with the “exceptionalism” baked into the American Protestant founding.

The results of the “forward strategy” policy were as breathtakingly paradoxical as they were tragic. The more it bore down on the Middle East, with guns in Iraq and with the BMENA (Broader Middle East and North Africa) Initiative everywhere it could gain access, the more effectively the Islamist-inclined were able to repurpose Western energies jujitsu-like to gain leverage over their domestic adversaries. Besides, fairly rapid democratization, even had it been possible, would not have stabilized Arab societies and made them less likely to spark off political violence; as with rapid economic growth, it would have made such violence more likely, as anyone who understands Schumpeter’s term “creative destruction” knows—and as anyone who paid attention to the “progress” of the ludicrously misnamed “Arab Spring” without scales for eyelids saw. We are fortunate that the “forward strategy for freedom” did not “succeed” for any longer than it did.

The tragedies are obvious. The less obvious paradox lies in the fact that when the Bush Administration campaigned to spread democracy in the Middle East it never occurred to most of its principals that what they presumed to be a strictly secular endeavor would be interpreted in the Muslim world through a religious prism. When Abu Musab al-Zarqawi, the late leader of al-Qaeda in Iraq and the founding father of ISIS, tried to persuade Iraqis not to vote because “democracy” was a front tactic for Christian evangelism, a slippery slope leading to apostasy, he spoke a language that resonated in the ears of many Iraqis and other Muslim Arabs.

The locals were essentially correct about this. We Americans were speaking a creedal tongue we thought entirely separate from “religion”—a word that does not exist as such in Arabic—because we “separate church from state,” a principle the original meaning of which most contemporary Americans have little clue. In truth, American political culture is, as we have been at pains to establish, not as far from religion as most Americans think it is. We are a theotropic society, whether expressed in traditional or ersatz secular ways. But most Americans can’t see that anymore than a fish can perceive water.

American longing to spread the democracy gospel to the Muslims is the 21st-century version of what was, in the 19th century, a more honest and self-aware missionary movement. We might persuade ourselves that our deepest beliefs can be compartmentalized into what is “political” and what is “religious,” and through such persuasion actually bring such compartmentalization about, at least to some extent and for a while. But Middle Easterners possess no such compartments by dint of a history sans Renaissance or Reformation, and not until recently have they discovered any reason to engage in that kind of self-persuasion. Not that theology and ideology are identical; but as essentially creedal systems they are bound to blur in cultures in which political theology, to use Mark Lilla’s apt term, has never been vanquished or, in most cases, even seriously challenged.[22] The ironic upshot is a demonic version of O’Henry’s “Gift of the Magi”, where parallel but separate behaviors led not to serendipity and bliss, but to anger and violence. Alas, foreign policy conducted as though it were a passion play, posing cosmic good against cosmic evil, is bound to make dramatic (word carefully chosen) mistakes.

Can Americans Count to Three?

What all six of these examples have in common is clear. They all manifest one-size-fits-all faith-based universalist assumptions, assumptions that do not, because they must not, ever contradict or complicate each other. In a two-valued world there can be no incommensurate “goods.” (Isaiah Berlin would chortle were he still among us.)

This penchant to lazy universalism radically discounts the awareness and significance of cultural differences, to the point that American leaders rarely seem to feel any particular need to know all that much about a country before they invade it—from Vietnam to Iraq to Libya, and several other places in between. It is also why we hear presidents and other members of the American political class repeat endlessly the mantra that people all over the world are basically the same and want the same things for their children. This is a guaranteed feel-good line that makes tactical political sense in an ever more multicultural America, but as a statement about global realities it could hardly be more misleading.

All the examples also demonstrate the power of Americans’ belief in abstract assertions that have not been and cannot be empirically supported. We invariably start with abstract principles and work deductively downward, so that, to paraphrase Walter Lippmann from Public Opinion, we do not see and then define, but first define and then see. And how we define reposes in highly Manichean creedal systems that are, to repeat, anti-hierarchical, egalitarian, and scripturalist/contractual at their core. The unstated assumption is that if we get the basic truth right, the details will take care of themselves.[23]

We are deploying the two-valued orientation again now in the way we try to understand the problem Middle Eastern-bred political violence, chalking it up to “extremist ideology” (read: wrong thinking). Hence the redoubling of calls for a “war of ideas” every time a new terrorist atrocity occurs.[24] This is a replay of the ideological perspective on why the Soviet Union was a menace to us. As was the case then, it is still an erroneous perspective now. As before too, a much deeper concatenation of cultural and sociological factors explains better what we wish to understand about the contemporary Muslim world. But developing that understanding takes time, work, intelligence, and most important perhaps, a tolerance for nuance and ambiguity that abrades against a Manichean mentality that credits only “right thinking” and “wrong thinking.”[25] It takes, in other words, an ability and willingness to count to three.

America is not the only political culture to exhibit a two-valued orientation. “My way or the highway” remarks have been expressed by non-American political figures as well, though they are rarer. After all, the two-valued orientation resides deep in primal human nature—as far as categorical distinctions go, it was mankind’s first pedagogical achievement to be able to distinguish one class of things from another. So at different levels within the same culture, layers of abstractions usually contain both two-valued and non-two-valued orientations, some used in some circumstances, others in different circumstances. Most adults acquire the knack of knowing when a two-valued template applies and when it does not.

Anglo-American culture, however, seems to more strongly embody the two-valued orientation than other Western/Caucasian cultures. To take an uncomfortable but non-trivial example, in antebellum and Jim Crow-era American law anyone with so much as a single drop of non-“white” blood was considered “colored.” That sensibility was inherited from the mother country, but was not prevalent on the Continent in France, Portugal, Italy, and other European countries, and it was law-embedded in none of them.

Neither are Americans alone in having antecedents in religious culture shape their contemporary political conceptions and foreign policy behaviors. The phenomenon is widespread because it has to be; cultural templates applicable to politics must come from somewhere if they do not simply fall out of the sky, and they but rarely come from ready-to-order menus in the form of ideological manifestos. Were that not the case then domestic political and especially foreign policy behaviors would flow only from the rational deliberations of the moment, as aided by memory brought to bear in the heads of decision-making elites. This would be like describing individual behavior without reference to any normal pre- or subconscious brain functioning.

Clearly, too, the characteristic American way of thinking about its relations with the world emerged over time, the overseas projection of its Anglo-Protestant/Enlightenment-infused idealism growing with American power and hence with its ambit of ambition and choice.[26] Wilsonianism, as it is commonly called, may have been latent all along, but it did not emerge in full until early this past century.

Additionally, this characteristic syntax has mattered more when elites are focused on emotionally pitched international crises when we are aroused by confusion, fear, or a sense of opportunity. In normal times, the professionalization of foreign policy expertise and the bureaucratic routinization of its function insulates most lower-level decisions from the great swirling currents of American politics, so that the American “operational code” becomes muted in a sheath of quotidian responsibilities. But in unstuck times we quickly revert to our standard mental-syntactical form. The speed with which the George W. Bush Administration travelled from promised humility to projectile hubris is the clearest case in point we may ever have. The late Michael Kelly wins the prize for succinctness, having described American foreign policy in those early post-9/11 days as “secular evangelism, armed.”

America has been and remains different even within Christendom. It really is, as G.K. Chesterton said, a nation with the soul of a church. Can anyone imagine anything like the Left Behind series becoming major bestsellers in Western Europe? Can any nation match America for evangelical energy when it comes to spreading (presumed) international best-practice governance? Is any other foreign ministry saddled by a legislature to issue “human rights,” “trafficking-in-persons,” and “religious freedom” reports each and every year—all of them essentially state-of-evangelism documents? And can any other nation match America for its passive-aggressive romance with apocalyptical end-of-the-world framings of national security-related issues—nuclear Armageddon, nuclear winter, popular versions of global warming, and more?

Indeed, might it be that the distended American fear of apocalyptically minded Islam derives not from its being so alien to American ways of thinking, but rather from its being so close? If we are the Children of the Sons of Light, someone has to be the Children of the Sons of Darkness. The Soviets were good at it, but the al-Qaeda/ISIS specter is much better. The Soviets were merely atheists in gray suits; al-Qaeda/ISIS is—yes, you guessed it—the robed anti-Christ. ISIS even has a black flag!

What Now?

All that said, the characteristic American way of apprehending the world may be changing qualitatively rather than merely evolving, as its civil religion has evolved heretofore for many decades. Hints repose in the fact that neither the previous nor the sitting president shares the exceptionalist narrative the way all their predecessors did, yet they got elected anyway—and from very different domains within the electorate. They could presage a new normal, which would suggest a massive if gradual failure in public myth maintenance, to invoke the late William McNeill’s language, over the past half century. But it is too soon to say.

If the dousing of traditional American Protestant/Enlightenment-based ideology in foreign policy is truly upon us, it remains to be seen if that turns out on balance to be a good thing. That will depend on which creedal anchors eventually replace it, because some such anchors must be present. For the time being, it may be that the syntactical residua of the anti-modernist Dutch Reform inheritance are replacing the Anglican variety of Anglo-Protestantism, in other words that the basic predicates of modernity—individual agency, secularity in politics and the arts, and the idea of worldly progress—are all in decline.

We will know eventually if U.S. foreign policy behavior begins to sound and act in a manner consistently different from the past. But what it will then sound and act like is anyone’s guess.


[1] That scholar was James Kurth; see his “The Protestant Deformation,” Orbis (Spring 1998), updated and refined at my bidding as, “George W. Bush and the Protestant Deformation,” The American Interest (Winter 2005). I kept my schema to myself because, as an observant Jew, it felt awkward to make a critical argument implicating a Christian religious view to a mostly Christian audience. Once Professor Kurth, not just a Protestant but a deacon of his church, made an even more searing argument, my concern abated. Hence my articles and essays including, among others, “Die bewaffneten Missionare,” Die Zeit, January 30, 2003 [reprinted in Michael Thumann, ed., Der Islam und der Westen (Berlin: Taschenbuch Verlag, 2003)]; “Reflections on the 9/11 Decade,” The American Interest Online, September 1, 2011; and “Missionary Creep in Egypt,” The American Interest (Autumn 2013).

[2] Note particularly Walter A. McDougall, The Tragedy of American Foreign Policy (Yale University Press, 2016). Others who have grasped a piece of the template include William Inboden, Religion and American Foreign Policy, 1945-1960: The Soul of Containment (2008); Jonathan Herzog, The Spiritual-Industrial Complex (2011); and Andrew Preston, Sword of the Spirit, Shield of Faith (2013). Even Henry Kissinger has let slip comments in this regard; for one example, note this remark from “Stability in Iraq and Beyond”, Washington Post, January 21, 2007: “Covert operations should not be confused with missionary work.”

[3] The Enlightenment’s contribution to the mix includes preeminently a ratification of the universalism inherent in Christianity, but also its bringing the “age of reason” to bear on early Protestantism—both of them comingled 16th-century developments. How to square the developing Enlightenment faith in science with the still-new faith in the Protestant God was a tricky task, which different Protestant professions handled differently. Exactly how that proceeded to success, however, is beyond the scope of this essay.

[4] Kevin Phillips’ The Cousins’ War (1999) moots this argument, but takes it on a long and different, and I think shallower, journey than he might have.

[5] This declension over time is what Kurth means by “the Protestant Deformation.”

[6] Kurth, “The Protestant Deformation,” p. 11. Emphasis in the original.

[7] Noted piquantly and illustrated in George Walden, “Through the Mist,” The American Interest (July-August 2017).

[8] A pre-Soviet and Soviet-era Oxford English Dictionary-style analysis on the origins and changing uses of the word “mir” might prove interesting in this regard.

[9] Nathan Leities, “The Soviet Strategic Culture: Implications for Limited Nuclear Options,” RAND Corporation, 1951.

[10] Robert Legvold, “U.S. and Soviet Strategic Doctrine and SALT,” Survival, January-February 1979.

[11] See Adam Garfinkle, “Culture and Deterrence,” Foreign Policy Research Institute E-Note, August 25, 2006.

[12] A key text is Gellner, Muslim Society (Cambridge University Press, 1981).

[13] McDougall, The Tragedy of American Foreign Policy, p.  282.

[14] Huntington, Political Order in Changing Societies (Yale University Press, 1968).

[15] One of the earliest debunkers of development doctrine was Paul Strassmann, Technological Change & Economic Development (Cornell University Press, 1968).

[16] Many of these explanations have long been known, from the work—among a great many—of William Graham Sumner in the 19th century to Robert Michels in the early 20th. There are even predicates of understanding from premodern times, going back all the way to Aristotle.

[17] McDougall, p. 244.

[18] McDougall, pp. 272-73.

[19] The reference is to his The Triumph of the Therapeutic: Uses of Faith After Freud (University of Chicago Press, 1966).

[20] See James Jeffrey, “The State of the State Department,” The American Interest (July-August 2017).

[21] Garfinkle, “Reflections on the 9/11 Decade.”

[22] Mark Lilla, The Stillborn God: Religion, Politics, and the Modern West (Knopf, 2007).

[23] An extended illustration of this belief at work may be found in Dov Zakheim, A Vulcan’s Tale (Washington, DC: Brookings, 2011).

[24] See, Adam Garfinkle, “How We Misunderstand the Sources of Religious Violence: The 2016 Templeton Lecture on Religion and Politics,” FPRI E-Note, December 19, 2016.

[25] Powerful support for my long-held contention resides in Marc Sageman, Turning to Political Violence: The Emergence of Terrorism (University of Pennsylvania Press, 2017).

[26] One view holds that American exceptionalism has always driven abstract American thinking about the world, but that in the earlier years of the Republic it was directed inward, as if to say: “We are different and better than you foreigners; you cannot be like us; and all we ask is that you leave us alone.” When we concluded that the world would not leave us alone, and we thought we had the power to change others, our exceptionalism flipped to point outwards—and then inward again, as during the interwar period, when many Americans concluded they had been right the first time. This thesis makes a cameo appearance in Edward Stillman and William Pfaff, Power and Impotence: The Failure of American Foreign Policy (Random House, 1964).