Foreign Policy Research Institute A Nation Must Think Before it Acts What We Mean by the West

What We Mean by the West

The subject today is the meaning of “the West” in the sense of Western civilization. The first and most obvious point to make is that the meaning of the West is a function of who is using the word. Those who feel themselves to be part of the West-who think of the West as “we’‘-will surely have flattering things to say about their civilization. Those who think of the West as the “other” are likely to define it in less flattering terms. The basic meaning of the word is “where the sun sets”-one of the cardinal directions. Chinese geomancers drafted elaborate and codified rules about what that direction meant as opposed to the East, North, or South. But we in the West have nothing so precise as the Chinese: to us the West connotes all sorts of characteristics desired by some, eschewed by others.

In the United States, for instance, the West conjures up the Wild West of our historic frontier, a place of freedom, open spaces, new starts, and a certain manliness. But it was also a place where danger, loneliness (largely due to the paucity of women), and lawlessness often prevailed. At the same time, Americans have habitually embraced a contradictory meaning of the West. For inasmuch as all North America was the West vis-&vis the Old World that colonists and later immigrants had left behind, the West was considered a “more perfect” place conducive, not to danger and lawlessness, but to liberty, equality, and prosperity. Americans were “new men under new skies,” as Frederick Jackson Turner proclaimed.

Read the full article here.