For most of the twentieth century, the idea of the West figured prominently in America’s understanding of its role in the world.1 During the two world wars and much of the Cold War, the United States saw itself as the defender of Western civilization. At the beginning of the twenty-first century, however, the idea of the West is largely absent from American discourse about world affairs. There is much talk of globalization today, but little of Westernization. Is this because globalization is really just a more acceptable name for Westernization? Or is it because the American elite have now abandoned Western civilization for the sake of some post-Western project, be it the global economy or something else? What is the meaning of the West in the age of globalization, and what is to be its fate?