For well more than a century, Western films have embodied the United States' most fundamental doctrine--expansionism--and depicted, in a uniquely American way, the archetypal battle between good and evil. Westerns also depict a country defined and re-defined by complex crises. World War II transformed the genre as well as the nation's identity. Since then, Hollywood filmmakers have been fighting America's ideological wars onscreen by translating...