For well more than a century, Western films have embodied the United States' most fundamental doctrine--expansionism--and depicted, in a uniquely American way, the archetypal battle between good and evil. Westerns also depict a country defined and re-defined by complex crises...