The American West is an evocative term that conjures up images of cowboys and Indians, covered wagons, sheriffs and outlaws, and endless prairies as well as contemporary images ranging from national parks to the oil, aerospace, and film industries. In addition, the West encompasses not only the past and present of the area west of the Mississippi but also the frontier as it moved across each of the fifty American states, offering the promise of freedom...