Americans' cultural love affair with their country's landscape started in the nineteenth century, when expansionism was often promoted as divine mission, the West was still the frontier, and scenery... This description may be from another edition of this product.