The West is vital to the myth of America. It is where radical individualism and beautiful landscapes merge in a sort of earthly paradise. Or so we've been led to believe by cinematic and literary... This description may be from another edition of this product.