The Second World War changed how the United States saw women's roles. Not only could women work, they could do work that men did. They could work in homes and hospitals, but they could also work in... This description may be from another edition of this product.