After World War II, the pivotal event in twentieth-century American history, life both at home and abroad seemed more complex and more dangerous than ever before. The political, economic, and social changes wrought by the war, such as the centralization and regulation of economic affairs by the federal government, new roles for women and minorities in American life, and the world leadership of the United States, remained in place after the soldiers...