Journalism and the American Experience offers a comprehensive examination of the critical role journalism has played in the struggle over America's democratic institutions and culture. Journalism is central to the story of the nation's founding and has continued to influence and shape debates over public policy, American exceptionalism, and the meaning and significance of the United States in world history. Placed at the intersection of American...