In this provocative book, H. W. Brands confronts the vital question of why an ever-increasing number of Americans do not trust the federal government to improve their lives and to heal major social ills. How is it that government has come to be seen as the source of many of our problems, rather than the potential means of their solution? How has the word liberal become a term of abuse in American political discourse? From the...