The idea of the United States as a Christian nation is a powerful, seductive, and potentially destructive theme in American life, culture, and politics. And yet, as Richard T. Hughes reveals in this powerful book, the biblical vision of the "kingdom of God" stands at odds with the values and actions of an American empire that sanctions war instead of peace, promotes dominance and oppression instead of reconciliation, and exalts wealth and power instead...