The United States has looked inward throughout most of its history, preferring to avoid "foreign entanglements," as George Washington famously advised. After World War II, however, Americans became more inclined to break with the past and take a prominent place on the world stage...