The United States' involvement in the two world wars of the twentieth century drastically changed the country's role in the world. It also changed the world's perception of the country. Up until 1917, America had stayed out of European affairs for the most part. That all changed...