American higher education was transformed between the end of the Civil War and the beginning of World War I. During this period, U.S. colleges underwent fundamental changes--changes that helped to create the modern university we know today. Most significantly, the study of...