Relations between Western nations and their colonial subjects changed dramatically in the second half of the twentieth century. As nearly all of the West's colonies gained their independence by 1975, attitudes toward colonialism in the West also changed, and terms such as empire and colonialism, once used with pride, became strongly negative. While colonialism has become discredited, precisely when or how that happened remains unclear. This book explores...