Has America always been capitalist? Today, the US sees itself as the heartland of the international capitalist system, its society and politics intertwined deeply with its economic system. This book looks at the history of North America from the founding of the colonies to debunk the myth that America is 'naturally' capitalist. From the first white-settler colonies, capitalist economic elements were apparent, but far from dominant, and did not drive...