Most of us probably think of America as being settled by British, Protestant colonists who fought the Indians, tamed the wilderness, and brought "democracy"-or at least a representative republic-to... This description may be from another edition of this product.