California is one of several states that make up the Wild West in the United States. It was first established as a U.S. a territory in 1848. The state that would become California was, like so much of the West, originally inhabited by Native Americans and, in the sixteenth century, colonized by Spain as part of Mexico. After the Mexican-American War (1846-1848), the United States acquired the land that eventually became Texas, New Mexico, Utah, Nevada,...