From the colonial period until modern times Christians have played a major role in the development of the United States of America. At the beginning, Christians had major input into the creation of the guiding documents and laws that helped the United States grow into a strong, free nation. During the previous century into the current century Christian influence has begun to wane and the respect for Bible believing evangelical Christians within...