Religion has always played a important role in the history of the United States. The Catholic faith was first brought to the North America continent by the Spanish in the 1500s. For the next 300 years, Catholic missionaries and settlers from Spain and then Latin America came to what is now California and the Southwest. In the 1600s, the European settlers began establishing colonies along the east coast of North America. Although there are some Catholics, the vast majority of the European settlers were Protestants, most from England. As the new nation formed, it was the Protestants branch of the Christian faith that had the strongest effect on the development of the religious climate in the United States. 
