-
Are You Aware? Founding Faith
This is part I in our “Freedom of Religion” Awareness Wednesday series. Read the other posts in the series here. It is a common misconception that the United States was founded as a “Christian nation” and that the founders intended it as such. It is true that the American colonies were largely established by Christians and that Christianity had a profound effect on the architects of the nation. But history does not support the claim that our government ever was, or was intended to be, Christian by those who conceived of and orchestrated its emergence. It was, in part, the oppression felt from both the British monarchy, with their supposed “divine…