The TownhallSocial issues

America has never been a Christian nation

By David Wanderi

Editor’s note: The opinions expressed here are those of the authors. View more opinion on ScoonTV.

People in this country are turning away from religion more than ever, specifically Christianity. In 2019, a Gallup report found that U.S. church membership declined approximately 20% over the past two decades. Being Christian myself, I have heard the statement, “America is turning its back on God,” far too many times. Following that statement is usually a warning to the congregation about God’s wrath almost certainly following if the country doesn’t turn back to God. However, when I sit and think about America’s inception as well as its overall history, it is clear to me it has never been a nation devoted to God, let alone Christianity. 

Founding fathers like Thomas Jefferson went out of their way during the country’s inception to create a clear line of separation between religion and the government.

He famously wrote, “Believing with you that religion is a matter which lies solely between man and his God, that he owes account to none other for his faith or his worship, that the legislative powers of government reach actions only and not opinions, I contemplate with sovereign reverence that act of the whole American people which declared that their legislature should `make no law respecting an establishment of religion, or prohibiting the free exercise thereof,’ thus building a wall of separation between church and state.”

For America to be a Christian nation, then the government would have to commit itself to the God of the Christian Bible in some capacity. However, this constitutional line in the sand separates the government from committing to any religion. Unlike countries like Afghanistan, Qatar, or Saudi Arabia, who’s official state religion is Islam, America has no official religious identity. 

Another jarring realization when analyzing America’s history is the giant stain of the transatlantic slave trade. Slavery has been a part of society long before both America and Christianity, but the manner in which chattel slavery was conducted makes it hard to believe America was a country that had a true reverence for the Christian God. I say this because of the multitude of passages in the Christian Bible that express God’s hatred for injustice committed to people, to the extent of God punishing nations for doing these very things.

The God of the Bible is not only a God who forgives, but also a God who avenges. Has America truly ever repented at a governmental level for the sins of chattel slavery? I would argue no, largely due to the fact that reparations were never enacted to African descendants of slavery. 

Even if you would argue that the slavery angle is a bit of a reach, you must also remember that during this period slave masters would give their slaves a bastardized version of the Bible known as “The Slave Bible.” This was a Bible which heavily omitted parts of the Old Testament. Many of the Old Testament stories, like Moses’ story in the Book of Exodus, would give hope to slaves like Nat Turner and fight for freedom. Thus, slave owners banned those stories.

Removing parts of the Bible and spreading an alternative narrative with it is literal heresy. If America was ever a Christian country, then it was a different Christianity than that of the Bible. Frederick Douglass once wrote, “I should regard being the slave of a religious master the greatest calamity that could befall me. For of all slaveholders with whom I have ever met, religious slaveholders are the worst.”

Douglas witnessed so-called Christian slave owners mercilessly beat their slaves and then return to their homes to read their religious devotionals. The cognitive dissonance of both those slave masters and a large part of America is ridiculous, to say the least. 

Now, although I believe America has never been a Christian nation, that isn’t to say America hasn’t had many practitioners of the Christian faith, especially ones who were a positive force. From the founding fathers, to the pilgrims, and to the freed slaves, there have been many believers of the Christian faith in the United States. The Christian faith has given many men and women positive convictions which have given them the strength to fight back against injustice both in this country and outside this country. All I’m arguing is that the government of America has never been one that truly sides itself with the God of the Bible.

Christians in this country have unfortunately merged the kingdom of heaven and the government of America together, and as a result have perverted the faith into religious patriotism. The God of the Bible cares for the wellbeing of America just as much as it cares for the wellbeing of China. 

Until Christians separate their love for the Gospel of Jesus Christ from the radical patriotism of this country, we’ll continue to see a decline in the Christian faith. It’s not because people don’t want to believe in it, instead due to it no longer being spread. 

Subscribe to get early access to podcasts, events, and more!

David Wanderi

Tags: , ,
Previous Post
No, Candace Owens – You have not found your purpose
Next Post
The United States of Asch conformity

Related Articles

Tags: , ,