Kevin H: Where Are We Taking America Back To?
America has lost its Christian morals.
We are no longer a Christian nation.
We need to get back to the way things used to be.
All the more we hear these things during election season. (With some exceptions this time around as there seems to be a serious cognitive dissonance with some segments who have previously concerned themselves with such matters.)
There are morals that have unquestionably degraded in this country over the past ten, twenty, fifty years (or whatever timeline one would like to use). Practicing of and acceptance of sexual behaviors that violate God’s commands seemingly grow by the day. A growing baby in the womb has been for quite a while now a choice to kill or not, not a precious and completely vulnerable human life worthy of protection. The sex and violence and language and attitudes that proliferate our media continually become more deranged and vulgar and prevalent and accepted.
The acceptance of traditional Christianity in this country continues on the decline. Christianity and its basic values long had been respected and in some measure integrated into the culture of this country. That respect and acceptance has eroded over the years and many times those who hold to traditional orthodox Christian beliefs are now viewed as reprehensible, unenlightened, bigots.
My question is, where and what exactly do we need to get back to in order to recover our Christian identity?
When African Americans had to ride in the back of the bus and couldn’t use the same drinking fountains as white people? Or when women weren’t allowed to vote and were treated like second-class citizens? Maybe when slavery was legal and masters would beat, rape, and dominate their slaves as they liked? How about when the people who were native to this land were killed and driven out of their lands by force and made to settle in substantially less desirable parts?
At least when it comes to behavior, you see, it is a matter of perspective.
Although this country had long seen itself as Christian by name, it hasn’t always acted the part. And so, while in some ways our country has become decidedly less Christian in its behavior, in other ways it has become more so. For instance, would an African American woman feel like our society and the laws of the land treat her in a more God-honoring fashion in 1850 or 1950 or today?
Even beyond behavior, if we study the founding of our country from all the different settlements and colonies to the formation of a national government, we find a mixed bag as to how integrated Christianity was in our government and laws.
Some settlements and colonies were formed by Christians escaping religious persecution. Of these settlements and colonies, some required strict adherence to their brand and rules of Christianity while others allowed for more freedom of religion. Yet other settlements and colonies were created to establish a presence and gain an accumulation of wealth for the mothership country back in Europe with little emphasis on religion. Yes, there may have been Christians involved in these settlements, but these settlements weren’t oriented around religion.
When our national government was formed, it was soon established that, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof”. At the same time, some states had laws on the books requiring adherence to various specific Christian practices or favoring specific sects of Christianity. Many of our founding fathers were Christians, but not all. Some of them who claimed to be Christian were heretical or heterodox in numerous fundamentals of the faith. So again, we see a mishmash of the influence and intentional integration of Christianity or lack thereof in the beginnings of our country.
God has never made a covenant with our nation like he did with Israel in the Old Testament. America is NOT the new Israel. We can claim no special status because of any supposed covenant.
So where do we go from here? Christians are losing their favored and respected status in our society. Those who hold to traditional orthodox beliefs are experiencing increased mocking and condescension from our culture and it would seem that trend will only worsen. The laws of our land which grant freedom of expression of religion are being challenged, especially in regards to aspects focused on Christian beliefs and behavior. We can expect these pressures to become even greater.
Jesus told us to expect persecution. John 15:18-20 The persecution we experience in our country today is very minor compared to what Christians experience in many other countries and cultures around the world and what many Christians have experienced over the last 2000 years.
Ironically, much of the persecution that has occurred in the history of our country has been committed by “Christians”. But going forward, the persecution of Christians could become worse in our society. I pray that it doesn’t, but it very well could.
Now this is not to say that Christians shouldn’t ever get involved in government or political issues, but maybe our concern shouldn’t be so much with taking our country back for God and more with leading our lives as Christians. If each of us personally carries ourselves in a Christ-like fashion and works to spread the gospel and the Church collectively does the same, maybe this is our best chance to influence society for the better. For our culture and country to become more “Christian”. And maybe it doesn’t work out that way. Maybe the world ends up hating us because it hated Christ first. We do not have final control over these things.
In the end, our ultimate hope is not in the United States of America. It is not in our government or elected officials or culture. It is with Him whose kingdom is not of this world.
(I apologize to the readers here outside of the U.S., as this article was tailored towards America. But, at the very least, I imagine that many of you in Western cultures are dealing with similar issues.)