Post Christian Cities in America - Should America turn to God again to stop its culture from dying?

It is sad to know the fact that religion is not anymore allowed in the public square in America. It is a well known fact that people keep moving away from God and the values in the society are dying and materialism is flourishing. Should this nation turn to God again to reestablish the very foundations all over again? Or What could be the future of America?

It is sad. However, American churches have been propagating lies, not Truth in scripture more and more since 1901. This is the eventual outcome. We have been blessed that the belief in God does still exist anywhere in our country.

I handle this by praying for those who have been misled by false teachings, praying for those who have been misleading others by the same, and the fall out of that (which is usually their children disdain the church and God once they are free as adults to make up their own minds.).

America has been seen as the “Christian” country in the world. And compared to many other countries, I can see why.

But the truth of the matter is that when the founding fathers of this country drew up the Constitution and the Declaration of Independence, and the Bill of Rights, during a period when there was no other option in the whole world than to believe in God, or at least have a religion one took seriously - some of those Founding Fathers were already disregarding the things of God. Benjamin Franklin once said that the churches and their steeples ought to be removed as they were a “blight” on the landscape!

We have never seen a fully Theocratic nation since the nation of Israel before Christ in the world. Israel is God’s chosen nation. America is not. God has blessed us greatly. In gratitude I pray we will all evangelize and read our Bibles so as to be able to answer the questions that those who have been misled might finally see that God is good, and yes, worth “turning to” again.

1 Like