We are now experiencing in America a subtle but continuing attempt to convince us that God is not real. It is most often not done in an open or even easily discernible way.
Truth and transparency are not the weapons of choice for the enemies of God; deception, division and the gradual erosion of Christian values are their favored tools. While we are still described as a Christian nation, we are moving further away from God's truth and light with each passing day. If God were real to America, would we remove Him from our schools, our government and our daily lives? In the midst of so many social and economic problems in our country, wouldn't you think the first thing we would do, if we really thought God were real, would be to turn to Him; and yet, He is rarely, if ever, mentioned in the answer to our problems. The dialogue centers on anything and everything but God. It seems as though He is treated as a fictional character, not really worthy of our trust and faith, of no consequence, whatsoever, in the solution to what is wrong with America.
Why, if we believe He is real, is He excluded, even as a partial answer, to the mounting dilemmas of our country? Greed, pornography, moral collapse, the absence of truth and leaders with personal agendas are all symptoms of spiritual bankruptcy by a nation that no longer thinks God is real. Until we believe He is real again, and make Him the centerpiece of the nation He has blessed so fully, I believe we will continue to experience problem after problem. God is real, and so is the disaster awaiting us if we continue to turn our back on Him.