Over the next couple of days, I want to share excerpts from a couple of articles I have read. One is by Jon Meacham of Newsweek. He says, "Let's be clear: while the percentage of Christians may be shrinking, rumors of the death of Christianity are greatly exaggerated . . . there is no doubt that the nation remains vibrantly religious -- far more, for instance, than Europe."
Then he clarifies something that all of us need to realize and accept. He says, "What, then, does it mean to talk of "Christian America"? Evangelical Christians have long believed that the United States should be a nation whose political life is based upon and governed by their interpretation of biblical and theological principles. If the church believes drinking to be a sin, for instance, then the laws of the state should ban the consumption of alcohol. If the church believes the theory of evolution conflicts with a literal reading of the Book of Genesis, then the public schools should tailor their lessons accordingly. If the church believes abortion should be outlawed, then the legislatures and courts of the land should follow suit."
I would submit (and I am just coming to an understanding of this) that Christianity does not thrive when it is controlling the political agenda. How is this really so different from Islamic states, which we have SO criticized? No, Christianity has always thrived when it was the "counterculture" that Jesus came to model. But more from Meacham's article tomorrow.