For those who are heavily invested in the Christian religion, at least as it's usually been understood in America, the news that America is entering into a 'post-Christian' epoch is understandably alarming.
But here are six reasons why I do not think Kingdom people should weep over the demise of American Christianity.
1. America has never been, and will never be, a "Christian" nation in any significance sense. Among other things, America, like every other fallen, demonically-oppressed nation, is incapable of loving its enemies, doing good to those who mistreat it or blessing those who persecute it. By applying the term 'Christian' to America, we've massively watered down its meaning -- which undoubtedly helps explain why the vast majority of American Christians assume being 'Christian' is perfectly compatible with hating and killing your national enemies . . .The sooner the label 'Christian' gets divorced from this country, the better. It provides hope that someday the word 'Christian' might actually mean 'Christ-like' once again.
2. There's a good bit of research demonstrating that the majority of American's identify themselves as 'Christian' when asked by a pollster, but when asked what this label actually means in terms of core values and lifestyle choices, it becomes apparent that for the majority of them the meaning of 'Christian' is basically 'American.' I submit that the main problem Kingdom people confront in spreading the Kingdom in America is that a majority of people assume they are already in the Kingdom -- they are 'Christian' -- simply by virtue of being American . . . If fewer people are identifying themselves as 'Christian,' this is good, for it means there's one less major illusion that Kingdom people have to confront and work through as they invite these folks into the Kingdom.
I will finish these thoughts tomorrow.
1 comment:
The word "Christian" doesn't make for a good adjective. It is a noun.
Post a Comment