Meacham: The End of Christian America | Newsweek Religion | Newsweek.com.
Good article (or at least as good as you can expect from the usual religous commentary from Newsweek and Time). Here’s an excerpt I found important:
“The decline and fall of the modern religious right’s notion of a Christian America creates a calmer political environment and, for many believers, may help open the way for a more theologically serious religious life.”
Amen. I can’t have a conversation with someone who already assumes that the USA is a ‘Christian nation,’ because these people already have a fixed idea of what the United States should be like, and this picture in their head–this idyllic ‘America’–seems to privilege THEM (i.e. white, middle class, straight ‘Christians’).
I don’t think the United States, or any country, has EVER been a ‘Christian nation,’ and that’s what differentiates me from these conservative types. Furthermore, the ‘founding fathers’ didn’t create the USA to be Christian, and I don’t think we should be taking advice from them in the first place, since they were slaveholders. And have any of the conservative Christians actually read what Jefferson wrote about religion?
Rant, rant, rant….
As a true conservative, I think the founders were a bunch of treacherous traitors. They should not be emulated. It’s never moral to overthrow a -lawful- government, and there have been many a worse government than eighteenth-century Britain. The founders were products of the Reformation, Enlightenment, and rudimentary capitalism. They were self-important, and did not want to pay taxes to their rightful sovereign. Bastards. Yeah, you’re right, America was never a Christian nation. However, at the founding most of the states had established churches; Massachusetts, I believe, was the last to disestablish in the 1830s.