5 Reasons America Is Not — And Has Never Been — a Christian Nation
The myth that America is a “Christian nation” is not only untrue, but promotes the pernicious idea that non-Christians are second-class citizens.
June 24, 2012 |
“The United States is a Christian nation.” If I had a nickel for every time I’ve heard this statement at a religious Right meeting or in the media, I wouldn’t be rich—but I’d probably have enough to buy a really cool iPad. The assertion is widely believed by followers of the religious Right and often repeated—and, too often, it seeps into the beliefs of the rest of the population as well. But like other myths that are widely accepted (you use only 10 percent of your brain, vitamin C helps you get over a cold, and the like), it lacks a factual basis.
Over the years, numerous scholars, historians, lawyers, and judges have debunked the “Christian nation” myth. Yet it persists. Does it have any basis in American history? Why is the myth so powerful? What psychological need does it fill?
I’m not a lawyer, and my research in this area has been influenced and informed by scholars who have done much more in- depth work. The problem with some of this material, great as it is,is that it tends to be—how shall I say this politely?—’dense.’ If I were a lawyer (the kind found on television dramas, not a real one), I would present the case against the Christian nation myth in a handful of easily digestible informational nuggets. Swallow them, and you’ll be armed for your next confrontation with Cousin Lloyd who sends money to Pat Robertson.
There are essentially five arguments that refute the Christian nation myth. I’m going to outline them here and then take a look at the history of the myth. From there, we’ll briefly examine the myth’s enduring legacy and how it still affects politics and public policy today.