With support from the University of Richmond

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

A Christian Nation? Since When?

America may be a nation of believers, but when it comes to this country’s identity as a “Christian nation,” our beliefs are all over the map.

Just a few weeks ago, Public Policy Polling reported that 57 percent of Republicans favored officially making the United States a Christian nation. But in 2007, a survey by the First Amendment Center showed that 55 percent of Americans believed it already was one.

The confusion is understandable. For all our talk about separation of church and state, religious language has been written into our political culture in countless ways. It is inscribed in our pledge of patriotism, marked on our money, carved into the walls of our courts and our Capitol. Perhaps because it is everywhere, we assume it has been from the beginning.

But the founding fathers didn’t create the ceremonies and slogans that come to mind when we consider whether this is a Christian nation. Our grandfathers did.

Back in the 1930s, business leaders found themselves on the defensive. Their public prestige had plummeted with the Great Crash; their private businesses were under attack by Franklin D. Roosevelt’s New Deal from above and labor from below. To regain the upper hand, corporate leaders fought back on all fronts. They waged a figurative war in statehouses and, occasionally, a literal one in the streets; their campaigns extended from courts of law to the court of public opinion. But nothing worked particularly well until they began an inspired public relations offensive that cast capitalism as the handmaiden of Christianity. ...

Read entire article at NYT