Is America a Christian Nation?

In a recent Associated Press article, Peter Smithconsiders the claim that the United States was founded as a Christian nation. He rightly points out that “Christian nation” means different things to different people. Despite conflicting historical evidence, 60% of U. S. adults believe the founders intended the United States to be a Christian nation and 45% believe it should be a Christian nation. So, is there a meaning of “Christian nation” supported by the biblical use of “Christian”?

Read the full article here.

Previous
Previous

No Flinching…How to Develop a Crucial Leadership Skill

Next
Next

‘Wet, Yet Unimpressed” - Exercising Resolve in Leadership