Sunday, February 14, 2010
How Christian Were The Founders?
In today's New York Times Magazine they have a long essay about the effort of some Texas conservatives to have textbooks changed so that they reflect what they, the conservatives, believe to be America's "Christian" lineage. Anybody who knows me for more than five minutes knows how I feel about that idea. But I'm curious as to what others think about this? Should America be thought of as having been founded as a Christian nation? Is it a Christian nation now? And digging a little bit deeper, does any geopolitical entity qualify as a Christian nation? Obviously this all depends on what meaning you attach to all of the key terms. What is meant by the term nation? What is it to be Christian, both individually and corporately as the church?