It is a common notion among American Christians that our country was once intended to be one that shares our faith, in legislation and in culture, and that only in the last century has that begun to fade. But is this historically true? And is it an idea that is supported by scripture? We try to cut through our own culture and revisionist history to answer these questions as best we can, and we invite you to share your opinions with us!