To the utter shock of Christians, the collapse of the Christian worldview in America and the West is happening. Why is this occurring- Have we been so blinded by the attack of the enemy on issues such as biblical history and biblical authority that we couldn't see this massive erosion taking place- Are there answers that can help get us back to a sure foundation-