It's popular today to point to modern and historical examples of Christians behaving shamefully. Certainly, some of this criticism is deserved. But has Christianity had a positive impact on the world? What responsibility do Christians have to positively influence the world, and what might that look like?