More more these days, people are associating Christianity with a political movement. If you scan the headlines and listen to newscast, you will often hear the words Christian and evangelical use with a set of political beliefs, a group of voters, or in combination with an agenda. Is this really what we want to portray as Christians?