Is relativism really the dominant belief system in American culture today? The stamping out of right in wrong has come to a halt while those that once pushed a don't judge me let me live my truth ideology, now take up moral stones to throw at the President. What does the bible say about truth and does it even matter?