Society has often strayed from conveying truth; the originating term behind Etymology. Throughout history, word origins and meanings have shifted to enforce rules, relay ideas, and push societal and cultural agendas. This is commonplace and shouldn't be surprising, however, does the shifting of terms have a net positive effect on our modern culture?