Is the left actually “woke”? Depends on your definition, but if “woke” means to be informed about the realities of the world, then the left is as far from woke as a group could get. While tearing down statues in the United States, the left turns a blind eye to genocide around the world.