With all of the calamity that has been happening closing off the year, we have been under the illusion that America has been a great place to live, despite its atrocious past, and some people often believe that it's not so bad because of all of its advancements, but what if America hasn't really changed from its past but just change the meaning of evil?