America has never been White. White people weren't even in America first. You've been led to believe the lie, but it does't make it true. America has never been a White Nation. People believe what they've been told although evidence shows otherwise, people will still believe what they've been told, without ever trying to find out the truth for themselves. This is the way of the entire world! People prefer a lie rather than the truth!
Become a supporter of this podcast: https://www.spreaker.com/podcast/relationships-and-relatable-life-chronicles--4126439/support.