“Christianity is a white man’s religion”
“Black Christians are being manipulated”, “Christianity was forced on blacks”
I’m an original African man, so I can’t be a Christian. It’s a foreign religion.
Does a perception being mainstream make it correct?
To claim that Christianity is a white man’s religion you would have to rewrite the entire Bible. What evidence are there to back this claim? Does this claim even hold any water?
Have I been misguided as a black Christian ?
What do you think about these statements? Have you had thoughts like these at some point? Have you been asked? How did you respond?
Join Oyindamola Bethel on this episode of One Truth Podcast as she extensively answers these questions and backs it up with irrefutable evidence.