The word Genesis simply means “beginnings.” The book of Genesis is not only the first book in the Bible but describes how God began all things. It forms the foundation of our Christian understanding about God and His son, Jesus Christ. In fact, without the book of Genesis, Jesus makes very little sense. In this… Read More»