God cares about our land! In the beginning, God instructed man to care for and rule over the land that they were given. After man sinned, the earth was cursed and man has felt its affects ever since. Now that we have been redeemed from the curse through Jesus, we must take our rightful position again and bring healing to our land, causing it to partner with us to see God’s Kingdom come on earth.