I came across this article yesterday that states almost all mainline Christian groups have grown in their acceptance of homosexuality. My question is, what happened?
Was there an addition to the Bible that I didn’t hear about?
Is God’s word not only relevant when we agree with it? (Check out this video for how I think of this idea!)
Thousands of years of Christianity has gone by where homosexuality was a sin. Is that something we just change because culture changes? Is our faith so shallow that we must blow with the winds of popular opinion?
If you’re not a Bible based Christian that I totally understand your ideas “evolving” on this topic because you don’t have a written standard by which your faith comes from. But for those of us out there who confess to believe in the Bible (as those in the study) then how is it that our opinions are changing so quickly?
Maybe we need to spend a little more time examining God’s Word and a little less time examining what our neighbors opinions are…