I believe it’s about time that we, as Christians, start
being more direct about what the bible says about these subjects. As our society
becomes more and more liberal, and less and less Godly, its seems when we post
the truth we are “hurting someone’s feelings”. God did say, “speak the truth in
love” but the opposition always feels like we are directing the finger. God’s
Word is true, we need to be more direct about speaking it.