I was wondering something today after watching an interview with Bill Maher and former Governor Mike Huckabee, who also ran for President (who I voted for!)
Bill Maher was talking about his new movie Religious, and he and Huckabee were discussing "religion and faith".
That made me wonder about other Christian's biblical views.
So, here's a question for you Christian...
Do you believe that the Bible is the Word of God? I mean the true, actual, God breathed, God inspired, authoritative Word of God? Do you believe it can and should be taken literally, in it's entirety? Do you believe that even though "men wrote the bible", it is still the Word of God, and that only He (God) inspired these men?
The reason I ask is because I was very disappointed and even disgusted that our President George W. Bush (who claims to be a Christian) said that he does not believe that the bible should be taken literally. He even laughed and smirked like that was such a ridiculous question!
It makes me sick to my stomach that anyone who claims to be a Christian would dare say such a thing! It hurts to hear "so-called" Christians say that the bible is just a bunch of stories and that somethings you can take literally, but somethings are just metaphors or allegories. I hate hearing that Jesus was just a good man and that the bible is just full of examples on how we should live. I can't stand it when people claiming to be Christians will say that Jesus is just one of the ways to heaven. And that Jesus is right for them personally but may not be the right truth for someone else. I am so sick of the "Politically Correctness" polite crap!
As a Christian you should believe that the Bible is the authoritative Word of God and should be taken literally!
As a Christian you should believe that Jesus Christ is the ONLY way to God the Father! There are no amounts of good works, or being a really nice person that will earn your way into heaven! It is ONLY through repentance and faith! And, if there was another way...Why would God allow His only Son, Jesus Christ to be punished, to pay for our sins, to die such a horrible death on that cross, if there was another way?!
As a true Christian it should offend you deeply to hear other self-proclaiming Christians not stand up for and defend the truth!
As a true Christian we should believe it, preach it, teach it, and live it!
I'm sorry to sound angry, it's because I am. I'm so angry at our country for allowing the Christian Faith that we were founded on to be spit upon! I'm angry that we have allowed words like Jesus, sin, and hell to be considered offensive and even "hate crimes"! I'm angry that we've ripped prayer and God out of our schools, but force...errr...I mean, teach "tolerance" of very offensive things and even teach sexual education, which also encourages "safe-sex" with birth-control and condoms! I'm angry that this country allows us to murder our un-born babies, feed our lusts with inappropriate entertainment, allowed the sanctity of marriage to be shamed, all while we spit in the face of God! I'm angry that we all have to walk around on eggshells so as to not offend anyone, in fear of being sued! I'm angry that "Freedom of Religion" and "Separation of Church and Government" have been so badly misused and abused! I'm angry that this country has turned it's back on our Lord!
And, I'm angry that our leaders have also!