Originally posted by siriuswriteri don't think most fundamental christians would care in regards to their religion. it's mostly about discovering god in yourself, learning to put all your faith in that.
The Bible declares, "Test everything. Hold on to the good" (1 Thessalonians 5:21). The foundation of the Christian faith has nothing to do with self-imposed philosophy and self-gratifying emotion -- "feelings" of security. On the contrary, non-Christians commit such acts.
Originally posted by siriuswriterlike, if you get sick, it's because you did something to displease god. [at least one friend i had accused me of that when i was diagnosed with kidney failure.]
What a shame! You should have said to your so-called Christian friend, "Why do you speak such words, when I am in a time of need?" Remember: biblical Christians strive to immolate Christ; they are re-born. And so... your friend should have provided comfort, instead of legalism (which Jesus detests). Your friend is no better than you, in the eye's of God.
Originally posted by siriuswriterjerusalem in today's culture is a place for jews and muslims.
Christians too!
Originally posted by siriuswritermaybe they'd see it as a chance to proselytize - like, OMG the temple has fallen REPENT AND BE SAVED, but, admittedly, a person would have to be bordering on fanaticism to say something like that out loud.
Or to even think it! I agree!