A thought.

Are things in the world actually getting worse? Not that there aren’t bad things happening, not that there isn’t a lack of morality and character which should cause concern…but are things really any worse now than they have been in the past?

Are things really so different now than they have been?

Why are so many people (especially Evangelical Christians) so worried and seem to go about wringing their hands over the state of things? I’m not saying that we shouldn’t try to change things for the better. But isn’t this the natural state of a fallen world?

Why are we so surprised and shocked by the fact that there is evil?

Advertisements