Reading Time: 4 minutes

Last month, in “The Aura of Infallibility“, I talked about how some religious believers declare themselves and their beliefs to be infallible in order to ward off the frightening possibility of having to decide what is true. This is, obviously, a futile tactic. We can proclaim ourselves to be immune to error as often as we like, but reality is unlikely to be impressed. Human beliefs, no matter how strongly or confidently held, do not decide the way the universe is.

What these believers fear, more than anything, is the slippery slope: that once they change even one of their beliefs, all the rest will follow. As one of them put it:

Yet, how can one know anything for sure about Jesus if the Bible that reveals him is wrong often or even from time to time. Is the Virgin Birth wrong? Is Jesus both God and man, or is that wrong? What about the Trinity? All such doctrines are attacked by secularists and non-believers as much as the Young Earth doctrine, why not jettison those as well? And if not, why not? How can you know what is right and what is wrong in the Bible?

Perhaps this fear is indeed valid in their case; since all their beliefs are reasonless and based only on faith, if they abandon one, then there’s no reason not to abandon all the others. But this doesn’t mean that belief in general is a futile endeavor. It means that we should endeavor to hold substantive beliefs – ones which are justified by facts. That way, even when one of our beliefs is shown to be wrong, we still have good reason to continue holding the others. The evidence of the world anchors our beliefs and prevents them from turning into the slippery slope of faith.

Human fallibility is obvious in every aspect of our lives. Although our ability to predict and therefore control the world has been increasing gradually since the scientific revolution, that understanding has been hard-won, and we’ve had many mistakes, missteps and blind alleys along the way. In the realm of morality, each new era reveals the painful ignorance of the last (and there’s no reason to believe ours will be an exception). In politics, war, corruption and scandal are rampant, and even if human nature is largely good, it’s not hard to get the opposite impression from skimming the news.

If there’s one lesson to take away from all this, it’s the following: We need systems and institutions that are capable of self-correction. Since we’ll always make mistakes, we need to set up a framework that permits us to learn from them and not repeat them in the future.

The scientific method is the essence of self-correction, which is one reason it’s been so wildly successful at finding out the way the world works. The system of scientific peer review and replicability exposes every idea to critical scrutiny and probing tests. Only the soundest, best-supported ideas can pass through this gauntlet. And even when wrong ideas survive initial scrutiny, they can always be overturned later by new evidence. Science’s tendency to shower rewards on those who disprove conventional wisdom gives this system a built-in method of correcting its own errors.

Democracy, too, is another system that is well-provisioned for self-correction. A monarchy, or any other system of absolute rule, offers people little recourse if their ruler turns out to be a bad one. By contrast, regular elections keep the system healthy and its officials accountable to the popular will, by giving the voters regular opportunity to throw out the ones they dislike.

By contrast, religion is a system of thought notably lacking in mechanisms for self-correction. The vast majority of religious beliefs do the exact opposite – assume that all significant truth was handed down at that religion’s founding, perfect and complete, and that nothing of significance remains to be learned. There is no reward in religion for those who introduce new beliefs into the system or argue against old dogmas. In fact, most religions are set up specifically to discourage that possibility, with some going so far as to pronounce curses and divine wrath on anyone who tries it. There is no system of voting or other means by which the lay believers can express their discontent or call for a change of direction. And in many religions, there is an oligarchical elite of clergy who choose their own successors, shutting ordinary followers out of the decision-making process altogether.

Of course, many religions have changed to reflect scientific and ethical advances made since their founding. But it’s not an unfair generalization to say that these changes almost never originate “top-down”, beginning with the official hierarchy and then propagating downward the same way as any new creedal statement. Instead, they usually begin with ordinary believers who wake up to the errors taught by their faith (often with assistance from nonbelievers, who’ve played an important role in many major social reform movements). And these reform movements always face fierce opposition from the entrenched religious leaders, who slander and demonize them to their last breath. The abolition of slavery, the women’s suffrage movement, the civil rights movement, the introduction of birth control – all these and others were denounced by clergy and religious leaders, their advocates labeled “godless atheists” regardless of whether they were believers or not. Most of these religious leaders resisted correction until their dying breath, and reform was only brought about when the societal consensus had grown too overwhelming to resist any longer.

Today, as we face ever more serious threats and crises, we can no longer wait for the rigid guardians of orthodoxy to give way. In place of dogmatic faith, we need all our societal institutions to be built on the idea of self-correction.

DAYLIGHT ATHEISM—Adam Lee is an atheist author and speaker from New York City. His previously published books include "Daylight Atheism," "Meta: On God, the Big Questions, and the Just City," and most...

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments