The Language of God: From Atheism to Belief

The Language of God, Chapter 1

By B.J. Marshall

The chapter begins with a description of Collins' experiences growing up. His parents shrugged off the business world and lived an agrarian life on huge tracts of land in the Shenandoah Valley. His father went on to teach at a women's college. Collins was homeschooled, and faith did not play a part in his upbringing. He went to an Episcopal church, but it was more for music appreciation than theology. OK, so we have a picture here of an ardent scientist who really didn't have a place for theism.

It's funny how, written from a theist perspective, he paints such a picture of atheists. Collins recounts how, as a student at the University of Virginia, conversations would easily turn to religion, where Collins' sense of the spiritual was easily challenged by the one or two "aggressive atheists one finds in almost every college dormitory" (p.15). Later, he says he enjoyed his agnosticism because it was "convenient to ignore the need to be answerable to any higher spiritual authority" (p.16). Collins likens this to practicing the "willful blindness" of his number one idol, C.S. Lewis (p.16). After college, he pursued a Ph.D. program at Yale and shifted to atheism, where he felt "quite comfortable challenging" the spiritual beliefs of others (p.16).

That comfort obviously didn't last long. After moving from chemistry to biology and getting accepted by the University of North Carolina, he did work that put him in intimate contact with very ill patients nearing death. He was astounded by their spirituality, and in one conversation a elderly woman simply asked him what he believed. He said he wasn't really sure and admitted to himself that he had never really weighed the arguments for and against belief. He realized he "could no longer rely on the robustness of [his] atheistic position" (p.20). How does one go from being quite comfortable challenging theists to having their "robust" atheistic world-view crumble? Apparently all it takes is to have an elderly sick woman ask what one believes. Incredible. I would have loved to have seen the atheistic Collins in action when he felt comfortable challenging the spiritual beliefs of others. Of course, Collins never says whether those challenges ended in his favor; I'm guessing they probably didn't.

So, Collins decided to look for answers and was pointed by a Methodist minister to look into the theology of C.S. Lewis. Collins marveled at how Lewis' arguments seemed to anticipate what Collins was thinking. The idea that most rocked Collins' ideas about science and spirit: The Moral Law and a Christian penchant for capitalization of random words. He then details a bunch of everyday problems, noting how it seems to be a universal human attribute to defer to some sort of unstated higher standard. "Though other animals may at times appear to show glimmerings of a moral sense, they are certainly not widespread, and in many instances other species' behavior seems to be in dramatic contrast to any sense of universal rightness" (p.23).

Of course, Collins never cites his sources, so we are left wondering how he knows how narrowly spread these non-human glimmers of morality are, and we are left asking how Collins is able to differentiate between "any" sense of universal rightness and these animals' behaviors - let alone the assertion that humankind's behavior is all that noble and aligned with universal rightness. Ask Hitler, Pol Pot, or even Mother Teresa.

Jessica Pierce and Marc Bekoff's book Wild Justice highlights the broad range of what we would call moral behaviors - fairness, trust, empathy, reciprocity, and more - in other animals. (I say "other" animals, because too often theists imagine humans apart from the animals). Pierce was interviewed on the Conversations from the Pale Blue Dot podcast. Here are some examples from her web site:

(I encourage you to add your own examples in the comments)

Over at Why Evolution Is True, Greg Mayer addresses this same excerpt of Collins' book, so I will simply link to it rather than expound here.

Not only does he confuse how animals can be moral, but he goes further to conflate morality with truth: "Let me stop here to point out that the conclusion that the Moral Law exists is in serious conflict with the current post-modernistic philosophy, which argues that there are no absolute right and wrongs.... If there is no absolute truth, can postmodernism itself be true?" (p.24). I'm not here to discuss the merits of postmodern philosophy, but I do find it amusing that he goes from moral relativism to the rejection of absolute truths. I can easily see how someone who holds to a relativistic standard of morality would still be perfectly well off thinking it's absolutely true that all rocks dropped in Paris will fall to the ground.

Collins sees altruism as a stumbling block to naturalistic explanations. He claims that selfless altruism - he explicitly rules out reciprocal altruism - cannot be attributed to individual selfish genes that want to perpetuate themselves. He gives three arguments from sociobiologists such as E.O. Wilson (though Collins never cites his sources, so we don't know without looking it up ourselves whether Wilson actually posited any of these three) that Collins think fail:

  1. Altruism as positive attribute for mate selection,
  2. Altruism as indirect reciprocal benefits, and
  3. Altruism as benefiting the whole group.

Before we unpack these arguments, we should note that Collins states that if altruistic behavior on the basis of its positive value to natural selection could be shown to be a credible argument, "the interpretation of many of the requirements of the Moral Law as a signpost to God would potentially be in trouble" (p.25). Well, sorry to say for Collins' sake, there's a lot of literature out there (check the references at the bottom of the page) explaining how evolution could have led to altruism - and not just in humans. Dawkins mentions four good reasons for individuals to be altruistic in "The God Delusion" (p.250-251):

  1. Genetic kinship: We evolved in small groups, allotting plenty of opportunity for kin altruism to develop,
  2. Reciprocal altruism: This one is out by Collins' standards, but we'd have plenty of time to develop this altruism given that we'd meet the same people over and over,
  3. Reputation: Dawkins states that biologists see a survival benefit to not only being a good reciprocator but having a reputation for being a good reciprocator, and
  4. Conspicuous consumption: Those who can provide food/shelter/protection with no expectation of compensation can flaunt their superiority.

Additionally, I argue that reciprocal altruism might not be as plainly seen as Collins might think. OK, there's the obvious "I scratch your back, you scratch mine." But I think there are plenty of examples of altruistic behavior with no tangible repercussions. For example: We had a huge snowstorm, and I helped my elderly neighbor shovel out her parking space. I am not expecting anything from it, and it doesn't even fall in line with Dawkins' lines of evidence (genetic, reputation, or conspicuous consumption). I did it because it fulfilled in me a desire to help my neighbor. I felt good doing it. Collins would not have been able to see that.

So, it appears that altruism as a positive attribute for mate selection corresponds nicely with kinship altruism and conspicuous consumption. It appears to me that having a reputation for being generous does indirectly benefit oneself, so maybe Collins would cry foul that this is a type of reciprocal altruism. However, it seems to me and others that a group consisting of individual members with unique and sometimes competing desires living cooperatively together seeks a stable solution to cohabitation.

Since Collins thinks that altruism must come from outside humanity, where does it come from? Well, he quotes his beloved Lewis again (Collins says he was stunned by the logic you are about to read): "If there was a controlling power outside the universe, it could not show itself to us as one of the facts inside the universe - no more than the architect of a house could actually be a wall or staircase or fireplace in that house. The only way in which we could expect it to show itself would be inside ourselves...." (p.29). OK, so let me get the analogy straight. We can't expect God/architect to be a fact inside the universe/house, so the only place we would expect to find God/architect is within us?? I'm in the universe, too, so why should I expect to find God in me and not outside me? On what grounds is this good logic? Although, it does explain why I've been haunted by my architect. Oh wait, no I'm not.

Collins has now found God, and he wonders what sort of God this is. He rules out deism out of hand on the grounds that, if Collins did indeed perceive God, then God would want a relationship with me. Sadly, I've tried to use this logic on Alyson Hannigan in vain for years. Given the high standards of the Moral Law, Collins concludes this God must be holy and righteous. He doesn't even consider Euthyphro's dilemma in trying to figure out the correlation between his God and the Moral Law: Does God arbitrarily dictate what is moral (in that case, isn't he amoral?), or does God say stuff is moral because that stuff is moral (in which case, why's God the middle man?). He also apparently didn't consider any other god who might desire a relationship with him. Nope - just Yahweh. Well, Jesus: Yahweh 2.0.

It became clear to Collins that science would get him nowhere in questioning God. Collins states that, if God exists, then he must be outside the natural world (but inside all of us, I guess), and therefore outside the purview of science. Oh, if he could only get off that easily.

Other posts in this series:

August 15, 2010, 12:50 pm • Posted in: The LibraryPermalink28 comments
Tags:

Under Green Leaves

In an old essay on Ebon Musings, "Finding Beauty in the Mundane", I wrote in a contemplative mood:

Have you ever considered the trees? Though their kind of life is far grander, slower and more patient than ours, they are each individuals, as different as human beings are. They add beauty to the world, give peace in their dappled shade, freshen the air and enrich the earth, and turn even the most hard-edged urban environment into a blossoming garden. We humans grew up beneath the trees, and we love them still...

Several years later, I still find this to be true. Whether I'm depressed or whether I'm already feeling good, it's almost always the case that visiting a botanical garden or a nature preserve, or even just going for a walk on a tree-lined street, noticeably improves my mood. The sight of sunlight slanting down through green leaves never fails to give me a sense of calm and peace. I tend to think the cause is that looking up at a tree reawakens one's sense of perspective: it's hard to see your own troubles as so serious in the presence of an organism that measures time only in years and decades.



But trees have more than just aesthetic benefits. Human beings feel an instinctive attraction to nature and wilderness, what E.O. Wilson called biophilia, and we flourish in its presence. For example, in one famous study, surgical patients who could see trees outside their window recovered faster and required fewer painkillers than patients whose window looked out on a brick wall. Other studies have found that greener urban areas have lower crime rates and that being in green environments lessens the symptoms of ADHD and improves schoolchildren's academic performance. (And that's not even to mention the many environmental and economic benefits of trees, either.)

The most likely explanation for this is that millions of years of evolution have instilled in us a built-in preference for certain kinds of environments, namely those most similar to our species' ancestral habitat. Wilson argues that this is the savanna, an open grassland broken up by patches of forest. This is the habitat we evolved in, the one we're best adapted to, and when we're placed in such an environment, we tend to fare better both mentally and physically. Urban environments, by contrast, present very different stressors that the human species never evolved to deal with.

I wonder if this feeling of displacement from nature is something that plays a role in religious conversions. When people live only in cities, surrounded by concrete and fluorescent lights, separated from nature, they do feel a sense of isolation and loss, and most of them don't know why. Religious proselytizers, of course, claim they can offer something to fill that void, and to people who don't know the true cause of these feelings, it's probably an effective sales pitch.

But when you know the true source of these feelings, the imitation can't compare to the reality. As I found for myself, the feeling of awe induced by direct contact with nature at its most spectacular is an ecstasy that easily compares to anything offered by any church. That's a piece of knowledge we ought to spread more widely. If more people understood the true, natural roots of human spirituality, the artificial attractions of religion might not prove so resilient.

August 9, 2010, 5:52 am • Posted in: The GardenPermalink22 comments
Tags:

The Contributions of Freethinkers: Richard Leakey

Atheists have a great number of famous names to our credit. We can justly claim renowned composers, scientists, musicians, civil rights leaders - and conservationists, as we'll see in today's post on the contributions of freethinkers.

Richard Leakey was born in Nairobi in 1944, son of the famous archaeologist Louis Leakey. The elder Leakey was a strong supporter of racial equality, and Richard's upbringing reflected that belief. He started school soon after the Mau Mau rebellion had been defeated, and when he spoke up in favor of the native Kenyans, his classmates taunted him as a "nigger lover", beat him, spat on him and forced him into a wire cage. Several online sources say that he also resolved never to be a Christian after he was caned for missing chapel services.

Partly due to incidents like this, Richard never finished high school. But despite this, he showed an impressive aptitude of his own for finding fossils of human ancestors - including Turkana Boy, one of the most complete hominid skeletons ever unearthed, which was discovered by a paleontological team under his direction. He also showed impressive skill at administration, becoming director of the National Museums of Kenya at just 25.

In 1989, in response to an international outcry over the slaughter of elephants and rhinos by poachers, President Daniel Arap Moi appointed Leakey head of the Kenya Wildlife Service and tasked him with protecting Kenya's endangered wildlife. Leakey accomplished this in characteristically bold fashion - by creating well-armed, specially-trained park ranger units that were authorized to shoot poachers on sight. Draconian though this seems, it was effective: almost a hundred poachers were killed during his first year at KWS, and poaching rates declined thereafter. Leakey also made international headlines when he burned 12 tons of confiscated illegal ivory, worth more than $3 million, in a massive bonfire.

In 1993, Leakey was flying a small private plane that crashed near the Great Rift Valley. This is widely believed, though never proved, to have been sabotage by someone seeking to assassinate him, probably in revenge for the anti-poaching campaign. He survived the crash, though he was badly injured and both his legs had to be amputated. Within a few months, however, he was up and walking again on prosthetics and back on the job.

Unfortunately, as a crusading reformist, Leakey may have been too zealous even for his own government. President Moi demanded that he reinstate 1,600 KWS employees who had been fired for corruption or inefficiency, and when Leakey refused, Moi gutted the agency, taking away most of its budget and power. Leakey resigned in protest, and in 1995, founded a new political party, Safina, devoted to the cause of reform. His campaign drew angry threats from British settlers who felt his zeal was putting them in jeopardy, and on one occasion, he was attacked by a mob loyal to Moi's party. As always, however, he refused to quit, and two years later, he won a seat in Kenya's Parliament. A year after that, with international lenders withholding funds because of pervasive corruption, Moi asked Leakey to rejoin his administration. As a January 2010 article in Sierra puts it:

So Richard Leakey, five times accused of treason — and of being a racist, colonialist, and atheist (the only accusation to which he pleads guilty) — was named head of Kenya's Public Service.

This time, Leakey had even more power than before: in his new job, he had authority second only to the president. But even this wasn't enough, and when his anti-corruption efforts ran into repeated political roadblocks, he quit for the second time. This time, he swore off politics for good.

At 65, Leakey still lives in Kenya, hale and hearty after two kidney transplants and still working to advance the cause of conservation in the country where he's spent nearly all his life. His most recent achievement is the launch of WildlifeDirect, a website that directly connects Western donors with conservationists and field biologists working with threatened and endangered species throughout the world. In 2008, WildlifeDirect helped to fund and train 700 park rangers in the Democratic Republic of the Congo.

Throughout his life, Leakey's zeal for combatting corruption has been exceeded only by his passion for bridging the gap between humans and nature, whether through unearthing our fossil past or preserving our threatened present for posterity. It's plain that his being an atheist didn't deprive him of an ethical compass. If anything, it contributed to the sense of profound interconnection with the natural world that's driven all the greatest advocates of conservation, past and present. Richard Leakey is one freethinker that atheists can be proud to have on our side.

Other posts in this series:

April 21, 2010, 8:06 pm • Posted in: The LoftPermalink10 comments
Tags:

Another Branch on the Human Family Tree

I haven't written about any new transitional fossils in a while, so it's a great pleasure for me to mention this one: a hominid skeleton nicknamed "Ardi", a specimen of Ardipithecus ramidus. This species was known from other fossil fragments, but Ardi is one of the oldest and most complete hominids found so far, and may give us the most insight yet into what the common ancestor of humans and chimpanzees looked like.

Image copyright 2009, Jay Matternes.

Ardi lived about 4.4 million years ago (by comparison, Lucy and her fellow australopithecines are about 3.4 million years old), in the Middle Awash region of modern-day Ethiopia. Today it's an arid badlands, but in that era, it was a lushly forested woodland, cool and wet but geologically active, with frequent volcanic episodes (a great boon to biologists, since volcanic rock and ash strata are easily dated with radiometric methods and give us good estimates of when a certain fossil lived). Primitive elephants, giraffes, horses, antelope, rhinos and monkeys are well-known from this area, as are other hominid specimens.

The fossil itself is believed to be a female. The bones were so poorly fossilized, according to the Science paper by Tim White and colleagues, that they would crumble if touched. The researchers painstakingly chipped them free of the rock they were encased in with dental picks, bamboo, and porcupine quills (!). From the fossil's discovery to its publication took nearly 15 years of preparation and study - but from all accounts, it was worth the wait.

In life, Ardi would have stood just under four feet tall and weighed about 110 pounds. The skull was small, about 325 cc, about the same size as a chimp's. Ardi's teeth suggest she was an omnivore, and from comparing other A. ramidus teeth and bones found in the region, White and his colleagues found little difference in tooth size or body size between male and female individuals. This suggests that their mating style was relatively peaceful, with little competition for mates (as compared to chimpanzees, who have massive canine teeth which are used to intimidate potential rivals) and possibly more stable pair-bonding and group cohesion.

Ardi's hands, feet and pelvis tell us a lot about how she got around. Hominids like Lucy show a mosaic of bipedal and arboreal adaptations - as Laelaps puts it, they "had their hands in the trees and their feet on the ground" - and Ardi shows a more primitive version of the same pattern, much as we'd expect from an ancestor of that age.

She stood and walked upright, though not as well as Lucy or as us, and her feet were becoming more rigid like ours, except that she also had an opposable big toe useful for grasping. Her arms were long enough to reach to her knees when standing upright, but her hands were not adapted for knuckle-walking. Nor did they have the specializations for climbing and hanging from trees that we see in modern apes. She still lived in the trees, but would have moved through them more slowly and carefully than chimps or orangutans, and was capable of descending to the ground and walking. This refutes the once-popular belief that bipedalism first developed when human ancestors left the forest for the savanna and adapted to stand upright so as to see over the grass - as shown by species like Ardi, bipedalism evolved before we left the trees.

Another popular but erroneous idea that Ardi refutes is that the common ancestor of humans and chimps looked basically like a chimp, and that humans have changed significantly while modern chimps are little different from our common ancestor. This is probably tied to the misconception of the "great chain of being" that sees humans as the highest or most advanced form of life on Earth. Ardi, who probably lived relatively near the time when our two lineages split, instead shows that both humans and chimpanzees have evolved and specialized since the time of our common ancestor, becoming adapted to two very different ways of life.

Other articles:

Ardipithecus: We Meet At Last. The Loom, 1 October 2009.

Tim D. White, Berhane Asfaw, Yonas Beyene, Yohannes Haile-Selassie, C. Owen Lovejoy, Gen Suwa, and Giday WoldeGabriel. "Ardipithecus ramidus and the Paleobiology of Early Hominids." Science, 2 October 2009: 64, 75-86. (full text online, requires free registration).

October 3, 2009, 3:18 pm • Posted in: The ObservatoryPermalink6 comments
Tags:

The Case for a Creator: Meet Your Ancestors

The Case for a Creator, Chapter 3

In the final section of chapter 3, Strobel and Wells turn to the evidence that creationists loathe above all else: the fossil hominids that make up the human family tree. Human ancestors are not only a clear, obvious transition that even a layperson can understand, they directly demonstrate that we ourselves are a product of evolution, thus striking at the desire to be separate, special creations that almost certainly motivates nearly all creationists.

I strongly suspect that creationism as a movement would never have arisen if scientists hadn't insisted on encompassing the human species in evolution's family tree. Whatever the creationists say, they don't really care about turtles or oak trees or earthworms. If scientists were willing to grant that human beings were special, unrelated to the rest of Earthlife, creationists would probably have been happy to concede that every other species came about from a process of mindless natural selection. But the evidence doesn't support a separate origin for humanity, and the idea that we might be one of those animals - a relative of slime molds and toadstools, of centipedes and cyanobacteria - enrages creationists, who can't bear to believe in a universe in which they are not the central and most important figure. In their quest to reclaim that sense of specialness, they would gladly obliterate the best theory ever devised to explain the true origins and diversity of life as we now see it.

And this leads us to the last section of Strobel's interview with Jonathan Wells. We begin with Java Man, who, according to his discoverer Eugene Dubois as quoted by Strobel, "represents a stage in the development of modern man from a smaller-brained ancestor" [p.61]. Strobel points out - for once, correctly - that the find consisted of a skullcap, a femur and some teeth, but that the femur and the teeth are now believed to belong to different species.

Nevertheless, Strobel writes as though Java Man is an isolated find, a single fossil fragment drifting in a void of uncertainty. As usual, the creationists have ignored the abundant corroboratory evidence. Java Man is just one specimen of a well-known hominid species, Homo erectus, that is known from many other specimens - including Sangiran 17, a far more complete skull that was also found on Java - and even more spectacularly, the Turkana Boy, a nearly complete skeleton of an approximately 12-year-old erectus boy found near Lake Turkana in Kenya. All these specimens, including Java Man, share the characteristics that make them unlike modern humans: a sloping forehead, heavy brow ridges, large jaw with no chin, and a braincase much smaller than ours (between 750 and 1100 cc, depending on age, while most modern sapiens have brains about 1350 cc).

What do the creationists think Homo erectus is? We never find out Strobel's viewpoint, since neither he nor Wells ever mentions these fossils. The closest he ever comes is asserting that Java Man is a "true member of the human family" [p.62]. That's actually correct, although it doesn't mean what Strobel thinks it does.

Aside from this brief discussion of Java Man, we hear nothing more about any specific fossil. Wells spends the rest of this brief section complaining about how artistic reconstruction of fossils is a speculative field [p.62] and quote-mining science writers who point out that we cannot reconstruct exact lines of descent from fossils - which is true, but Wells acts as if this means that every theory ever devised about human evolution is worthless. The lesson he takes away is not that we must be careful to only propose testable hypotheses supported by the evidence, but that "Darwinists assume the story of human life is an evolutionary one, and then they plug the fossils into a preexisting narrative where they seem to fit" [p.63], as if the fossils themselves had no meaning and could be used to support any conceivable hypothesis equally well.

I also want to highlight one particularly obnoxious bit of dishonesty. Here's Wells quoting science writer Henry McGee:

"In fact, he said that all the fossil evidence for human evolution 'between ten and five million years ago - several thousand generations of living creatures - can be fitted into a small box.'" [p.63]

It's true that the oldest fossil evidence of human evolution - the species nearest the branch point of humans and other apes - is fragmentary. But by definition, those species would be the least humanlike. What Wells neglects to mention is that all the most important fossil evidence showing how humans became human is younger than five million years! Australopithecus afarensis, and the other australopithecines, are between 4 and 3 million years old. Homo habilis is between 2.5 and 1.5 million years old. Homo erectus is between 2 million and half a million years old. We have multiple fossils for most of these species and others, far more than would fit in a "small box". Wells' sleazy tactics would be like a defense attorney getting a witness to admit that he saw nothing unusual between 5 and 6 PM, and triumphantly concluding his client was innocent - even though the crime took place at 7.

Again, what stands out about this section is how little time Strobel and Wells spend on discussing the actual fossils of human ancestors. We never hear about Turkana Boy. We never hear about Lucy or Homo habilis. What were these creatures? How does the intelligent-design worldview explain them? This is a question Wells steers well clear of, other than repeating postmodernist claims that any explanation is just as good as any other.

Now I'll do something that Strobel and Wells never do: show you the fossils so you can see them for yourself. Here's a table, with pictures, which lists some of the most important hominid specimens and shows what creationists think about each of them.

As you can see from the table, although all the creationists are adamant that every fossil is either fully human or fully ape, they can't agree which is which. (Java Man in particular is an almost even split, especially if you include Strobel and Wells' claiming that it's human.) This, of course, is exactly what we would expect if these fossils were genuinely transitional: being intermediate between two groups, they would resist unambiguous classification as one or the other. Ironically, the creationists themselves provide the best testimony of that.

Other posts in this series:

July 31, 2009, 6:55 am • Posted in: The ObservatoryPermalink164 comments
Tags:

Noble Africa

To those who are following the continuing genocide in Darfur, every day brings grim headlines:

Fighting has prompted thousands of people in the southern part of Sudan's Darfur region to seek security and shelter at a refugee camp in the northern part of the war-torn area, according to the United Nations.

...An estimated 300,000 people in the western Sudanese region have been killed through combat, disease or malnutrition, according to the United Nations. An additional 2.7 million people have been forced to flee their homes because of fighting among rebels, government forces and the violent Janjaweed militias.

Though its plight has attracted the most attention, Darfur is far from the only troubled region of Africa. There's the failed state of Somalia, now a haven for terrorism and piracy, and the outbreaks of famine and cholera brought on by the near-total collapse of Zimbabwe in the face of dictator Robert Mugabe's refusal to surrender power, to name just the two most prominent examples from recent headlines. How did we let this happen?

Africa was the human race's first home. It is our birthplace, our cradle. The continent should be a sacred place to all of us, a living temple of memory reminding us of our origins. Instead, it's poverty-stricken, politically fractured, still laboring under corrupt autocracies and mired in backwardness and superstition. The picture is not all bleak - there are success stories, and notable bright spots - but even so, Africa as a whole lags behind the rest of the world, and still struggles with the legacy of imperialism and the unbridged chasms of its own political divides.

And yet, there was a time when all humans were Africans. Though we've spread all over the world in successive waves of migration, our genes have not forgotten the past. Whether you're European or Asian, from the Arctic or from Polynesia, your heritage can be traced back to families that lived in Africa millions of years ago. If you care to categorize on the basis of something as superficial as skin color, then you can know to a certainty that the blood of black men and women flows in your veins.

It was Charles Darwin who ventured the bold guess that the human race evolved in Africa, and the evidence has vindicated him. It's in Africa that we find the bones of our earliest known ancestors and our close cousins in the human family tree: species like Lucy's, Australopithecus afarensis, small hominids who had the heavy brows and brain size of chimpanzees but stood and walked upright like us. It's in Africa, in Laetoli, that we find the oldest trace evidence of human bipedalism: two trails of footprints frozen in stone, four million years old, where three people - perhaps a family of man, woman and child - walked together across a field of new-fallen volcanic ash. It's in Africa, in Tanzania's Olduvai Gorge, that we find the earliest stone tools. And more controversially, it's in Africa, at sites like Kenya's Koobi Fora, that we find possibly the earliest evidence for the domestication of fire.

In short, it's in Africa that we learned to be human. It was under the shade of Africa's trees that we first descended to the ground, and on African savannas that we stood upright and walked for the first time. The songs of our childhood were first sung beneath an African dawn; the stories that echo in your bones were first told around African campfires.

Of course, we did not stay in our birthplace forever. As the population grew and wanderlust took the human spirit, we flowed out in successive waves of settlement and conquest. We spread north into the fertile crescent of the Middle East, where we first domesticated animals and plants and built the world's oldest cities, and into Ice Age Europe, where we eradicated our brothers - the stocky, heavy-browed Neanderthals, who had lived and thrived in the frozen landscape for tens of thousands of years until we arrived. We walked across the Bering Strait into the Americas and fanned out across the Pacific by raft and canoe. We spread over the face of the earth, building mighty civilizations and forging empires in battle and conquest. And, in due time, the conquerors returned - to their own birthplace, had they but known it - and put it under their heel as well.

It took centuries for Africa to throw off that yoke, and the injuries that it suffered still are not fully healed. Its people still grapple with endemic disease, with political corruption and with their own tribalisms, all of which are exacerbated by poverty and international neglect. But still and all, Africa is a noble continent, not in the condescending caricature of the "noble savage", but nobility in the true sense of the word: those whose blood is purest, whose lineage traces back longest. It is still the home of the most deeply rooted branches of the human family tree: as Richard Dawkins writes in The Ancestor's Tale, the disappearance of everyone outside Africa would decrease human genetic diversity only slightly, while the disappearance of everyone in that continent would mean the loss of most of our species' gene pool. Compared to Africa, all the rest of humanity is a prodigal son, descended from a relatively small number of restless wanderers who left a great and ancient family to seek their fortunes in the world.

In the slow ascent of human progress, we have many milestones left to reach. There are ancient trouble spots throughout the world, and we can count ourselves more advanced as we overcome each of them. But the turmoil of Africa is our species' greatest shame. I can imagine an Earth where Africa takes its rightful place among the pantheon of peoples; an Africa where the archaeological sites of humanity's origin are sites of pilgrimage, sacred places preserved for all to see and walk in the footsteps of our ancestors. I can imagine an Africa that's peaceful and prosperous, where gleaming cities exist alongside the simple beauty and grandeur of the savannahs and rainforests that were our childhood home. We may, perhaps, have no right to call ourselves truly advanced until that world is a reality.

March 6, 2009, 7:55 am • Posted in: The ObservatoryPermalink23 comments
Tags:

The Story of Atheism

In my previous post, I wrote some thoughts on the power of storytelling and how atheists can use it to our benefit. In this post, I intend to apply those principles to tell a story: the story of atheism.

Because gods are fundamentally human creations, this is also a story of humanity. It opens in the time when the human race was newborn, when we had first come of age as conscious beings who could look around and conceptualize the world. I don't know the exact nature of the beings in whose minds these ideas first appeared - they may not have been modern Homo sapiens, but they were undoubtedly our ancestors and deserve to be described as such.

The end product is somewhat similar to my atheist psalm, "The Gods", somewhat similar to treatises on the origins of religion like Dennett's Breaking the Spell. In the name of narrative convenience and brevity, some details have been omitted from this story. Nevertheless, I think it captures an adequate, if simplified, account of events in our past that actually happened. Editorial suggestions are, as always, welcome.


In the beginning, Humanity was lonely and afraid. We had tremendous potential, but we were still simple creatures, knowing only the rudiments of survival, and at the mercy of a world that was chaotic and full of danger. Like children lost in the wilderness, we knew that we existed, but not where we had come from, nor what happened to us when we died.

To ease our loneliness and fear, in our imaginations we filled the world with other people: people who lived in fire and water, in earth and trees, in sun and moon. From what we knew then, this was reasonable: after all, the only other things we knew of that reacted to us with as much complexity and inscrutability as these natural phenomena were our fellow human beings. And if the natural events that governed our lives were personified, then perhaps those people could be supplicated in times of trouble, perhaps they could be persuaded to have mercy on us. But because these other people were invisible, we called them spirits; and because we could not control the seasons or the weather, we reasoned that these spirits must be more powerful than us.

When agriculture was discovered, our population expanded and we became sedentary. But this meant we were even more dependent on nature's favor, and staying in the good graces of the spirits became even more important. Thus, in our eyes, they became more powerful still, and were elevated from spirits to gods - invisible beings who had power over our lives, and who had to be appeased above all else. This was the birth of religion, as our duties to the gods became formalized, crystallizing from folk superstitions about what had seemed to bring prosperity in the past.

These ideas stayed with us, and as our knowledge and our civilization expanded, they too began to grow in scope. As tribes merged into nations, the gods ran together, like drops of water merging. When war was kindled, the rulers sought to fill their people with courage by assuring them that the gods were on their side and would see that they prevailed over the enemy - or, at worst, that their spirits would end up in a pleasant afterlife. And as human power continued to grow and nations were forged into empires, the gods of the victors grew ever more powerful, the success of their worshippers tangible proof of their expanding dominion over the earth.

At first, the gods and the earthly ruler were one, and the voice of the king was assumed to be the voice of the divine. Through assertions of power both earthly and in the afterlife, their sway was initially absolute. But as the gods grew in power and influence, it became more advantageous to claim the right to speak for them. This was especially true when disaster struck a society, when the rulers had made bad decisions and their link to the gods could be doubted. Small wonder, then, that prophets began to appear who preached that the existing authorities were corrupt, that the gods wanted something different of us, and that they had had an insight into this new path. And small wonder, too, that the more persuasive of these prophets attracted followings of their own.

What this led to was a decoupling of religion from the state apparatus and a flowering of religious creativity as new sects of every kind arose, expressing all the creativity of which the human mind is capable. Wherever there was a human need unmet by the existing society, new religions sprang up promising to fill it. Of course, the state-run religions still existed and often lashed out harshly at their competitors. In other places, new religions grew in power until they became the established authority, or were coopted by an existing state whose rulers found their tenets to be useful. And old religions that had become bureaucratic and impersonal were often outcompeted by younger, more vibrant faiths and dwindled away, their gods' voices fading to nothingness as their followers died out and their temples crumbled.

All this was the pattern of human society for millennia. Belief in differing gods led to bloody wars between societies, but also sustained a shared cultural identity within a society, leading to a stable equilibrium. Every era had skeptics and doubters of the established faith, but few of them gained any great following, since they had no alternative religion to offer on which they could build a power base. But in one society in particular, there came an era of enlightenment, when great thinkers dared to ask questions of the world... and in at least one time, at least one place, there were enough skeptical minds put together to fan the embers that had been smoldering throughout human history into flame. The scientific age had dawned.

At its essence, the scientific era was underlain by a simple, revolutionary idea: statements about the world should not be accepted on the basis of faith, but proven by open and systematic testing. But simple as it sounds, the advances it brought us were immense. Fired by the thrill of discovery, the heralds of the scientific age sent their new paradigm sweeping out over the world like a universal acid, dissolving the superstitions and dogmas that had for so long impeded our thinking.

In the light of science, the natural phenomena that had once seemed so inscrutable, so humanlike, lost their mystery as the hidden rules underlying them were laid bare in all their grand, mechanical glory. We peered into the dark and discovered that the cosmos was not a place of thundering spirits or leering devils, but a vast machine, one whose guiding principles meshed with all the harmonious elegance and regularity of great gears. Even life itself, so long thought to be supernatural, was revealed to be another machine, albeit a particularly complex and subtle kind. The deities and demons that had once dwelled the interstices of our ignorance washed away like sand in water, as we learned about the origins of the world, of the human species, of the mind. At least in part measure, we have grasped the truth, and learned that it was far more intricate, more satisfying, and more wondrous than the imaginings of our youth. Science does not have every answer, nor does it offer guidance for every aspect of life, but when it comes to finding out how the world works, it has no equal.

The reverberations of this era of change are still with us. We live in a time, one ongoing since the Enlightenment, when the old certainties of faith are shifting underfoot. Every sect has dealt differently with these changes, but none have entirely avoided them. Some people are moving their gods into ever more rarefied realms to escape the relentless probing, crafting deities whose existence is indistinguishable from their nonexistence. Others, more militant, are reaffirming the old creeds with fiery zealotry, denouncing scientists for their godlessness, and boasting and cheering one another for their stubborn clinging to faiths that are childlike in their ignorant simplicity. Still others, probably the majority, have come to a reluctant accommodation with the scientific outlook, but banking their hope on finding tangible traces of the gods in the shrinking areas we haven't investigated - an unsustainable compromise, whether they know it or not.

And now, into this new world, come those who did not grow up in the shadow of gods, and who have taken the simple, revolutionary step of asking why we should believe any proposition for which there is no evidence. The crude fundamentalisms of humanity are all alike in their falsehood; the unfalsifiable beliefs are all alike in their irrelevance. In place of chasing these shadows and clutching at these mirages, this new generation of free thinkers has come to the realization that we should turn our attention to the things that are real, that are verifiable - the only important things. In place of trying to appease phantoms of our imagination, we should turn our attention to bringing goodness into this world and easing the burden of our fellow creatures.

The atheist view can seem cold and comfortless to novices, for it does not promise that all our hurts will be succored. Nor does it give us guardians hovering above to guide our steps. But where atheism requires us to abandon the consolations of childhood, it brings in their place the maturity of adulthood. Instead of clouded sight, it brings clear vision. Instead of gods and angels to watch out for us, it brings the realization that we must look out for each other. We live in a vast and uncaring cosmos, but we have each other to depend on, and the freedom to succeed or fail by our own efforts.

This is our story, and we are all characters in it, as well as the storytellers. But unlike any other character, we see the story we are in, and our choices will write the next chapter. In spite of everything, the darkness of our past may come sweeping back, and our future may be a fall back into the same precipice we have been painfully climbing out of. Or the slow, frustrating, yet upward trajectory of history may continue, into a bright future that surpasses our imagination as far as the truth surpasses the imaginings of the past.

January 5, 2009, 10:47 pm • Posted in: The ObservatoryPermalink19 comments
Tags:

The Scars of Evolution

Human beings, like all other species on this planet, have a history. We came into existence through a process of slow, grinding trial-and-error, occurring over geological time via the sieve of differential survival. And like all species, our bodies and our genes reflect and bear witness to that history. Far from being perfect, one-time creations, we still bear the scars of the evolutionary process that made us.

This post will discuss some of the lines of evidence which hint at humanity's past. I won't repeat that well-known example of an evolutionary vestige, the human appendix. Instead, I'm going to focus on a few other examples that aren't as widely discussed.

Toes. It's only because we're used to having toes that we don't usually consider how strange they are. Why do our feet have these stubby, non-functional digits on the ends? They can't grip nearly as well as fingers, and we don't need them to balance or to walk. (Why not just have a fused front of the foot?) By contrast, anyone who observes other primate species can see that they have, not two hands and two feet, but four hands, all of which are good for grasping. As human beings gained the ability to stand and walk upright, our feet lost their grasping function, but the digits themselves, though now shrunken and largely useless, remain.

Lanugo. This little-known developmental phenomenon is an important clue to our mammalian past. Lanugo is a coat of fine, downy hair that fetuses grow while in the womb, covering the entire body except for the soles of the feet and the palms of the hands. Typically, lanugo is shed by the seventh or eighth month of pregnancy, although premature infants may retain it for several weeks after birth. The question is why we grow it at all, and the theory of evolution can easily explain this as a vestigial characteristic retained from our furry ancestors.

Goosebumps. Fitting neatly together with lanugo is the vestigial human trait called the pilomotor reflex. When a person is cold or frightened, tiny muscles at the base of each hair contract, causing body hair to stand on end. In animals with thicker fur, this is a useful reflex: erect hairs trap air to create a layer of insulation, and they also make the animal appear larger and more intimidating. In humans, however, it is pointless. Like lanugo, goosebumps are a giveaway clue indicating that relatively hairless human beings are descended from furry progenitors.

Hiccups. Yes, hiccups are a sign of humanity's evolutionary past. In fact, unlike goosebumps or lanugo, which merely point to our shared history with hairier mammals, hiccups point all the way back to the time when humanity's ancestors were amphibians. According to this article by Neil Shubin (HT: The Panda's Thumb), the hiccup reflex is controlled by an area of the brain that we share with tadpoles. The hiccup consists of a sharp inhalation followed by a closing of the glottis (the valve at the top of the windpipe). In tadpoles, which have this same reflex, the inhalation draws water into the mouth, where the gills can process the oxygen it contains, but closes the glottis so the water does not enter the lungs. For tadpoles, it's a vital breathing reflex; in humans, it's a hiccup. And the same measures that often arrest hiccups in human beings (inhaling carbon dioxide, stretching the chest wall by taking a deep breath) also stop the gill-breathing reflex in tadpoles.

The true human tail. One of the most shocking - for creationists, anyway - human atavisms is the true human tail. On rare occasions, human infants are born with short, non-prehensile, but undeniably real tails, up to several inches in length and containing nerves, blood vessels, muscle fibers, and sometimes even extra vertebrae. They can move through voluntary muscle contraction.

In fact, all human embryos grow tails while in the womb, and normally they are reabsorbed before birth. The true human tail is the result when this does not happen. Usually they are surgically removed, although they are benign. For an evolutionary scientist, the reason why we grow them is obvious: we are descended from an ancestor species which had them. For creationists, who claim that human beings were created complete and separate as we currently are, it must be difficult to explain why we have so many vestigial structures that link us to other species of mammals.

The fused chromosome 2. It's long been known that human beings have 23 pairs of chromosomes, while the other great apes such as gorillas and chimpanzees have 24. It is all but impossible that the lineage that led to humans could have completely lost all this genetic material and still produced a viable organism. Where, then, did the extra chromosomes go?

Chromosomes are tipped with distinctive segments of DNA called telomeres and have another special segment called a centromere in the middle. Lo and behold, human chromosome 2 has a telomere on one end, then an inactivated centromere, then a segment of telomeres in the middle, then another centromere, then a final telomere - the structure we would expect to find if two chromosomes had fused into one. When we compare this chromosome to the two appropriate ape chromosomes, we find a compelling match, indicating that this chromosomal fusion occurred at some point after the human lineage split from our ape relatives.

The vitamin C pseudogene. Unlike most mammals, human beings can't synthesize their own vitamin C; we must ingest it as part of our diet, or else we get the disease of scurvy. Under the hypothesis of special creation, humans were created this way from the beginning, so we wouldn't expect evidence that we once had this ability but have since lost it. However, according to evolution, we are descended from other mammals, and since most mammals can make their own vitamin C, we'd expect that human ancestors did have this ability at some point as well. If this is so, our genes may preserve evidence of it.

Sure enough, human beings have a version of the vitamin C synthesis gene, but ours is "broken", disabled by mutations. Our primate relatives, who also lack this ability, also have broken versions of the gene. Just as evolutionary theory would predict, the same disabling mutations that exist in the human gene can be found in the genes of chimpanzees, orangutans, and macaques - compelling evidence that we are all descended from a primate common ancestor who incurred this mutation at some point in the past. (It's likely that this mutation wasn't selected against because all primate diets are rich in fruit, providing abundant vitamin C.)

Taken together, the scars of evolution provide abundant evidence of humanity's history. Like all species on this planet, we are not unique special creations. We are one end result of a long process of mutation sieved through selection, a countless series of adaptive compromises and tradeoffs. Our very bodies testify to the natural forces that have shaped us through the vast expanses of time.

March 10, 2008, 7:31 am • Posted in: The ObservatoryPermalink108 comments
Tags:

Book Review: The God Part of the Brain

(Author's Note: The following review was solicited and is written in accordance with this site's policy for such reviews.)

Summary: Contains many interesting ideas, but the informed reader will find much to take issue with.

Atheist Matthew Alper's The God Part of the Brain seeks to explain the religiosity of humankind in terms of human evolution and the biology of conscious experience. Alper's hypothesis is that the increased intelligence that gave human beings an evolutionary advantage also gave us the ability to foresee our own inevitable deaths. To prevent people from becoming debilitated by this knowledge, evolution counteracted death anxiety by instilling in us a biological predisposition to believe in gods, a soul, and an afterlife. Now that we understand why we believe in these things, he argues, there is sufficient evidence to conclude that they are all just cognitive illusions and none of them are real.

Although this book contains many attention-getting ideas, I believe the skeptical, knowledgeable reader will find many good reasons to doubt its thesis. Alper has no formal scientific training that I know of, and is a layman when it comes to biology; and it shows. His conclusion that religious belief is genetically hardwired into the entire human species, so that belief in God is a human trait as natural and universal as language or walking upright, is far too sweeping. Not nearly enough in the way of evidence is presented to support it. Other than a brief, footnoted reference to a single twin study, his entire line of argument rests on the assertion that belief in gods, an afterlife, and a spiritual realm is found in every human culture, even if the specifics of that belief differ, and that the only explanation for this is that such belief is a genetically programmed instinct.

Since this is where Alper begins, this is where I will begin as well. It is not the case that every human culture since the dawn of time has believed in a dualistic, Platonic conception of reality. Here is how he puts it:

...every human culture has perceived reality as consisting of two distinct substances or realms: the physical and the spiritual.

...every culture has maintained a belief in some form of a spiritual reality. As this realm transcends the physical, things comprised of spirit are immune to the laws of physical nature, to the forces of change, death, and decay. Things therefore which exist as a part of the spiritual realm are subsequently perceived as being indestructible, eternal, and everlasting. (p.3)

While reading this passage, the counterexample that immediately came to my mind was Buddhism. Contrary to Alper's claims, Buddhism generally does not believe in a distinct substance called "spirit" that is immune to the laws of physical decay. On the contrary, the core Buddhist tenet of anatman (literally "no-soul") teaches that human minds, far from the imperishable ghost in the machine that Western religions envision them as, are made up only of mutable aggregates called skandhas that are mistakenly identified as an imperishable self. The belief that the self is immutable and permanent is one of the fundamental ideas that Buddhism teaches against, regarding it as a delusion that causes all the suffering that people experience. Buddhism generalizes this principle to the belief that all things are transient and impermanent. As explained on this site:

The one great law of the universe, then, is change. Phenomena come into being, mature and disappear. They are the result of conditions; when the conditions change, they also change or disappear. Even those things which appear as permanent are impermanent. Entire universes come into being, mature and disintegrate. Buddhism does not recognize a primal cause, nor does it recognize the existence of a permanent, unchangeable substance in anything. Rather, it sees all things as constantly changing, as conditionally created.

Alper's understanding of Buddhism is seriously lacking. Several times, he mentions the Buddhist concept of nirvana, but speaks of it as if it were equivalent to the afterlife in the Western religions, a place where the immortal souls of the deceased go to dwell. Again, this is a gross mischaracterization of Buddhist teaching, which regards nirvana as a state of non-existence, insofar as it can be described in words at all. In fact, the word literally means "extinction".

Other examples could be adduced - the ancient Greek Atomists, some forms of Judaism - to show that not all cultures or religions believed in an immortal soul and a spiritual afterlife, as Alper incorrectly claims. The basic point is that the fundamental claim underlying all his assertions, the supposed universality of human belief in the spiritual, simply is not true.

In addition to this, I also find fault in Alper's scientific claims, specifically his claim that the only way to explain a universal or near-universal human belief is as a hardwired adaptation. Granted, in non-intelligent, non-sentient species that live their lives propelled entirely by instinct, it is a sound claim that any universally observed behavior must be dictated by genes. But human beings are obviously not such a species, and it is here that Alper's analogies between human religions and planarians turning toward light fall short. In addition to instinct, we have a wholly new level of mental and cultural complexity not shared by other species, and this undermines any simplistic claim that all our behaviors must be programmed by our genes.

Consider a parallel case. All human cultures have also worn clothes, in some form or another. Does this mean that clothes-wearing is also hardwired into us, programmed in our genes? Do the As, Ts, Cs and Gs of our DNA spell out instructions on how to cut and stitch a pair of pants, somewhere on our chromosomes? Are cultures that prefer robes, kilts or togas made up of mutants carrying an alternative allele of the clothes-wearing gene?

As any reputable biologist would agree, this is plainly absurd. There are very good cultural reasons why people wear clothes, including protection from the elements, societal notions of modesty, the desire to attract the opposite sex, and displays of social status. This commonality can be accounted for by basic, general similarities in the architecture of the human mind, and does not require elaborate scenarios postulating a specific selective advantage for early clothes-wearers. As compared to the null hypothesis, the claim that there exists a specific "clothes-wearing" gene is a positive assertion and as such takes on the burden of proof. Without empirical evidence to support such an idea, it becomes nothing more than a speculative "just-so" story, an example of armchair theorizing unsupported by the facts.

If religion is not a hardwired instinct, how else can its prevalence be explained? There are three alternatives:

1. The memetic explanation (adaptive): Religion is not hardwired in our genes, but has spread and become universal because it offered an advantage to human cultural groups that practiced it - societal cohesion and cooperation, willingness to sacrifice oneself in war, the establishment of law and order through divine-command morality, or whatever else - and groups that did not have this advantage were unable to compete with those that did, and eventually died out.

2. The spandrel explanation: Religion per se is not hardwired in our genes, but is an accidental byproduct of some other beneficial adaptation that evolution selected for in our species' past, such as the propensity to participate in dominance hierarchies, the desire to seek cause-and-effect relationships in the world, or the urge to anthropomorphize natural phenomena we do not understand. (See Daniel Dennett's Breaking the Spell for a run-down of these possibilities.)

3. The memetic explanation (parasitic): Religion is not hardwired in our genes, but has spread and become universal because it is advantageous to the religious memes themselves to do so. In this explanation religion is like a common cold virus, evolving in ways that improve its own propagation, even if this results in deleterious effects to the human beings who act as its hosts.

Note, also, that these explanations are not mutually exclusive. Like most complex natural phenomena, religion probably has multiple underlying causes, and the true explanation will almost certainly involve all of them to some degree. Personally, I lean towards a combination of 2 and 3, with a dash of 1. I do not, however, believe that religion is genetically hardwired into us, or that it would have entailed any adaptive advantage to humanity if it was.

Alper's hypothesis is an extreme version of genetic determinism: any cultural behavior that is widely or universally practiced must be dictated by genes that force us to instinctually behave in that way. No mainstream biologist or evolutionary psychologist that I know of holds to such a strong version of this idea, not even Richard Dawkins, who has been derided as an "ultra-Darwinian" by his critics.

There is another obvious counterexample to this claim: if Alper is correct, how could there be such people as atheists? He offers two possibilities to explain this. The first is that, like most genetic traits, religiosity exhibits a range of variation, and some people will be born with more or less capacity for it than others:

...there are those we might call spiritually/religiously deficient, those born with an unusually underdeveloped spiritual/religious function.... These are society's spiritually retarded, if you will, or, in keeping with the musical metaphor, those we might call spiritually tone deaf. (p.183)

This hypothesis does have one highly testable implication: there should be a genetic difference between theists and atheists. If it is true that religious belief is a preprogrammed genetic instinct to counteract the otherwise unbearable knowledge of mortality, it should follow that people who lack religious belief but are not crippled by dread must have a different gene that enables them to cope in another way. I strongly doubt any study will ever be performed that finds such a thing, but if one ever were, that would be compelling evidence in support of Alper's thesis.

But then again, there is another problem: what about people who convert from theism - often very intense, fundamentalist forms of theism, which Alper says lie on the opposite end of the bell curve of variation from atheists - who deconvert and become atheists? There are many such stories that could be produced. Are we to believe that these people's genes have changed during their individual lifetime? Obviously not.

Alper's suggestion is that these people's innate proclivities toward religion may have "atrophied", or that they have "chosen to suppress" them (p.183) - but if this is possible, it undercuts his entire hypothesis and throws its falsifiability into serious question. Alper's entire point is that the knowledge of one's future death is such a horrifying and debilitating awareness that people lacking a spiritual part of the brain literally could not survive and were driven to extinction (he says the knowledge was "jeopardizing our very existence" (p.183)). But now he implies that people can suppress this tendency without serious repercussions?

In a later chapter, this book also puts forth an inventive hypothesis, albeit one that strikes me as highly unlikely to be true. It suggests that America's high degree of religiosity as compared to most First World nations is due to a founder effect: most of the early immigrants were religious devotees fleeing persecution, who brought their genetic tendency toward dedicated religious practice to their new nation. If this were the case, how would we account for the fact that New England - site of settlement of the Puritans, one of the most fanatically religious of all America's immigrants, as Alper documents - is today relatively secular, as compared to the Bible-belt South, which was originally founded for economic profit? I suspect, again, the reasons for the United States' religiosity is cultural and not genetic: the Constitution's guarantees of a secular government have created a spirit of free-market competition among faiths, as opposed to the established European churches that became complacent and apathetic due to a lack of competition.

There is one more point I have to comment on. Despite being an atheist, despite proclaiming his confidence in science as the only truly effective method of understanding the world, there comes a point where Alper makes a truly bizarre philosophical claim that contradicts much of what he himself says:

As all of our perspectives are relative, no species, nor any individual within a species, can ever claim that its interpretation of reality constitutes any absolute truth... just as flies possess fly "truths," humans possess human "truths," neither being any more genuine or "real," just different. (p.226)

How can this not be read as a repudiation of everything he has spent the previous two hundred pages arguing? If different claims to truth are merely a matter of opinion and there is no way to determine which is more accurate, then his claims that evolution has given us a propensity to believe in God should also be viewed as mere opinion, no more valid than any alternative possibility.

This sloppy thinking is all too characteristic of the book, unfortunately. There are some interesting nuggets of information to be had, such as its citation of a deliciously ironic study that shows religious fundamentalists, not atheists, have often had stressed and difficult relationships with their fathers. But its major argument is little more than armchair philosophizing, lacking in substantial evidentiary confirmation, and contradicted in important ways by much of the evidence we do have.

April 1, 2007, 10:30 am • Posted in: The LibraryPermalink8 comments
Tags:

Are Evolved Minds Reliable Truth-Finders?

In recent years, Christian apologists such as Alvin Plantinga have advanced arguments purporting to prove that evolutionary naturalism is a self-refuting worldview. According to these people, if evolution is true and there is no intelligent creator-god, then humans' sensory and rational faculties were created by a blind process that is not concerned with truth or falsity, and therefore those faculties themselves could not reliably detect truth or falsity. The conclusion, as Plantinga and others would have it, is that if we believe evolutionary naturalism is true, we must distrust our own conclusions, including the belief in evolutionary naturalism. In this post, I will show that this argument is not just wrong, it is obviously wrong. An atheist has more than sufficient grounds to believe that their sensory and cognitive faculties are reliable, and it is not just probable but inevitable that a process of naturalistic evolution would result in this.

Over the past century and a half, our scientific study of the world has led to the conclusion that the human species, as well as all other life on this planet, was created by a process of evolution. Briefly described, evolution is a process by which living things that are better suited to survival in their environment tend to reproduce more abundantly, while living things less well suited reproduce less abundantly. The result of this is that genes which have a negative impact on survival tend to fade away, while those which contribute to survival are passed on and become more common, making them liable for further improvement in the next generation.

Our minds and senses, like all other adaptations of living species, were designed by evolution. And like all other adaptations, they could only have persisted to the degree that they aided our survival. If they did nothing but generate false beliefs, then at best, they would not harm our chances of survival, and far more likely would substantially decrease them. In either case, they would soon be eliminated by natural selection - in the latter case because they were an impediment to survival, in the former case because they were simply a waste of energy that could more usefully be spent elsewhere (like the eyes of blind cave fish). (The human brain consumes a substantial fraction of the body's total oxygen and energy consumption. Natural selection could never maintain such a costly adaptation unless it conferred substantial survival benefits.)

Clearly, then, in order for these faculties to persist, they must confer some survival benefit, and it is not difficult to see what that benefit is. What Christian apologists have ignored is that the ability to accurately perceive one's environment and respond appropriately is essential to survival. In this respect, evolution is concerned with the truth or falsity of a creature's beliefs, because while the evolutionary process is blind with respect to method, it is most definitely not blind with respect to results. A creature that could not respond correctly to its environment, or that did so only imperfectly, would be at a significant survival disadvantage compared to one that could perceive more accurately. Therefore, it should be obvious that, all else being equal, evolution will always favor greater accuracy of sensory perception - both the ability to sense the environment with greater fidelity and the disposition to respond correctly to those sensory impressions.

Consider a simple example: a bacterium trying to swim toward a source of nutrition. Suppose this bacterium has chemical receptors on its surface that can detect molecules drifting through the liquid medium all around it. To gain the maximum amount of nutrition, the bacterium needs to be able to sense the gradient - the direction in which nutrient molecules are more concentrated - since that will probably be the direction in which their source is located. Which, then, will have a greater chance of reproducing and passing on its genes - the bacterium that can accurately sense the gradient and move in that direction, or one that is blind to the gradient and strikes out in a random direction?

A bacterium has none of a human being's rich mental life, of course, and apologists such as Plantinga argue that while evolution would select for correct actions, it would not necessarily select for correct beliefs. But though this could be true for creatures whose actions are decoupled from their beliefs, human beings are not like this. If a creature will face more situations in its lifetime than its genes can explicitly program it for - if it cannot live solely by the autopilot of instinct, as human beings cannot and do not - then that creature must perceive its environment correctly in order to respond correctly. Accurate belief is the only sure way to produce correct action.

As an example of this, consider a more complex case: a troop of apes living in a forest, where interactions between individuals are a way of life. This mode of existence would favor a whole slew of new cognitive abilities: recognizing individuals and remembering their status in the group, remembering which group members are likely to reciprocate your favors, determining whether another individual can be bribed or deceived and being resistant to deception in turn, and group cooperation in hunting and defense. These cognitive feats all require sophisticated skills, including long-term memory and the ability to infer the contents of another individual's mind, and any individual ape that did poorly at these tasks would be outcompeted and taken advantage of by those that were superior, if not exiled from the group entirely. On the other hand, even a very imperfect capability to do these things would provide a selective foothold; the better a given ape was at it, the more that ape would prosper, and so once the capability existed at all it would be liable to refinement and improvement through natural selection. It should be clear that in these circumstances no false belief would give selective advantage in the way that true belief would.

Finally, consider a case involving a characteristically human ability: the manufacture and use of tools. Tool-making was a major evolutionary advantage that conferred a significant benefit on the primitive virtuosos that were best at it. However, it also requires even more skills in one's mental toolbox: sensitivity to fine-grained details of the environment, the ability to notice correlations, infer causality, imagine possible futures, classify objects into abstract categories, detect failures, and improve one's technique through practice and testing. None of this would be possible without a sophisticated and highly accurate set of perceptual and reasoning abilities. Again, false beliefs about what the best kind of rock is to chip into tools, or whether a blunt end will be just as good for a spear as a sharp point, or indeed any step of the process, will inevitably put their possessor at a severe disadvantage compared to the hominids who got it right.

Of course, this is not to say evolution will produce perfect sensory perception. It is obvious that we possess no such thing, and there are good reasons why. Evolution is a process of tradeoffs, and takes shortcuts whenever possible; it tends to produce a "good enough" solution rather than a perfect solution. This explains many common errors in human reasoning and perception, such as the urge to anthropomorphize natural phenomena, or our susceptibility to certain kinds of optical illusions. It should be a matter of no dispute that human brains are not perfectly reliable. However, we are not helpless to correct our own perceptual mistakes. Using our superior pattern-recognition abilities, we can perceive when our efforts have failed and alter our plans accordingly. More specifically, when we recognize a defect in our perception, we can overcome it using a prosthesis that compensates for the defect. An optical illusion such as the Muller-Lyer illusion can be overcome by using a physical prosthesis, such as a ruler. More subtle defects in our perception can be corrected by using a mental prosthesis - the scientific method.

Though it is not perfect, it is more than obvious that evolution will produce at least generally reliable mental tools for environmental perception, pattern recognition and abstract reasoning in any intelligent being. In light of this, the burden of proof is now on the presuppositionalists to explain why an evolutionary naturalist should not consider their own beliefs reliable.

In any case, this argument is not uniquely applicable to atheists. Christians have their own defeaters which by any rational reading should force them to believe that their minds and senses are unreliable.

For example, if a Christian believes that the Bible is true, they must believe that there are circumstances under which God will deceive people and cause them to believe lies (2 Chronicles 18:21-22, 2 Thessalonians 2:11-12). But if this is true, how can any Christian know that they are not one of the people God is deluding? By definition, if you are one of those people, you would not know it; a deception is not a deception if the person experiencing it recognizes it as such, and an omnipotent being could easily create a deception that a person could not see through. This means that a Christian must always admit the possibility that any of their beliefs may be delusions sent by God; but this in turn means that a Christian can never have complete confidence in any of their beliefs, including the belief that God sends people delusions or even the belief that Christianity is true. The Christian worldview undermines itself just as totally as Christians claim atheism does.

This conclusion is just a special case of the more general conclusion that belief systems incorporating inscrutable, unlimited supernatural beings can never give sufficient grounds for considering your beliefs justified, since there is always the possibility that those supernatural beings are deceiving you in undetectable ways for unknowable reasons of their own. By contrast, atheism excludes such malevolent possibilities; and while this does not prove atheism true, it does mean that it is consistent and that it provides a sufficient foundation for holding evidence-based beliefs in the first place. We can be, and often are, mistaken, but atheism at least offers us a chance to discover and correct those mistakes, without fear of mischievous supernatural beings thwarting our every attempt at finding out the truth.

February 13, 2006, 12:44 pm • Posted in: The LibraryPermalink30 comments
Tags:

DAYLIGHT ATHEISM: THE BOOK
Now available from Big Think!

RECENT POSTS

MUST-READ POSTS (view all)

RECENT COMMENTS

SITE CATEGORIES (explanation)

TAG ARCHIVE

ARCHIVES

POST SERIES

see all >

BLOGROLL

PODCASTS

FORUMS

OTHER LINKS

THIS BLOG'S PARENT SITE

SEARCH THIS SITE

RSS 2.0 FEED

ABOUT THE AUTHOR

STATEMENT OF PRINCIPLES

WHY "DAYLIGHT ATHEISM"?

FEEDBACK

SPEAKING ENGAGEMENTS

SSA Speaker Page
Find Me on Facebook Find Me on Atheist Nexus
Kiva - loans that change lives
Foundation Beyond Belief
The Out Campaign
Winner of the 2009 3 Quarks Daily Science Writing Prize