E. O. Wilson on how we got to be this way

Distinguished evolutionary biologist Edward O. Wilson, who coined the term consilience as it was used at the conference I attended last weekend, gave the keynote address. His talk was based on his latest book, The Social Conquest of Earth.

Wilson began his talk with three haunting questions that Gauguin wrote on a painting he made toward the end of his life: Where do we come from? What are we? Where are we going? These questions are central to philosophy, religion, and science. Continue reading →

Consilience conference

This past weekend, I attended an intense and very interesting conference in St. Louis on the topic of consilience (Consilience: Evolution in Biology, the Human Sciences and the Humanities). The term consilience in this context refers to the unification of knowledge in the sciences and humanities proposed by biologist Edward O. Wilson in his book by that name. Very roughly speaking, the idea is to find a bridge between the different areas of human knowledge and put them all on a common footing by looking at the social sciences and humanities in the light of the findings and methods of evolutionary biology, cognitive science, and other scientific fields. (It’s been a few years since I read Wilson’s book, so that definition is more my best understanding at the moment than it is a summation of his views. There is plenty more to say on what consilience means, and I intend to say it here in the near future.)

I learned a huge amount at the conference, both from the talks (20 talks in three days by a star-studded cast of scientists and humanists) and from the discussions over meals or drinks, and I met a lot of great people. I had hoped to blog from St. Louis as things unfolded, but my input channels were saturated (three pens ran out of ink, and I filled about a quarter of a Moleskine notebook with my notes), and I had no energy left for output. However, I did write a few blog posts after I got home:

Big history: Perspective!

You may remember the Total Perspective Vortex from the Hitchhiker’s Guide books by Douglas Adams. The vortex gave its unfortunate occupant a realistic view of his or her place in the context of the entire cosmos, a humbling vision that drove the victim mad. I have to say that as I learn more and more about our proper place in the grand scheme of things, I become exhilarated rather than distressed. Gaining greater perspective is a thrill.

The emerging interdisciplinary field of big history seeks to weave together all we know of the history of the universe and everything in it and to teach this coherent big picture. Geologist Walter Alvarez was one of the co-discoverers of a significant event in Earth’s history, the extinction of the dinosaurs after an asteroid impact, that had a tremendous effect on the subsequent history of the planet. He began teaching a big history course at Berkeley a few years ago, and he came to IU last week to talk about big history, which he described as a bridge between science and the humanities.

One of the problems, when you teach big history, is that creatures who have a maximum lifespan of about a century (and generally don’t last even that long) have a hard time grasping the time scales involved. Or as Carl Sagan put it:

Part of the resistance to Darwin and Wallace derives from our difficulty in imagining the passage of millennia, much less the aeons. What does seventy million years mean to beings who live only one-millionth as long? We are like butterflies who flutter for a day and think it is forever.

To address this problem, one of Alvarez’s students, Roland Saekow, wanted to find a way to synthesize all of the timelines on the handouts for the big history course into a single zoomable presentation. He and Alvarez worked on the problem, and eventually their project grew into the ChronoZoom Project. At the website, you can get an overview of all the timelines and then zoom in a particular time period. Dots scattered about the timelines contain relevant images, video, and other embedded material. A couple of tours are also available. The project is in beta, but already it’s enough to light up the mind. It will be interesting to watch it develop and see how the timelines fill up with information.

In his talk, Alvarez focused on two concepts in big history: little big history and contingency. A little big history covers a single place or thing in light of all the timelines we have: cosmic, geological, evolutionary. The basic idea is that you can look at just about anything in terms of “How did this come to be?” and uncover a story in deep time. He offered a fascinating glimpse of this approach with a sketch of what a little big history of Spain (where his family came from) might look like.

One cool thing about this approach is that it shows you the connections between the different fields of academic study: for example, how the geological factors that shaped Spain’s high, dry interior are connected to the presence of so many words of Spanish origin in American cowboy lingo (the interior was not suitable for agriculture but was good for herding livestock, so the Spanish already had experience that was useful when they came to the American west, and that fact lingers in the language). It also gives people a much better sense of the shape of the cosmos and of existence, the time spans involved, and incredible complexity of cause and effect. This hard-won knowledge of who and where and when we are, at least in the broadest outlines, is one of the most valuable birthrights of every human, in my opinion.

An example of this complexity is the idea of contingency, that the facts that seem so solid to us in hindsight actually emerged from a confusing matrix of possibilities. The topic of contingency in history is vast, and I’m sure everyone has personal examples. Here’s one of mine: In his early 20s, my father went to a bank to apply for a job, but the woman he needed to talk to in the personnel department was out having lunch, so he went to another company where a friend of his worked and applied there. He got the job, and that’s how he met my mother. If the person at the bank hadn’t been out to lunch when he came by, I probably wouldn’t be here (or my kids, or the grandchild about to be born).

On a bigger scale, as Alvarez pointed out, that asteroid that hit Earth had about a seven-minute window of opportunity, which is not much out of the 4.5 billion year history of the solar system, especially considering the magnitude of the changes it caused. I think the real point of understanding contingencies, although Alvarez didn’t mention this, is that it clarifies the haphazard nature of existence. There is no plan. The only meaning is the one we supply. Events are a complex web that we understand only incompletely, and the stories we impose upon them are the best we can do with limited facts. They could have been different.

Maybe I shouldn’t have dismissed the Total Perspective Vortex so quickly; there is a certain terror involved sometimes in seeing one’s own small life in the context of the big picture. However, I’ll leave you with these inspiring words from Mountains of the Mind, by Robert Macfarlane:

Contemplating the immensities of deep time, you face, in a way that is both exquisite and horrifying, the total collapse of your present, compacted to nothingness by the pressures of pasts and futures too extensive to envisage. And it is a physical as well as a cerebral horror, for to acknowledge that the hard rock of a mountain is vulnerable to the attrition of time is of necessity to reflect on the appalling transience of the human body.

Yet there is also something curiously exhilarating about the contemplation of deep time. True, you learn yourself to be a blip in the larger projects of the universe. But you are also rewarded with the realization that you do exist—as unlikely as it may seem, you do exist.

In addition to the ChronoZoom page, there are a few books about big history.

This one is by David Christian, who is more or less the founder of the field:
Maps of Time: An Introduction to Big History.

Here’s one where Christian zooms in on human history: This Fleeting World: A Short History of Humanity.

Here’s a competing popular book of big history (the reviews are mixed, and I haven’t read it): Big History: From the Big Bang to the Present, by Cynthia Stokes Brown.

And finally, a scholarly look at not just the big history of the universe but big history as a field: Big History and the Future of Humanity, by Fred Spier.

Movie review: Cave of Forgotten Dreams

Cave of Forgotten Dreams, documentary film by Werner Herzog, 2010

This documentary is almost certainly as close as you will ever get to exploring the Chauvet Cave in southern France, home to the earliest known cave art, and Werner Herzog provides an excellent vicarious visit. The cave was discovered in 1994 and speedily locked up to protect its prehistoric treasure, a multitude of paintings of wild animals roughly twice as old as any previously known. The cave was evidently visited by humans in two different periods: the Aurignacian, roughly 30,000 to 32,000 years ago, and the Gravettian, roughly 25,000 to 27,000 years ago. Most of the art is from the earlier period; an enigmatic footprint left by a young boy, paired with the tracks of a wolf, are among the fewer remnants of the later period. About 20,000 years ago, a rock slide covered the entrance to the cave, which lay undisturbed until 1994. I groped for an analogy; it is as if some future beings in the year 29,000 by our current calendar found an iPad amongst the debris at the lowest levels of the city of Troy, perhaps.

Herzog does these beautiful images justice; he filmed them in 3D under fairly restricted conditions (access to the cave is very limited). The result takes full advantage of the light of moving flashlights, the looming shadows of the film crew and scientists, and the billows and depressions in the stone walls, which the creators themselves exploited to present their visions of their animal cohort. The film also shows some footage of the world outside the cave, including the Ardèche River and a natural stone bridge called the Pont d’Arc. Although one of the messages of the film, underscored by its title, is that the past is in many ways lost to us, this view of the paintings was evocative of the conditions under which they were created. For all our distance in time from these anonymous artists, it was easy to think that you could sense something of their world.

Herzog spends lots of times on the paintings themselves, noting their proto-cinematic aspects (a bison drawn with eight legs, for example, in an attempt to portray movement). There are some satisfyingly long slow pans over the images in all their mysterious beauty: a series of four overlapping horses, a rhinoceros with an exaggerated horn, a pair of rhinos apparently locked in combat, and many more. (The soundtrack gets a little intrusive in spots; silence would have been a fine alternative to what struck me as generic shapeless mystical music.) He also interviews some of the people who study the caves, a passionate and sometimes eccentric bunch. Jean-Michel Geneste, the Chauvet Cave Research Project’s director, describes the rich fauna of the time:

“You have to imagine lions, bears, leopards, wolves, foxes, in very large numbers, and among all these carnivores and predators—humans!”

Archaeologist Wulf Hein, talking about what we can learn of other arts at the time from other sites, appears wearing a rough fur garment of some sort and holding a replica of a tiny bone flute. He gestures at the German valley behind him and speaks as if he were an eyewitness setting the scene for a story:

“In the valley down there, reindeer and mammoth were passing, and it was very cold.”

Another researcher discusses the sounds we can imagine from the paintings, for example, the open mouths of horses suggesting their whinnies. These interviews support another message of the film, that although we can never reconstruct the past fully, we can represent it (and in fact we seem compelled to do so).

One of the most poignant signs of these early humans, to me anyway, was a series of red handprints they left behind. I am always moved by the sight of prehistoric handprints; they are one of the most vivid reminders of the humanity of these long-lost people (“I was here!” they seem to be saying). In this case, one of the people who left handprints had a crooked little finger, so his path through the cave can be traced by the recognizable handprints he left behind. The shadow of the unknown in which so many people once lived makes it particularly astonishing when we can identify a specific individual among those many, many anonymous generations.

In addition to its human traces, the caves contain things left behind by other animals: bones, some of them gnawed, perhaps by cave bears; bones of the bears themselves, including a skull that has since been encased in glittering calcite; scratches the bears made on the walls, some under the paintings and some over them. Most of the stalactites and rippling curtains of stone evidently formed after the rock slide that sealed the cave, so the painters would not have seen them. They emphasize the vast amount of time that has passed since its earlier users left it.

All in all, I highly recommend this film, particularly if you are fascinated by what we can understand of the lives of prehistoric humans or by the way scientists investigate these early ancestors. It is a dazzling visit to a mostly vanished world.

Writing, loss, and memory

I’ve been thinking about this quote from The Country of Language by Scott Russell Sanders: “And I knew that my impulse to write is bound up with my desire to salvage worthy moments from the river of time. Maybe all art is a hedge against loss.”

It’s always been a challenge to me to know what to put in and what to leave out when I write. When I was in probably fourth or fifth grade, I was given an assignment to write about my spring vacation from school, which I think consisted of a long weekend around Easter. We were supposed to hand this in the morning after the vacation ended. I’m sure the teacher wanted just a page or two summing up the key events—an Easter egg hunt, a family dinner—but I started writing on the first evening of the break, all about coming home from school that day and what Mom said to me and what we had for dinner and what my brothers and sister and I did when we played in the yard that evening. I did the same thing the next day, in what must have been excruciatingly tedious detail.

The experience now reminds me of Lewis Carroll’s fictional map with a scale of one mile to the mile. It was the first time I thought about the writer’s problem of when to summarize and when to zoom into the details. (I seem to remember that about ten pages into the thing, it started to drive my mother nuts, and I was fairly frustrated too.) So “worthy moments” is a key part of the Sanders quote. You can’t possibly capture all the moments, and you wouldn’t really want to. It’s taken me longer to realize (or admit) that you can’t even capture all the worthy moments.

The other thing that occurs to me is that I also try to save worthy moments in the form of physical objects. I have a folder full of expired museum passes and train ticket stubs and similar ephemera from a trip to Paris this summer, not to mention some Euro coins in a small bowl. Handling these things again reminds me that those magical two weeks were real, and helps me focus my energies on getting back there someday. This is all well and good; that trip was just under six months ago. However, I have taken enough trips and lived through enough noteworthy events that I don’t have room for every bit of memorabilia from every one of them. My house is small, and life is short. Storing and looking at things from past experiences crowds out the space and time needed for new ones.

This leads back to writing, because sometimes writing about a particular place or time or event can be enough to preserve it in my mind, and I can jettison the physical reminders. This past spring I finally threw away an old set of bookshelves, the first I ever bought. They were made of particleboard and showing their years, but I clung to them because for someone who has as many books as I do, bookshelves are more than just another piece of furniture. I bought this set when I was 15, using money I had won in a creative writing contest. I painted them myself. They weren’t just bookshelves; they expressed the optimism and pride of my 15-year-old self. But they were in fact a slowly crumbling object that was falling apart unevenly and no longer stood up straight. It helped to write down my memories of them and let the bookshelves themselves go. A small file on the hard drive is much easier to find room for than the shelves themselves, but it still allows me to bolster my identity by hanging onto the feelings of that younger self.

In the much longer run, however, even the small file will have to go. One of the ideas about which I feel most passionately is the value of the written word to the human species. More than four decades after I got my first library card, it is still sometimes a wonder to me that we can enter the minds of people long gone, let them transmit their thoughts to us, perhaps discuss those thoughts with others, and maybe even send a few down the pipeline ourselves to future minds. It is one of the most magical things that apes do. However, the amount of human wisdom and experience that has been preserved, as vast as it is, is only a fraction of the knowledge and thought and sheer human personality and wit that have been produced through the ages. And, if I am honest with myself, I realize that the amount of it that I will be able to comprehend, even if I live into my 80s or 90s, is the merest crumb. What I leave behind will probably be no more than the wake of the boats I saw passing on the Seine this summer, an evanescent ripple that blends quickly into the countless other agitations that move across the water.

This thought used to distress me, but I’ve cleaned out enough closets and hauled enough stuff to the curb or to Goodwill that I am content to realize that old things have to go, and someday I will be an old thing whose time has come. Even this realization, however, I would mark in words. The following poem is by Carl Sandburg; it’s from a collection called Smoke and Steel. Because the entire book is available for free from Google Books, I don’t think I’m taking anything away from Sandburg’s estate by posting this poem here.

Stars, Songs, Faces

Gather the stars if you wish it so.
Gather the songs and keep them.
Gather the faces of women.
Gather for keeping years and years.
And then . . .
Loosen your hands, let go and say goodby.
Let the stars and songs go.
Let the faces and years go.
Loosen your hands and say goodbye.

Book review: Delusions of Gender

Delusions of Gender: How Our Minds, Society, and Neurosexism Create Difference, by Cordelia Fine

I enjoy seeing images and reading descriptions of how people perceived the potential of the near future. For example, I recently saw an illustration from the 1900s that showed a room full of people in the year 2000 wearing full Edwardian dress and sitting around a radium fireplace. What is often amusing about these forecasts is that certain areas of life seem so set in stone that no one can imagine them changing. Clothing and hairstyles are certainly one good example. Gender roles are even more interesting: A 1967 vision of the future features a home computer that mom will use to shop for clothes and dad will use to pay the bills and see how much he owes in taxes. The technology is expected to develop, but the people are seen as fairly static.

The theme of Cordelia Fine’s latest book is that these unexamined assumptions help create the reality we study when we examine human behavior and also influence our interpretation of what we see. The first section of the book examines the way our assumptions about gender influence the very behavior we study when we look for inherent gender differences. For example, spatial reasoning is generally taken to be a particularly male skill. Fine cites a study where women outperformed men on a mental rotation task when it was presented in terms of stereotypically female activities (interior decorating, for example), but men did better than women when it was presented in terms of stereotypically male activities (nuclear propulsion engineering, say). This is just one example of many studies that revealed how responsive we are to social cues—even something as subtle as checking a box to indicate our gender before starting a test. Fine covers other aspects of how expectations color the ways that we perceive other people’s behavior, such as different reactions to more or less the same behavior in men and women in leadership positions.

To me, the chapter on stereotype threat seemed simultaneously the saddest and the most promising part of this section. Stereotype threat is what you’re under when you are performing a task that a social group you belong to is believed to be bad at, and it often has a negative effect on progress or performance. Fine writes, “It’s disconcerting to think that those who belong to negatively stereotyped groups might be pervasively hampered by stereotype threat effects in their academic life.” It seems like such a waste to me that thousands of capable brains, especially young brains, are not performing up to their potential because they are stressed by false beliefs along the lines of “Girls can’t do math” or “Guys are not good with words” (or beliefs about what people of your ethnicity are capable of or good at, for that matter). On the other hand, this does indicate that the human race has reserves of untapped, or inadequately tapped, potential on which to draw.

The second section is about the ways in which new results in neuroscience are often interpreted in terms of the same old gender stereotypes. It was surprising to me how many of the things that I thought I knew about gender differences are in fact not really well established. For example, you may have heard about (shoot, I may have written about) articles investigating the influence of fetal testosterone, which is supposedly the basis of some gender differences. I learned that no one has found a good correlation between any of the measurements made so far and any of the various supposedly masculine skills or behaviors that have been examined. (Fine makes an excellent point in the context of this discussion: Men are sometimes said to be better at science, but we haven’t even identified precisely which cognitive abilities make for a successful career in science, so it’s a bit premature to begin trying to identify the prenatal influences that produce the scientifically minded brain.) Another example: Females are said to have a larger corpus callosum, the band of nerve fibers connecting the brain’s hemispheres, but this finding is not cast in stone; in fact, it was rejected in a 2008 review article.

Furthermore, Fine makes the point that even when gender differences in brain activity are truly identified, they don’t necessarily represent something hardwired. We’re learning to understand the brain as a fairly plastic thing, and the effects of socialization surely show up there (where else would they appear?). She quotes neurophysiologist Ruth Bleier to the effect that “Biology can be said to define possibilities but not determine them; it is never irrelevant but it is also not determinant.” It almost seems to me that we are flexible creatures who for some reason really like to see each other in terms of fixed, either-or categories. Anyway, if you have read much in the way of popular science regarding gender differences in the brain, this section of the book may give you some surprises; it certainly provides a lot of valuable information. Chapter 14, “Brain Scams,” is particularly useful as a corrective to some of the pop psychology takes on gender differences.

The third section goes into how socialization occurs, and in particular how parents pass their beliefs about gender on to their children. For example, she has a fascinating discussion of gender-neutral child-rearing. This much-lauded concept is extremely hard to realize in practice; she describes the efforts one couple made to provide a truly gender-neutral background for their children, and it was a Herculean undertaking. People tend to overlook the many influences at work and attribute their children’s choices to genetics if they do not match what the parent is overtly promoting. (If I give a little girl toy trucks and a chemistry set but she wants dolls and a makeup set instead, it must be her hardwired femininity coming out, not all the advertising she sees, the trips down the toy aisle at the store, the television shows and books and movies that promote gender stereotypes, my own unconscious beliefs and behaviors, or the way her friends behave.)

It was almost amusing to read about some of the studies of supposedly gendered preferences measured in very young children. Six-month-old babies acted more interested in a pink doll or a blue truck depending on whether they were female or male, respectively, which is taken to indicate innate gender-based preferences. Has evolution really had time to teach human babies much about trucks (or the color blue, for that matter)? The people who care for them, on the other hand, have had time to teach them plenty, even at six months. (By the way, one interesting fact I picked up is that the current color-coding scheme using pink and blue is fairly recent; in fact, through the end of the 19th century, babies and young children of both genders generally wore white dresses, and when colors began to be used, pink was originally the color for boys.)

Fine has many other amusing but pointed observations about parent-child interactions. For example, even parents who want to stretch the gender boundaries for their daughters will be much more rigid about maintaining them in their sons. She quotes a mother whose son kept asking for a Barbie, so she and her husband “compromised” by getting him a NASCAR Barbie. (There are negative words for women that have no male equivalent—think of “slut,” for example, and words of that ilk—but males get the short end of the stick on this one: the word “tomboy” is, at least these days, attractive in a way that the rough male equivalent, “sissie,” is not.)

I highly recommend the book. The science of human behavior acquires various encrustations of half-baked or misunderstood sorta-facts as it works its way into the popular consciousness; aspects are emphasized or ignored for political or social reasons, and in some cases, the studies are over-interpreted or poorly done to begin with. Gender roles are one of the more touchy areas where science is easily misinterpreted or influenced by biases, although as Fine points out, “to those interested in gender equality there is nothing at all frightening about good science. It is only carelessly done science, or poorly interpreted science, or the neurosexism it feeds, that creates cause for concern.” Regarding that untapped potential I mentioned earlier in this review, Fine has this to say in the epilogue:

When a woman persists with a high-level math course or runs as a presidential candidate, or a father leaves work early to pick up the children from school, they are altering, little by little, the implicit patterns of the minds around them. As society slowly changes, so too do the differences between male and female selves, abilities, emotion, values, interests, hormones, and brains—because each is inextricably intimate with the social context in which it develops and functions.

Notes for the curious:

I wasn’t making it up about the radium fireplace (you’ll need to scroll down to find it) or the 1967 vision of gender stereotyped computer commerce.

The mental rotation study is described in Spatial Cognition and Gender: Instructional and Stimulus Influences on Mental Image Rotation Performance, M. J. Sharps, J. L. Price, and J. K. Williams, Psychology of Women Quarterly, 18(3), 413–425, 1994.

The study about toy dolls and trucks is described in Sex Differences in Infants’ Visual Interest in Toys, G. M. Alexander, T. Wilcox, and R. Woods, Archives of Sexual Behavior, 38(3): 427–433, 2009.

For more on color-coded children’s clothing, see this Smithsonian Magazine article. Check out the slide show, which features Franklin Roosevelt as a little boy.

Beauty, meaning, and freedom: How science changed my worldview

Last week I wrote that science enriches rather than impoverishes my worldview. I thought it might be useful to describe more precisely what I mean by this. It’s easy to speak broadly about science, meaning, and beauty, but it’s not always very clear exactly what these words mean in terms of the real-life story of how someone came to adopt a particular philosophy of life. I hope you will excuse me for a longish digression into the personal.

There’s a joke about Catholicism that everything is forbidden unless it’s compulsory. I grew up in a devoutly Catholic home in which life was hedged about by prohibitions ranging from the absurd to the devastating. The thing that bothers me the most, looking back, is that these prohibitions were never questioned, even if human well-being or thriving had to be sacrificed to them. God had set up the rules long ago, and the chain of command ran from him to the pope to the priest to my parents. Don’t ask why. (I mean, come on: we were primates! Healthy young primates are born to wonder why, and “The pope said so” is not a very good answer. If humans are lucky, they make a lifetime habit of asking questions.)

Religion felt to me like a constant presence nudging me to examine everything I did, said, or thought and check for wrong-doing. The classic complaint of ex-Catholics is the church’s attitude toward many normal desires, and that was certainly part of the problem, but it went way beyond that. For example, before church on Sundays, we were supposed to fast for an hour before taking communion. That’s not that big a deal, once you get past the mental contortions required of the trusting young mind when beloved elders present utterly bizarre beliefs about eating god. The silly thing about it was that we were also supposed to remember to brush our teeth at least an hour in advance, lest we accidentally swallow some of the toothpaste and thus break the required fast. It’s easy enough to laugh at it now, but given all the other things I was taught (that god could visibly leave the host and shame the intended recipient if he were offended, for example), which I was unfortunately unable to question when I was a kid, this prohibition was yet another source of existential dread, a way that you might be offending an all-powerful, irritable force without even realizing it, on grounds that seemed hazy at best. (OCD, anyone?)

As I was growing up, I didn’t learn all that much about science. I read randomly here and there, particularly in astronomy, but still, my science education was incomplete enough that I did not have to confront the discrepancies between my belief in the Biblical story about the creation of the world and the things we had learned about that topic since the Iron Age. Frankly, my memories of exactly how my youthful brain dealt with this subject are a blur, but I do remember that my conception of the history of humans, the earth, and the universe was constrained by the story told by the Bible and the Catholic church.

My scientific ignorance didn’t matter all that much, though, because there was plainly no chance of being a scientist. Dream as I might about observing the stars, it seemed that women weren’t really cut out for science anyway, or even any job other than motherhood, teaching grade school, or nursing. Not to knock any of those jobs, but that’s a limited set of opportunities. (It may sound like I’m 90 years old and talking about the prejudices of a bygone era, but I was born in 1961, so my era is not quite as bygone as all that, and I’m sure this sort of approach to young women’s potential is alive and well in fundamentalist churches today.)

My problem wasn’t just that I needed to learn to disagree with my parents about the meaning of women’s lives and find my own place in the world. With the best of intentions, they described my role in life and the possibilities open to me in terms of obedience to the will of a strict, all-powerful being who had little truck with women’s liberation and evidently little use for me except as a potential mother (unless I wanted to enter a convent and worship him unceasingly all my life, of course). Their claim to have god on their side distorted the power balance and the ordinary course of human generational differences. (This is why I am utterly opposed to any distortion of normal human interactions that arises from one side claiming to be speaking for a deity of any type. Sorry, no dice. We’re all just primates, and we speak for ourselves.)

I married and had children very young. The marriage had some wonderful moments, and my sons are an enduring source of joy. However, it was not a happy marriage. As it ended, quite predictably, after a few years, I began to think about going back to school. I took a correspondence course in astronomy and felt my mind boggle at the scale and complexity of the universe that I began to learn about. Once I removed the narrow framework of the story I was taught as a child, the cosmos seemed to expand in a heady rush.

I did the classic thing of walking around a lamp with a globe and a ball to figure out how the seasons and the phases of the moon worked (it’s embarrassing to admit that I was 21 before I learned that). I saw a partial solar eclipse, went out and looked at a new comet I had learned about in the pages of Astronomy magazine, and caught a few episodes of Cosmos on TV. I learned about the Big Bang and the synthesis of chemical elements heavier than hydrogen in stars and the dispersal of those elements throughout surrounding space when stars died. It’s a cliche by now to say that we and everything we know are star stuff, but grasping the truth of this was a powerful, permanently mind-altering experience. I learned enough to understand, at least in the most rudimentary outline, how life had evolved on this planet, and to begin to comprehend, as well as a short-lived creature can, the astonishingly long time periods involved.

My sense of both the timeline on which humans appeared and the vast space in which we found ourselves shifted dramatically. It was one of the most liberating experiences of my life. Before the huge panorama of space and time that unfolded before my delighted, awestruck eyes, the constricting walls of thou-shalt and thou-shalt-not were reduced to a manageable, even ignorable, size. Faced with the vast and intricate story of the universe as we know it so far, the stories I had been told of guilt and sin and redemption, stories that justified all the limitations enforced by worry and fear, began to look faded, childish, parochial, and distant. It was the most tremendous relief. I felt a dark weight rolling away from my mind, which became more and more free to move about in a much larger and more brightly lit space.

One thing I remember vividly about this time period after the divorce was spending summer evenings sitting outside and looking at the stars. I borrowed a small cheap telescope and looked at the moon, whatever planets were out, and various star clusters and nebulae. Late in the evening, I would put the telescope aside and simply watch the sky. If you sit long enough, you get a wonderful sense of the earth’s rotation. At that time of year, the Milky Way slowly crosses the sky; the spring constellations that were low in the west at sunset give way to the stars of summer, and eventually fall stars creep into view over the eastern horizon in the small hours before dawn.

I remember sitting out there one night and feeling a light breeze pass by as I watched the face of the night sky wheel by overhead. I found a deep pleasure in understanding, in rough terms anyway, the source of the wind (ultimately, the sun, which heats the earth unevenly), and knowing enough about what I was seeing in the night sky to feel like I was part of a fascinating universe and was able to comprehend it. The wind moved my hair, and it moved the leaves on a small tree nearby. I felt a sense of communion with the living world around me and the cosmos from which it had arisen. (It was eerily perfect, a couple of years later, to run across Kenneth Rexroth’s beautiful description of a similar experience in his poem The Heart of Herakles.)

More importantly, I felt like I had a right to be there. I was just another carbon-based life form, just like the tree. Far from feeling reduced in rank by realizing that I was an animal, I found it glorious to realize that I was not born sinful and flawed; I did not have to justify my existence by living up to someone else’s standards for perfection, masquerading as the divine will. I did not even need to bother any more about the censorious words of busybodies at church. Whatever kind of person I was, whatever my needs and interests and desires were, I needed to honor and fulfill those without hurting anyone else. That’s it. I felt like I had a place in the world that I could occupy without apology.

I don’t mean to make it sound like I was transformed overnight; I wasn’t. It took years to unravel the worst of the knots in my mind, and doubtless the people close to me can identify places still in need of work. (One of my sons said to me recently that a fundamentalist religious upbringing is a good way to create atheists and freethinkers, and this is true, but I think there must be easier and more humane ways to do it.) But the knowledge that the true story of the universe was vastly more intricate and wonderful than I had been taught, and the feeling of being part of a beautiful, totally natural universe that we can explore using science, is priceless to me and remains at the core of my belief system today.

Mortality and evolution

Well, it’s spring (in the northern hemisphere, anyway) and new life is bursting out everywhere you look, but again I’m going to talk to you about mortality.
Specifically, I’m going to talk about a recent paper that looked at the way that thoughts of death affect people’s beliefs about science. Like the paper I discussed in a recent post about mortality, intolerance, and mindfulness, this one uses terror management theory to frame an investigation into how people react to an existential threat. According to this theory, contemplating our own deaths produces anxiety that we ward off by various psychological defenses. Among these defenses is a stronger belief in worldviews that provide meaning, order, and perhaps a promise of immortality in some form or another.

Three researchers examined how thoughts of death affect people’s acceptance or rejection of evolutionary theory, the foundation of biological science, and intelligent design (ID), which is often couched in scientific language but in fact is not well founded in science. Evolutionary theory is obviously incompatible with belief in a literal instantaneous creation by a deity and arguably with any form of belief in an orderly world created with some purpose. Unfortunately, many people also see evolutionary theory as draining the meaning and purpose from human life. ID is essentially a response to this perception, and as such it offers a more obvious and more traditional sense of meaning.

In a series of studies, the researchers asked participants to write about either their own death or dental pain (which also arouses negative emotions but presumably does not tap into existential anxiety). Then the participants read brief selections from Michael Behe, arguing for ID, and/or Richard Dawkins, arguing for evolutionary theory, and answered questions about how they rated the author’s expertise and how much they agreed with what he was saying. In three studies with a range of participants (some undergrads, some older people from a variety of backgrounds), the authors found that by and large those who had written about death were more likely to agree with Behe than with Dawkins. This was pretty much what they had expected; they figured that ID bolsters the psychological defenses that people tend to draw on when confronted with thoughts of their own death.

The really fascinating stuff comes in the fourth and fifth studies. In the fourth, some of the participants were also given a brief reading from Carl Sagan in which he describes science as providing not just knowledge but meaning and comes down squarely on the side of being courageous enough to accept the universe as it really is, to the best of our current knowledge, and to make our own meaning. The fifth study used only the Behe and Dawkins readings; participants were all college students in the natural sciences. In both, the participants did not tend to accept ID or reject evolutionary theory even if they had written about their own deaths; in fact, they were more likely to reject ID.

I thought these were exciting results, particularly the study that included the Carl Sagan reading. I think our ability to use reason and careful observation to understand the world around us, and to be, as Sagan put it, courageous enough to accept the truths we find, is one of the greatest things about us as a species. The rejection of not just scientific findings but also the values that underlie scientific research is deeply troubling for a number of reasons. I don’t think that accepting science leads to an impoverished worldview; in fact, for me it’s exactly the opposite. Presenting science in a way that is unflinchingly honest about the situation in which we find ourselves and its implications for traditional religious beliefs and at the same time nurtures the deep sense of meaning that people hunger for is crucial. When I read the passage from Sagan that was used in this study, I was impressed at how well he did this. The fact that his words can change the way people react to an existential threat is heartening. A current debate in the online atheist community centers around whether unvarnished, honest rejection of the supernatural is compatible with (a) changing people’s minds or (b) persuading people of the beauty and meaning to be found in science. Although this study involves only a snapshot of people’s reactions to various readings, I think it suggests that science can be presented both honestly and compellingly. Now we need to understand and apply the best techniques that people have found for doing this.

This paper was published in PLoS One, so anyone can read it: Tracy JL, Hart J, Martens JP (2011) Death and Science: The Existential Underpinnings of Belief in Intelligent Design and Discomfort with Evolution. PLoS ONE 6(3): e17349. doi:10.1371/journal.pone.0017349. Published March 30, 2011.

More about mortality, intolerance, and mindfulness

A few weeks ago, I posted about a study that looked at how mindfulness affected people’s reactions to thoughts of their own deaths. Thanks to a good-hearted reader, I now have a copy of the paper itself, which provides a lot more information than the news article I was using.

The research was done in the context of something called terror management theory, which suggests that humans are motivated strongly by the fear of death. Past research has shown that after being reminded of their own mortality, people tend to become more biased and judgmental (favoring their own group over others, choosing harsher punishments for various offenses). The idea, very roughly, is that people defend themselves against uneasiness about death by clinging more strongly to the things in their worldview that provide identity and meaning. The paper looks at how mindfulness influences this behavior.

One thing I wondered about was what measure was used to determine how mindful the participants were. It turns out that their degree of mindfulness was assessed using the Mindful Attention Awareness Scale (MAAS). (The MAAS is described in this paper, which notes that it “is focused on the presence or absence of attention to and awareness of what is occurring in the present”; it suggests that the degree of mindfulness can be increased with practice, and describes some findings that indicated various correlations between MAAS results and the Big Five personality traits. Those two things together strike me as quite intriguing because they seem to me to be saying that perhaps some personality traits can, at least to some degree, be altered.)

Anyway, the paper about mindfulness and fear of death covered seven different studies that looked at a variety of interlinked questions. The general outline was that participants wrote about either what they imagined their own death would be like, or what watching television is like, and then answered a questionnaire that examined their attitudes or reactions (e.g., the degree to which they favored something written by a foreign, pro-US author versus something by a foreign, anti-US author). One thing that struck me was that in four of the seven studies, the participants were undergrads in their late teens and early twenties. In two of them, the age extended upward to the mid-thirties, and in one of them it extended up to the mid-sixties (the paper didn’t give a breakdown by age). I wonder how age differences might affect the results.

I personally have felt fairly anxious when thinking about my own death; being dead poses no terrors for me, because I think my consciousness will be extinguished when my brain dies. What bothers me is the moment of death and the knowledge that my time has run out: that there are no second chances left, that all of the things I haven’t been able to do or experience will remain forever undone. I have wondered how really old people, who realistically must know that their time on earth is mostly behind them and that death will come sooner rather than later, deal with what that must feel like. I can only assume that if I am fortunate enough to make it into my 80s or 90s, I will have found a way to cope. (I hope I’ll be cheerful enough that I can make jokes about not buying green bananas.) It would be interesting to know how more or less mindful people of different ages compared with each other in these sorts of studies (or if there’s any difference in younger people who have been in a life-threatening situation).