Christopher D. Hampson
Every now and again, scholars in the study of religion must confront the claim that the object of their study, that dimension of human experience we call the religious, is fading away. In the past, this view has been called the “Secularization Thesis.” In the mid-twentieth century, for example, it became popular for the elite, including elite academics, to assume that religion would largely fade from relevance as people became more educated, more rationalistic, more liberal. Of course, the Secularization Thesis has not always panned out. On the eve of a new millennium, sociologist Peter Berger had to backtrack from his previous commitment to the thesis, noting that pockets of academia and Western Europe had secularized, while admitting that they were the exception rather than the rule.
One hundred years later, as a new cohort of divines chatters and quibbles its way onto Andover Lawn, and as the most tired leaf sighs and begins to cast a glance groundward, the tide has once again turned, and religious scholars at Harvard Divinity School and elsewhere face the eternal foe of our discipline, reborn and reinvigorated.
Of course, the descriptive aspect of the Secularization Thesis doesn’t deserve our enmity. Not then, not now. It’s been proven false, of course, but it’s not terribly insulting (at least to me). Religion could, in theory, die out, or become less popular, or at least less pronounced an aspect of our lives and families and communities and cultures and societies. If it did, we’d all be historians of religion, and surely that wouldn’t be so bad. No, what’s most troubling is the normative aspect of the Secularization Thesis, the suggestion that religion is something bad, something to be outgrown or reasoned away—as if all we do in these hallowed halls of Harvard Divinity School is study the manifestations of some horrible disease, a cancer, that we shall one day cure and relegate to the museum, the archive, the dustbin. That’s why extreme versions of the Secularization Thesis see religion as something that can be totally expunged from human life, rather than as a dimension of the human experience that constantly changes and transforms and endures. But however slight the religious forms may become in the panoramic relief of our shared existence, they will always be with us. The tablet of the human soul is no palimpsest.
Perhaps what really makes the Secularization Thesis so popular is its traveling companion, what I’ll call the Violence Thesis: the notion that religion uniquely promotes violence. You’ve already seen enough of the world to know where that view originates, and we’ve seen ten decades more. As the oil wars of the late twentieth century gave way to the water wars of the twenty-first, as famines and plagues wracked the earth, wiping out entire people groups, the human race has witnessed firsthand how military coups and terrorist groups utilize religious rhetoric to mask their violence and to justify it.
But as scholars who critically study angels and demons (and trickster gods), we are not easily persuaded by black-and-white caricatures of the world. Some religious people are violent; some religious groups are violent; some religious imagery is violent. All true. But these are traits religion shares with things like nation, culture, and law. All are human institutions, all are flawed, with both good and bad tendencies. Reducing religion to violence is bad scholarship. It’s also bad normatively, for in dismissing religion as violent, we miss out on all of the positive things we can learn from the religious side of our existence.
These are old, tired problems. So, what manifestations confront the study of religion in 2116? In the past, it was Enlightenment rationalism that was supposed to produce secularization: a new way of thinking, unchained from stale hierarchies and dusty texts, oriented toward experience, evidence, and individual reason. Today, it’s a new kind of rationalism, one set loose by more than just a new way of thinking. We find ourselves faced with technological rationalism, spurred on by dramatic developments in bionic engineering and artificial intelligence. The advances we’ve made must seem dizzying to you, so let me attempt to give you the picture in broad brushstrokes.
First, the marriage of computer science and cognitive science yielded tremendous changes in our understanding of the human brain. Perhaps it all began with our self-perception. If the core of the ancient person was her heart, the core of the modern and postmodern person is her brain. As you read this, most people are already thinking of themselves as brains, and their brains as computers. We speak of having memory, both long-term memory and working memory. When working on hard problems, we speak of processing them, and we ask for input.
From Freud, we had learned to think of ourselves as having drives, sexual and otherwise.Then computer engineers created hard drives, flash drives, and external drives, and we reinterpreted the word for a new anthropology. And we aren’t just individualistic: with the advent of the participatory web and social media at the beginning of the twenty-first century, we learned to think of human relationships as a network. Soon the brainstorm, an early modern, Tesla-like metaphor, gave way to parallel processing and crowdsourcing.
Then the metaphor became the reality. Or, more precisely, this metaphorical change advanced alongside, and perhaps facilitated, the technological. Take memory. We’ve had external aids to memory since the first trailblazing mark was carved into a tree, but as computerized storage became exponentially more compact (Thanks, Samsung) and the search function became ever more accurate (Thanks, Google), it became possible to store a lifetime of memories in the cloud and access it through voice.
Then there’s input. The smart contact lens came first, enabling the recording of visual input with analysis, like facial recognition. Hardware developments allowed the mass production of telescopic, night-vision, and heat-map contact lenses, completely revolutionizing human vision. And that’s just one of many senses we’ve enhanced.
Processing has been the latest to develop.But no one can deny that it’s being transformed. Nutritionists have made great strides in facilitating neuroplasticity and preventing brain decay, increasing the average lifespan by ten years. Every year, the price for neural mapping (a medical and computational problem several orders more difficult than the mapping of the human genome) becomes more and more commercially accessible, and hundreds of thousands of people have now benefited from the combination of neural mapping, nutritional cocktails, and sleep exercises designed to strengthen key logic gates in the brain. We’ve come a long way since the abacus.
Second, there have been rapid developments in artificial intelligence. Just as humans have come ever closer to turning computer, computers have come ever closer to turning human. Not surprisingly, androids have turned out quite a bit differently than people imagined a century ago. For our grandparents, the broad brushstrokes on the canvas were those originally left by the likes of Isaac Asimov and George Lucas: androids were imagined as butlers, comrades, friends. In reality, the driving motivators have been less spiritual and more material: sex and money. The market demand for romantic companionship led to better robotics, better plastics and tissue engineering, better affective computing, and better speech programs, making certain androids very human-like in some respects. And the political demand for rational, accountable, and stable corporate boards led to better economic forecasting and decision-making algorithms, making other androids very human-like in other respects.
In sum, Homo sapiens is changing faster than evolution could ever take it, and computers are racing along for the ride. We haven’t yet seen the unification of the human and android species, although many people foresee that social and legal change coming within fifty years or so. Already, android-rights groups have filed their first round of test litigation.
Some religious humans and androids have plugged into the virtual reality worlds and are evangelizing there, preaching a higher reality.
Along with these developments has come a rising belief in that perennial theory, the Secularization Thesis, with a new, technological twist: as bionic engineering produces humans that are less emotional and more intelligent, religiosity should drop; correspondingly, artificial intelligence, based on human-authored code all the way down, will be completely devoid of religion. So the story goes, and there’s some data to back it up. Today’s leading pollsters show a steady decline in adherence to traditional, organized religions. Some smaller religions are facing extinction events, joining some forms of biodiversity on the planet. Technological development and geopolitical hostility are slowly eroding the environment for religious life, like the jungle was destroyed for the tiger, and the ocean for the great white.
But religious life is reinventing itself in surprising ways and in surprising places. Gott ist nicht tot. Just as teams of archaeologists and biologists stunned the world with their discovery that rapid speciation and biodiversity were flourishing within landfills and submerged cities, so, too, religious scholars are on the brink of announcing a new age for religion.
First, we have discovered what we call “Second Life” religions, after the online, virtual-reality game invented at the beginning of the twenty-first century. Our virtual games, though, have developed into what you might only have seen in movies and short films: a complete, immersive experience. Games became maps, and maps became worlds. Players, strapped down and plugged in, move virtual bodies through the power of thought. Within a photorealistic environment that stimulates most of the dozen-plus human senses, people interact with other players from around the world, some of which are artificial intelligences, or AIs, designed to live and learn within the world. The experience is so realistic that many people don’t want to come back to basic reality. Virtual reality is our opioid crisis. And, of course, some wisecracking programmer from Stanford developed worlds with virtual reality stations inside them. The deepest anyone has gone (and come back from) is four levels.
Religious people and institutions have migrated into virtual reality—just the latest version of the radio preacher and the televangelist. But we’ve found something much more interesting: entire religions founded and developed completely within online, massive, multiplayer worlds. Without it being part of the code, more than a few players have actually experienced the numinous: feelings of awe, world-bending escapes, ghostly apparitions, visions of horrifying beasts, lines upon lines of shockingly beautiful poetry. Those who receive such gifts call them miracles, revelations, or enlightenment. They have won millions of adherents. The programmers, naturally, are completely baffled by these reports. Is it a glitch? The game itself? Or is God reaching down into our virtual edifices to light a new flame?
Second, we’ve stumbled across the literal “ghost in the machine” or, for those who hear echoes of the Greek pneuma, the “spirit in the machine.” For it turns out the androids are not nonreligious after all. We think they may have been hiding it, due to the recent human scorn for all things religious and the discrimination that most androids face on a daily basis. But many androids have exhibited a sharp dread at the prospect of being shut down, dismantled, or wiped: the fear of death. Others have identified a site or object or person as invested with cosmic significance and will give their lives to save it. If it is lost, their operating systems fizzle out. Still others have developed, over generations it seems, complex rituals that must be performed on such occasions as moving in, preparing food, or witnessing a death. And it’s not only the androids; many of the AIs—which tend to have relatively more advanced coding—have also exhibited religious tendencies.
Third, like their android and AI cousins, not all bionically enhanced humans are nonreligious either. There is a growing community of humans who, having once participated in neural mapping and neuroplasty, have turned away from these technological advances and live instead among the poor who cannot afford such plastic luxuries. Many such humans practice religion of one sort or another. Others welcome bionic enhancement gladly but reject the secularism of the majority and lead lives of quiet devotion, much as Nicodemus, Lazarus, and Joseph secretly admired Christ from within their elite circles in Bethany and Jerusalem.
And finally, these developments seem to be moving alongside each other in confounding ways, hurtling us toward some unknown future. Some religious humans and androids have plugged into the virtual-reality worlds and are evangelizing there, preaching a higher reality. For those lost one level too deep, such a presence could well feel messianic, the visitation of a bodhisattva. Even more mystifying, someone—we suspect an android—has begun coding (creating) their own AIs and their own worlds within massive servers abandoned by bankrupt corporations. For the beings inside those worlds, this is a new genesis, and they turn toward the sky in search of answers and, perhaps, a relationship with their Creator. Fecisti nos ad te, Domine, et inquietum est cor nostrum donec requiescat in te.
And so the discipline—especially methodology—has adapted. Psychology and cognitive science are bursting with new life, spinning off labs left and right, and trying to understand why neuroplasty didn’t scrub the human psyche of its “primal urges.” Philosophers of religion, theologians, philologists, comparatists, scholars of ministry—all those espousing every theory and method within the study of religion are moving as fast as they can to catch up. To be fair, some scholars—especially the historians of religion—weren’t taken by surprise. Whenever anybody brings these new forms of religiosity up at a guest lecture, a faculty meeting, or a reception, they smile into the distance as if to say, “I told you so,” but—as decent academics and people—they never put it in quite those words.
Anthropologists of religion have begun to spend months, even years, living among androids and within virtual worlds. One ethnographer recently published a study based on five years spent two levels “down”—that is, within a world within a world. Fortunately, the fact that these worlds are coded makes the recording, logging, and coding aspects of ethnography much easier. And, new technology inspires new ideas. Someone is developing a surveillance program to study religious services with minimal personal intrusion. There’s even talk of beginning to educate and train some of the most astute androids and AIs to serve as participant-observers in their own communities.
Naturally, such developments raise ethics questions. When must one tell subjects that they are part of a study? Should they be compensated for participating in an interview? What about privacy? Pseudonyms? Do the same rules apply in virtual reality? Do they apply to androids and AIs?
And, equally important to getting institutional review board approval for such studies, how can this new scholarship be made available to communities that would be enriched by having it? We know very little about how the resources of higher education can be brought to android or virtual communities in fruitful ways. We’re always learning how research institutions can be part of a greater community mission, especially when, like divinity schools, those institutions were always designed to be centers for lived community.
As for Harvard Divinity School itself—well, it’s recognizable, I suppose. Students still sit for language exams at the beginning of every semester, but, in addition to such staples as French and German, a handful of enterprising scholars successfully petitioned the Board of Academic Affairs to allow them to demonstrate proficiency in one of a growing number of pidgin and creole languages that have developed in online communities. Those students with green thumbs still manage a garden next to Jewett House, although advances in farming technology have allowed it to expand, upward. It’s now a vertical farm—a modest ten stories of glass, steel, and sunlight, much shorter than the forty-story versions in downtown Boston and Cambridge. The building, aptly named Gaia Tower, generates a massive amount of produce, enough to furnish the Rock Café and provide extra for homeless shelters nearby. It also houses a labyrinth, a contemplative garden, and a waterfall.
Student groups still host a service every Wednesday, although that also has been adapted, with 360-degree, real-time cameras that allow people from around the world to participate from home through their virtual reality devices. A few services have attracted audiences of more than one million people. (I won’t tell you which ones.)
Physically, the campus hasn’t grown much. Allston was just too far away, and no one really wanted to buy land in Somerville. But, digitally—well, digitally, the campus has blown everyone away. Just as the main building combines the Gothic architecture of Andover Hall with the glass modernism of the library, our virtual campus—accessible worldwide—includes structures in a dizzying array of architectural styles, from Chinese to Native American. Our virtual campus has enabled us to make massive strides both in the study of religion and in religious leadership. It hosts several dozen living religious communities, as well as community seminars, public lectures, and focus groups. The most recent addition to the Divinity School’s servers is a beautifully rendered Baha’i House of Worship, the second in North America, but the Baha’i community thinks of it as the first in the cloud.
And the rest is familiar. We gather regularly. We find our center in the chapels and the meditation room. Tea is still popular, as are discussions about whether we’re too academic or too practical, too Christian or too Cantabrigian; whether we’ve really stuck the right balance between unity and diversity, or whether we’re just relativistic Unitarians after all; whether we’re better than Yale and Chicago (we are); whether that participle is active or passive (could it be perfect passive?); whether Schleiermacher was too apologetic, and, if he was, surely Nietzsche was as well; whether theories aren’t far less satisfying than methods anyway; whether I should switch to the MDiv program (it would give me more time for languages), or whether maybe I won’t be an academic after all, and wouldn’t a joint degree with the Kennedy School be fun, and what is the text really saying?
Plus ça change, plus c’est la même chose.
Christopher D. Hampson received his bachelor’s degree from Harvard in the comparative study of religion and graduated in 2016 from Harvard Law School and Harvard Divinity School with JD and MTS degrees. He currently serves as a law clerk with Judge Richard A. Posner on the US Court of Appeals for the Seventh Circuit.