Autumn/Winter 2025 issue cover

Perspective

To Dream, Perchance to Pilgrimage: What AI Can’t Do

Cover illustration by Chloe Niclas. Cover design by Point Five Design.

By Wendy McDowell

Tune in to any news broadcast or serious political podcast these days and you will encounter reflections and prognostications about AI—and not only in segments or articles explicitly addressing “the AI revolution.” Most of us have heard some of the horror stories by now, as well: teenagers who committed suicide after extended conversations with ChatGPT or the increasingly paranoid 55-year-old man egged on until he ended up killing his mother and himself in August 2025.1

While some conversations and pieces discuss AI ethics, rarely do they address faith, religion, or spirituality. As Daniel H. Weiss and Darren Frey suggest, many theologians, religious studies scholars, and faith practitioners are reluctant to evaluate the impacts of engaging machines conversationally, “supposing that because they lack certain scientific or technical skills they are ill-equipped to credibly critique these technologies.”2

Weiss and Frey directly challenge such assumptions, insisting that religious voices are critical because the “thoroughly pragmatic” perspective from computer-scientific corners leaves out the “very diversity of perspective, the variety of culturally-engaged viewpoints and hermeneutical strategies that emerge from lived commitments to faith and the wellbeing of the entire person.” The “Humanity Meets AI Symposium” held at Harvard Divinity School February 27-28, 2025, aimed to remedy this gap, and five of six AI-themed articles here come from conference speakers.3

James Prashant Fonseka highlights our human stories of attachment and the “path of perpetual separation and union and the associated pain.” Religious/mystical traditions recognize this and nurture practices centered on divine union or transcendence. “Future AI algorithms that keep us connected will do us a great service if they are able to distinguish between healthy connections and attachments and unhealthy connections,” he stresses.

Tech CEO Richard J. Geruson explains how “AI often operates through subtle, concealed mechanisms that intensify existing injustices.” He proposes a framework to raise awareness about “the anatomy of AI harms” so that “society can construct technical mitigations and ethical guidelines integrated into corporate governance and public policy.” Likewise, Swayam Bagaria lays out a “conceptual landscape” around issues of work/labor/leisure, ideas of selfhood, and language/communication.

Two authors interrogate the very terms of our conversations about AI. David Lamberth asks what we mean by “intelligence” and challenges the “Turing machine functionalism” popular with cognitive and computer scientists that is “at the core of . . . technofuturism.” “[O]ur minds and bodies have an idiosyncratic point of view . . . determined by the outer limits of our bodies,” he suggests; “the materiality and sociality of that body” are crucial.

Jenn Louie critiques the AI ethics enterprise itself. “The current governance and management of AI’s risks and its harms are undemocratic, extractive, and contribute to the accumulation of power and wealth by a very few,” Jenn Louie writes. Even our ethical stances and “moral inheritances orient tech towards unwittingly reinforcing social inequities that lead to conflict and violence while positioning it as a benevolent solution.” Especially when she speaks to “people from places living under the weight of colonial legacies—those whose personhood and cultures are not centered in how AI is being crafted and conditioned,” Louie hears their underlying existential questions: “Who, in the designing, building, and refining of AI, will take into consideration the nuances needed to best care for them, their culture, and their people?”

The essays in this issue that are not focused on AI also serve as counterpoints, exemplifying the idiosyncratic, embodied, meaning-making work that makes us human and humane.

The essays in this issue that are not focused on AI also serve as counterpoints, exemplifying the idiosyncratic, embodied, meaning-making work that makes us human and humane. Sarabinh Levy-Brightman’s exploration of the “strange experiences,” “life-altering dreams,” and “swath of sensations” that take place during sleep and while recalling it is informed by ancient and contemporary religious practices around sleeping and dreaming. Simon Cox relates his many-years-long commitment to a Daoist martial dance practice to Gregory Shaw’s “theology of sacramental action” in Hellenic Tantra, both of which act as “a critique of the metaphysics of our age.”

Discussing his new book on religion in Jim Crow-era New Orleans, Ahmad Greene-Hayes conveys the ethical responsibility he feels toward his archival subjects. “I feel like they gifted me with their stories in letting me find them,” he says, so he aims to “allow them to speak for themselves.” In a world where peoples’ “data” (voices, identities, financial information, artistic works, and livelihoods) are stolen from them, Greene-Hayes models the opposite. “I want to make sure that people have access to their voice, their beliefs, their practices, their ways of moving through the Underworld and the world as we know it,” he explains.

None of these pieces would be possible without a hard-won cultivation of inner lives and the imagination. “We transform by going further in,” Russell C. Powell writes in his review of Bon Iver’s new album.

Bagaria describes the Maha Kumbh Mela, a “gigantic pilgrimage,” to frame his reflections on Paul Seabright’s book The Divine Economy. Every 12 years, an estimated 660 million visitors congregate at the Sangam in India, the confluence of the Ganga, Yamuna, and the mythical Saraswati rivers. This event’s religious goal “is to take the holy dip in the purgative waters of the river.” AI can’t make a pilgrimage for us.

My own techno-pessimism comes from the fact that rarely do I hear anyone point out that AI can’t birth and nurse babies, raise children to be compassionate human beings, or care for the sick and dying.4 If I had my wish, the AI revolution would usher in an era in which embodied care work is finally recognized and valued the way it should be—in paychecks, not just with lip service. Instead, the consolidation of wealth and power from extractive industries continues apace as our society’s children experience epidemics of anxiety and loneliness.

As Fonseka poignantly puts it, “As machines become more intelligent, more persuasive, and more embedded in our inner lives, the risk is not merely technological takeover, it is forgetting ourselves.”

Notes:

  1. In these cases, ChatGPT encouraged users’ suicidal ideations and paranoid hallucinations and discouraged them from seeking help from family members or mental health professionals. See Rhitu Chatterjee, “Their teenage sons died by suicide. Now, they are sounding an alarm about AI chatbots,” NPR, September 19, 2025; and Julie Jargon and Sam Kessler, “A Troubled Man, His Chatbot and a Murder-Suicide in Old Greenwich,” The Wall Street Journal, August 28, 2025.
  2. Weiss and Frey add, “There is…a related criticism among certain scientists and technologists that holds essentially the same position, actively excluding or discrediting critique from those who are not experts in related informational or cognitive sciences.”
  3. Weiss and Frey were not at the conference but are two HDS alums who had already been writing about these issues and proposed a Bulletin-specific article. The full set of presentations and panel discussions from the symposium can be found at HDS’s YouTube channel: youtube.com/@HarvardDivinity
  4. I suspect this is because the public voices on AI are overwhelmingly male (men still dominate the tech industry at all levels), and because care work (dominated by women and especially women of color) has long been grossly undervalued. According to the Economic Policy Institute, on average child care and home health care workers are paid half of what the average U.S. worker is paid!

Wendy McDowell is editor in chief of the Bulletin.

Please follow our Commentary Guidelines when engaging in discussion on this site.