"I think religions all go to the same path, you know I think it’s all religions are a way of how to live your life and they all kinda lead to the same goal" (145). Thus did one young American recently express a bedrock principle of liberal Protestantism—the conviction that beneath the formalities of “organized religion” lies a natural substratum of human yearning and experience, one common to all the major spiritual traditions. This breezy description of religious equivalency represents one of many voices captured in Christian Smith’s Souls in Transition (2009), a magisterial study of the religious lives of college-aged, or “emerging,” adults in the United States. At first glance, the results of Smith’s survey might seem to confirm media reports that church affiliation is inexorably declining because young adults are “spiritual” not “religious.” Ironically, one of the interviewees acknowledged the trendiness of this way of framing things: “I’m spiritual, yes, which is now starting to sound like a cliché....” (159).
However, the widespread use of such language represents more than a simple fading away of organized religion. Indeed, the great accomplishment of Smith’s book—and of several other recent studies—is to show just how historically contingent is the individualism, moral relativism, and therapeutic consumerism of mainstream American culture. A vague, inchoate “spirituality” is not something natural that remains once religion has retreated. Our current religious settlement is not the inevitable outcome toward which previous history tended. Rather, it is a product of particular ways that generations of men and women have chosen to talk about their own lives—to frame them morally and theologically.
Those seeking resources to understand the historical forces and intellectual premises behind the contemporary framing of religion need not search for long. Critical accounts of contemporary assumptions about belief have been pouring off the presses in the past decade. Many of these studies share a suspicion of what Charles Taylor has called the “subtraction” story of secularization—an account that assumes that the secular is what remains after all that is historically contingent (and therefore unscientific or parochial or obsolete) has fallen away. In various ways, these six recent books reject the premise that American politics and culture are neutral on matters of religion. They argue that it is arbitrary and coercive to exclude religion from the academy or to deprive young men and women of sophisticated resources for thinking about their spiritual lives.
The most notable re-evaluations of religion’s place in the modern world have been big historical tomes. Charles Taylor’s A Secular Age (2007) insists that secularization is not primarily a matter of declining church attendance or religious affiliation. Rather, the cultural frameworks within which individuals and communities make sense of belief have themselves been radically transformed. How did religion morph from a fact of life—something deeply embedded in everyday experience—to a seemingly dispensable set of inwardly-held beliefs? As Taylor rightly notes, many modern believers experience secularization less as a loss of faith than as a recognition that their belief is merely one option among many. Indeed, within a pluralistic culture, even those who wish to preserve traditional ways of life, such as the Amish, will seem to be exercising a form of individual freedom or personal preference. Avoiding nostalgia for a lost past, Taylor recognizes that assumptions about belief are rooted in pervasive social and cultural practices. His goal is not to lament the loss of an irrecoverable past but rather to open up space for debate. A Secular Age gently questions pervasive assumptions about religion by showing the complex historical and moral reasons for their appeal. The attraction of atheism, for instance, lies less in incontrovertible philosophical arguments than in its seemingly brave and even heroic ethical stance.
If Taylor’s Secular Age seeks to loosen the grip that modern assumptions have on contemporary minds, Brad Gregory’s The Unintended Reformation (2012) aims to demolish such assumptions altogether. A defense of the Catholic tradition, Gregory’s book argues that modern science and secular politics have not discredited any of the major claims of medieval theology. Rather, by the time of the Enlightenment, the religious conflicts precipitated by the Reformation merely made it seem that Christian revelation had been discredited. Hoping to bypass bloody confessional disputes, Enlightenment thinkers turned to natural theology. The God they found, however, was a poor imitation of the transcendent God of scripture and the church. A distant “first cause,” the God of the Enlightenment could easily be rendered superfluous by naturalistic explanations. Thus, argues Gregory, the entire modern enterprise came to be based on an erroneous conception of God and Christian revelation. The God that modern scientists have allegedly dethroned is not the Christian God.
Gregory goes beyond Taylor in insisting that the institutions of modern life are radically arbitrary and coercive—and thus lacking legitimacy. It is no surprise to Gregory that modernity’s founding principle of “reason alone”—itself a response to the futility of Protestantism’s “scripture alone”—has led to intellectual and religious dead ends. Gregory finds an emptiness at the heart of modern public life. State power and consumer pleasures—not a substantive vision of the good—bind modern liberal societies together. By contrast, Taylor interprets the history of modernity as a complex ebb and flow of social, cultural, and intellectual forces. A student of Hegel, Taylor is more inclined than Gregory to learn from modern history—to acknowledge that the historical process draws attention to truths that would otherwise not have been apparent. Thus, while Gregory contends that modern religious tolerance represents a fundamental betrayal of Jesus’ uncompromising moral message (of which the Christian civilization of the Middle Ages is purportedly the logical outcome), Taylor suggests that modernity has made clear the “radical unconditionality” and otherness of the Gospel’s proclamation of human dignity in ways that a Christian civilization could not (see Taylor 1999, 17).
Despite these differences, Gregory and Taylor concur that modern Western culture has unnecessarily closed itself off to transcendent sources of value. Contrary to popular perception, the loss of such sources was not a simple “falling away” of religious inheritances. Rather, it resulted from willful decisions to construe reality in certain ways rather than others.
Few recent books have explored the intellectual consequences of a secular framing of reality more eloquently than Paul Griffiths’s Intellectual Appetite (2009), a theological meditation on human knowing. The modern academy, suggests Griffiths, speaks of knowledge and scholarship as if they were the objective, detachable products of an impersonal process. According to Griffiths, such language obscures the ways in which the modern research university is engaged in a very old project—that of forming the inquirer’s “intellectual appetite.” No less than ancient and medieval catechists, the modern university cultivates in students and researchers habits of knowing. Not so long ago, the modern university was committed to a Weberian vision of the scholarly life as an ascetic calling. The relentless search for new and highly specialized empirical knowledge required the researcher to sacrifice his personal need for meaning. More recently, universities have become “aware that there is no unanimity within their walls about what intellectual appetite is and how it should be formed” (17). Despite this lack of consensus, or perhaps because of it, the academy continues to socialize its inductees, encouraging them to value novelty above wisdom and possessive careerism above shared wonder.
The crucial distinction Griffiths draws is between studiousness, a patient attention to objects through which divine beauty and goodness shine, and curiosity, an insatiable lusting after empty spectacles. Griffiths argues that the quest for novelty is ultimately nihilistic, since there can be nothing sui generis in the created order. Paradoxically, the quest for novelty condemns the scholar to tedious and futile repetition, whereas a humble, stuttering witness to a given reality opens up onto wonders that are ever new. One of the book’s most provocative chapters critiques modern notions of intellectual property. Griffiths argues that intellectual objects cannot, by definition, be owned, since, unlike material commodities, an intellectual good can be shared with others without being diminished. The entire modern academic enterprise, he suggests, operates under a regime of artificial scarcity.
In critiquing “curiosity,” Griffiths may underestimate its value. To what extent is the spirit of curiosity an inevitable product of economic specialization and media saturation? To what extent has restless curiosity of the scientific enterprise opened up new forms of human flourishing—including the ability of millions of young men and women to pursue higher education? Is it possible to return to a purely contemplative model of learning when the material benefits of curiosity—of instrumentalized knowledge—are so readily apparent? Regardless of the answers to these questions, Griffith rightly highlights the limitations of knowing rooted exclusively in a quest for novelty and instrumental power. The modern university’s production of knowledge is not and cannot be neutral, since intellectual regimes inevitably promote particular intellectual virtues—particular ways of cultivating the “intellectual appetites.”
If this is true, then how are American educational institutions shaping young people’s “desire to know”? Here, one can turn to Warren Nord’s Does God Make a Difference? (2010), a book that sums up a career of rich reflection on religion and education. According to Nord, contemporary secular education stunts the intellectual development of its students by prohibiting a frank and open discussion of religion in the classroom. A survey of textbooks in several disciplines leads Nord to conclude that American classrooms arbitrarily exclude religion from areas of study to which it is vitally relevant—history after 1700, science, politics, literature, and art. Such exclusion is not “neutral,” nor is it a logical application of the Constitution, which simply prohibits the state from endorsing or promoting any particular religion. Rather, such exclusions reflect an active bias against religion—a laïcité that goes far beyond a disestablishment of the churches.
Nord defends the teaching of religion in public schools on secular, not spiritual, grounds. Shallow introductions to religion, suggests Nord, trap students into the present, leaving them unable to understand the vast majority of cultures, befuddled by the beliefs of their neighbors, and inarticulate about the weightiest existential and moral questions. An arbitrary exclusion of religion from the curriculum inhibits self-reflection and leaves disciplinary assumptions unchallenged: “Political scientists often assume that the truth is to be found in the scientific method they employ rather than in the (normative) ideological, philosophical, and political beliefs and values of the politicians, voters, and writers they study—and, as a result, they don’t teach students to think politically so much as they teach them to think scientifically about politics” (211). The effect of all this is to lock students into an exceptionally narrow and cramped mental space, to deprive them of the tools they need to understand their world and flourish as reflective adults.
Other studies suggest that the effects of such deprivation are quite pervasive. According to Christian Smith’s Souls in Transition, a lack of substantive religious reflection deprives young men and women of vital resources they need to narrate their own experience. Drawing on interviews with a wide range of young adults, the book reveals individuals groping, often unsuccessfully, to understand their lives and their wider world. Many of Smith’s interviewees protest, for instance, that they have “no regrets” for any of their choices—a claim that departs substantially from Jewish and Christian understandings of sin and offers few possibilities for developing complex moral or communal narratives. Equally striking is the radically subjective approach to religion found among the young adults interviewed. As Smith suggests, their world consists entirely of individual experience and emotion—of isolated selves trying to do what is “right for them” and to get along with one another (41–52). The conclusion Smith draws is admonishing without being moralistic: the culture offers young people poor resources for making sense of their lives.
Such impoverishment does not necessarily indicate a declining influence of religion on American public life. According to Smith, media claims that today’s young adults are more “spiritual” but less “religious” than previous generations are greatly exaggerated (295–96). Indeed, the strength of Smith’s book lies in its “thick description” of religious vocabularies; it resists any simple analysis of American religious culture. Nonetheless, it does hint at a broader failure: “if communities of other adults who care about youth wish to nurture emerging adult lives of purpose, meaning, and character—instead of confusion, drifting, and shallowness—they will need to do better jobs of seriously engaging youth from early on and not cut them adrift as they move through their teenage years” (299). Absent such engagement, argues Smith, young adults will be unable to understand, let alone resist, the individualistic and therapeutic assumptions they absorb from the wider culture.
What, then, is to be done? How can educators and parents promote a deeper and broader engagement with questions of meaning? How can secular education do justice to the full complexity the human experience, both past and present? How can faculty model ways of knowing that go beyond the instrumental mastery of disciplinary methodologies?
Confessing History (2010)—a collection of essays exploring “the Christian faith and the historian’s vocation”—argues that the way forward lies less in sweeping institutional change than in the scholar’s quotidian callings as writer, believer, teacher, citizen, and churchgoer. Together, the essays invite the reader to ponder what it might look like to pursue a scholarly vocation with well-formed intellectual appetites. The book’s autobiographical essays, for instance, attest to the power of humility as an intellectual virtue. They provide eloquent models of the “stuttering wonder” that Griffiths holds up as the proper response to the mystery of creation. As Una M. Cadegan puts it in her essay “Not All Autobiography is Scholarship: Thinking, as a Catholic, about History,”
...mystery itself is not, in some sense, mysterious, if by “mysterious” we mean something that tries to keep itself from us, keep us guessing and stumbling. Instead, mystery is very near, always waiting to ambush us, in the most mundane of our tasks, because we deal with the stuff of which the gracious mystery at the heart of the world is made. (59)
Insofar as it follows Jesus’ command to love God and one’s neighbor, the historian’s calling is no different than that of other Christians. As Beth Barton Schweiger writes in “Seeing Things: Knowledge and Love in History,” “Christian historians should set aside the often unyielding standards of professional norms in order to foster relationship with their peers, students, and the people in their books” (76).
While applauding the pioneering work of the generation of Christian historians that came to prominence in the 1970s and 1980s (George Marsden, Nathan Hatch, Mark Noll, etc.), some of the essays in Confessing History imply that those historians did not sufficiently challenge the assumptions of the historical profession. Indeed, some contributions yearn for a more perfect—and allegedly less accommodating—union of historical inquiry and Christian faith and practice. William Katerberg, for instance, seeks to shift priority from “‘own sake’ knowledge” to “loving intersubjective relationships.” In “The ‘Objectivity Question’ and the Historian’s Vocation,” he urges the historian to seek “history in service of life,” to acknowledge the “tradition-enacting, critical-memory function that history (heritage) plays in the communities in which they live” (120, 117). Meanwhile, Christopher Shannon challenges the tyranny of the monograph and argues that all history books (whether they acknowledge it or not) are “morally charged narratives.” Most of what goes on in the academy, argues Shannon in “After Monographs: A Critique of Christian Scholarship as Professional Practice,” is little more than an ongoing attempt to legitimize “the modern secular world” (183). Shannon suggests that Christian historians might best challenge this legitimacy by telling explicitly providentialist stories and by critiquing the equally providentialist assumptions of secular monographs.
But is such “postmodern” meta-reflection more likely to yield fruit than the direct, vigorous engagement with the historical profession and the impressive methodological rigor modeled by the monographs of Marsden and Noll? To follow Simone Weil, are not patient, self-effacing habits of attention—habits that may require years of training and the relinquishing of one’s own need for meaning—also acts of love, even prayer? (1951, 105–116). History, after all, offers abundant examples of the tragic, often unintended, consequences of humanity’s urgent need for meaning—its use of “history in service of life.”
It is not clear whether the calls in Confessing History for a more subjective and “relational” history are any less an accommodation to trends within the academy than the work of previous generations of Christian historians. As the introduction explains, many of the contributors undertook graduate studies in the mid-1990s—the moment when critiques of objectivity and Weberian rationality were at their peak. Several of the essays raise now standard questions about the secular modernity that informs the university, highlighting its hypocrisy, its Eurocentricism, and its hegemonic instrumentality. Like Brad Gregory, Shannon reminds us that the modern liberal experiment has a shoddy moral record—that it has tolerated massive violence, exploitation, and alienation. In this view, the very fact that the modern secular regime disavows or represses its historically contingent foundations is evidence of an arrogated power.
Such a monolithic depiction of the “modern secular world” offers too easy a target. Modernity is a product of social, economic, and cultural transformations that go far beyond intellectual moves made during the eighteenth century. And within the modern world, the liberal Enlightenment has always been something of an embattled faith. However, it is a faith that, to many, has had the voice of moral authority—an authority based not on metaphysical or epistemological certainty but rather on an awareness of the fallibility of human judgment, the goodness of ordinary life, the dangers of political power, and the fragility of flesh. At its best, the modern liberal tradition invites us to engage in acts of moral sympathy, to imagine ourselves in the place of another. Within complex systems of social and economic interdependence, the liberal tradition has fostered awareness of the myriad and indirect ways that we are bound to our fellow human beings—as well as the potentially disastrous ways that attempts to secure personal meaning or communal belonging have deprived individuals of their potential for flourishing.
Here, Michael Kugler’s “Enlightenment History, Objectivity, and the Moral Imagination” in Confessing History, a sensitive exploration of the moral imagination of Enlightenment historians, works to counterbalance triumphalist Christian readings of liberalism’s alleged failures. According to Kugler, the Enlightenment moral imagination was exceptionally diverse; it ranged from the Baron d’Holbach’s blunt atheism to Christian accounts of the moral sentiments. But it was rooted in a critique of the dangers of “religious mastery”—“the human tendency to turn worship and theology into mastery of talk about God and of ourselves” (143). Though the liberal tradition has been too quick to see theocrats lurking everywhere, its critique of “religious mastery” has nonetheless uncovered new dimensions of Jesus’ message. To its vision of religious tolerance we owe our ability to speak and write and think so freely. In seeking to outflank the Enlightenment, do Christian scholars not risk the same bad faith—the same elision of moral sources—of which they accuse the modern academy?
Matthew Lundin is Assistant Professor of History at Wheaton College.
Other Works Cited
Taylor , Charles. “A Catholic Modernity.” In A Catholic Modernity? Oxford: Oxford University Press, 1999.
Weil, Simone. “Reflections on the Right Use of School Studies with a View to the Love of God.” In Waiting for God, Emma Craufurd, trans. New York: G. P. Putnam’s Sons, 1951.