Wilfred M. McClay
The challenge of Christopher Shannon.
- View Issue
- Subscribe
- Give a Gift
- Archives
The University of Scranton Press may be one of the least well-known academic publishers in the United States. But for my money, it has stepped right into the big leagues by deciding to reissue, in a revised edition, historian Christopher Shannon’s extraordinary 1996 book Conspicuous Criticism: Tradition, the Individual, and Culture in American Social Thought, from Veblen to Mills, originally published by the John Hopkins University Press. (The subtitle of the new edition is slightly altered.) Scranton is thereby performing a public service, and a courageous one, at a time when economic pressures are forcing university presses to become very nearly as bottom-line conscious as commercial houses. A book like Conspicuous Criticism will never be a bestseller. But one dares to hope that with this new edition, Conspicuous Criticism will, after a decade of languishing in the shadows, emerge from its status as a bit of an underground classic, with a following among young Christian intellectuals in particular, and at last begin to get the kind of respectful attention across the intellectual spectrum that it deserves.
Conspicuous Criticism: Tradition, the Individual, and Culture in American Social Thought, from Veblen to Mills (New Studies in American Intellectual and Cultural History)
Christopher Shannon (Author)
The Johns Hopkins University Press
224 pages
$33.99
Shannon’s book remains as fresh today as when it appeared, an unusually penetrating and challenging rebuke to the social-scientific outlook on human existence. The social sciences, he argued, have used the anthropological idea of “culture” to unsettle the very basis of everyday life, promoting “a destabilization of received social meanings.” Critical social-scientific analysis, which so presents itself as the heroic antidote to the ravages of “the market,” is in fact in “the vanguard of extending the logic of commodification to the most intimate aspects of people’s lives.” If Shannon is right, the most celebrated critiques of modern American society, from Veblen to Mills, are pernicious failures, launched in the name of a thin and debased understanding of “culture,” and imposing an obsolete and misleading apparatus for thinking about society and culture. If Shannon is right, we will need to change, and change radically, the way we are doing things in the study of human life and thought, if we expect to proceed beyond the current impasse.
Small wonder, then, that the book was almost completely ignored by the relevant tribes of academics and social critics when it appeared. What else were they likely to do with a book that calls them on the carpet, along with everything they do? For all that we academics claim to relish provocative analyses and paradigm-shifting arguments, the truth of the matter is much less flattering to our amour propre. Such claims are often just little more than a rhetorical flourish, or even a self-deluding fantasy. We pretend to love change, and may even believe that we do. But in practice, we have little patience for it, particularly if we are the ones required to do the changing.
As a matter of brute fact, there is no force in the institutional world of ideas more powerful than the inertia of business-as-usual, the familiar pattern of expectations revolving around the core activities of paper-giving, journal-editing, lecture-giving, conference-attending, monograph-publishing, and hiring and tenuring, all under the surprisingly powerful conforming influence of peer review. Even the advent of postmodernism and its avatars has for the most part taken place in a smooth, untroubled, institutionally conservative manner, changing very little about this core structure. Notwithstanding the roiling that always seems to be occurring on the academy’s surface life, or the constant charges of political and cultural radicalism coming from the outside, or the faculty’s proud boasts of “transgressivity” and willingness to “think the unthinkable,” the truth of the matter is that the academy is one of the most procedurally conservative institutions in modern life. By challenging the professional canons, and the assumptions behind them, a book like Shannon’s took a position that is almost unassimilable, hence more easily ignored than engaged.
What, indeed, is our age likely to do with an author who put forward an argument for “the recovery of necessity,” at the very same time that the techno-utopian computer wizard Ray Kurzweil, perhaps reflecting more faithfully the regnant moral theology of high modernity, is assuring us that “The Singularity”—the moment when man escapes entirely from the yoke of biological necessity—”is near”? What can our age make of an author who thinks that the most pressing political question before us today is not the increase of political “participation” but the recovery of the meaning of politics itself, as an avenue for the expression of genuine human freedom, and an escape from the relentless “instrumentalization” of life? An author who argues (much like the philosopher Charles Taylor) for a renewal of our appreciation of ordinary life, but who remains snappishly suspicious of any attempts to over-theorize such a move, contending—astonishingly, to modern ears—that “acceptance of ordinary life requires an acceptance of waste” and requires resistance to the transformation of ordinary life into a “locus of meaning”? Who admonishes us that “all things do not exist to be read,” and that “experience does not have to be written to be valid”?
Let’s stop there for a moment. What, you may ask, could Shannon possibly mean in opposing the exaltation of “meaning”? How can one object to “meaning”? Isn’t this something approaching a modern sacrilege? Isn’t “meaning” precisely that thing for which we are told “modern man” is perpetually questing? Yes, precisely so. But I think it may help flesh out Shannon’s point to consider how it is embodied in a literary example drawn from Walker Percy’s novel The Moviegoer—itself a story of a questing modern man, the book’s narrator, whose aspirations are diluted and diverted into his tendency to exalt the “textualization” of experience in movies.
The narrator is addicted to the movies because it is only when he sees something in the movies that he can feel it to have been validated as “real.” When he and his girlfriend go to see Panic in the Streets, a 1950 movie filmed partly in the same New Orleans neighborhood where they are seeing the film, they emerge from the darkness of the theater with the certitude that the neighborhood is now “certified”:
Nowadays when a person lives somewhere, in a neighborhood, the place is not certified for him. More than likely he will live there sadly and the emptiness which is inside him will expand until it evacuates the entire neighborhood. But if he sees a movie which shows his very neighborhood, it becomes possible for him to live, for a time at least, as a person who is Somewhere and not Anywhere.
Or consider a passage earlier in the book, in which the narrator observes a honeymooning young couple wandering the French Quarter of New Orleans. They seem unhappy, anxious, aimless, sensing something wrong, something missing—until they spot the famous actor William Holden walking on the street. The young man is able to offer Holden a light for his smoke, and in this brief, impersonally friendly interaction with the hyper-real figure of Holden, a radiant source of “certification” itself, everything suddenly changes for the young man and his wife:
He has won title to his own existence, as plenary an existence now as Holden’s… . He is a citizen like Holden; two men of the world they are. All at once the world is open to him… . [His wife] feels the difference too. She had not known what was wrong nor how it was righted but she knows now that all is well.
Holden has turned down Toulouse shedding light as he goes. An aura of heightened reality moves with him and all who fall within it feel it. Now everyone is aware of him… .
I am attracted to movie stars but not for the usual reasons. I have no desire to speak to Holden or get his autograph. It is their peculiar reality which astounds me.
Shannon’s account of things speaks directly to the condition that Walker Percy has so penetratingly described, a mad compulsion to grasp hold of textual “meaning” as a shield against the “emptiness” of everyday life, a shield which is itself a chief cause of the very emptiness it would counteract, much like the compulsions of a man who takes drugs to alleviate the pains of his drug-taking. Shannon’s challenge to our ways of thinking about culture takes us much deeper, then, than a mere intellectual critique of social-scientific ideas and techniques. It is also an exploration of the ways in which those ideas and techniques have insinuated themselves into the most intimate crevices of our souls.
I would probably never have become aware of Shannon’s work myself, had I not been asked to review Conspicuous Criticism for the Annals of the American Academy of Political and Social Science. I had never so much as heard Shannon’s name before, nor had I heard of the book, then only recently published. But something intrigued me about the title, and so I accepted the invitation. The book itself proved to be an utterly fresh and compelling critique of the social sciences, by means of a close and searching reading of a variety of the most influential social-scientific writers of the early-to-mid-20th century, from Veblen to C. Wright Mills.
Such was my introduction to the work of one of the most original of the rising generation of U.S. cultural and intellectual historians. Although a historian by training, Shannon is very much at home in the precincts of social theory, and his work is profoundly informed by his Roman Catholic convictions and commitments. In addition, he has the kind of interdisciplinary versatility and range that were the glory of the American Studies movement at its best. These traits come together in a most unusual way in him. His perspective on the larger subject of modern American culture is difficult to describe adequately. I suppose it would come closest to the mark to say that he has been deeply influenced by the critiques of the Enlightenment launched by Alasdair MacIntyre and others in the same line, critiques that have been especially effective in opening up the problems of “community” and “tradition” in modern America, and in identifying the enterprise of social science, as now practiced, as the most dangerous foe of those things, and indeed, of the very insights it ostensibly seeks.
To put it more bluntly, Shannon thinks it entirely possible that the enterprise of social science is inherently self-defeating—useful in identifying the essential preconditions of social order, but profoundly unhelpful in the end, because it does so by means of a vocabulary that, ironically, makes it impossible to believe in the legitimacy of that social order. Such language may, in effect, rob us of the wherewithal to buy back what we never should have sold in the first place. In that sense, Shannon’s argument reminds me of the witticism attributed to Karl Kraus, to the effect that “Psychoanalysis is itself the disease of which it would be the cure.” For Shannon, the reification and subsequent problematizing of “culture” is itself the great iatrogenic disease of our times, the error at the heart of the social-scientific method, and the source of the very social woes that the social sciences have proved so incapable of curing.
Shannon is, of course, not content merely to critique social science. For him, the roots of the problem represented by social science stretch back much further, to the antinomies and emphases inherent in our modern, Protestant culture. Looking at the effects of the broadly liberal social-scientific outlook on American intellectuals, as seen and understood through their most influential texts, Shannon finds, again and again, the same basic premises: modernist, liberal, rationalist, individualist, “enlightened,” anti-traditionalist, anti-authoritarian, cosmopolitan, and culturally (if not theologically) Protestant.
As a committed Protestant myself, I found myself wanting to quarrel with him about his sweepingly negative view of Protestantism, which seemed to me in need of qualification, and which included many elements that are just as vividly present in the present-day condition of American Catholicism. But I could not help but be stimulated by the boldness of his argument, and at the number of times that he found the mark in ways that more seasoned scholars covering the same ground (myself included) had failed to do. When I began editing a book series in American intellectual history for Rowman and Littlefield, I naturally sought out Shannon to see if he had another project in the wings.
Indeed he did, and the result was his second book, A World Made Safe for Differences, which captured the interest not only of historians but also of a broad range of social scientists, such as the communitarian sociologists Robert Bellah and Amitai Etzioni. To oversimplify greatly, what Shannon did with this second book is demonstrate how the postwar social-scientific understanding of “culture,” while appearing to endorse cultural and individual diversity, in fact imposed an imperial standard of behavior and cultural organization that was even more rigid than the standards it replaced, and all the more pernicious for failing to acknowledge its imperial designs. Hence the book’s title is an ironic one, since it points to the ways that cultural or individual “difference” was reduced to a commodity that was, in fact, fully commensurable with all those things from which it “differed.”
The resulting book read like a cross between the cultural criticism of the Frankfurt School and the constructive impulses of communitarian Catholic social thought—and in fact, Shannon’s work helps one to see that these two stances may not be as far apart as appears at first glance. Indeed, it would not be too far from the mark to label Shannon as a Catholic variant upon the vision of the late Christopher Lasch, the eminent historian who was Shannon’s teacher at the University of Rochester, and from whose spirit of moral and intellectual critique of modernity he continues to derive inspiration.
As I have already intimated, Shannon’s argument is difficult to relate to existing ideological camps. It is radically conservative, not only in its high general regard for tradition but also in its unhesitant condemnation of the American abortion license and its skepticism about the jettisoning of traditional sexual morality. In other respects, however, it recalls the Frankfurt School (and poststructuralist) critique of universalism and liberal toleration, as controlling regimes that are all the more insidious for their refusal to declare themselves as such, and their self-serving charade of “value neutrality.” Its critique of Ruth Benedict’s anthropological view of culture is conjoined, brilliantly, to a critique not only of American modernizing arrogance in Vietnam but also of one of the most scathing American critics of the Vietnam intervention, Frances FitzGerald. There was a deep consensus, Shannon argues, undergirding the relationship between the liberal establishment and the radical counterculture, a consensus that consistently inhibited the emergence of genuine alternatives.
Both of Shannon’s books deserve to make a considerable mark, and could serve even to reorient some of our national discussion of the problem of “community” and the need to recover a sense of the authority of tradition. But Conspicuous Criticism remains especially worthy of reconsideration, precisely because its publication marked the emergence of a voice that has yet to be adequately heard and confronted.
Skeptics will say that Shannon is merely giving us, at bottom, yet another critique of liberal modernity. But I think this pigeonholing underestimates the uniqueness of his achievement. He is relentless in pointing to the ways in which other critics of modernity have merely reshuffled its premises without seriously challenging them, let alone departing from them. It is both radical and conservative, combining a powerful attack on bourgeois liberalism and consumer capitalism with a ringing defense of the place of religion and tradition (and particularly traditional religion) in contemporary society. Writing with moral passion and critical verve, Shannon identifies the forces that isolate the individual in modern society and counters more than a century of efforts by “progressive” intellectuals to displace tradition in favor of a humanism that actually diminishes humanity in the name of freeing its potential.
In a sense, one could say that Conspicuous Criticism is a call to reinstate traditional relations to God, nature, tradition, and the common good. But it makes that call in a most untraditional way. It is not in any way a paean to nostalgia nor a brief for conservatism. Instead, it arises out of a keen sense of necessity, an awareness of the inadequacy of critical discourse, and the unsustainability of the unassisted modern project in all its triumphalist finery. It is a recognition that, when the road forward leads only to a dead end, or over the side of a cliff, the most urgent business at hand is to trace the way back.
Wilfred M. McClay teaches history and humanities at the University of Tennessee at Chattanooga, and is the editor of the forthcoming Figures in the Carpet: Finding the Human Person in the American Past (Eerdmans).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromWilfred M. McClay
Lauren F. Winner
Faith in the suburbs.
- View Issue
- Subscribe
- Give a Gift
- Archives
A few weeks ago, I visited a church in a locale I’ll call Levittown. The building was mid-century churchy: stained glass windows; deep, dark wooden pews; prominent pulpit and altar; upright piano on a dais. But about twenty minutes into the service, something decidedly contemporary caught my eye: a giant (should I say venti?) Starbucks cup sitting proudly on the piano. How’s that for contemporary iconography? I wonder if it was a paid product placement.
The Suburban Christian: Finding Spiritual Vitality in the Land of Plenty
Albert Y. Hsu (Author)
IVP
220 pages
$14.19
Starbucks is an icon of suburbia, of course, even if the great coffee institution did start in Seattle, and it is fashionable to decry suburban living. Indeed, one of the few things agrarians and urbanites share is their utter horror for the suburbs, whose gated communities and starter mansions are poison for the soul. Even suburbanites themselves often engage in anti-suburb diatribes, albeit a tad sheepishly.
Two new books propose to redirect the conversation. David Goetz, a former editor at Leadership Journal, and Albert Y. Hsu, an editor at InterVarsity Press, ask what a spirituality of suburbia, a spirituality for people who drive mini-vans and tend manicured lawns (or pay someone else to tend them), might look like.
Suburban life, if pursued unheedingly, “obscures the real Jesus,” writes Goetz in Death by Suburb. “Too much of the good life ends up being toxic, deforming us spiritually.” But if obscured, Jesus is there somewhere, and Goetz’s book aims to help suburbanites find him in the ocean of lattÉs, in the aisles of Pottery Barn, and in the bleachers at the soccer field: “You don’t have to hole up in a monastery to experience the fullness of God. Your cul-de-sac and subdivision are as good a place as any.”
Goetz identifies eight “environmental toxins” that plague suburbia and offers a spiritual practice to purge each toxin from your system and help you realize that “even in suburbia all moments are infused with the Sacred.” By packaging his insights in this self-helpy formula—7 habits, 8 practices, 40 days to a more authentic Christian life—Goetz obviously opens himself up to criticism: this blueprint recapitulates some of the very problems of the suburban mindset that he is trying to offset. But I suspect he knew what he was doing, and chose the idiom to convey a subversive message to his target audience.
Consider environmental toxin #8, for example: “I need to get more done in less time.” Do you constantly wish you had more time—more time to catch up on email, get to the grocery store, pay your bills, please your boss, maybe even take your wife out to dinner? Consider keeping the Sabbath, a discipline sure to reconfigure the understanding and inhabiting of time for all those who faithfully practice it. (Scripture offers us a similarly counterintuitive antidote for the related sin of credit card debt: if you want to get out of debt, start tithing. Giving money to the church won’t get our Visa bills paid, but there is no surer way to escape being owned by money than giving it away.)
Environmental toxin #6: “My church is the problem.” Goetz has no patience for Americans’ pernicious church-hopping: “Only in relationships that permit no bailing out can certain forms of spiritual development occur.” Rather than switch churches because your pastor said something you disliked or the new church plant down the street has a livelier youth group, practice the discipline of “staying put in your church.” This manifestly countercultural advice cuts to the very heart of America’s restless anomie.
Environmental toxin #3: “I want my neighbor’s life.” Has life in the suburbs turned your skin permanently green with envy and taught you to covet the Joneses’ cars, careers, and Ivy League-bound kids? Try developing “friendship with those who have no immortality symbols.” That is, stop hanging out with your rich neighbors, and instead find “ways to be with the poor, the mentally disabled, the old and alone…. . essentially, all those who don’t build up [your] ego through their presence.” When you hang out with less wealthy people, you “begin to compare [yourself] to a different kind of neighbor,” and then you experience not envy but gratitude.
The point here is well-taken, but it still finds us measuring our worth against other people. And the examples Goetz offers underscore how hard it is for middle-class Americans to practice downwardly mobile sociability. His model of social “kenosis” is the writer Barbara Ehrenreich, who emptied herself by focusing her gaze on maids and waitresses. But Ehrenreich gazed at maids and waitresses because—on assignment for Harper’s for articles that became the book Nickel and Dimed—she was working undercover as a maid and waitress herself. It is worrying indeed if investigative journalism is the principle channel through which suburbanites can “face the humanity of another kind of person.”
Albert Y. Hsu’s The Suburban Christian finds in suburban living a deep spiritual longing. People come to the suburbs, Hsu says, because they are looking for something, a job or affordable housing or good public schools (or, less charitably, mostly white public schools). Like Goetz, Hsu insists that you don’t need to live on a farm or in the inner city to live an authentically Christian life. Nevertheless, “the suburban Christian ought not uncritically absorb all the characteristics of the suburban world.”
One excellent chapter teases out what follows from suburban reliance on cars. (Did you know that the average commuter spends three weeks a year commuting?) As a consequence of our driving dependence, says Hsu, the elderly who can’t drive are marginalized. Policy makers don’t prioritize public transportation. Indeed, we often don’t build sidewalks; as Bill Bryson has observed, “In many places in America now, it is not actually possible to be a pedestrian, even if you want to be.”
Alongside Goetz’s suggestion that we stay put in our churches through thick and thin, Hsu urges us to recover the parish mindset—that is, to go to the church down the block and join in what God is doing there, rather than shopping for the perfect fit and winding up at a church two suburbs away.
Consumerism goes hand in hand with suburban living. How can we “consume more Christianly”? Shop in locally owned stores; create holiday rituals that don’t revolve around gift-giving; regularly fast, not just from food, but also from media, new technology, and new clothes; buy organic, fair-trade coffee produced by companies that don’t destroy rain forests. (And if you agree with the skeptics who find the “fair-trade” crowd self-deluded, there are plenty of other ways to become a more discriminating consumer.) A basic guideline for simple living, says Hsu, is “to live at a standard of living that is below others in your income bracket. It you can afford a $400,000 house, live in a $250,000 one instead. Or, if you can afford a $250,000 house, live in a $150,000 one.”
In recent years we’ve seen a flourishing of books that take a fresh look at what might be called our “living arrangements.” The works of Wendell Berry and Albert Borgmann; books such as David Matzko McCarthy’s The Good Life: Genuine Christianity for the Middle Class, Eric Jacobsen’s Sidewalks in the Kingdom: New Urbanism and the Christian Faith, and T. J. Gorringe’s A Theology of the Built Environment: Justice, Empowerment, Redemption—these and many others examine the built-in assumptions of our ways of life and their often unintended and unexplored consequences. Add Goetz and Hsu to that growing stack.
Neither of these books pretends to offer the last word on the subject of suburban Christianity. They raise more questions than they answer—questions, for example, about the effects of suburban development on the landscape that may have attracted us to the suburbs in the first place. It would be salutary to consider suburban gender narratives, the ways that suburban living shapes our understandings of masculinity and femininity, and to probe the deep economic structures that make suburbia not only possible but seemingly necessary. What about the labor relations we practice in our suburban homes, homes so often kept clean by someone who can’t afford to live in the suburbs? What vision of redeemed creation do we encode when we build houses that aren’t designed to last more than 75 or 100 years—or tear down 30-year-old homes to build bigger ones?
Still, Hsu and Goetz have offered a welcome alternative to tiresome and self-righteous preaching about the spiritual superiority of agrarian or urban life. For Christians living in suburbia—and for those of us who share in the sins of suburbanism from our perches in the country or the city—these provocative yet loving books may prove invaluable.
Lauren F. Winner is the author most recently of Real Sex: The Naked Truth About Chastity (Brazos).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromLauren F. Winner
Christopher Shannon
A squandered heritage regained.
- View Issue
- Subscribe
- Give a Gift
- Archives
Long before the current clergy sex abuse scandal, a significant portion of American Catholics had already come to identify themselves as survivors. Viewed rhetorically, the response to this all-too-real current crisis follows the script of an earlier abuse scandal of somewhat more questionable veracity: Catholic education. American Catholics who came of age in the 1960s like to identify themselves, for better or for worse, as the people who were beaten by nuns. As comedy or tragedy, this story has been American Catholics’ chief contribution to late 20th-century American popular culture, as witnessed by the broad appeal of stage productions such as Do Patent Leather Shoes Really Reflect Up?, St. Mary Ignatius Explains It All for You, Nunsense, and Late Night Catechism.
God and Man at Georgetown Prep: How I Became a Catholic Despite 20 Years of Catholic Schooling
Mark Gauvreau Judge (Author)
Crossroad
192 pages
$23.00
In God and Man at Georgetown Prep, Mark Gauvreau Judge writes as a survivor not of abuse, but of neglect. Coming of age in the 1970s, Judge missed out on the gory/glory days of tough-guy priests and ruler-wielding nuns. Drawing on the theological spirit, if not the anglophile cultural posturing, of the conservative Catholic William F. Buckley’s classic God and Man at Yale, Judge exposes and indicts the functional atheism that has shaped Catholic educational institutions in the decades following the Second Vatican Council (1962-65). Judge’s book is an unabashed plea for Catholics to recover the world they have lost and reclaim the birthright they have sold for the material comforts and cultural respectability of mainstream, middle-class American life.
As an approximate generational contemporary of Judge, I can attest that he knows the world of which he writes. Growing up in the Washington-Baltimore area, the son of a successful journalist who wrote for National Geographic magazine, Judge found himself in an Élite Catholic educational milieu that fully embraced the liberal interpretation of Vatican II. Judge’s account of his education gives me new appreciation for provincialism: reform came a bit later to upstate New York, so I was spared the worse excesses of vanguard Catholic liberalism. Judge began his Catholic education at Our Lady of Mercy grammar school, run by the Sisters of Mercy in Potomac, Maryland; he continued on at Georgetown Preparatory School, the Jesuit prep school founded by John Carroll, first bishop of Baltimore, in the 1780s.
From 1850 to 1950, Catholic schools stood as the single most important marker of Catholic separatism in America. But by the 1970s, Catholic education had become a pale imitation of an already bland liberal humanitarianism. In one example, Judge cites We Follow Jesus, a third-grade religion book used at Mercy in the 1970s, which retells the Gospel story of Martha and Mary with Jesus simply saying “Now, Martha, do not worry too much about dinner; just do the best you can.”
If the Sisters of Mercy watered down the faith, Georgetown Prep directly undermined it. After centuries on the front lines of the Church’s war with modernity, the Jesuits had finally gone native. Pierre Teilhard de Chardin, the French Jesuit censured by Rome for his efforts to synthesize Catholic theology and Darwinian evolution, became an intellectual hero. Teilhard’s displacement of the cross for a progressive vision of humanity evolving toward an “Omega Point” in history fit all too neatly with New Age spirituality. At Georgetown Prep in the 1970s, Eastern mysticism trumped Catholic theology and situational ethics replaced traditional Catholic moral teaching, particularly in matters of sex.
Alas, Judge found more of the same at the non-Jesuit Catholic University of America in Washington, D.C., an institution founded to promote a national presence for Catholic intellectual life in America but now in open revolt against the teachings of the Church. Judge attended Catholic U. in the 1980s, at the height of the controversy surrounding Father Charles Curran, the moral theologian who lost his position for supporting the right of Catholics to express faithful dissent from the Church’s teachings on sexual ethics, particularly the ban on artificial birth control reaffirmed in the 1968 encyclical Humanae Vitae. Curran lost his battle but clearly won the war of popular opinion, with most students and faculty rallying to his side in the name of academic freedom.
Judge looks back on these developments with a heavy heart, but he concedes that the near apostasy of his Catholic educational institutions caused him little distress at the time. Like many young American males of his generation, he was less concerned with theology than with sex, drugs, and rock n’ roll. His drug of choice was alcohol, which he was able to control for some time through the alcoholic/workaholic discipline of an upwardly mobile East Coast professional. After graduating from Catholic U., Judge began a successful career in journalism, writing on popular culture, politics, and religion for mainstream outlets such as The Washington Post and left-of-center weeklies such as The Progressive and In These Times.
Despite this professional success, Judge eventually realized that he had lost control of his life to alcohol. Sparing us the details of his collapse and recovery, Judge simply states that he drank too much, did stupid things, and overcame his alcoholism with the help of Alcoholics Anonymous. In a similarly refreshing manner, he insists that most of the best times of his life involved alcohol in one way or another. Drinking with friends, staying up all night talking, laughing, listening to music and dancing—these are good things. Looking back on his recovery, Judge sees Alcoholics Anonymous as at least as much of a problem as alcohol itself. Rooted in the tradition of Protestant conversion narratives, the 12-step program of A.A. found one of its earliest advocates in Father Ed Dowling, a Jesuit priest who saw in it principles similar to the Spiritual Exercises of Ignatius of Loyola. According to Judge, in recent decades A.A. has, like Catholic schools, largely rejected its Christian roots; like other popular therapies, the secularized 12-step program has become an end in itself. To Judge’s credit, he refuses to define himself in terms of his disease.
Still, God and Man at Georgetown Prep is a conversion story of sorts—but a distinctly Catholic conversion story. Judge never officially left the Church, and he presents the “reversion” to his childhood faith less as a turn from sin to salvation than from indifference to commitment. The turning point in Judge’s life came not with his recovery from alcoholism but with the death of his father from cancer. Here again, Judge writes refreshingly against genre expectations. His father’s death leads not to emotional trauma but to an intellectual and spiritual awakening: “My father had been dead for several months before it dawned on me that he’d been a Catholic.”
Judge knew, of course, that his father had always attended Mass faithfully, but only by going through his father’s book collection after his death did he realize that his father had been a serious intellectual Catholic. Judge’s twenty years of Catholic education had failed to impress upon him the possibility that being Catholic had anything at all to do with the intellectual life. Catholicism was rules, doctrines, and Mass on Sunday. Exploring his father’s book collection, Judge discovered the intellectually rich and challenging Catholicism of G. K. Chesterton, Jacques Maritain, Joseph Pieper, and Dietrich von Hildebrand. After reading the books that had shaped his father’s mid-century Catholicism, Judge came to a new self-understanding: “I am a member of a generation of Catholics raised after Vatican II who was cheated out of a Catholic education.”
Members of that generation will share in Judge’s delight at the recovery of his Catholic intellectual heritage. Catholics and non-Catholics alike will find in his account a model for an intellectual life firmly rooted in the particularities of one faith tradition, yet determined to speak to the world in a common language. In particular, Joseph Pieper’s writing on hope as a historical virtue and his major cultural works, Leisure: The Basis of Culture and In Tune With the World: A Theory of Festivity, provide a philosophical framework sorely lacking in contemporary historical and cultural studies. Judge sees in the intellectual world of mid-century Catholicism not lockstep conformity to particular doctrines but rather an expansive affirmation of the beauty and goodness of God’s creation. Beginning in the 1960s, Judge contends, liberal American Catholics severed this affirmation from orthodoxy and thus reduced it to a kind of “humanism within the limits of the Democratic Party alone!”
This political dimension of the recent history of American Catholicism plays no small part in Judge’s story. In the work of a Washington-based journalist, it is hard to see how it could not. From father to son, the Judge family seems to have followed the now familiar trajectory from New Deal Democrat to Reagan Republican. Critical of crass free-market materialism and the Wal-Martization of American life, Judge nonetheless takes as his contemporary Catholic intellectual guides the solidly neo-conservative George Weigel and Richard John Neuhaus. The war on terrorism simply carries on the work of the war against communism; the real evils of communism/terrorism seem to excuse the real evils of the alternative regimes America has supported in the name of democracy. If liberal Catholics have shamelessly used the “consistent life ethic” argument advanced by Cardinal Joseph Bernardin to make abortion and capital punishment equivalent evils, conservatives have used opposition to the greater evil of abortion as license to support a whole range of lesser political evils clearly condemned by their erstwhile hero, John Paul II. Catholicism at its best has never fit neatly into American cultural and political categories. Even as Judge points his reader to a more classical Catholicism, he may provide some Catholics with ammunition for a political battle that, in the terms presently operative, is simply not a Catholic fight.
Christopher Shannon is assistant professor of history at Christendom College. His book Conspicuous Criticism: Tradition, the Individual, and Culture in Modern American Social Thought, has just been reissued in a revised edition by the University of Scranton Press.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromChristopher Shannon
Julia Vitullo-Martin
The formative history of suburbia.
- View Issue
- Subscribe
- Give a Gift
- Archives
At a reunion held a few years ago by my husband’s family outside Baltimore, my brother-in-law, an architect, suggested we explore Guilford, a section of Northeast Baltimore where their Italian immigrant grandfather had done stonework which he regarded as the finest of his career. An elderly relative wanted no part of the expedition. Yes, grandpa had been very proud of his stone houses, walks, walls, and porter lodges, she recalled. But Guilford prohibited any Italian from moving in. It was “restricted.” No Italian, Jewish, or black families need apply. My young, well-educated Italian American in-laws—bankers, professors, lawyers—pondered the unwelcome idea that their hard-working grandfather had treasured having built houses that he himself had been forbidden to buy.
Yet for the decades between the Civil War and the Great Depression—the first heyday of suburban development in America—most upper- and upper-middle-class prime residential developments routinely discriminated in a fashion we now regard as reprehensible. A family’s having the money to buy a house wasn’t sufficient. It also had to be the right color and ethnicity, and attend the correct church. Deeds carried restrictive covenants that set forth a series of proscriptions that bound both buyer and seller, as well as subsequent owners. In addition to Guilford, Maryland, restrictive covenants governed such famous developments as Forest Hill Gardens and Great Neck Hills in New York, Colony Hills in Massachusetts, Park Ridge in Illinois, Country Club District in Kansas, Palos Verdes in California, and hundreds of others across the country.
Some of the restrictions, particularly in the days before zoning, made eminent sense: no slaughterhouses, for example. No oil refineries, iron foundries, coal yards, hen houses, or reform schools. Some restrictions were a matter of taste, but surely enhanced property values, such as landscaping and set-back requirements. Other restrictions doubtless contributed to the deadening of suburbia that is so much criticized today: no stores, no theaters, no restaurants. But these are really policy considerations, having to do with personal preferences. Do you want to live in a quiet, serene, fairly uniform haven, or do you want to live in a lively, dense, urbane neighborhood? People of good will can and do disagree, and make different choices.
The pernicious restrictive covenants—not struck down by the Supreme Court as unconstitutional until 1948—had to do with race and religion. The most desirable developments were confined to white, Anglo-Saxon Protestants, preferably Episcopalian, with an understood hierarchy for everybody else—which meant that upwardly mobile Americans seeking the most desirable housing as a reward for their newfound wealth, education, and success were usually blocked if they were anything but white Protestants.
Now this little-remembered but immensely important practice has been given its own history by Robert M. Fogelson, a professor of urban studies and history at the Massachusetts Institute of Technology. To drive home just how extensive the practice was, Fogelson tells the story of an incident that occurred in Los Angeles in 1948. Singer Nat King Cole, one of the most successful entertainers of the 20th century, bought a 12-room house for $85,000 in Hancock Park, a restricted area. Hancock Park’s wealthy doctors, lawyers, and businessmen organized to keep Cole out. When the Supreme Court struck down restrictive covenants as unenforceable by the state, they decided to buy him out, telling him they did not want any undesirables moving in. “Neither do I,” said Cole. “And if I see anybody undesirable coming in here, I’ll be the first to complain.”
One of the most important intellectuals setting the stage for restrictive covenants, writes Fogelson, was Frederick Law Olmsted, Sr., long renowned as this country’s greatest landscape architect, designer of the finest American parks, most famously Central Park in New York City. Esteemed as a liberal who opened public spaces to the masses and as an innovator who devised transverses to separate traffic from pedestrians, Olmsted was also a class-conscious aristocrat who saw degradation and deterioration all around him in the 1860s—”the unmistakable signs of the advance guard of squalor.”
His solution was separation—to be applied to people in suburbs much as he had applied it to traffic in Central Park. Separate the bad from the good, the noxious from the clean, the tasteless from the tasteful. (This he called the “law of progress,” which would enhance the “cleanliness and purity of domestic life.”) In addition, ensure that the separation becomes permanent via agreements among property owners. “Suppose I come here,” he asked, writing about a suburban tract, “what grounds of confidence can I have that I shall not by-and-by find a dram-shop on my right, or a beer-garden on my left, or a factory chimney or warehouse cutting off this view of the water? If so, what is likely to be the future average value of land in this vicinity?” To emphasize its importance, he italicized his final sentence: “What improvements have you here that tend to insure permanent healthfulness and permanent rural beauty?”
In fearing change, 19th-century Americans were hardly being frivolous. As Fogelson points out, the late 19th century was a time of widespread civil disorder, brutal industrialization, financial panics, and unpredictable real estate markets. Elegant buildings were demolished and replaced by taller, uglier ones. He quotes a Unitarian minister in Cambridge, lamenting that “the want of permanence is one of the crying sins of the age,” and that Americans “are always getting ready to live in a new place, never living.”
As Olmsted noted, the point of his ideas was to ensure “tranquility and seclusion” and to prevent the “desolation which thus far has invariably advanced before the progress of the town.” On these matters he was a man of genius who set out the principles that still guide the best development: roads should be curvilinear, fitting into rather than destroying natural surroundings; a very few should handle through traffic, the others should be local; they should be beautifully landscaped, as should the front of all homes; the entrances to property should be distinctive, set off by wooden gates or stone lodges—stone being a crucial element used in all Olmsted designs.
Indeed, Guilford, Maryland, was a preeminent Olmstedian development, designed by the Olmsted Brothers, a Boston firm set up by the revered man’s sons. Laid out in 1903, Guilford was more stringently restricted than almost any predecessor development. A separate document of 23 pages banned nuisances on all lots, and banned businesses and multi-family housing on all but a few. Setbacks were required at the rear and sides of the houses as well as the front. A design review process allowed the Guilford Park Land Company to reject plans for “aesthetic or other reasons,” and to consider whether the house was in “harmony” with its neighbors. No house or lot could be occupied “by any negro or person of negro extraction,” nor did the company sell to Jews “of any character whatever.” In other words, ethnicity trumped class. A distinguished Jewish scholar, for example, at nearby Johns Hopkins University would be blocked from purchasing. These exclusions became part of the marketing campaign.
What Fogelson thinks of all this he pretty much keeps to himself, which is a disappointment. As a historian accustomed to casting a cold eye on the human condition, Fogelson in his younger years wrote on race, violence, riots, crime, and the disintegration of cities. His book of 2001, Downtown: Its Rise and Fall, 1880-1950, treats the destructiveness of the American pattern of separating business from residences. He concludes that the fall of downtown in the mid-20th century was due to the American development of itself as a nation of suburbs—a bourgeois utopia defined (so he suggests in his new book) as much by what it excluded as by what it included.
So how are our once-restricted bourgeois utopias doing today? Pretty well, actually. Guilford is still gorgeous, as are Forest Hills, Great Neck, and Bel-Air, to name just a few. For some reason, Fogelson doesn’t mention what is obvious to anyone walking through these neighborhoods: the restrictive covenants governing physical amenities like landscaping remain. But the neighborhoods are now a vibrant mixture of ethnicities and probably religions. He does target Palos Verdes Estates, which is only one percent African-American and two percent Hispanic, even though its mother city of Los Angeles hasn’t had a white majority since at least 1990.
What does this mean? Have we really not made progress, even though the Italians and Jews who were closed out in the 20th century now own many of the most beautiful suburban houses in America? Does the paucity of blacks and Hispanics in Palos Verdes reflect intractable injustice and discrimination? Or will they be following the groups ahead of them, much as the Italians followed the Irish, and the Irish followed the Germans, who followed the English? Fogelson doesn’t tell us, leaving us to draw our own conclusions about the meaning of it all.
Julia Vitullo-Martin is a senior fellow at the Manhattan Institute in New York.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromJulia Vitullo-Martin
Kathryn Long
History and ‘The End of the Spear’.
- View Issue
- Subscribe
- Give a Gift
- Archives
Elisabeth Elliot once commented that she wrote her novel, No Graven Image, as a way to deal with her experiences on the mission field through fiction, especially the three years she spent among the Waorani, then known as “Aucas.” Margaret Sparhawk, the heroine of Elliot’s book, is a sincere young woman who struggles with the challenging and unexpected complexities she encounters in her efforts to live out the gospel among an indigenous people. Through Margaret’s experiences, Elliot explores the idea that there is much more to missionary work than meets the eye, in fact, a great deal more than the folks back home ever imagine.
Elliot’s novel offers a helpful cautionary note to viewers of The End of the Spear, a feature-length film about the story with which Elisabeth Elliot is most often associated and which she did much to memorialize: the 1956 killings of her husband and four other missionaries by Waorani warriors in the rainforests of Ecuador. Elliot’s 1957 book Through Gates of Splendor, along with a Life magazine photo essay and wide coverage in both the secular and the Christian press, made the deaths of Jim Elliot, Peter Fleming, Ed McCully, Nate Saint, and Roger Youderian the defining missionary martyr story for American evangelicals during the second half of the twentieth century.
The End of the Spear tells this story from new points of view, those of Nate Saint’s son Steve and one of the Waorani, named Mincayani. Mincayani (played by Louie Leonardo) is a composite character drawn from the life histories of several Waorani warriors but closely associated with the real-life Mincaye, one of the men who speared the missionaries. The film follows Steve (played as a boy by Chase Ellison and as an adult by Chad Allen) and Mincayani from Mincayani’s childhood experiences of tribal violence in the 1940s and Steve’s loss of his father in 1956 to a dramatic moment of confrontation and reconciliation as adults in the 1990s. Along the way, it portrays the love between a father and his son and recreates the efforts by Nate Saint and his friends to contact the Waorani that led to their deaths. It also brings the story up to date by sketching the subsequent peaceful contact by three women and a child who lived among these same people. The four were Steve’s Aunt Rachel Saint (Nate’s older sister, played by Sara Kathryn Bakker); a young Waorani woman named Dayumae (Christina Souza), who was Rachel’s language informant; Elisabeth Elliot (Beth Bailey); and Elliot’s small daughter Valerie (Laura Mortensen).
In the screenplay, Dayumae tells the Waorani, who turn out to be members of her extended family, about God’s Son, “who was speared and did not spear back.” The American women live the message by not seeking to avenge their slain family members and by nursing the Waorani through a polio epidemic. Young Steve Saint, his mother, and his sister ride out the epidemic with the tribe and make other visits. The Waorani choose to embrace Christianity and end their revenge killings. Rachel Saint lives the rest of her life among the Waorani. In 1994, when she dies, the tribe invites Steve to take her place, an invitation he and his family accept. Then comes the film’s climax.
The End of the Spear does a number of things well. Moviegoers, including evangelicals familiar with the story, learn that the name of this people-group is “Waorani” (or “Waodani”) and not “Aucas,” a Quichua word meaning “savages” and used as a slur.1 The movie is fast-paced, intense (including intense violence), and offers some beautiful aerial scenes of rivers and forests in Panama, where it was made. The focus on Mincayani enables the screenwriters to provide some context for the Waorani who killed the five missionaries, including the high homicide rates and patterns of revenge killings that characterized their culture. The choice of the Embera, an indigenous group in Panama, to play the Waorani (except for lead characters), reflects a desire to offer some authenticity to the portrayal of indigenous people. Although the explicit Christian message is muted, the film tells a new generation about five young men who cared enough about the Waorani to risk their lives.
The film has gotten mediocre reviews in the secular press, some because of the film’s message-based content, some (correctly, I think) for the lack of character development and other weaknesses in the script. For example, the word “missionary” is not mentioned until well into the movie, leaving the uninitiated to wonder why in the world the hotshot young pilot and his friends want to find and meet the elusive and hostile indigenous people. Even so, the film grossed about $4.3 million on its opening weekend, ranking eighth in U.S. box office receipts. Evangelicals, by and large, have responded positively. The main controversy among conservative Christians has centered not on the film itself but on the choice of gay activist Chad Allen for the dual roles of Nate Saint and the adult Steve Saint.
My own unease with the film has a different source. It goes back to the complexity of missionary reality. There is both much more—and sometimes less—to the history of missionaries and the Waorani than meets the eye in the film. In its effort to inspire and entertain, the film presents a story with all the complexities removed. In doing so, it also employs a great deal of fictionalization. The movie is honest about this, although most people will miss the disclaimer: at the very end of the closing credits, a brief statement acknowledges composite characters and fictionalized incidents. Of course, audiences recognize that history on the big screen is almost always fictionalized. At the same time, however, the film opens with the words, “From a true story,” and that phrase is prominent in advertising. The movie is based on the dramatized documentary, Beyond the Gates of Splendor, released in 2005. Unfortunately, in an attempt to appeal to a commercial audience, The End of the Spear loses much of the documentary’s charm and other strengths while sharing its weakness of glossing over large portions of the past fifty years.2
Much of the fictionalization in The End of the Spear is done to make the plot, which focuses on Steve and on Mincayani, correspond to the larger narrative of the missionaries’ deaths and the Waorani embrace of peace/Christianity. In essence, the screenplay adds the legend of these two characters to the familiar, and in some circles almost mythic, story of the five missionaries. Historical connections do exist, but not as the film presents them. As a boy the real Steve Saint spent many school vacations with his Aunt Rachel and the Waorani, but he became an influential participant in Waorani history only at about the point where the film concludes. He and his family did not relocate permanently to Ecuador as the movie implies, but lived there for about a year between 1995 and 1996. They have maintained their involvement through extended visits. Saint’s autobiography, published in connection with the movie and bearing the same title, reflects this. The real Mincaye, who is Dayumae’s half-brother, participated in many of the historical events narrated in the film and was one of the “Palm Beach” killers. However, other Waorani warriors played more prominent roles, which led to the creation of a composite character.3
Given the plot, it is logical that the Steve Saint character is introduced as an eight-year-old, frightened by the risks his dad is taking. The real Steve turned five the same month Nate Saint was killed. In the film, Steve secretly radios Aunt Rachel to find out from Dayumae how to say “I am your friend, your sincere friend” in the Waorani language. A phrase is given which the boy carefully repeats and writes down. He teaches his father, and these will become his father’s last words, as well as the words Steve later uses to reach out to Mincayani.
The historical record shows that the men did try to get phrases from Dayumae, including ones that they thought meant, “I like you; I want to be your friend.” Yet one real-life complication is that the Waorani are a kinship-based society and had no words in their language for friend or friendship. Contrary to the movie, Dayumae spoke no English. As a young teenager, she fled the violence of her people to become a peon or virtual slave on a hacienda at the edge of the rainforest. There she spoke lowland Quichua, the language of the Indians around her. When Jim Elliot visited her to learn phrases in Wao tededo, the Waorani language, he did not realize that she spoke a version of her native tongue corrupted by Quichua influences. Elliot was fluent in Quichua, but neither he nor the others would know that Wao tededo bore no relationship to that language. The End of the Spear certainly portrays the complete inability of the missionaries to communicate with the Waorani during their first peaceful encounter. Even so, “I am your sincere friend” is an invented theme that obscures the vast cultural divide between the Waorani and the missionaries who wanted to meet them.
More curious is the way the film depicts two events that have been the subject of controversy and criticism over the years: the shooting of a Waorani man during the attack on the beach and the circumstances surrounding the polio epidemic that struck the Waorani in 1969.4
Both the movie and historical accounts agree that the men took guns when they established their base camp in Waorani territory. They thought the Indians would flee if shots were fired, but they also believed that firing, even into the air, should be a last resort. The search party that arrived after the attack found a bullet hole through the plane window and some signs of struggle. Later, when the Waorani who participated began to talk about what happened, it became clear that a bullet from one of the pistols fired in the melee either hit Dayumae’s brother Nampa in the head or grazed his head. Nampa, who was one of the attackers, died sometime later—from a few weeks to more than a year—the time frame is unclear. Accounts generally connect his death to the attack, reporting that Nampa died of the bullet lodged in his head or from an infection related to the wound, though this, too, is disputed.5
Since 1974, critics have charged Rachel Saint and her sending agency, the Summer Institute of Linguistics (now SIL International), with trying to conceal the gunshot and Nampa’s death in order to make the missionaries look more heroic. In fact, neither Saint nor the sil denied the shot. The End of the Spear plays into the critics’ hands by offering only the slightest visual nod to the shooting of Nampa. One scene during the spearings shows an arm pointing a pistol to the sky, while another reaches around to knock it down so the pistol fires horizontally rather than vertically. There is no indication that the shot hit anyone.
Most accounts appear to suggest that the shot was accidental, in the context of a struggle, but we may never know for sure. Expressing a note of ambiguity might have added to the power of a film emphasizing forgiveness, reconciliation, and ending violence. The Waorani have known about Nampa’s wounding and death all along; perhaps American moviegoers should have been given the same opportunity. In the end, some Waorani still found the overall decision by the missionaries not to use their weapons in self-defense a significant witness to the potential of Christianity as a mechanism for ending violence.
The choice of the 1969 polio epidemic as a turning point in The End of the Spear seems particularly odd. Steve Saint was eighteen at the time, and, contrary to the movie, not present. No foreigner (non-Waorani) was there except for Rachel Saint, and her role was a mixed one of sacrifice, bravery, and a hard-headedness that cost dearly the very indigenous people she loved. In the movie version, the Aenomenane, downriver tribal enemies of the peaceful Waorani, arrive with their sick seeking help. The illness is diagnosed as polio and a six-week quarantine is imposed on the village. The quarantine includes young Steve, still not more than nine or ten, his mother, sister, Rachel Saint, and Elisabeth Elliot. The missionary women, along with Dayumae, Kimo, and other Waorani, demonstrate love for enemies by caring for the polio victims. They improvise wooden teeter-totters to rock the victims and help them breathe. Meanwhile, Steve’s nemesis, Mincayani, continues to resist the way of peace. He hunts game and throws it away rather than feed traditional foes. As the epidemic passes, old animosities dissolve and a peaceful kingdom dawns. “The teeter-totters had stopped and with them the cycle of revenge.” From this point, the film skips twenty-five years to Rachel Saint’s death in 1994, when Steve is challenged to pick up the mantle from his fallen aunt.
All this may make for good cinema, but it is deceptive history. The polio outbreak occurred at a time when Rachel Saint, a few of her colleagues, and a handful of Waorani believers were engaged in a controversial effort to find and relocate formerly hostile groups of Waorani scattered across their vast traditional territories. Saint and her colleagues pushed the relocation because they feared the Waorani would not survive hostile encounters with oil crews who were exploring their territory. They also believed that consolidating the groups would facilitate Christianization. The Waorani who responded to these efforts did so because they wanted spouses, trade goods, and peace (which also pretty much summed up their understanding of Christianity).
In September 1969, when polio first appeared, there were approximately 250 Waorani crowded in or near Tiwaeno, the clearing where Dayumae’s family first met Elisabeth Elliot and Rachel Saint in 1958. (Elliot left the Waorani in 1961.) About 60 percent of them were from two waves of newcomers who had arrived within a little more than a year. Some had already faced contact illnesses such as severe respiratory infections in the new location and were weak due to food shortages. Sixteen people died of polio, all from among the newcomers. About the same number were left handicapped, some of them taken to outside hospitals or clinics for treatment. Two people were speared in revenge killings, and one of the perpetrators died mysteriously shortly after a spearing.
Rachel Saint has been criticized because she had dragged her feet on immunizing the recent arrivals. More important, after polio was diagnosed she ignored doctors’ advice to immediately immunize because she was afraid that adverse reactions would lead to violence. For three weeks, as the disease spread, Saint refused to allow a missionary doctor and a nurse to fly into the clearing because of the danger of spearing. They finally came anyway and were the ones who designed makeshift treatments like the teeter-totters. Saint worked to exhaustion caring for the sick. She also risked her life, even breaking spears, to enforce the Christian ethic of peace by confronting warriors bent on revenge killings after polio victims died. A few Waorani converts did care for their former enemies, reinforcing the association of Christianity and peace. Nonetheless, it was a difficult time. BaÏ, an influential warrior, left with members of one group. He called Tiwaeno “a place of death.”6
The crisis highlighted Rachel Saint’s unwillingness to let any other outsider live and work with “her” tribe, even while population influx overwhelmed her attempts to serve as sole resource—medic, missionary, linguist—among the people. In partial response to perceived shortcomings, during the next five years the sil would add four more staff members to the Waorani “team.” Two of them, Catherine Peeke and Rosi Jung, worked with informants to translate the New Testament into Wao tededo. Another, Pat Kelley, served as a literacy instructor and developed reading materials, while Jim Yost did an anthropological field study of the Waorani, the beginning of a series of important research projects. Yost and his wife, Kathie, who arrived with toddler Rachelle and had two more children during the eight years of their assignment, were the first nuclear family of outsiders to live among the Waorani.
Although these individuals almost never appear in any of the well-publicized stories, they invested significant portions of their adult lives helping the Waorani face the pressures of increasing contact with the outside world. They gave them tools—Scripture and literacy—for an indigenous Christianity. They helped negotiate Ecuadorian citizenship and land disputes and worked with missionary nurses to train native health care promoters. Meanwhile, tensions increased between sil in Ecuador and Rachel Saint, along with outside criticism of the organization. As reporter Amy Rogers noted in the Pittsburgh Post-Gazette, the film “glosses over great accomplishments and simmering controversies” when it fast-forwards through this period.7
In the climax of The End of the Spear, Mincayani takes the adult Steve to the beach where his father was killed. The Waorani warrior digs up a tin cup and a battered photo of Steve that Nate Saint carried in his plane (the photo having survived forty years in the rain forest).
“They didn’t shoot us,” Mincayani says, part of an intense exchange.
“Your father was a special man. I saw him jump the Great Boa [a Waorani spirit guarding the afterlife] while he was still alive.”
A flashback shows what has been alluded to earlier: angelic figures above the riverbank and light flooding the area as the men died. Radiance comes down and touches the dying Nate Saint. “I speared your father,” Mincayani confesses to Steve. Mincayani points a spear at his own chest and urges Steve to use it. In a moment of long pent-up rage, Steve wants to do just that. Yet Saint draws on the deeper power of forgiveness and faith. “No one took my father’s life. He gave it.” Mincayani and Steve are reconciled and find peace.
In 1989, 33 years after the killings, several Waorani converts who participated in the spearings began reporting that they had seen figures, or lights, and heard singing above the riverbank after the men were killed. Apart from this reference, which the film accepts without question, the rest of the final scene is fictionalized. The actual Steve Saint and Mincaye were never estranged. This is the stuff of Hollywood, and a perfect way to end a missionary drama for the folks back home. The deaths of the five missionaries in 1956 became an archetypal narrative of missionary sacrifice and heroism for evangelicals in the United States and around the world. As believers, we respond to an apparent sequel that is just as dramatic and unambiguous.
The reality is more complicated. The challenge of reconciliation for the Waorani was never with the missionaries or their family members; it involved finding a way to end the bloody vendettas among themselves and to coexist with former enemies. The End of the Spear vividly conveys the Waorani as agents of violence. The movie is less successful in portraying efforts by the Waorani themselves to make peace once the gospel was introduced.8 Nor is the theme of reconciliation extended to missionary relationships. Such deeply committed and determined women as Elisabeth Elliot and Rachel Saint found it easier to forgive their loved ones’ killers and live among them than to get along with each other. The film also avoids the painful issue of children forgiving fathers who abandoned them in order to risk and ultimately lose their lives, even for the best of motives.
In the final voiceover, Steve Saint speaks movingly of the gain out of loss that has come to his family. Mincayani has lived to be a grandfather, and a grandfather not only to his own children’s children but also to Steve’s. The movie is silent on the complicated gains and losses—along with peace—experienced by the Waorani since some first embraced a form of Christianity almost fifty years ago. This includes the struggle to retain their cultural identity in the face of enormous pressures, while at the same time not remaining frozen in the past. It does not recognize the quiet efforts of others in addition to Rachel Saint—Peeke and Jung, for example—to help the Waorani survive in the modern world and to embrace a Christianity that means more than simply, “Thou shalt not kill.”9 They, too, have experienced gains and losses, seldom neatly balanced. For all their imperfections, they have tried to model Christianity in the midst of the beauty and the mud and the bugs and the dailiness of jungle living.
No one movie can do it all. And good cinema is often inaccurate history, though perhaps the bar should be higher when narratives explicitly concern God’s work in the world. The producers of The End of the Spear sought the authenticity associated with a true story without the difficulties that real history also brings. They gave the audience a stirring account, but not as much as we needed to know.
Kathryn Long is associate professor of history at Wheaton College. She is writing a book on the history of Waorani/missionary contact.
1. “Waorani” is a plural or collective noun; “Wao” refers to one person and is the adjectival form. To avoid awkwardness in English prose, I have chosen to use Waorani throughout as both noun and adjective.
2. Beyond the Gates of Splendor (Bearing Fruit Communications, 2002; Twentieth Century Fox Home Entertainment, 2005); available on dvd.
3. Steve Saint, The End of the Spear (Tyndale, 2005). Two fairly romanticized sources that include some of the Waorani history depicted in the film are Ethel Emily Wallis, The Dayuma Story (Harper & Brothers, 1960); and Wallis, Aucas Downriver: Dayuma’s Story Today (Harper & Row, 1973).
4. Although it was written in the late 1970s, the most credible summary of criticism published in English, including these two issues, remains chapter 9, “The Huaorani Go To Market,” in David Stoll, Fishers of Men or Founders of Empire? The Wycliffe Bible Translators in Latin America (Zed Press, 1982). Stoll was no friend of Wycliffe or the sil, but he did extensive research and refrained from the kind of spurious charges that have since been made. A Spanish source for extensive information on Waorani history, as well as criticisms against American Christians who have worked among the people, is Miguel Angel Cabodevilla, Los Huaorani en la Historia de los Pueblos del Oriente (cicame, 1999). The quality of some of the interviews and materials Cabodevilla has compiled is uneven, but his editorial voice generally is perceptive and balanced.
5. All English accounts are translations from versions of the story told by various Waorani eyewitnesses. Some early variations may have resulted from the fact that Rachel Saint was still learning the difficult Waorani language and misunderstood some elements of the accounts. Saint’s account appeared in an epilogue to the 1965 edition of Wallis, The Dayuma Story; see also an abridged version by Saint, “What Really Happened, Told for the First Time” in Decision, January 1966, p. 11. Anthropologist James Yost and writer John Man conducted extensive interviews in April 1987 with Geketa, the leader of the spearing party, including a visit to the site of the attack. According to a translated transcript, Geketa indicated that the bullet entered just above Nampa’s eye and lodged there. An edited version of Geketa’s story appeared in the sil film, produced in 1988, “Tell Them We Are Not Auca We Are Waorani,” where Geketa stated, “The man [missionary] shot Nampa.” Steve Saint reports a version where a Waorani woman grabbed the arm of a missionary with a gun. “Nampa was grazed by a gunshot and fell down hard.” He died a year later “while hunting.” See Saint, “Nate Saint, Jim Elliot, Roger Youderian, Ed McCully, and Peter Fleming, Ecuador, 1956: A Cloud of Witnesses,” in Susan Bergman, ed., Martyrs: Contemporary Writers on Modern Lives of Faith (HarperSanFrancisco, 1998), p. 151. The account in Saint’s book, The End of the Spear, is similar.
6. Wallis, Aucas Downriver, p. 100.
7. Amy Rogers, “Ecuadoran tribe transformed after killing of 5 missionaries,” Pittsburgh Post-Gazette, January 8, 2006. Among others, I was interviewed for this article. After a painful dispute, the sil asked Rachel Saint to leave the Waorani work in 1976. She did and later retired from the organization to return to Ecuador as a private citizen and live among the Waorani.
8. For Waorani agency and the missionary contribution, see James S. Boster, James Yost, and Catherine Peeke, “Rage, Revenge, and Religion: Honest Signaling of Aggression and Nonaggression in Waorani Coalitional Violence,” Ethos, Vol. 31, No. 4, pp. 471-494. Some isolated violence has continued.
9. These have included U.S. missionaries and Latin American nationals representing Mission Aviation Fellowship, the Plymouth Brethren, hcjb World Radio, and the Christian & Missionary Alliance.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromKathryn Long
Thomas Albert Howard
Is a Catholic out of place on Wheaton’s faculty?
- View Issue
- Subscribe
- Give a Gift
- Archives
The Wall Street Journal did evangelical higher education and, just maybe, the task of Christian unity a favor when it published a front-page story on the plight of Joshua Hochschild (January 7, 2006). A philosophy professor at Wheaton College, Hochschild was dismissed from the faculty after converting to Catholicism. The president of Wheaton, Duane Litfin, ruled Catholic theology incompatible with Wheaton’s statement of faith, to which all faculty assent at the beginning of their careers and renew upon signing their annual contracts, a customary practice at many evangelical colleges.
L’affaire Hochschild, as we might call it, is but the latest manifestation of a simmering conflict of opinion over how evangelical colleges should posture themselves toward the future. In many respects, the episode at Wheaton mirrors another celebrated incident from the1980s, when the literary critic Thomas Howard (no relation, oddly enough) was obliged to resign from Gordon College in Wenham, Massachusetts (my home institution) after converting to Catholicism. Like Hochschild, Howard wistfully boxed his books, but his departure raised more questions than it settled. Hochschild’s departure raises similar questions.
Is an evangelical liberal arts college (i.e., not a seminary and not a church), and one that prides itself on intellectual engagement, served by a statutory environment that effectively excludes all Catholics, and indeed most non-evangelical Christians, from the faculty ranks? Having commendably avoided the seductions of secularism in the 20th century, do evangelical colleges—such as, say, Wheaton, Taylor, Gordon, and Westmont—now suffer from another problem: superattenuated retrenchment, a defensiveness increasingly unbecoming in a world in which many evangelicals look upon the legacy of Mother Teresa about as favorably as that of Billy Graham? By refusing even the possibility of a single Catholic faculty member—including self-described “born again” Catholics or those with deep sympathy for Protestant theology—are evangelical colleges failing to take seriously the biblical mandate for Christian unity? Would the prospects of genuine ecumenical work be improved if evangelicalism’s best and brightest had a chance to rub shoulders with a Catholic scholar or two during their college years?
While the Wall Street Journal article prompts such questions, it misconstrues an important aspect of the contemporary Christian intellectual scene. Its author attributes the firing of Hochschild to a “new orthodoxy” sweeping through church-related higher education, a novel vigilance to uphold the religious mission of Christian colleges. This rings true in some respects, especially for mainline Protestant or Catholic institutions trying to recover religious identities compromised by historical forces analyzed trenchantly in James Burtchaell’s The Dying of the Light (1998).
But the problem with many evangelical colleges is not necessarily the dying of the light, but rather hiding it under a bushel, a determined attachment to the certainties of a subculture derived from fairly recent historical experience at the expense of new, promising opportunities for theological depth and ecumenical engagement. Indeed, the phrase “new orthodoxy” for many evangelical scholars today, far from conjuring up strictures in hiring, will call to mind Thomas Oden’s recent book, The Rebirth of Orthodoxy: Signs of New Life in Christianity (2003). A Methodist theologian at Drew University dispirited by the trajectory of liberal Protestantism, Oden has long called for a “new ecumenism,” not the ecumenism in which social concerns often edged out doctrinal considerations, but a unity built around Nicene Christianity, a robust doctrine of the church, and reengagement with a shared apostolic and patristic heritage. Oden’s call for an orthodox ecumenism—one that elides while still recognizing the significance of 16th-century conflicts—has been borne out in numerous scholarly projects in recent years. The cumulative impact of these efforts on evangelical thought and culture has been estimable.
Consider, for example, the trends analyzed in Colleen Carroll’s The New Faithful: Why Young Adults are Embracing Christian Orthodoxy (2002). A journalist interested in the religious climate among young people today, Carroll documents the enormous interest in ancient, liturgical Christianity among younger, educated evangelicals—sometimes leading to conversion to Orthodoxy or Catholicism, more often leading to greater attentiveness to tradition and ecclesiology, almost invariably leading to criticism of stale Protestant-Catholic polemics and a weariness with the fearmongering anti-Catholicism that has pervaded much of twentieth-century evangelicalism.
And this brings us to the rub. The Hochschild case at Wheaton has a recognizable generational-cum-theological aspect, a conflict between those who want to circle the wagons around 20th-century evangelical doctrinal formulations (above all, a pinched definition of biblical inerrancy increasingly qualified or disavowed by evangelical theologians), encoded pointedly in faith statements, and those who believe that the fullness of Christian expression predates and transcends the wisdom of the last few generations. Put differently: on the one hand, younger faculty and many students (with some sympathetic administrators and trustees) increasingly feel that if evangelical institutions do not broaden their faith statements in the direction of orthodoxy (in Oden’s sense), they risk intellectual narrowness and impoverish students’ ability to act upon Scripture’s ecumenical mandate. On the other hand, many senior administrators, such as those at Wheaton, and many trustees (with some sympathetic faculty and students), equate tampering with existing faith statements as a dive onto the slippery slope of secularism. If colleges alter their faith statements, President Litfin of Wheaton writes in his recent book Conceiving the Christian College (2004), the ultimate destination is “entirely predictable”: “the institution will wind up just another formerly religious school, basically secular in reality if not in name.”
To be fair, Litfin’s worries are not unfounded: the evangelical schools that make up the Council for Christian Colleges and Universities (CCCU) don’t need to look very far to find examples of once-Christian colleges long estranged from their original mission. As I have become acquainted with robustly Christian institutions and those living off the capital of a former glory, I’m persuaded that the future lies with the former, not the latter. Judicious hiring practices and faith statements therefore remain of abiding importance, not only to ensure a clear mission but—and one can argue this on liberal grounds—to nourish a rich institutional diversity in higher education. CCCU colleges have contributed greatly to this diversity, not by “celebrating diversity” in the abstract, but by being attentive to their actual mission.
And yet—and yet. As Carroll’s The New Faithful and other analyses suggest, we are living in a new era. Not only are the anathemas, divisions, and stereotypes of the 16th-century breach breaking down all around us at last, but also the fundamentalist-modernist controversies of the early 20th century, which account for much of the embattled sense of present-day evangelicalism, are increasingly remote from current challenges. If one uses political clout, publishing notice, and church attendance as barometers of cultural authority, evangelicals are now in the driver’s seat with respect to certain aspects of American society. (It bears remembering that power corrupts, as Lord Acton famously said, and power that retains an embattled sense of powerlessness, is, well, … Acton would have something pithy to say about this too.)
Signs of the new understanding across formerly hostile lines are aplenty and, for some, the catalogue below is a familiar one, but it’s important for Christian educators, not just scholars, to take these into consideration as they consider the future. Particular importance should be attributed to the following:
• The watershed of the Second Vatican Council (1962-65), especially the Council’s Decree on Ecumenism (Unitati Redintegratio), which made clear that “both sides were to blame” for the “crisis” of the 16th century, that “truly Christian endowments” exist outside the Church of Rome, and that greater cooperation with “separated brethren” was a theological necessity.
• A massive shift in opinion over the past few decades among evangelicals in their attitudes toward Catholics, from viewing them as apostates and threats to the American way of life to partners promoting a “culture of life”—or what Timothy George of Beeson Divinity School has memorably described as an “ecumenism of the trenches.”
• The historic papacy of John Paul II and his efforts toward mutual understanding, expressed best in Ut Unum Sint (“That they be one”), in which he even suggested that the Petrine Office is “open to a new situation” to promote ecumenical progress.
• Tremendous theological rapprochement, especially on the crucial doctrine of justification, the major point of contention during the Reformation. For many, the signing of the Joint Declaration on Justification in 1999 by representatives of the Lutheran World Federation and the Roman Catholic Church signaled the beginning of an hitherto unimaginable theological era.
• A greater understanding of the “catholicity” of the Reformation itself, as promoted by leading American Protestant theologians such as Carl Braaten and Robert W. Jenson in their book, The Catholicity of the Reformation (1996).
• The shift of Christianity’s center of gravity from the Atlantic North to the Global South and an attendant necessity for cooperation between Catholics and Protestants, lest 16th-century-style conflicts repeat themselves to the detriment of a compelling witness.
• The willingness of key thinkers, Protestant and Catholic, to point out the closing gap between former divisions. Witness Karl Lehman and Wolfhart Pannenberg’s The Condemnations of the Reformation Era: Do They Still Divide? (1990) and Mark Noll and Carolyn Nystrom’s Is the Reformation Over? (2005), extensively treated in these pages.
• Numerous collective efforts of unofficial theological cooperation, most notably in this country in the Evangelicals and Catholics Together initiative spearheaded by Richard John Neuhaus and Charles Colson.
• Greater efforts among some evangelical colleges to foster ecumenical understanding. The faculty at Gordon College, I’m proud to say, has begun relationships with St. Anselm’s College, a neighboring Benedictine institution, and Hellenic College, a Greek Orthodox college, for the purpose of conversation and mutual instruction.
• The considerable influence of what we might call the “Taizé ethos” among young people, and this French community’s commitment to serve as “a concrete sign of reconciliation between divided Christians and separated peoples.”
What do these developments add up to for evangelical higher education in general and for the maintenance of exclusionary faith statements in particular? How should schools proceed prudentially in this heady climate, faced with partisan A, who would like to abolish all faith statements in the name of academic freedom, and partisan B, who would reify current arrangements in perpetuity? Both partisans, I should reiterate, have good cause for their arguments and both recognize that genuine principles of intellectual integrity are at stake, not to mention practical concerns about alumni loyalty, faculty morale, student recruiting, and the like.
Let me suggest two provisional measures, which, although perhaps not entirely pleasing to the partisans, might at least create the necessary space for greater dialogue and understanding.
First, schools might consider establishing a working study group to examine the current faith statement, its rationale, and the intervening theological developments since its inception. For many evangelical colleges, present-day faith statements precede the aforementioned developments; many reflect lingering overtures to fundamentalist positions in the fundamentalist-modernist controversies of the twentieth century, which played out in a general Protestant milieu of entrenched anti-Catholicism. Such a study group could read together various documents, and present findings and recommendations to the faculty, administration, and board of trustees. Conversation often produces more conversation, and questioning can lead to questioning, but you have to start some place. Redoubled stasis is rarely the hallmark of dignified purpose.
Second, in light of the aforementioned developments, schools might consider an “exception clause” to current hiring practices. While maintaining current faith statements, this would recognize that certain scholars exist, Catholics but also Orthodox or mainline Protestants, who, while unable to sign the current statement, would not only respect but sympathetically engage the mission of the institution and offer themselves as a valuable conversation partner. Such an exception clause could even be designed restrictively, requiring for any given candidate the assent of the faculty senate and key administrators. This would not swing open the flood gates, but it might create the statutory possibility of, say, a Catholic of good will donning cap and gown on evangelical campuses, countering the tendency of such institutions to become, as someone quipped, coddling cocoons of the like-minded. Students would benefit enormously, for they would have the opportunity to hear the actual idiom of a “separated” brother or sister and they would thereby gain greater understanding of the distinctiveness of their own faith. All too easily, I have discovered, evangelical students can finish their undergraduate years with misperceptions of Catholicism inherited from their subculture largely intact. He who knows only one, knows none, the poet Goethe said about language. The same applies to expressions of the faith; young evangelicals need encounters with non-evangelical Christians not just to understand “the other” but to understand themselves.
What is more, an exception clause would amount to a principled measure on behalf of the institution. In the history of higher education, indeed in the history of most institutions, statutory changes are provoked, belatedly and awkwardly, by crisis and controversy. One could readily imagine, for example, a faculty member converting to Catholicism and then suing a college for discrimination if forced to leave her post. Presently, the courts might give preference to the institution in such a case, but this is a trend in jurisprudence that colleges shouldn’t bet the farm on. Or, a college might act unbecomingly from pecuniary interest alone, after, for instance, some future study demonstrated a significant number of ecumenically inclined (“new faithful”) parents are withholding their children and dollars from evangelical institutions for fear of a narrow education. (Alas, from anecdotal evidence, I know this is already taking place—and I have even heard promising young evangelical scholars express a preference not to teach at evangelical colleges, fearful of too restricted a range of theological outlook.)
Instead, it’s better to proceed upon theological principle and an astute reading of the times. For Christian educators, the virtue of prudence might sometimes demand an impassioned defense of the status quo, but this is not an inexorable law. What appears as a high-minded defensive strategy from present seats of authority risks appearing as unimaginative and narrowly preservationist from the standpoint of the future. And this precisely is the challenge (and opportunity) of evangelical educational leadership today. In addressing the challenge, it bears remembering that the task of leadership is not simply to express the loyalties of one’s constituents, but also to educate these loyalties in the direction of more capacious understanding and deeper propriety. Such actions might prove unpopular in the short term, but right action and popularity have always had a strained relationship.
The present challenge is especially pertinent to the current generation of educators who stand proudly in the reform or neo-evangelical tradition, associated with figures such as Carl Henry and Harold J. Ockenga. These thinkers, it will be remembered, rose to the occasion in the mid-20th century to challenge American fundamentalism for shortchanging the life of the mind and downplaying the need for social engagement. Carl Henry’s Uneasy Conscience of Modern Fundamentalism, in particular, stands out as a signpost of forward thinking in an otherwise uninspiring time for conservative Protestantism. Thankfully, great strides have been made with respect to intellectual seriousness and social engagement since the mid-20th century—and it bears noting that already in 1947 Henry spoke of a “truly spiritual ecumenicity” and the need to reconnect American evangelicalism to “the Great Tradition” of historic Christian orthodoxy.
Nevertheless, evangelical higher education still has an uneasy conscience to reckon with. The issue today is different from but perhaps not altogether unrelated to the one that Henry and Ockenga faced. In short, it’s a failure to understand that cultural authority necessitates greater magnanimity toward others, and that Christ’s words about Christian unity remain an imperative, not an option. Evangelicals need not fear the occasional non-evangelical Christian scholar in their midst, treating her like a infection to be excised. Rather, evangelicals should and can develop the institutional self-confidence to play the role of magnanimous host, recognizing in fact that there are certain crucial “other” voices that they should want among them. Indeed, what reflective evangelical parent in America today would not want the future Flannery O’Connor, G. K. Chesterton, or J. R. R. Tolkein to instruct their children?
But at an even deeper level, evangelical institutions should question the wisdom of current arrangements because they work against one of evangelicalism’s strengths: taking seriously the Great Commission. In Jesus’ high priestly prayer in John 17, He prays explicitly for the unity of the church: “I ask not only on behalf of these, but also on behalf of those who will believe in me through their word, that they may all be one.” Unity, and the fellowship it presupposes, makes truth attractive to those outside the fold; for, as Christ continues, “The glory that you have given me I have given them, so that they may be one, as we are one, … so that the world may know that you have sent me and have loved them even as you have loved me” (NRSV).
In the final analysis, the decision to welcome sympathetic Catholic scholars in the house of evangelical education should flow from the heart of the Gospel itself: from the evangelical concern about the Great Commission. Evangelism divorced from ecumenism, rightly understood, vitiates the cause it putatively serves. Evangelical liberal arts colleges are neither missionary agencies nor churches; they are not, in other words, on the front lines in proclaiming the gospel, baptizing and making disciples. But they are seats of intellectual growth, where young people can learn to think seriously and theologically; where ideas can be exchanged and improved upon; and alas, where divisions within the church’s history might be understood and, with grace, worked to overcome. Without an occasional flesh-and-blood Catholic on the faculty, this task is enormously compromised. And herein lies the cause of a new uneasy conscience.
Thomas Albert Howard is Associate Professor of History and the Director of the Jerusalem & Athens Forum at Gordon College. He recently published Protestant Theology and the Making of the Modern German University (Oxford, 2006).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromThomas Albert Howard
Sarah Hinlicky Wilson
A female apostle? Impossible!
- View Issue
- Subscribe
- Give a Gift
- Archives
The Bible on your shelf doesn’t actually exist. No exact original of it is to be found in Greek, Syriac, or any other ancient language. It is, instead, the product of hundreds of compiled parchments and papyri, containing big blocks of text or little bits of it, some ancient, some more recent, some ancient but recently discovered. Along the way they got copied into uncials and minuscules, dubbed with names to inspire novels (Codex Sinaiticus; Philoxeniana), and now are signified in the clearinghouse Nestle-Aland Novum Testamentum Graece as
It would seem to be a straightforward work of science to sort, date, and judge each of the texts, and in many ways it is. There are rules for comparing scraps of majuscules and scraped palimpsests. The scriptural scholars of bygone eras help contemporary ones through their own questions scrawled in the margins of their Bibles. And it’s not too hard to recognize and correct the ever-so-slightly incorrect transcriptions of some sleepy monk in a tomb-cold scriptorium. But the letter is not copied alone; so is the spirit and the meaning. Text critics are inevitably exegetes, and aspiring exegetes must also be text critics. This is Eldon Jay Epp’s basic principle for biblical studies.
Which brings us to Romans 16:7, embedded in the oft-overlooked collection of greetings to various Christian luminaries at Rome. Here Paul hails his “relatives who were in prison” with him, “prominent among the apostles” and “in Christ before” he was. This impressive pair is Andronicus and his coworker. The latter is sometimes called Junia—thus the KJV, every other English translation up till the 1830s, and nowadays the NRSV. The lion’s share of recent English Bibles, though, give the name Junias, with the –s on the end. The RSV specifies Andronicus and Junias as “my kinsmen” and “men of note among the apostles”; the Good News generously adds a footnote after Junias suggesting the name “June”; the NIV—most widely read of all contemporary versions—offers no footnoted alternative to Junias at all. The matter at stake in the choice of names is the simple question asked of everyone upon entry into this world: Is it a boy or a girl?
Until about a hundred years ago, the consensus was universal. Junia was a woman. Every church father, without exception, thought so. Even John Chrysostom, not exactly famous for positive thoughts about the female sex, commented, “How great the wisdom of this woman must have been that she was even deemed worthy of the title of apostle.” However, in a curious twist of fate, the church a millennium and a half later concluded not that her wisdom was so great, but that, if she was indeed worthy of the title of apostle, then she wasn’t a she at all. The very liberal vanguard that exalted the historical-critical study of the Bible found the leadership of a woman unthinkable, and so made Junia into Junias, a man—even though there is not a single record of the name Junias anywhere in ancient Rome.
The switcheroo from female to male was possible, in the first place, because the apostle’s name appears only once, in the accusative form “Junian.” (Exegetes need to be not only text critics, but first-rate grammarians as well.) The suffix –n is found on both masculine and feminine nouns. The one textual clue to help choose between them, in this case, is an accent mark. A do start inserting accent marks in their fresh copies, they always choose the acute and never the circumflex. The only variant that they display is to the name Julia. Even this mistake is telling: Julia is another woman’s name (in fact, the most popular Roman name for women), and probably first appeared when one of those notoriously sleepy scribes skipped ahead to Romans 16:15 and borrowed the name from there. The grammatical and even accidental choices of the medieval copyists reveal the whole interpretive tradition behind them.
And yet—masculine constructions of must be a man, together begat great scholarly ingenuity. Sometimes Roman surnames were contracted to shorter forms; an example is Patrobas in Romans 16:14, which is short for Patrobios. Junias, then, was proposed as a contraction of the attested Roman surname Junianus. There isn’t the slighest shred of evidence that this is what happened, yet somewhere along the way the contracted-Junianus theory turned into a sure thing. Epp documents how the idea grew from conjecture to certainty in its own kind of scribal-transmission error. It culminated in the 1927 Nestle Greek New Testament, where the distinctly masculine version of the name, complete with circumflex, was offered as the definitive and undisputed reading—even claiming the oldest unaccented texts in its defense! Only in 1998 did the standard Nestle-Aland and United Bible Society editions replace the masculine with the feminine name. Accordingly, few English translations reflect the correction.
Is it really possible that plain textual evidence could be so obscured by plain bias? If John Chrysostom, of all people, allowed that Junia could be a woman and an apostle at the same time, could the progressive leaders of the twentieth century be guilty of such blatant prejudice? Epp cites the report of Bruce Metzger’s Textual Commentary to the UBS (2nd. ed.), from as recently as 1994, explaining the dispute about the name:
Some members, considering it unlikely that a woman would be among those styled “apostles,” understood the name to be masculine Wörterbuch, pp. 70f.). Others, however, were impressed by the facts that (1) the female Latin name Junia occurs more than 250 times in Greek and Latin inscriptions found in Rome alone, whereas the male name Junias is unattested anywhere, and (2) when Greek manuscripts began to be accented, scribes wrote the feminine
In other words, the Junia reading had textual evidence on its side; the Junias reading had none; yet until less than a decade ago, the latter still won the day.
Could Paul have called a woman an apostle? He certainly did not use the term lightly. He was compelled to defend his own apostolicity, as the last and untimely born, to the disciples of Jesus, whose friendship with the Lord automatically granted them apostolic status. It can only be the highest of Pauline praise to call Andronicus and Junia prominent among the apostles.
That he was capable of applying this praise to a woman is suggested not only by the textual evidence but by the context of Romans 16 as well. A woman and deacon by the name of Phoebe is entrusted with the letter itself. Seventeen men are greeted along with eight women (omitting Junia), but of the twenty-five, seven of the women are described as contributing the most to the churches, while only five men receive that distinction. Prisca is listed ahead of her husband Aquila, and in two places (vv. 6 and 12) four of the women are said to have “worked very hard,” the same verb Paul uses to describe his own apostolic ministry in 1 Cor. 4:12, Gal. 4:11, and Phil. 2:16.
The early church thought that Junia the woman was an apostle, yet remained indifferent to the implications of her status. The modern church disbelieved the apostolicity of any woman, and so ignored the hard evidence. Between the textual and contextual witnesses, in the interplay of exegesis and grammar, Epp draws the reasonable conclusion that there was indeed a female apostle named Junia. But, he notes, “human beings carry out not only textual criticism and interpretation, but implementation as well, and that makes all the difference.”
Sarah Hinlicky Wilson is a doctoral student in systematic theology at Princeton Theological Seminary.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromSarah Hinlicky Wilson
Susan Wise Bauer
Why Paul would have flunked hermeneutics.
- View Issue
- Subscribe
- Give a Gift
- Archives
Sometime this past year, I was reading Sumerian poetry (for work, not for pleasure) when I came across a 4,000-year-old epic describing the Sumerian paradise, a garden city free of evil and sickness where
the raven utters no cry …
the lion kills not,
the wolf snatches not the lamb,
unknown is the kid-devouring wild dog.1
If this doesn’t bring you up short, turn to Isaiah 11, where the prophet tells us that when the Messiah returns, the wolf will live with the lamb, the lion will eat straw like the ox, and that the earth will be full of the knowledge of the Lord as the waters cover the sea. The words in which Isaiah describes the great hope of the believer, the words that inform John’s own vision of the new heavens and earth: those words don’t seem to have originated with—well, with God.
This is the opening dilemma of Peter Enns’ Inspiration and Incarnation: Evangelicals and the Problem of the Old Testament. The uniqueness of the Old Testament as a piece of literature has been seriously dented by the discovery of more and more ancient texts that predate (and anticipate) biblical forms. Creation story, flood story, prophecy, proverb: all of these were in use in Mesopotamia long before the first biblical book was penned.
So how can we claim that the Old Testament—and it alone from all the texts of that pre-Christian age—is divine communication from God to man? It’s an interesting question, but it turns out to be small potatoes compared with the next problem that Enns, professor of Old Testament at Westminster Theological Seminary, sets before us: It seems as though the Old Testament was also puzzling for Matthew and Luke and Paul. In fact, from where we sit, it looks as though the apostles were lousy at exegesis.
Enns gives us a number of startling New Testament passages that use the Old Testament by wrenching the original words violently out of context and even altering them. For example, Matthew 2 tells us with confidence that Jesus’ trip down to Egypt as a boy (and his eventual return to Galilee) fulfilled Hosea 11:1, “Out of Egypt I called my son.” But Hosea 11:1 is simply describing the Exodus; it is a passage, Enns points out, which “is not predictive of Christ’s coming but retrospective of Israel’s disobedience.” In other words, Matthew is shamelessly proof-texting, in a way that would get any student enrolled in Practical Theology 221 (Expository Skills) sternly reproved.
Or consider Paul’s use of Isaiah 59:20 in Romans 11, where he winds up an argument by announcing, “And so all Israel will be saved, as it is written: ‘The deliverer will come from Zion.’ ” But Isaiah says something quite different: “The Redeemer will come to Zion,” he tells us.
Changing the words of Scripture to suit your own purposes? Paul wouldn’t get past the first week of New Testament 123 (Hermeneutics) like that. He is breaking every rule of thoughtful evangelical scholarship, which holds that the proper way to approach inerrant Scripture is with careful grammatical-historical exegesis: painstaking analysis of each word of the Scripture and its relationship to other words, the setting of the sentence in the verse, the verse in the chapter, the chapter in the book, and the book in the historical times of its composition.
Of course Paul breaks those rules, Enns says; they are our rules, not Paul’s. Inspiration and Incarnation offers us passages from such extrabiblical texts as the Wisdom of Solomon and the Book of Biblical Antiquities in order to show that, far from doing something extraordinary and super-apostolic, Paul and Matthew were doing exactly what most of their contemporaries did. Both apostles had been trained by the scholars of their day, the so-called “Second Temple” period, to come to a text looking for the “mystery” beneath the words: the deeper truth that an untrained reader might not see. Both of them came to the Old Testament already convinced that they knew what that mystery was: the incarnation, death, and resurrection of God in Jesus Christ.
Paul knows, by faith, that this truth underlies all of the Old Testament. He knows that it will be in Isaiah; he looks for it in the 59th chapter, and—as we might expect—he finds it. And if he has to change a preposition or two to make this “mystery” clear to the rest of us, he is not violating any sort of interpretive rule. His own principles of exegesis allow him to “read into the prophet’s words,” as Enns puts it, what he “already knew those words were really about.”
This is the exactly the kind of exegesis that terrifies most evangelicals. The man who admits that meanings can be “read into” Scripture stands on the fabled slippery slope, right above a sheer drop-off, while below him churns a sea of relativism, upon which floats only a single overloaded lifeboat, captained by a radical feminist gay & lesbian & transgender activist who is very anxious to make the final decision about who gets pitched overboard.
Nevertheless, Enns is willing to plant his feet on the slope and stand there long enough to ask two disturbing questions. The first is this: Are we really saying that the apostles used an interpretive method that was not particularly inspired, and which in the hands of many Second Temple scholars led to enormous distortions of the original texts? And that this “mishandling” of the Old Testament produced, somehow, an inspired and trustworthy New Testament? Enns’ answer to this is an unequivocal yes. “This makes revelation somewhat messy,” he writes, “but … it would seem that God would not have it any other way. For the apostles to interpret the Old Testament in ways consistent with the hermeneutical expectations of the Second Temple world is analogous to Christ himself becoming a first-century Jew.”
In other words, the God who spoke to man through Christ also speaks to man through Scripture, and in much the same way: he enters into our world and uses our own cultural patterns to reveal himself. We cannot insist that there is a separate, ahistorical, all-divine message in any part of the Bible that somehow triumphs over all contemporary thought and custom. This, Enns writes, is a modern version of the ancient Docetic heresy, which held that Christ only seemed human. “What some ancient Christians were saying about Christ,” he writes, “… is similar to the mistake that other Christians have made (and continue to make) about Scripture: it comes from God, and the marks of its humanity are only apparent, to be explained away.”
Which leads Enns to the next disturbing question. If Paul and Matthew use Second Temple techniques to interpret the Old Testament, should we follow their example—beginning with what we know to be true, and taking our interpretation from there?
This question gets a conditional yes: as long as we begin with the same central mystery as Paul and Matthew, the “reality of the crucified and risen Christ, [which is] both the beginning and the end of Christian biblical interpretation.” This reality, not the method which we use to affirm it, should be at the center of our doctrine of inerrancy.
This means, unfortunately, that we cannot cling to the comforting notion that grammatical-historical exegesis is a kind of high road to truth. Like the Second Temple exegesis of Paul and Matthew, it is a method—the method produced by our own time and place. Like the Second Temple exegesis, it can produce both truth and error. “Our own understanding of the Old Testament—and the gospel—has a contextual dimension,” Enns writes. “As subjective as this sounds, it is nevertheless inescapable… . If any of this is troublesome, it may be because we have not adequately grappled with the implications of God himself giving us Scripture in context.”
Well, of course it is going to be troublesome, and Enns, who knows the evangelical community well, is perfectly aware of it. But Inspiration and Incarnation makes clear that Scripture, like the Incarnation itself, is a scandal: like Christ crucified, a stumbling block to the wise. It takes ancient and unreal images, like the lion and the lamb together, and demands that we look back on them with faith in the resurrection of Christ. It claims, against all common sense, that this faith will transform the dead pictures into a living hope. It is loaded with problems and imperfections. And it is the Word of God, which means that we must engage in as much prayer as study of Hebrew vocabulary, as much faith as reading up on the history of the ancient world, as much charity (something remarkably lacking in most of the debates over how to read Scripture) as Greek grammar. It means that when an evangelical scholar like Enns—teaching in an evangelical seminary, a faithful member of his local church—writes, “There do not seem to be any clear rules or guidelines to prevent us from taking [the process of interpreting Scripture] too far,” we must recognize this as an honest and truthful statement of the difficulties rather than an open door to chaos. It means, in the end, that we must take incarnation seriously.
Do we know what we are saying when we stand in an American church on a Sunday morning in 2006 and recite, “He was conceived of the Holy Spirit, born of the virgin Mary, suffered under Pontius Pilate, was crucified, dead, and buried”? This polished, grammatical, creedal acknowledgment, transmitted to us via centuries of church tradition, of liturgy and Advent custom and carols, of Bible-school illustration and triumphant hymnody, has scrubbed up and made deceptively commonplace the essential weirdness of God becoming man.
I believe in the Incarnation, but then on the other hand I have never had to stand face-to-face with a grimy, troublemaking, blue-collar worker who claims to be God.
I do have to stand face-to-face with the Old Testament and its excessive, contradictory, harsh, alien texts. Enns encourages us to recognize the Old Testament for what it is: the anteroom of the Incarnation, the practice ground where we are brought nose-to-nose with the true difficulty of believing that God ever came to earth.
Susan Wise Bauer is writing a history of the world for W.W. Norton.
1. Trans. Samuel Kramer in History Begins at Sumer: Thirty-Nine “Firsts” in Recorded History, 3rd. ed. (Univ. of Pennsylvania Press, 1981), p. 144.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromSusan Wise Bauer
John Wilson
- View Issue
- Subscribe
- Give a Gift
- Archives
A few days ago, Wendy and I were in upstate New York for a literary festival at Houghton College. I shared the program with the poet Julia Kasdorf, author of two excellent collections from University of Pittsburgh Press, Sleeping Preacher and Eve’s Striptease; Tim Stafford, who spoke both about journalism and about historical fiction and read from his first-rate novel about the abolitionist movement, Stamp of Glory; Justin Niati, an African journalist who was forced to flee the Congo more than a decade ago after he exposed corruption in his native land and who currently is an assistant professor of French at Houghton; and a number of student writers.
At one session I spoke to students about “The Role of the Journal in Culture.” That may sound rather grandiose, but it’s a subject that anyone who edits a publication resembling BOOKS & CULTURE needs to keep in the back of his mind. And now and then, something occurs to move it up to the front burner.
In its issue of April 3, The New Republic featured as its cover story an essay-review by Damon Linker, “Without a Doubt: The Christianizing of America,” ostensibly occasioned by Richard John Neuhaus’ Catholic Matters: Confusion, Controversy, and the Splendor of Truth (Basic Books). Linker was until recently an editor at First Things, where Neuhaus is famously editor-in-chief. The essay, posted on TNR’s website on March 24, is very long—16 pages in the printer-friendly version I read—and very strange.
Some of its constituent parts, to be sure, are all too familiar. Linker’s fevered warnings against the “offense that Neuhaus’ political theology gives to American pluralism and civility” are of a piece with Kevin Phillips’ American Theocracy: The Peril and Politics of Radical Religion, Oil, and Borrowed Money in the 21st Century (Viking) and dozens of other exercises in apocalyptic huffing and puffing. Linker tells us again and again how “radical” Neuhaus’ program is, but he never gets around to saying what terrible things will happen when theocracy is established. The excommunication of Garry Wills?
Then there are the staples of anti- Catholic propaganda—above all the notion that Catholic faith entails a blind submission to the authority of the Church, a “comprehensive and hermetically sealed religious ideology that will definitively insulate [the believer] from doubt.” Evidently Linker has not read, for example, the Introduction to Christianity written by Joseph Cardinal Ratzinger in the late 1960s, in which Ratzinger observes that
both the believer and the unbeliever share, each in his own way, doubt and belief, if they do not hide from themselves and from the truth of their being. Neither can quite escape either doubt or belief; for the one, faith is present against doubt; for the other, through doubt and in the form of doubt. It is the basic pattern of man’s destiny only to be allowed to find the finality of his existence in this unceasing rivalry between doubt and belief, temptation and certainty.
And certainly we’ve heard, ad nauseum, about the need for “traditionalist believers” to “adapt to modernity by embracing at least some degree of liberalization,” though unlike some more forthright players in this conversation—John Shelby Spong comes to mind—Linker never makes clear precisely what adaptations are required.
What makes the essay odd, even a little creepy, is its personal dimension. Here is a piece by a man who was until very recently in the bosom of First Things, portraying Richard John Neuhaus as the Cardinal Richelieu of the Religious Right, a megalomaniac whose machinations imperil American society with “the threat of sacralized revolutionary violence.” (Students of rhetoric may want to ponder the way in which Linker suggests that Neuhaus suffers from pathological delusions about his own world-historical significance even as the essay is casting him as a Catholic version of Osama bin Laden.)
I happen to agree with much of Linker’s critique of the November 1996 First Things symposium, “The End of Democracy?” But Linker’s essay is as intemperate as anything in that symposium. I’ve never met Linker, nor have I talked with anyone at the magazine about his article, but it seems to be driven by a personal animus that has little to do with the issues at stake.
If the primary theme of Linker’s essay is “The Catholicizing of America” (not, as the subtitle has it, “The Christianizing of America”), evangelicals play a part in his narrative as well. You may recall a TNR piece by Franklin Foer, apropos the Alito nomination, published shortly before Foer was named as TNR’s new editor (“Brain Trust,” November 14, 2005). “In 1994,” Foer begins, “the eminent evangelical historian Mark Noll wrote a scorching polemic about his own religion called The Scandal of the Evangelical Mind.” Pause there for a moment to note Foer’s utter cluelessness. Noll’s “religion”— my religion, Richard John Neuhaus’ religion—is Christianity. Evangelicalism is not a religion, and no one with more than a journalist’s crash-course briefing on the subject would think otherwise. Remind me to write a piece sometime about the scandal of Franklin Foer’s mind, taking Foer as representative of the very bright people who routinely pontificate these days about evangelicals and religion in America. The scandal is that such intelligent, generally well-educated folk are as ignorant of the subject as they are confident in their pronouncements.
It was in that piece that Foer laid out the relationship between Catholics and evangelicals for the benefit of TNR’s readers, describing what he called “the reality of social conservatism: Evangelicals supply the political energy, Catholics the intellectual heft.” And again: in the culture wars, “evangelicals didn’t just need Catholic bodies; they needed Catholic minds to supply them with rhetoric that relied more heavily on morality than biblical quotation.”
Now here is Linker on the same topic:
Countless press reports in recent years have noted that much of the religious right’s political strength derives from the exertions of millions of anti-liberal evangelical Protestants. Much less widely understood is the more fundamental role of a small group of staunchly conservative Catholic intellectuals in providing traditionalist Christians of any and every denomination with a comprehensive ideology to justify their political ambitions. In the political economy of the religious right, Protestants supply the bulk of the bodies, but it is Catholics who supply the ideas.
And later in the essay we hear of “Neuhaus’ first tentative attempt”—in Naked in the Public Square—
to solve the problem of the evangelicals by developing an alternative way for them to talk about religion in public. Instead of referring to their personal religious experiences, they would adopt a nondenominational “public language of moral purpose,” as well as learn to make more sophisticated, intellectually respectable arguments about American society and history, democracy and justice, culture and the law.
The problem of the evangelicals! Is that really how They talk about Us? There’s too much confusion here, as Bob Dylan said; it’s hard to know where to begin. In general, the figures most readily identified with the Religious Right—Jerry Falwell, Pat Robertson, James Dobson, Tim LaHaye, et al.— have been negligibly influenced by Catholic thought. Among evangelical intellectuals, Catholicism is much more influential than it was a generation ago, but it is only one stream among many shaping public discourse among evangelical élites, and certainly not on a par with the Reformed tradition represented by thinkers such as Nicholas Wolterstorff, Richard Mouw, and many others. Hard as it may be for Foer and Linker to grasp, evangelicals are not entirely dependent on crumbs from the Catholic table.
And so on. For a corrective to Foer and Linker on this score, a good place to start is A Public Faith: Evangelicals and Civic Engagement, edited by Michael Cromartie (Rowman & Littlefield/Ethics and Public Policy Center).
Who knows what really goes on in the offices of First Things? It was my good friend Jody Bottum, after all (who left The Weekly Standard to become editor of First Things a year or so ago), who introduced me to “The Inquisitor,” a lay Catholic hitman featured in a series of novels in the 1970s written by Martin Cruz Smith under the pen name Simon Quinn. If I were Damon Linker, I’d watch my back. Perhaps in those labyrinthine chambers, where bottles of single-malt Scotch are no doubt more numerous than Bibles, Father Richard John Neuhaus mocks his intellectually challenged evangelical allies while planning the coup d’état that will turn the United States into a Catholic theocracy once and for all. Politics—and religion too—makes strange bedfellows.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromJohn Wilson
Mark Noll
- View Issue
- Subscribe
- Give a Gift
- Archives
Books like Rough Crossing raise a large, important, and tragically enduring question.1 It is whether the United States’ historical profession to be a land of liberty should be taken seriously. Ask generations of willing immigrants—ask adherents of countless religious minorities persecuted for one reason or another in their homelands—ask entrepreneurs beyond counting, the numberless founders of voluntary faith-based organizations or the great legion of American purveyors of print—and the answer must, on balance, be a conclusive yes. But concentrate on the history of African Americans, and the answer is not nearly so obvious. Is their experience an exception that proves the rule, or is it a deadly fly poisoning a hypocritical ointment?
Rough Crossings: Britain, the Slaves and the American Revolution
Simon Schama (Author)
Ecco
496 pages
$5.84
Simon Schama’s Rough Crossings joins a gathering stream of publications that will doubtless grow larger in the approach to March 25, 2007, which marks the 200th anniversary of the final enactment of the British Parliament’s “Bill for the Abolition of the Slave Trade.”2 For a comparison germane to Schama’s purposes, it is pertinent to note that the United States Congress also enacted a ban on slave trading in 1807. Yet the effect of the American action was only marginal. For decades after 1807, the British naval squadrons that were dispatched to the west coast of Africa to enforce Parliament’s action regularly interdicted slave ships bound with their human cargo to North America.
This book is presented as the story of American slaves who during the American Revolution escaped from “the Sons of Liberty” in order to find refuge, manumission, and at least some support from the British “tyrants.” More than the subtitle might suggest, however, it is also about the zealously persistent British reformers who championed abolition and expended great energy in pursuing that goal on three continents. In its first half the key figure is Granville Sharp (1735–1813), a Quaker-influenced evangelical whose lifelong anti-slavery advocacy began in 1765 when on the stoop of his physician-brother’s surgery in London he encountered Jonathan Strong, a West Indian slave brutally beaten by his master to within an inch of his life. In the book’s second half Schama’s lens is John Clarkson (1764–1828), a young naval lieutenant who in 1791–1792 persuaded over 1,000 Loyal blacks, who were living in Nova Scotia and New Brunswick after being rescued from their bondage in the United States, to emigrate to the west coast of Africa, where a British philanthropic company hoped to create a new life for librated slaves in Sierra Leone. Clarkson, like Sharp, was an earnest evangelical. For him, as for Sharp, there exists an abundance of published and archival source material, which Schama exploits very well.
Although the quantity of surviving sources makes it easier for Schama to depict the white philanthropists who promoted antislavery—as well as their white opponents—he nonetheless delivers a great deal of compelling reading on the freed slaves for whom the philanthropists mobilized. Their stories make up the most memorable parts of the book. Some of these accounts document levels of perseverance and integrity beside which the martyrologies and heroic deeds of the American search for “life, liberty, and the pursuit of happiness” pale by comparison.
There was, for example, Shadrack Furman of Acamac County, Virginia, who in 1781 went to work for the British after his house was destroyed by Continental forces. After supplying information to General Cornwallis, Furman was captured by Americans, given 500 lashes, beaten about the head so that he lost most of his sight, had his leg chopped almost off with an ax, and was left to die in a field. But Furman recovered and went on to serve aboard a British ship and as an agent on land in the last days of the war. To indicate that emancipation by the British was never a unmitigated blessing, it took Furman several years of scrounging in London to convince a Court of Loyalist Claims that he was, in his mutilated body, who he said he was, and to be awarded a pension of £8 per year (or about the annual wage of a young servant).
One of the individuals whose biography encompasses all of the rough crossings of Schama’s title was Mary Perth, who enters the story as a 36-year-old slave in Norfolk County, Virginia. Shortly after Lord Dunmore, the last royal governor of the colony, issued a declaration on November 7, 1775 that promised emancipation to all slaves who escaped to British lines, Perth and the other slaves on her plantation took up the governor’s offer. After harrowing transits and many narrow escapes, she was eventually taken with other Loyal blacks to Shelbourne in Nova Scotia, where she was promised civil rights and a parcel of land.
But delivery on these promises proved spotty, and Perth was one of the blacks who responded when, in the fall of 1791, at an African Methodist chapel in Birchtown, Nova Scotia, John Clarkson delivered a moving appeal for migration to Sierra Leone. After surviving a storm-ravaged winter voyage to West Africa, she set up as a shopkeeper in Freetown, the new colony’s main settlement, where, in 1793, she voted in an election for neighborhood committeemen and so became one of the first women anywhere in the world to exercise an electoral franchise. In 1794 her shop was looted when a French naval squadron occupied Sierra Leone during the early days of the great international war between France and Britain. In the aftermath of this occupation, Perth exerted herself to extend hospitality to Zachary Macaulay, the governor of the colony, who had replaced John Clarkson and his benevolent rule with a much harsher regime. When the French ships began their cannonade, she led children from one of the colony’s schools through the rainforest to a neighboring village of Tembe tribespeople. Macaulay and Freetown’s white residents were much less at home outside the colony, but when he too arrived in the village, Mary Perth made him tea and insisted that he spend the night in one of the settlement’s few beds.
David George’s story is even more notable. He was a slave near Savannah, Georgia, when shortly before the American Revolution he experienced a dramatic conversion. While still enslaved, George helped found perhaps the first black Baptist church in what became the United States. When the war began he escaped for his life and liberty to the British and was eventually taken with other black Loyalists to Nova Scotia. There he persevered through intense local persecution by whites resentful of the crumbs that were being distributed by British colonial officials to a few of George’s fellow blacks, and he organized probably the first black Baptist church in what would become Canada. He too responded to the Sierra Leone appeal, and eventually became a kind of right-hand man for John Clarkson in shepherding the migrants during their voyage and their first months in the new colony, where he established what was probably the first black Baptist church in Africa.
When Clarkson was recalled to London in early 1793, he took George with him. Schama’s verve as a writer, which sparkles throughout the book, catches the poignancy of that journey: George, “who had in his time lived with slaves, Indians and British soldiers, and who had tramped, run and slogged through swamps and creeks, snowdrifts and river ice, was now to be faced with the high hats, white bonnets and rosy cheeks of Home Counties Baptists.”
In England, George was befriended by John Rippon, one of the era’s great Baptist leaders. Under his patronage, George wrote up an account of his life for a Baptist periodical that Rippon edited, which means that more is known about George than almost any of his fellows. Sadly, when they were both in England, George fell out with Clarkson when the latter was dismissed by the directors of the Sierra Leone Company for arguing too abrasively for the rights of the Nova Scotia blacks. Schama’s tone toward George cools after this point in the story, which is regrettable since in episodes that Schama only sketches, George would later argue fiercely with Governor Macaulay on behalf of the blacks’ political rights, and also for what had become their distinctive Christian beliefs and practices.3
One final story suggests the ironies at work in an age where, as in our own, the promotion of “freedom” was as haphazard as it was ardent. Henry Washington, a slave whose patronymic came from his owner, had also come over to the British after Lord Dunmore’s declaration. (In Schama’s colorful phrase, he “deserted General George for King George.”) This Washington was part of the Nova Scotia band recruited by Clarkson, and he became one of the leaders among the Sierra Leone settlers. But when Governor Macaulay tried to crack down on blacks who insisted that the Sierra Leone Company keep the promises Clarkson had made in Nova Scotia, Washington took part in a mini-revolt against the British. After that revolt was put down, this former slave left Sierra Leone to seek his livelihood in the African wilds, and from that point he is lost to recorded history.
Schama, the author of big and lively books on Dutch civilization, Rembrandt, the French Revolution, and (for the bbc) the whole of British history, is a terrific writer. Here he is, for example, on Halifax, Nova Scotia, when the liberated American slaves arrived:
There was the North British Club, where the Scots could rub their chins, exchange gloomy intelligence about the shocking state of trade and shake their heads at the follies of the world. There was the Salt Fish Club, where the Anglo-Irish could speak their piece about the Scots and pass the decanter. There were prayers and blusters, wagers and seductions. It was like most other eighteenth-century commercial towns in the British Atlantic empire: greedy, gossipy, and parochial, with eyes much bigger than its stomach.
And yet for all such stylistic skill, and despite the moral depths this story plumbs, Rough Crossings suffers from one nearly fatal weakness. That weakness is Schama’s inability to fathom the religion that drove philanthropists like Sharp and Clarkson and that was, if anything, even more vital to many of the freed slaves at the heart of his story. On the evangelical enthusiasms displayed in different forms by the whites and the blacks, Schama’s tone throughout is wearily patronizing. For instance, Jonas Hanway, an energetic Londoner who supported countless philanthropies, is described as representing “a certain kind of busy, charitable Englishman.” Schama mentions one of Hanway’s tracts—with a very long title that begins Advice to a Farmer’s Daughter …—and then says he wrote “many more that all said more or less the same thing: abhor vice, pray, get up early.”
And here is Schama’s concluding paragraph at the end of a exquisitely narrated account of mob actions by white Nova Scotians against the black Loyalists in the summer of 1784:
Among those who had lost his house in the riot was the Baptist pastor David George, described by a local merchant, Simeon Perkins, as “Very Loud,” and who had persisted in preaching to his flock in the Shelbourne meeting house, even while the mob surrounded it with flaming torches, threatening to burn it to the ground. But then David George was not one to abandon his faith, for while the Lord was with him, he feared no evil.
Most egregiously, the emotion-packed meetings that the twice-migrated blacks held in Sierra Leone’s five or six Methodist and Baptist chapels—each and every night!—elicit from Schama no commentary, except that the noise could sometimes be heard by the whites on board ship in the harbor or in their own lodgings in the town. Such treatments are invariably clever, and they do not exactly dismiss the motives that drove the book’s key actors. But neither do they give those motives their due.
And so we are left with a hole in the middle of an engaging, important, and very readable book. Which is a real shame, since if there was (and is) any possibility of saving the “language of liberty” from the cant and hypocrisy it has so often endured in modern Western history, it must surely come from the depth of religion that inspired the lives of Granville Sharp, John Clarkson, Mary Perth, and David George.
Mark Noll is McManis Professor of Christian Thought at Wheaton College and the author most recently of The Civil War as a Theological Crisis, just published by the University of North Carolina Press. In July he will assume a new position in the History Department at the University of Notre Dame.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
1. Other noteworthy recent volumes that raise the same question include James B. Bennet, Religion and the Rise of Jim Crow in New Orleans (Princeton Univ. Press, 2005); Edward J. Blum, Reforging the White Republic: Race, Religion, and American Nationalism, 1865-1898 (LSU Press, 2005); and Michael O. Emerson and Christian Smith, Divided by Faith: Evangelical Religion and the Problem of Race in America (Oxford Univ. Press, 2000).
2. Two recent books that, like Schama’s volume, effectively popularize material that has been much discussed in professional historical work include Adam Hochschild, Bury Their Chains: Prophets and Rebels in the Fight to Free an Empire’s Slaves (Houghton Mifflin, 2005); and Steven M. Wise, Though the Heavens May Fall: The Landmark Trial that Led to the End of Human Slavery (Da Capo, 2005), which treats the 1772 ruling by British Chief Justice Mansfield that a West Indian slave, James Somerset, could not be enslaved on English soil.
3. A valuable book that Schama misses, which documents George’s conflicts with Macaulay, is Grant Gordon, From Slavery to Freedom: The Life of David George, Pioneer Black Baptist Minister (Baptist Heritage in Atlantic Canada, 1992).
- More fromMark Noll