Do not be proud of the fact that your grandmother was shocked at something which you are accustomed to seeing or hearing without being shocked. It may be that your grandmother was an extremely lively and vital animal, and that you are a paralytic.
—G. K. Chesterton, As I Was Saying

How but in custom and in ceremony Are innocence and beauty born? Ceremony’s a name for the rich horn, And custom the spreading laurel tree.
—W. B. Yeats, “A Prayer For My Daughter”

“Seven and a half hours of mild, unexhausting labour, and then the soma ration and games and unrestricted copulation and the feelies. What more can they ask for?”
—Mustapha Mond in Huxley’s Brave New World

I remember the first time I noticed the legend “cultural instructions” on the brochure that accompanied some seedlings. “How quaint,” I thought, as I pursued the advisory: this much water and that much sun, certain tips about fertilizer, soil, and drainage. Planting one sort of flower nearby keeps the bugs away but proximity to another sort makes bad things happen. Young shoots might need stakes, and watch out for beetles, weeds, and unseasonable frosts …

The more I pondered it, the less quaint, the more profound, those cultural instructions seemed. I suppose I had once known that the word “culture” comes from the capacious Latin verb “colo,” which means everything from “live, dwell, inhabit,” to “observe a religious rite” (whence our word “cult”), “care, tend, nurture,” and “promote the growth or advancement of.” I never thought much about it.

I should have. There is a lot of wisdom in etymology. The noun “cultura” (which derives from colo) means first of all “the tilling or cultivation of land” and “the care or cultivation of plants.” But it, too, has ambitious tentacles: “the observance of a religious rite,” “well groomed” (of hair), “smart” (of someone’s appearance), “chic, polished, sophisticated” (of a literary or intellectual style).

It was Cicero, in a famous passage of the Tusculan Disputations, who gave currency to the metaphor of culture as a specifically intellectual pursuit. “Just as a field, however good the ground, cannot be productive without cultivation, so the soul cannot be productive without education.” Philosophy, he said, is a sort of “cultura animi,” a cultivation of the mind or spirit: “it pulls out vices by the roots,” “makes souls fit for the reception of seed,” and sows in order to bring forth “the richest fruit.” But even the best care, he warned, does not inevitably bring good results: the influence of education, of cultura animi, “cannot be the same for all: its effect is great when it has secured a hold upon a character suited to it.” The results of cultivation depend not only on the quality of the care but the inherent nature of the thing being cultivated. How much of what Cicero said do we still understand?

Culture survives and develops under the aegis of permanence. Instantaneity—the enemy of permanence—is one of the chief imperatives of our time. It renders anything lasting, anything inherited, suspicious by definition.

In current parlance, “culture” (in addition to its use as a biological term) has both a descriptive and an evaluative meaning. In its anthropological sense, “culture” is neutral. It describes the habits and customs of a particular population: what its members do, not what they should do. Its task is to inventory, to docket, not judge.

But we also speak of “high culture,” meaning not just social practices but a world of artistic, intellectual, and moral endeavor in which the notion of hierarchy, of a rank-ordering of accomplishment, is integral. (More etymology: “hierarchy” derives from words meaning “sacred order.” Egalitarians are opposed to hierarchies in principle; what does that tell us about egalitarianism?) Culture in the evaluative sense does not merely admit, it requires judgment as a kind of coefficient or auxiliary: comparison, discrimination, evaluation are its lifeblood. “We never really get near a book,” Henry James remarked in an essay on American letters, “save on the question of its being good or bad, of its really treating, that is, or not treating, its subject.” It was for the sake of culture in this sense that Matthew Arnold extolled criticism as “the disinterested endeavour to learn and propagate the best that is known and thought in the world.”

It is of course culture in the Arnoldian sense that we have primarily in view when we speak of “the survival of culture.” And it is the fate of culture in this sense that I will be chiefly concerned with in this essay. But it would be foolish to draw too firm a distinction between the realms of culture. There is much confluence and interchange between them. Ultimately, they exist symbiotically, nurturing, supplementing, contending with each other. The manners, habits, rituals, institutions, and patterns of behavior that define culture for the anthropologist provide the sediment, the ground out of which culture in the Arnoldian sense takes root—or fails to take root. Failure or degradation in one area instigates failure or degradation in the other. (Some people regard the astonishing collapse of manners and civility in our society as a superficial event. They are wrong. The fate of decorum expresses the fate of a culture’s dignity, its attitude toward its animating values.)

The problem with metaphors is not that they are false but that they do not tell the whole truth. The organic image of culture we have inherited from Cicero is illuminating. Among other things, it reminds us that we do not exist as self-sufficient atoms but have our place in a continuum that stretches before and after us in time. Like other metaphors, however, it can be elevated into an absurdity if it is pushed too far. Oswald Spengler’s sprawling, two-volume lament, The Decline of the West, is a good illustration of what happens when genius is captivated by a metaphor. Spengler’s book, published in the immediate aftermath of World War I, epitomized the end-of-everything mood of the times and was hailed as the brilliant key to understanding—well, just about everything. And Spengler really is brilliant. For example, his remarks about how the triumph of scepticism breeds a “second religiousness” in which “men dispense with proof, desire only to believe and not to dissect,” have great pertinence to an age, like ours, that is awash in new-age spiritual counterfeits. Nevertheless, Spengler’s deterministic allegiance to the analogy between civilizations and organisms ultimately infuses his discussion with an air of unreality One is reminded, reading Spengler, of T. S. Eliot’s definition of a heretic: “a person who seizes upon a truth and pushes it to the point at which it becomes a falsehood.”

That said, for anyone who is concerned about the survival of culture, there are some important lessons in the armory of cultural instructions accompanying a humble tomato plant. Perhaps the chief lesson has to do with time and continuity, the evolving permanence that cultura animi no less than agricultural cultivation requires if it is to be successful. All those tips, habits, prohibitions, and necessities that have been accumulated from time out of mind and passed down, generation after generation: How much in our society militates against such antidotes to anarchy and decay!

Culture survives and develops under the aegis of permanence. And yet instantaneity—the enemy of permanence—is one of the chief imperatives of our time. It renders anything lasting, anything inherited, suspicious by definition. As Kenneth Minogue observed earlier in this volume, “The idea is that one not only lives for the present, but also ought to live thus.” We want what is faster, newer, less encumbered by the past. If we also cultivate a nostalgia for a simpler, slower time, that just shows the extent to which we are separated from what, in our efforts to decorate our lives, we long for. Nostalgia (Greek for “homesickness”) is a version of sentimentality —a predilection, that is to say, to distort rather than acknowledge reality.

The political philosopher Hannah Arendt dilated on one essential aspect of the problem when she argued against taking an instrumental, “self-help” approach to culture. “What is at stake here,” Arendt observed in “The Crisis in Culture,”

is the objective status of the cultural world, which, insofar as it contains tangible things—books and paintings, statues, buildings, and music—comprehends, and gives testimony to, the entire recorded past of countries, nations, and ultimately mankind. As such, the only nonsocial and authentic criterion for judging these specifically cultural things is their relative permanence and even eventual immortality. Only what will last through the centuries can ultimately claim to be a cultural object. The point of the matter is that, as soon as the immortal works of the past became the object of social and individual refinement and the status accorded to it, they lost their most important and elemental quality, which is to grasp and move the reader or the spectator over the centuries.

The “objective status” of the cultural world to which Arendt appeals is precisely the aspect of culture we find hardest to accommodate. If there is a “nonsocial and authentic criterion” for judging cultural achievements, then what happens to the ideology of equivalence that has become such a powerful force in Western societies? Are we not committed to the proposition that all values are social? Isn’t this part of what social constructionists like Richard Rorty mean when they say language goes “all the way down”? That there is no self, no value, no achievement, no criteria independent of the twitterings of fashion, which in turn are ultimately the twitterings of social power?

Is there a point at which scientific development can no longer be described, humanly, as progress? We know the benisons of technology; are we about to become more closely acquainted with its depredations?

The attack on permanence comes in many guises. When trendy literary critics declare that “there is no such thing as intrinsic meaning,” they are denying permanent values that transcend the prerogatives of their lucubrations. When a deconstructionist tells us that truth is relative to language, or to power, or to certain social arrangements, he seeks to trump the unanswerable claims of permanent realities with the vacillations of his ingenuity. When the multiculturalist celebrates the fundamental equality of all cultures—excepting, of course, the culture of the West, which he reflexively disparages—he substitutes ephemeral political passions for the recognition of objective cultural achievement. “A pair of boots,” a nineteenth-century Russian slogan tells us, “is worth more than Shakespeare.” We have here a process of leveling that turns out to be a revolution in values. The implication, as the French philosopher Alain Finkielkraut observed, is that

the footballer and the choreographer, the painter and the couturier, the writer and the ad-man, the musician and the rock-and-roller, are all the same: creators. We must scrap the prejudice which restricts that title to certain people and regards others as sub-cultural.

But what seems at first to be an effort to establish cultural parity turns out to be a campaign for cultural reversal. When Sir Elton John is put on the same level as Bach, the effect is not cultural equality but cultural insurrection. (If it seems farfetched to compare Elton John and Bach, recall the literary critic Richard Poirier’s remark, in Partisan Review in 1967, that “sometimes [the Beatles] are like Monteverdi and sometimes their songs are even better than Schumann’s.”) It might also be worth asking what had to happen in English society for there to be such a thing as “Sir Elton John.” What does that tell us about the survival of culture? But some subjects are too painful. Let us draw a veil …

“The history of philosophy,” Jean-François Revel observed in The Flight from Truth (1991), “can be divided into two different periods. During the first, philosophers sought the truth; during the second, they fought against it.” That fight has escaped from the parlors of professional sceptics and has increasingly become the moral coin of the realm. As Anthony Daniels observed in his essay for this volume, it is now routine for academics and intellectuals to use “all the instruments of an exaggerated scepticism … not to find truth but to destroy traditions, customs, institutions, and confidence in the worth of civilization itself.” The most basic suppositions and distinctions suddenly crumble, like the acidic pages of a poorly made book, eaten away from within. “A rebours” becomes the rallying cry of the anti-cultural cultural elite. Culture degenerates from being a cultura animi to a corruptio animi.

Aldous Huxley’s Brave New World may be a second-rate novel—its characters wooden, its narrative overly didactic—but it has turned out to have been first-rate prognostication. Published in 1932, it touches everywhere on twenty-first-century anxieties. Perhaps the aspect of Huxley’s dystopian—what to call it: fable? prophecy? admonition?—that is most frequently adduced is its vision of a society that has perfected what we have come to call genetic engineering. Among other things, it is a world in which reproduction has been entirely handed over to the experts. The word “parents” no longer describes a loving moral commitment but only an attenuated biological datum. Babies are not born but designed according to exacting specifications and “decanted” at sanitary depots like The Central London Hatchery and Conditioning Centre with which the book opens.

As with all efforts to picture future technology, Huxley’s description of the equipment and procedures employed at the hatchery seems almost charmingly antiquated, like a space ship imagined by Jules Verne. But Huxley’s portrait of the human toll of human ingenuity is very up-to-date. Indeed, we have not—not quite, not yet—caught up with the situation he describes. We do not—not quite, not yet—inhabit a world where “mother” and “monogamy” are blasphemous terms from which people have been conditioned to recoil in visceral revulsion. Maybe it will never come to that. (Though monogamy, of course, has long been high on the social and sexual revolutionary’s list of hated institutions.) Still, it is a nice question whether developments in reproductive technology will not soon make other aspects of Huxley’s fantasy a reality. Thinkers as different as Michel Foucault and Francis Fukuyama have pondered the advent of a “posthuman” future, eagerly or with dismay, as the case may be. Scientists busily manipulating DNA may give substance to their speculations. It is often suggested that what is most disturbing about Brave New World is its portrait of eugenics in action: its vision of humanity deliberately divided into genetically ordered castes, a few super-smart alpha-pluses down through a multitude of drone-like Epsilons who do the heavy lifting. Such deliberately instituted inequality offends our democratic sensibilities.

We have here that curious compact of moral levity and grim self-absorption that has characterized so many partisans of “advanced” opinion from Oscar Wilde on down to our own time.

What is sometimes overlooked or downplayed is the possibility that the most disturbing aspect of the future Huxley pictured has less to do with eugenics than genetics. That is to say, perhaps what is centrally repellent about Huxley’s hatcheries is not that they codify inequality—nature already does that effectively—but that they exist at all. Are they not a textbook example of Promethean hubris in action? It is worth stepping back to ponder that possibility.

In the seventeenth-century, Descartes predicted that his scientific method would make man “the master and possessor of nature”: are we not fast closing in on the technology that proves him right? And this raises another question. Is there a point at which scientific development can no longer be described, humanly, as progress? We know the benisons of technology. Consider only electricity, the automobile, modern medicine. They have transformed the world and underscored the old observation that art, that techne, is man’s nature. Nevertheless, the question remains whether, after two hundred years of breathtaking progress, we are about to become more closely acquainted with the depredations of technology. It would take a brave man, or a rash one, to venture a confident prediction either way. For example, if, as in Brave New World, we manage to bypass the “inconvenience” of human pregnancy altogether, should we do it? If—or rather when—that is possible, will it also be desirable? Well, why not? Why should a woman go through the discomfort and danger of pregnancy if a fetus could be safely incubated, or cloned, elsewhere? Wouldn’t motherhood by proxy be a good thing—the ultimate labor-saving device? Most readers will hesitate about saying yes. What does that tell us? Some readers will have no hesitation about saying yes; what does that tell us?1

As Huxley saw, a world in which reproduction was “rationalized” and emancipated from love was also a world in which culture in the Arnoldian sense was not only otiose but dangerous. This is also a sub-theme of that other great dystopian novel, George Orwell’s 1984, which ends with the work of “various writers, such as Shakespeare, Milton, Swift, Byron, Dickens,” being vandalized by being translated into Newspeak. When that laborious propaganda effort is finally complete, the “original writings, with all else that survived of the literature of the past, would be destroyed.” The point is that culture has roots. It limns the future through its implications with the past. Moving the reader or spectator over the centuries, in Arendt’s phrase, the monuments of culture transcend the local imperatives of the present. They escape the obsolescence that fashion demands, the predictability that planning requires. They speak of love and hatred, honor and shame, beauty and courage and cowardice—permanent realities of the human situation insofar as it remains human.

The denizens of Huxley’s brave new world are designed and educated—perhaps his word, “conditioned,” is more accurate—to be rootless, without culture. When a relic of the old order of civilization—a savage who had been born, not decanted—is brought from a reservation into the brave new world, he is surprised to discover that the literary past is forbidden to most of the population.

“But why is it prohibited?” asked the Savage. In the excitement of meeting a man who had read Shakespeare he had momentarily forgotten everything else.

The Controller shrugged his shoulders. “Because it’s old; that’s the chief reason. We haven’t any use for old things here.”

“Even when they’re beautiful?”

“Particularly when they’re beautiful. Beauty’s attractive, and we don’t want people to be attracted by old things. We want them to like the new ones.”

Huxley’s brave new world is above all a superficial world. People are encouraged to like what is new, to live in the moment, because that makes them less complicated and more pliable. Emotional commitments are even more strictly rationed than Shakespeare. (The same, again, is true of 1984.) In the place of emotional commitments, sensations—thrilling, mind-numbing sensations—are available on demand through drugs and motion pictures that neurologically stimulate viewers to experience certain emotions and feelings. The fact that they are artificially produced is not a drawback but their very point. Which is to say that the brave new world is a virtual world: experience is increasingly vivid but decreasingly real. The question of meaning is deliberately short-circuited. “You’ve got to choose,” the Resident World Controller for Western Europe patiently explains to the Savage,

“between happiness and what people used to call high art. We’ve sacrificed the high art. We have the feelies and the scent organ instead.”

“But they don’t mean anything.”

“They mean themselves; they mean a lot of agreeable sensations to the audience.”

If this seems like a prescription for arrested development, that, too, is part of the point: “It is their duty to be infantile,” the Controller explains, “even against their inclination.” Promiscuity is encouraged because it is a prophylactic against emotional depth. The question of meaning is never pursued beyond the instrumental question of what produces the most pleasure. Socrates told us that the unexamined life is not worth living. Huxley (yet again like Orwell) pictures a world in which the unexamined life is the only one available.

Huxley’s imagination failed him in one area. He understood that in a world in which reproduction was emancipated from the body, sexual congress would degenerate into a purely recreational activity, an amusement not inherently different from one’s soma ration or the tactile movies. He pictured a world of casual, indeed mandatory, promiscuity. But he thought it would develop along completely conventional lines. He ought to have known that the quest for “agreeable sensations” would issue in a pansexual carnival. In this area, anyway, we seem to have proceeded a good deal further than the characters who inhabit Huxley’s dystopia.

In part, the attack on permanence is an attack on the idea that anything possesses inherent value. Absolute fungibility—the substitution of anything for anything—is the ideal. In one sense, this is a product of what the philosopher Michael Oakeshott criticized as “rationalism.” “To the Rationalist,” Oakeshott wrote in the late 1940s, “nothing is of value merely because it exists (and certainly not because it has existed for many generations), familiarity has no worth and nothing is to be left standing for want of scrutiny.” The realm of sexuality is one area where the effects of such rationalism are dramatically evident. It was not so long ago that the description from Genesis—“male and female created he them”—was taken as a basic existential fact. True, the obstinacy of sexual difference has always been a thorn in the side of utopian rationalism. But it is only in recent decades that the engines of judicial meddlesomeness, on the one hand, and surgical know-how, on the other, have effectively assaulted that once permanent-seeming reality.

Wherever we look—at our schools and colleges, at our churches, museums, courts, and legislatures—we see well underway a process of abdication: a process whereby institutions created to protect certain values have been “deconstructed” and turned against the very things they were meant to preserve.

For an illustration of how sexual politics has been enlisted in the attack on permanence, consider the recently acquired habit of using the term “gender” when we mean “sex.” This may seem an innocent, nearly a euphemistic, innovation. But it is not innocent. It issues not from any residual sense of modesty about sexual matters but from a hubristic effort to reduce sex to gender. The term “gender” has its home in grammar: it names a certain linguistic convention. Sex describes a basic biological division. As the columnist George Will noted recently, the substitution of “gender” for “sex” is so widespread because it suggests that sexual differences are themselves a matter of convention—“socially constructed” and therefore susceptible to social deconstruction: susceptible to being “erased by sufficiently determined social engineers.” A powerful legal tool in the campaign to substitute gender for sex is Title IX, which celebrated its thirtieth anniversary in May 2002. Written to prohibit discrimination on the basis of sex, it has, in the hands of what Will calls “Title IX fanatics,” become a legal bludgeon that is wielded to deny the reality of sexual differences. It has already been used to gut the athletic programs of hundreds of schools and colleges across the country; the next target, Will suggests, will be the curriculum: if a college has an engineering department, it must also have proportional representation of the sexes—sorry, the genders—in that department. Anything less would be an insult to the ideal of equality.

A more florid example of sexual fungibility at work is the explosion of interest in—indeed, the incipient normalization of—“gender reassignment surgery” and other adventures in sexual plasticity. A glance at the personal ads of any “alternative” newspaper—to say nothing of internet sex sites—will reveal a burgeoning sexual demi-monde where the “transsexual,” “pansexual,” and “virtually sexual” heartily compete with more traditional promiscuities.

Nor are such phenomena confined to such “help wanted” venues. Headline from a California newspaper last summer: “San Francisco is about to embark on another first in the nation: providing health care benefits for city workers undergoing sex-change procedures.” “Oh, well,” you say: “It’s California, what do you expect?” Here’s another headline: “Britain’s free health care service should provide sex-change operations for transsexuals because they suffer from a legitimate illness, a court has ruled.” Not to be left behind, The New York Times Sunday magazine recently ran a long and sympathetic cover story about a “transgendered” thirteen-year-old who, though born as a girl, has lived for the last several years as a boy.

Real-life transsexuals are what we might call the objective correlative of an increasingly prominent strand in our culture’s fantasy life. Consider, to take just one example, the British artists Dinos and Jake Chapman. Their signature works are pubescent female mannequins studded with erect penises, vaginas, and anuses, fused together in various postures of sexual congress. The thing to notice is not how outrageous but how common such items are. The Chapman brothers are not a back-alley, plain-brown-wrapper phenomenon. Their works are exhibited in major, once staid, galleries like the Royal Academy in London and the Brooklyn Museum in New York. They are “transgressive,” all right. But the point is that the transgressions they announce have been to a large extent domesticated and welcomed into the mainstream. It would be bootless to multiply examples—readers will doubtless have lists of their own. Hardly anyone is shocked anymore, but that is a testament not to public enlightenment but to widespread moral anaesthesia. (The question of aesthetics, of distinctively artistic achievement, does not even enter the calculation: what does that tell us?)

What we are seeing in sexual life is the fulfillment, in some segments of society, of the radical emancipatory vision enunciated in the 1960s by such gurus as Herbert Marcuse and Norman O. Brown. In Eros and Civilization Marcuse looked forward to the establishment of a “non-repressive reality principle” in which “the body in its entirety would become … an instrument of pleasure.” The sexual liberation Marcuse hailed was not a fecund liberation. As in Brave New World, children do not enter into the equation. The issue is pleasure, not progeny. Marcuse speaks glowingly of “a resurgence of pregenital polymorphous sexuality” that “protests against the repressive order of procreative sexuality.” A look at the alarmingly low birth rates of most affluent nations today suggests that the protest has been effective. When Tocqueville warned about the peculiar form of despotism that threatened democracy, he noted that instead of tyrannizing men, as past despotisms had done, it tended to infantilize them, keeping “them fixed irrevocably in childhood.” What Tocqueville warned about, Marcuse celebrated, extolling the benefits of returning to a state of “primary narcissism” in which one will find “the redemption of pleasure, the halt of time, the absorption of death; silence, sleep, night, paradise—the Nirvana principle not as death but as life.” What Marcuse encouraged, in other words, is solipsism, not as a philosophical principle but as a moral indulgence, a way of life.

It is often said that we are entering the “information age.” There is doubtless some truth in that. But what does it mean? The shocking bulletins appear with clocklike regularity: students seem to know less and less history, less and less mathematics, less and less literature, less and less geography. In May 2002, Diane Ravitch bemoaned the “truly abysmal scores” high-school seniors made in an American history examination: only one in ten did well enough to be considered proficient in the subject. The week before, some other report had bad news about other students and some other subject. A look in the papers today will reveal yet another depressing finding about the failure of education.

Our command of information is staggering. And yet with that command comes a great temptation. Partly, it is the temptation to confuse an excellent means of communication with communications that are excellent.

Welcome to the information age. Data, data everywhere, but no one knows a thing. In the West, at least, practically everybody has instant access to huge databases and news-retrieval services, to say nothing of television and other media. With a few clicks of the mouse we can bring up every line of Shakespeare that contains the word “darkling” or the complete text of Aeschylus in Greek or in translation. Information about contract law in ancient Rome or yesterday’s developments in microchip technology in Japan is at our fingertips. If we are traveling to Paris, we can book our airline ticket and hotel reservation online, check the local weather, and find out the best place to have dinner near the Place des Vosges. We can correspond and exchange documents with friends on the other side of the globe in the twinkling of an eye. Our command of information is staggering.

And yet with that command comes a great temptation. Partly, it is the temptation to confuse an excellent means of communication with communications that are excellent. We confuse, that is to say, process with product. What Eric Ormsby observed about contemporary librarians in his essay for this volume goes for the rest of us: our fascination with means has led us “to ignore and neglect the ends.”

That is not the only confusion. There is also a tendency to confuse propinquity with possession. The fact that some text is available online or on CD-ROM does not mean that one has read and absorbed its contents. When I was in graduate school, there were always students who tended to suppose that by making a Xerox copy of some document they had also read, or half-read, or at least looked into it. Today that same tendency is exacerbated by high-speed internet access. We can download a veritable library of material to our computer in a few minutes; that does not mean we have mastered its riches. Information is not synonymous with knowledge, let alone wisdom.

This is not a new insight. At the end of the Phaedrus, Plato has Socrates tell the story of the god Theuth, who, legend has it, invented the art of writing. When Theuth presented his new invention to the king of Egypt, he promised the king that it would make his people “wiser and improve their memories.” But the king disagreed, claiming that the habit of writing, far from improving memories, would “implant forgetfulness” by encouraging people to rely on external marks rather than “the living speech graven in the soul.” WELL, NONE OF US would wish to do without writing—or computers, come to that. Nor, I think, would Plato have wanted us to. (Though he would probably have been severe about television. That bane of intelligence could have been ordered up specially to illustrate Plato’s idea that most people inhabit a kind of existential “cave” in which they mistake flickering images for realities.) Plato’s indirect comments—through the mouth of Socrates recounting an old story he picked up somewhere—have less to do with writing (an art, after all, in which Plato excelled) than with the priority of immediate experience: the “living speech graven in the soul.” Plato may have been an idealist. But here as elsewhere he appears as an apostle of vital, first-hand experience: a realist in the deepest sense of the term.

The problem with computers is not the worlds they give us instant access to but the world they encourage us to neglect. Everyone knows about the studies showing the bad effects on children and teenagers of too much time in cyberspace (or, indeed, in front of the television set). It cuts them off from their family and friends, fosters asocial behavior, disrupts their ability to concentrate, and makes it harder for them to distinguish between fantasy and reality. I suspect, however, that the real problem is not so much the sorry cases that make headlines but a more generally disseminated attitude toward the world.

When I entered the phrase “virtual reality,” the Google search engine (at last count, 2,073,418,204 pages indexed) returned 1,260,000 hits in .12 seconds. There are many, many organizations like the Virtual Reality Society, “an international society dedicated to the discussion and advancement of virtual reality and synthetic environments.” Computer simulations, video games, special effects: in some areas of life, virtual reality seems to be crowding out the other variety. It gives a whole new significance to Villiers de L’Isle-Adam’s world-weary mot: Vivre? Les serviteurs feront cela pour nous.

The issue is not, or not only, the digital revolution—the sudden explosion of computers and email and the internet. It is rather the effect of such developments on our moral and imaginative life, and even our cognitive life. Why bother to get Shakespeare by heart when you can look it up in a nonce on the internet? One reason, of course, is that a passage memorized is a passage internalized: it becomes part of the mental sustenance of the soul. It’s the difference between a living limb and a crutch.

It used to be said that in dreams begin responsibilities. What responsibilities does a virtual world inspire? Virtual responsibilities, perhaps: responsibilities undertaken on spec, as it were. A virtual world is a world that can be created, manipulated, and dissolved at will. It is a world whose reverberations are subject to endless revision. The Delete key is always available. Whatever is done can be undone. Whatever is undone can be redone.

Of course, as the meditations of Huxley in the 1930s and Marcuse in the 1960s suggest, computers and the internet do not create the temptations of virtual reality; they merely exacerbate those temptations. They magnify a perennial human possibility. Human beings do not need cyberspace to book a vacation from reality. The problem is not computers or indeed any particular technology but rather our disposition toward the common world that culture defines. When we ask about the survival of culture and the fortunes of permanence, we are asking about the fate of that common world. In many respects it is a political question—or, more precisely, a question regarding the limits of politics. When Susan Sontag, in the mid-1960s, championed the “new sensibility” she saw erupting across American society, she rightly observed that its representatives “have broken, whether they know it or not, with the Matthew Arnold notion of culture, finding it historically and humanly obsolescent.”

What exactly is the “Matthew Arnold notion of culture” that Sontag and her cadre of hip intellectuals rejected as outmoded and irrelevant? For one thing, as we have seen, it is culture understood as a repository of mankind’s noblest spiritual and intellectual aspirations: “the best,” as Arnold put it, “that has been thought and said in the world.” The “Matthew Arnold notion of culture” is thus a hierarchical idea of culture—a vision of culture as a “sacred order” whose majesty depends on its relevance to our deepest cares and concerns.

A second feature of the “Matthew Arnold notion of culture” is its independence—what Arnold summed up in the term “disinterestedness.” Criticism achieves disinterestedness, Arnold said,

by keeping aloof from what is called “the practical view of things”; by resolutely following the law of its own nature, which is to be a free play of the mind on all subjects which it touches. By steadily refusing to lend itself to any of those ulterior, political, practical considerations about ideas …

Understood in one way, Arnold’s ideal of disinterestedness—with its emphasis on “a free play of the mind on all subjects”—might seem to be a prescription for moral quietism or frivolous aestheticism. What rescues it from that fundamental unseriousness is Arnold’s unwavering commitment to truth and honesty. The business of criticism, he said, is to know and propagate the best, to “create a current of true and fresh ideas,” and “to do this with inflexible honesty.” It tells us a great deal about the state of culture that Arnold’s demanding ideal of disinterestedness is not merely neglected but actively repudiated by many influential academics and intellectuals today.

A virtual world is a world that can be created, manipulated, and dissolved at will. It is a world whose reverberations are subject to endless revision. The Delete key is always available. Whatever is done can be undone. Whatever is undone can be redone.

A third feature of the “Matthew Arnold notion of culture” is its immediacy, its emphasis not on virtual but on first-hand experience. “Here,” Arnold noted, “the great safeguard is never to let oneself become abstract, always to retain an intimate and lively consciousness of the truth of what one is saying, and, the moment this fails us, to be sure that something is wrong.” The “Matthew Arnold notion of culture,” then, comes armed with a sixth sense against the seductions of the spurious, the attractions of the ersatz.

Ultimately, what Sontag had against Arnold’s view of culture was its earnestness, its seriousness. When she celebrated the Camp sensibility, she did so largely because in Camp she found a nimble ally in her effort “to dethrone the serious.” Her praise of pop culture, pornography, and the pullulating ephemera of the counterculture must be understood as part of her battle against seriousness as traditionally defined. We have here that curious compact of moral levity and grim self-absorption that has characterized so many partisans of “advanced” opinion from Oscar Wilde on down to our own time. Redacted by the political passions of the 1960s, that strange compact resulted in the vertiginous relativisms that have overpopulated the academy, the art world, and other bastions of elite culture throughout Western society.

Part of what makes those relativisms vertiginous is their inconsistency. What we see in contemporary culture is relativism with a vengeance. It is a directed, activist relativism, forgiving and nonjudgmental about anything hostile to the perpetuation of traditional Western culture, full of self-righteous retribution when it comes to individuals and institutions friendly to the West. It incubates what Mark Steyn described above as “the slyer virus”: “the vague sense that the West’s success must somehow be responsible for the rest’s failure.” It is in effect a sort of secularized Jansenism: we are always in the wrong, not in the eyes of God but in the eyes of the exotic Other as imagined by us.

It has long been obvious that “multiculturalism” is an ornate synonym for “anti-Americanism.” It is anti-Americanism on a peculiar moralistic jag. Its effect has been to pervert institutions hitherto entrusted with the preservation and transmission of our spiritual, political, and intellectual heritage. The institutions persist, but their purpose is stymied. Wherever we look—at our schools and colleges, at our churches, museums, courts, and legislatures—we see well underway a process of abdication: a process whereby institutions created to protect certain values have been “deconstructed” and turned against the very things they were meant to preserve.

Consider what has happened to the judiciary. In any society that enjoys the rule of law, courts are a custodian of permanence. The task of judges is to uphold the laws that have been passed down to them, not make new ones. But as Robert Bork has shown—and as we see all around us—the American judiciary has to an extraordinary extent become the “enemy of traditional culture.” On issues from free speech and religion to sexuality, feminism, education, and race, the courts have acted less as a defender of the law than as an avant-garde establishing new beachheads to promulgate the gospel of left-liberal enlightenment. The recent attempt by the Ninth Circuit Court of Appeals in California to declare the Pledge of Allegiance unconstitutional because it includes the phrase “under God” is one of the more risible efforts in this campaign. The overall effect has been to inure society to rule by diktat, a situation in this country that is as novel as it is ominous. “It would,” Judge Bork observes, “have been unthinkable until recently that so many areas of our national life would be controlled by judges.” One again recalls Tocqueville’s warning about democratic despotism. Only now it is not the sovereign but the judiciary that

extends its arms over society as a whole; it covers its surface with a network of small, complicated, painstaking, uniform rules through which the most original minds and the most vigorous souls cannot clear a way to surpass the crowd; it does not break wills, but it softens them, bends them, and directs them; it rarely forces one to act, but it constantly opposes itself to one’s acting; it does not destroy, it prevents things from being born; it does not tyrannize, it hinders, compromises, enervates, extinguishes, dazes, and finally reduces each nation to being nothing more than a herd of timid and industrious animals of which the government is the shepherd.

The attack on permanence is a failure of principle that results in moral paralysis. Chesterton once defined madness as “using mental activity so as to reach mental helplessness.” That is an apt description of a process we see at work in many segments of our social and intellectual life. It is not so much a version of Hamlet’s disease—being sicklied o’er with the pale cast of thought—as an example of what happens when conscience is no longer animated by principle and belief.

Item: Friday, May 17, 2002: “Hamas Founder Says Suicide Attacks Will Continue.” Really? And what about us: what do we have to say about that abomination? Mostly, we wring our hands and mumble about restarting the “peace process.” In a recent column, Linda Chavez reported on an episode of National Public Radio’s “All Things Considered” in which a group of second- and third-generation Palestinian Americans living in Northern Virginia were interviewed. If you had been thinking of taking a holiday there, you may wish to reconsider, or at least be sure that your life insurance premiums are paid up. As Ms. Chavez noted, the sentiments expressed could have come from Hamas. “It doesn’t matter who dies,” said one young boy who idolizes the suicide bombers, “just as long as they’re Israeli.” His mother blames Israel: “They’ve made him violent and hate them.” His father swells with paternal pride: “If his time has come, he will die, regardless of where he is. But at least he will die for a cause. I will live the rest of my life being proud of him.” What about the rule of law? Forget it. American democratic values? Don’t make me laugh. What we have here, Ms. Chavez observes, is “a reflection of our new multicultural America, where young people are taught that one’s allegiance to one’s ethnic group takes precedence over allegiance to the United States or adherence to democratic values.” Thus it is, as David Pryce-Jones observes in his essay for The Survival of Culture, that “contempt for democratic institutions was translated into contempt for the moral values that had underpinned those institutions.”

“No taxation without representation” is a splendid demand. But so is “no immigration without assimilation.” Where is the simple imperative that one live up to one's oaths or face the consequences?

When immigrants become American citizens, they take an oath of allegiance. Among other things, they must “absolutely and entirely renounce and abjure all allegiance and fidelity to any foreign prince, potentate, state, or sovereignty of whom or which [they] have heretofore been a subject or citizen.” But such promises are only so many words to a population cut adrift from the permanent values enshrined in America’s political principles. The fault lies with the elites who no longer respect and stand up for those principles. “No taxation without representation” is a splendid demand. But so is “no immigration without assimilation.” Where is the simple imperative that one live up to one’s oaths or face the consequences? If one becomes an American citizen, then one must become an American citizen, with the rights and duties pertaining thereto. If that proves too onerous, perhaps citizenship should be revoked and a one-way ticket to elsewhere provided. Such drastic measures would not be a sign of excessive rigor but an example of beneficence in action. It is kindness to stymie the forces of anarchy. By supporting the permanent values that undergird society, such enforcement would be a vote for civilization against chaos.

Since September 11, questions about the survival of culture have naturally taken on a new urgency. The focus suddenly shifted away from the airier purlieus of cultural endeavor to survival in the most visceral sense. The murderous fanatics who destroyed the World Trade Center, smashed into the Pentagon, and killed thousands of innocent civilians took the issue of multiculturalism out of the fetid atmosphere of the graduate seminar and into the streets. Or, rather, they dramatized the fact that multiculturalism was never a merely academic matter. In a sense, the actions of those terrorists were less an attack on the United States than part of what Binyamin Netanyahu called “a war to reverse the triumph of the West.” We are very far from being in a position to assess the full significance of September 11 for the simple reason that the detonations that began that day continue to reverberate and destroy. A battle of wills, a contest of values, was initiated or at least openly acknowledged on September 11. It is much too early to predict the course of that conflict.

September 11 precipitated a crisis the end of which we cannot see. Part of the task that faces us now is to acknowledge the depth of barbarism that challenges the survival of culture. And part of that acknowledgment lies in reaffirming the core values that are under attack. Ultimately, victory in the conflict that besieges us will be determined not by smart weapons but by smart heads. That is to say, the conflict is not so much—not only—a military conflict as a conflict of world views. It is convenient to command the carrier battle groups and cruise missiles; it is essential to possess the will to use them and the faith that our cause, the cause of culture, is the best hope for mankind. Mark Steyn put it well: “If we are as ashamed as we insist we are—of ourselves, our culture and our history—then inevitably we will invite our own destruction.” The horrifying slaughter of September 11 tempts us to draw a line around that day and treat it and its immediate consequences as an exceptional case. There is a deep sense, however, in which the terrorist attacks underscore not the fragility of normality but the normality of fragility. This is a point that C. S. Lewis made with great eloquence in a sermon he preached at Oxford in 1939. “I think it important,” he said,

to try to see the present calamity in a true perspective. The war creates no absolutely new situation: it simply aggravates the permanent human situation so that we can no longer ignore it. Human life has always been lived on the edge of a precipice. Human culture has always had to exist under the shadow of something infinitely more important than itself. If men had postponed the search for knowledge and beauty until they were secure, the search would never have begun.

We are mistaken when we compare war with “normal life.” Life has never been normal. Even those periods which we think most tranquil, like the nineteenth century, turn out, on closer inspection, to be full of crises, alarms, difficulties, emergencies. Plausible reasons have never been lacking for putting off all merely cultural activities until some imminent danger has been averted or some crying injustice put right. But humanity long ago chose to neglect those plausible reasons. They wanted knowledge and beauty now, and would not wait for the suitable moment that never comes. Periclean Athens leaves us not only the Parthenon but, significantly, the Funeral Oration. The insects have chosen a different line: they have sought first the material welfare and security of the hive, and presumably they have their reward.

Men are different. They propound mathematical theorems in beleaguered cities, conduct metaphysical arguments in condemned cells, make jokes on scaffolds, discuss the latest new poem while advancing to the walls of Quebec, and comb their hair at Thermopylae. This is not panache: it is our nature.

Lewis’s meditation is by turns cheering and sobering. On the one hand, it testifies to the heartiness of culture, which is the heartiness of the human spirit. Sonnets in Siberia, mathematical formulae in the besieged fortress. There is no time when cultural instructions are not pertinent. On the other hand, Lewis’s meditation reminds us that culture, and the humanity that defines it, is constantly under threat. No achievement may be taken for granted; yesterday’s gain may be tomorrow’s loss; permanent values require permanent vigilance and permanent renewal.

Ultimately, victory in the conflict that besieges us will be determined not by smart weapons but by smart heads.

What lessons may we draw from these Janus-faced conclusions? One is that it is always later than you think. Another is that it is never too late to start anew. Our French friends have lately taken to disparaging the “simplisme” of America’s foreign policy. In their subtlety they ignore the fact that most important truths are—I use the adverb advisedly—terribly simple. Our complexity is much more likely to lead us astray than any simplicity we may follow.

In Notes Towards a Definition of Culture, T. S. Eliot observed that “If any definite conclusions emerge from this study, one of them surely is this, that culture is the one thing that we cannot deliberately aim at. It is the product of a variety of more or less harmonious activities, each pursued for its own sake.” “For its own sake.” That is one simple idea that is everywhere imperiled today. When we plant a garden, it is bootless to strive directly for camellias. They are the natural product of our care, nurture, and time. We can manage that when it comes to agriculture. When we turn our hands to cultura animi, we seem to be considerably less successful. The historian John Lukacs has just published a gloomy book called At the End of an Age. He argues that “we in the West are living near the end of an entire age,” that the Modern Age, which began with the Renaissance, is jerking, crumbling irretrievably to its end. I believe Lukacs is precipitate. After all, prophecies of the end have been with us since the beginning. It seems especially odd that an historian of Lukacs’s delicacy and insight would indulge in what amounts to a reprise of Spengler’s thesis about the “decline of the West.” How many times must historical “inevitabilities” be confounded before they lose their hold on our imaginations?

Where Lukacs is on to something, however, is in his meditations on the ideology of progress. Science does not deserve the scare quotes with which Lukacs adorns it, far from it. But it is true that much that we have taken for progress looks with the passage of time more and more dubious. Our stupendous power has accustomed us to say “yes” to every innovation, in manners and morals as well as the laboratory. We have yet to learn—even now, even at this late date—that promises of liberation often turn out to conceal new enchantments and novel forms of bondage. Our prejudice against prejudice tempts us to neglect the deep wisdom of tradition and time-sanctioned answers to the human predicament. The survival of culture is never a sure thing. No more is its defeat. Our acknowledgment of those twin facts, to the extent that we manage it, is one important sign of our strength.

  1.   A recent article in The Wall Street Journal reported on the new popularity of using continuous birth-control pills or other methods to suppress women’s menstrual cycles. The article quoted one obstetrician-gynecologist who, noting that most women in primitive societies had many more pregnancies than women today, argued that stopping monthly periods “gets women to a more natural state.” Really?

A Message from the Editors

Your donation sustains our efforts to inspire joyous rediscoveries.

This article originally appeared in The New Criterion, Volume 20 Number 11
Copyright © 2023 The New Criterion |

Popular Right Now