Every “Age of Enlightenment” proceeds from an unlimited optimism of the reason . . . to an equally unqualified skepticism.
—Oswald Spengler, The Decline of the West

History is strewn with the wrecks of nations which have gained a little progressiveness at the cost of a great deal of hard manliness, and have thus prepared themselves for destruction as soon as the movements of the world gave a chance for it.
—Walter Bagehot, Physics and Politics

Day by day, month by month, doubt by doubt, law and order became fascism; education, constraint; work, alienation; revolution, mere sport; leisure, a privilege of class; marijuana, a harmless weed; family, a stifling hothouse; affluence, oppression; success, a social disease; sex, an innocent pastime; youth, a permanent tribunal; maturity, the new senility; discipline, an attack on personality; Christianity . . . and the West . . . and white skin.
—Jean Raspail, The Camp of the Saints

At least since Oedipus met King Laius at the narrow fork where the road from Delphi to Thebes split off to Daulis, the image of a crossroads has signaled a dramatic and morally fraught turning point.1 A lot of ink has been spilt pondering the question whether the turn is something fated or the result of a choice. Robert Frost, in his famous poem, took “the road less traveled,” and it “made all the difference.” Could Oedipus have done likewise? Or did Sophocles load the dice against poor Oedipus? Aristotle, in his commentary on the plays, says that Oedipus’s fate was the result of a flaw for which he bore the guilt, though not, exactly, the responsibility.

That is the way things go with the apparently inexorable dialectic of hubris, infatuation, and nemesis (yielding in the end to catharsis and resolution, not that it did Oedipus much good).

Contrasting the “indulgence” shown by liberal intellectuals towards the gritty realist Theodore Dreiser and the severity shown by them to Henry James, the literary critic Lionel Trilling spoke of “the dark and bloody crossroads where literature and politics meet.” In “our world of impending disaster,” Trilling said, confronting that crossroads was unavoidable for anyone attending to “reality in America,” though he remained somewhat elliptical about the burdens of the different courses being mapped. What Trilling calls “the ultimate question”—“of what use, of what actual political use” are James’s novels?—is a puzzle that might baffle Oedipus. Still, Trilling’s memorable formulation has a significance that transcends his discussion of Dreiser and James. His “dark and bloody crossroads” also stands before us, posing different and perhaps even more ultimate questions.

The literary critic Lionel Trilling spoke of “the dark and bloody crossroads where literature and politics meet.”

One of the more beneficent and also enigmatic early Greek goddesses is Hecate, the only daughter of the titans Perses and Asteria. Among other things, the three-faced goddess stands as a talisman at crossroads, junctions whose alternative routes may mark not only alternative destinations but also alternative destinies. “Ora vides Hecates in tres vertentia partes,” Ovid writes in his poem on the Roman calendar of festivals, “servet ut in ternas compita secta vias”: “See Hecate’s faces turned in three directions that she may guard the branching crossroads.”

Ovid’s description is upbeat. But for some time past, the mood in the West, among intellectuals, anyway, has listed towards the gloomy. Hecate, if she is on hand at all, seems to glower more than guard. In the 1940s, Cyril Connolly sighed that it was “closing time in the gardens of the West.” A couple of decades before that, in the immediate aftermath of the shambles that was World War I, Oswald Spengler gave definitive expression to the temperament of the times in his sprawling Teutonic threnody Der Untergang des Abendlandes, “The Decline of the West.”

At some point the West really will decline. Is it finally happening? And what is this “West” that is “declining,” “closing,” facing a momentous crossroads? An incomplete but not inaccurate shorthand is “Christendom.” I know that the term strikes an antique note in America circa 2022. When is the last time you heard someone utter it without invisible scare quotes? (When was the last time you heard someone utter it at all?) But a decaying moral vocabulary—consider the recent career of words like “virtue,” “manliness,” “womanly,” or “respectable”—is part of the ominous cargo we are called upon to bear. As for “Christendom,” it names a dispensation in which the individual possesses intrinsic moral worth, defined not only by the Christian tradition itself but buoyed also by those traditions, predominantly classical and Judaic, which flowed into and helped nurture and define that baggy creation we call “the West.”

One bleak route on the crossroads we face involves the willful throttling of those mighty currents. Angry at the Gyndes River for sweeping away and drowning one of his sacred white horses, Cyrus decided to punish the river by having his slaves cut 360 channels into it, stanching its flow to a trickle. This we have done to ourselves, applying mental tourniquets to the arteries that fed us from the past in order that we might gambol undisturbed in distracted present-tense ignorance. An illustrative case in point is the Princeton classics department, where woke educationists panting for relevance recently jettisoned the requirement that its students learn Latin or Greek, never mind both. At the same time, they publicly celebrate the fifty-seven varieties of racial-trans-wonderfulness that have become the focus of academic obsession. It is a situation that is as absurd as it is malignant. In The Present Age (1846), Kierkegaard described the jaded spirit that “leaves everything standing but cunningly empties it of significance.” That is where we are today: occupying a husk of decadence assiduously emptied of vitality. Princeton, Yale, Harvard, and the rest of the querulous educational establishment are sodden with money but spiritually and intellectually bankrupt. They continue to look like educational institutions: leafy walks, imposing libraries, impressive buildings. But most of the activities they sponsor are inimical to real education, inciting thousands of puny Cyruses to divert and stymie the waters of tradition in order to polish the mirror of their narcissism.

Dynamism & deviancy

It is often said that a distinguishing feature of Western civilization is its “dynamism,” its tendency to change, to foment opposition and critical self-reflection. By contrast, ancient Egyptian, Chinese, and other cultures sought longevity through stasis. Among other things, the element of percolation that is part of the West’s identity inscribes a certain irony into its dna. Often, seeds of its perpetuation blossom into more or less fundamental challenges to its legacy. Just so, Marxism and its many allotropes and subsidiaries are deeply anti-Western and at the same time typical products of that peculiar dynamism that defines the West. The ideologies of “wokeness” and identity politics belong here: blunt, untutored manifestations of the “transvaluation of all values” that Nietzsche spoke about in one of his agitated moods. Perhaps this self-canceling aspect of the West is part of what James Hankins and Allen C. Guelzo had in mind when they noted (in “Civilization & tradition,” the first installment in this series of essays on “Western civilization at the crossroads”) that “Civilization is always threatened by barbarism, and the greater threat often comes more from within than from without.” The political philosopher James Burnham made a similar point when he argued that “Suicide is probably more frequent than murder as the end phase of a civilization.” That the pathology may be self-generated is more an admonition than a consolation.

It is often said that a distinguishing feature of Western civilization is its “dynamism,” its tendency to change, to foment opposition and critical self-reflection.

The historian Arnold Toynbee spoke in this context of the “barbarization of the dominant minority.” When a society is robust and self-confident, Toynbee suggested, cultural influence travels largely from the elites to the proletariats. The elites furnish social models to be emulated. The proletariats are “softened,” Toynbee said, by their imitation of the manners and morals of a dominant elite. But when a society begins to falter, the imitation proceeds largely in the opposite direction: the dominant elite is coarsened by its imitation of proletarian manners. Toynbee spoke in this context of a growing “sense of drift,” “truancy,” “promiscuity,” and general “vulgarization” of manners, morals, and the arts. The elites, instead of holding fast to their own standards, suddenly begin to “go native” and adopt the dress, attitudes, and behavior of the lower classes. Flip on your television, scroll through social media, look at the teens and pre-teens in your middle-class neighborhood. You will see what Toynbee meant by “barbarization of the dominant (or, rather ‘once-dominant’) minority.” One part of the impulse is summed up in the French phrase nostalgie de la boue. But it is not “mud” that is sought so much as repudiation.

The social scientist Charles Murray, writing about Toynbee in The Wall Street Journal back in 2001, noted how closely the historian’s analysis fit developments in contemporary America. To a large extent, it is a matter of failed or discarded ideals. “Truancy and promiscuity, in Toynbee’s sense,” Murray writes,

are not new in America. But until a few decades ago they were publicly despised and largely confined to the bottom layer of Toynbee’s proletariat—the group we used to call “low-class” or “trash,” and which we now call the underclass. Today, those behaviors have been transmuted into a code that the elites sometimes imitate, sometimes placate, and fear to challenge. Meanwhile, they no longer have a code of their own in which they have confidence.

What we are talking about is the drift, the tendency of our culture. And that is to be measured not so much by what we permit or forbid as by what we unthinkingly accept as normal. This crossroads, that is to say, is part of a process, one of whose markers is the normalization of the outré. Senator Daniel Patrick Moynihan described this development as “defining deviancy down.” It is, as the late columnist Charles Krauthammer observed, a two-way process. “As part of the vast social project of moral leveling,” he wrote,

it is not enough for the deviant to be normalized. The normal must be found to be deviant. . . . Large areas of ordinary behavior hitherto considered benign have had their threshold radically redefined up, so that once innocent behavior now stands condemned as deviant. Normal middle-class life then stands exposed as the true home of violence and abuse and a whole catalog of aberrant acting and thinking.

Hilaire Belloc espied the culmination of this process in Survivals and New Arrivals (1929):

When it is mature we shall have, not the present isolated, self-conscious insults to beauty and right living, but a positive coordination and organized affirmation of the repulsive and the vile.

The sage of Ecclesiastes informs us that “there is nothing new under the sun.” But there are many aspects of our culture today that put that claim to the test. Michael Anton, in his contribution to this series in December, provides an inventory of developments that, taken together, support his contention that our situation is (as he said in his title) “unprecedented.”

In some ways, Anton’s list reads like a variation on the litany offered in the passage from Jean Raspail’s Camp of the Saints (1973) that I quote as an epigraph. That dystopian novel imagines a world in which Western civilization is overrun and destroyed by unfettered third-world immigration. It describes an instance of wholesale cultural suicide in which demography is weaponized and deployed as an instrument of retribution. Conspicuous in that apocalypse is the feckless collusion of white Europeans and Americans in their own supersession. They faced an existential crossroads. They chose extinction, laced with the emotion of higher virtue, rather than survival.

Although published in the early 1970s, Camp of the Saints sounds a distinctly contemporary note. Ibram X. Kendi is not actually a character in Raspail’s book. But he might have stepped right out of its pages. “The only remedy to racist discrimination,” Kendi has written, “is antiracist discrimination. The only remedy to past discrimination is present discrimination. The only remedy to present discrimination is future discrimination.” In other words, Kendi advises race-based discrimination today, race-based discrimination tomorrow, race-based discrimination forever. Compare Raspail:

Now, it’s a known fact that racism comes in two forms: that practiced by whites—heinous and inexcusable, whatever its motives—and that practiced by blacks—quite justified, whatever its excesses, since it’s merely the expression of a righteous revenge, and it’s up to the whites to be patient and understanding.

This racial spoils-system is one giant totem looming over the crossroads we face. Another is the anti-sexual sexual hypertrophy that has become such a curious feature of our cultural landscape. Back in 1994, Irving Kristol wrote an important essay called “Countercultures.” In it he noted that “ ‘Sexual liberation’ is always near the top of a countercultural agenda—though just what form the liberation takes can and does vary, sometimes quite widely.” The costumes and rhetoric change, but the end is always the same: an assault on the defining institutions of our civilization. “Women’s liberation,” Kristol continues,

is another consistent feature of all countercultural movements—liberation from husbands, liberation from children, liberation from family. Indeed, the real object of these various sexual heterodoxies is to disestablish the family as the central institution of human society, the citadel of orthodoxy.

In Eros and Civilization (1966), the Marxist countercultural guru Herbert Marcuse provided an illustration of Kristol’s thesis avant la lettre. Railing against “the tyranny of procreative sexuality,” Marcuse urged his followers to return to a state of “primary narcissism” and extolled the joys of “polymorphous perversity.” Are we there yet? “Be fruitful, and multiply,” the Book of Genesis advised. Marcuse sought to enlist a programmatically unfruitful sexuality in his campaign against “capitalism” and the cultural establishment: barrenness as a revolutionary desideratum. Back then the diktat seemed radical but self-contained, another crackpot effusion from the academy. Today, it is a widespread mental health problem, accepted gospel preached by teachers, the media, and legislators across the country. As I write, the National Women’s Law Center has just taken to Twitter to declare that “People of all genders need abortions.” How many things had to go wrong for someone, presumably female, to issue that bulletin? “All genders,” indeed. I recall the observation, attributed to Voltaire, that “Those who can make you believe absurdities can make you commit atrocities.”

The costumes and rhetoric change, but the end is always the same: an assault on the defining institutions of our civilization.

In “The Catholic Tradition and the Modern State” (1916), the historian Christopher Dawson wrote that “It is not liberty, but power which is the true note of our modern civilization. Man has gained infinitely in his control over Nature, but he has lost control over his own individual life.” I think this is true. And there is a political as well as a technical or scientific dimension to the phenomenon Dawson describes.

In the West, what we have witnessed since the so-called “Progressive” movement of the 1910s and 1920s is the rise of a bureaucratic elite that has increasingly absorbed the prerogatives of power from legislative bodies. In the United States, for example, Article I of the Constitution vests all legislative power in Congress. For many decades, however, Americans have been ruled less by laws duly enacted by their representatives in Congress and more by an alphabet soup of regulatory agencies. The members of these bodies are elected by no one; they typically work outside the purview of public scrutiny; and yet their diktats have the force of law. Already in the 1940s, James Burnham was warning about the prospect of a “managerial revolution” that would accomplish by bureaucracy what traditional politics had failed to produce. Succeeding decades have seen the extraordinary growth of this leviathan, the unchecked multiplication of its offices and powers, and the encroaching reach of its tentacles into the interstices of everyday life. We are now, to an extent difficult to calculate, ruled by this “administrative state,” the “deep state,” the “regulatory state.”

The location of sovereignty

When in September 2020 the World Economic Forum at Davos announced its blueprint for a “Great Reset” in the wake of the worldwide panic over covid, a new crossroads had been uncovered. Never letting a crisis go to waste, the Davos initiative was an extensive menu of progressive, i.e., socialistic imperatives. Here at last was an opportunity to enact a worldwide tax on wealth, a far-reaching (and deeply impoverishing) “green-energy” agenda, rules that would dilute national sovereignty, and various schemes to insinuate politically correct attitudes into the fabric of everyday life. All this was being promulgated for our own good, of course. But it was difficult to overlook the fact that the wef plan involved nothing less than the absorption of liberty by the extension of bureaucratic power. “Of all tyrannies,” C. S. Lewis wrote,

a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron’s cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.

The social commentator Joel Kotkin, noting the class element of this process, has recently traced the emergence of a “neo-feudalist” impulse throughout the West. Drawing on the managerial apparatus of the administrative state, this twenty-first-century form of feudalism lacks the regalia of its medieval precursors. But it operates by enacting an equally thoroughgoing agenda of dependency.

The political scientist John Marini may not have coined the term “administrative state,” but he has done more than anyone has to plumb its Stygian depths and anatomize its assaults on liberty. Drawing on Marini’s work, the Claremont Institute’s Glenn Ellmers has recently articulated the essential tension between constitutional government, which is based upon a broadly shared common sense, and its would-be replacement: technocratic government, which is populated by an unaccountable elite, “experts” whose leading characteristic is faith in their own prerogatives. This “permanent government,” Ellmers writes, “is a powerful force”:

It has established its own legitimacy apart from its political or constitutional authority, within the ranks of both political parties and the courts. Bureaucratic rule is defended as essential to solving, in a non-partisan way, the problems of modern government and society. But the bureaucracy has become a political faction on behalf of its own interests. Moreover, the party that defends progressivism and elite authority is increasingly open about politicizing the last vestiges of non-partisan government, including the Justice Department and federal law enforcement. As their power has grown, these defenders of administrative government are increasingly unable to understand, let alone tolerate, anyone who fails to recognize the legitimacy of the administrative state.

As I have noted elsewhere, it is important to acknowledge that this progressive project is as much a Republican as a Democratic pursuit. Despite some rhetorical differences, both parties are fully paid-up worker bees in this hive. The “pragmatic” cover they give themselves is the supposed “complexity” of modern life and complication of contemporary governance. Who but they are equipped to manage the machinery of the state, the minutiae of government? Ellmers nails it: “The pursuit of progress, social justice, and equity becomes for them the moral equivalent of constitutional authority.” Welcome to Davos.

Challenging the hegemony of experts requires a widespread reassertion of that “common-sense morality” to which the Founders appealed but which is regarded by our would-be masters as an impediment to utopia. A key problem, as Ellmers sees, is that

what is left of public morality is now understood in terms of “values,” or subjective preferences based only on individual will. Even in the small handful of healthy institutions in civil society, the political and civil rights of the ordinary citizen rest upon a precarious foundation, threatened and undermined by the powerful claims of social progress.

Those claims are nearly irrefutable, as anyone who has dared to question the dominant narrative on issues from “climate change” to covid policy to race, abortion, and identity politics will know. As I have put it in other essays, the question revolves around the location of sovereignty. Who rules? The people, articulating their interests through the metabolism of ordinary politics? Or the bureaucratic elite, who claim to discern the inevitable direction and goal of history and are prepared to marshal the coercive power of the state to prevent anyone or anything from cluttering up that highway to enlightenment? This is another description of the crossroads we face.

Can & may

In the preface to Thoughts and Adventures, a collection of essays published in 1932, Winston Churchill asks whether the world is “heading for the cross-roads which may lead to the two alternative Infernos,” the Cities of Destruction, brought about by the engines of war, and the Cities of Enslavement, brought about by the engines of totalitarian ideology. To both, he notes, “Science holds the keys.”

Some of the essays in that collection are light jeux d’esprit—“Hobbies,” for example, and “Painting as a Pastime.” Some are historical reflections, edged with political insight. But at least two bear directly on the dark crossroads Churchill adumbrates: “Shall We All Commit Suicide?” and “Fifty Years Hence.” Both are remarkably prescient. Churchill speculates about the morally fraught side of technological development, a development that had accelerated suddenly within the lifetimes of his parents. Churchill dilates especially upon two things: the implications of modern technology for the practice of war and its political implications, especially in the hands of totalitarian despots who begin by vacating the central moral assumption of Christendom, that the individual possesses intrinsic—Thomas Jefferson might have said “unalienable”—moral worth.

Indeed, our species has never been notable for its pacific qualities.

“The story of the human race is War,” Churchill acknowledges. Indeed, our species has never been notable for its pacific qualities. But it was not until the twentieth century that “War really began to enter into its kingdom as the potential destroyer of the human race.” That potential was bequeathed to humanity by science. Writing in the 1920s, Churchill forecasts practically the entire twentieth-century menu of martial innovation. He imagines extraordinary developments in communications—wireless telephones, for example, which he suggests might render face-to-face meetings all but superfluous—as well as fearsome developments in chemical and biological warfare. He also notes new feats of mobilization, logistics, and brute kinetic power. How extraordinary that “it is today already possible to control accurately from the bridge of a battle cruiser all the power of hundreds of thousands of men, or to set off with one finger a mine capable in an instant of destroying the work of thousands of man years.” Churchill even foresees the awesome power of nuclear energy. Might it not, he asks, be someday possible that a device no larger than an orange could destroy an entire city block or “blast a township at a stroke”? The truth is, he writes, that “Mankind has never been in this position before. Without having improved appreciably in virtue or enjoying wiser guidance, it has got into its hands for the first time the tools by which it can unfailingly accomplish its own extermination.”

Churchill also understood the chilling possibilities inherent in genetic manipulation, “startling developments . . . just beyond our finger-tips in the breeding of human beings and the shaping of human nature.” Will it not soon be possible, for example, to produce people who possess “admirable physical development” but whose “mental endowment [is] stunted in particular directions”? “Christian civilization” recoils from such vistas, Churchill says, but “there is nothing in the philosophy of Communists to prevent their creation.”

In some ways, these are by now commonplace facts about modern life. But Churchill understood that at issue was not simply the technological development of new instruments of destruction and enslavement, but a changed attitude toward humanity itself. In his book The Abolition of Man (1943), C. S. Lewis underscores this point. Science is about “conquering nature,” Lewis acknowledges. But every victory increases the domain of that which we treat as “mere Nature,” i.e., something to be mastered. “It is in Man’s power,” Lewis writes, “to treat himself as a mere ‘natural object’ and his own judgments of value as raw material for scientific manipulation to alter at will.”

The frightening moral developments that writers as different as C. S. Lewis, Churchill, and Christopher Dawson foresaw have borne poisonous fruit only with recent technical innovations. But the attitudes, the view of nature and humanity that is presupposed by such developments, have their roots in the intellectual revolutions of the sixteenth and seventeenth centuries, the revolutions of “the new science,” of Copernicus and Galileo, of Bacon, Newton, and Descartes. In seizing the freedom to determine itself, humanity at the same time began to assert its freedom to take charge of nature.

It would be difficult to overestimate the importance of this shift. Since the time of Aristotle, science had been essentially a contemplative matter; truth was conceived as a stable order that man strove to behold. But in the modern age, science became fundamentally aggressive, inseparable from the project of grasping and manipulating nature—including man’s nature—according to human designs. Instead of opening himself up to nature’s secrets, man now strove to reconstruct those secrets by active intervention. Francis Bacon’s celebrated declaration to the effect that “knowledge is power” typifies the new approach.

Lewis makes the arresting observation that “there is something which unites magic and applied science while separating both from the ‘wisdom’ of earlier ages.”  It is this: “For the wise men of old,” Lewis notes,  

the cardinal problem had been how to conform the soul to reality, and the solution had been knowledge, self-discipline, and virtue. For magic and applied science alike the problem is how to subdue reality to the wishes of men: the solution is a technique; and both, in the practice of this technique, are ready to do things hitherto regarded as disgusting and impious—such as digging up and mutilating the dead.

Lewis goes on the compare Bacon with Marlowe’s Dr. Faustus, whose chief aim is not knowledge (as is sometime claimed) but power. “A sound magician,” Faustus boasts, “is a mighty god.”

Nowhere was this new approach more dramatically crystallized—or more systematically worked out—than in the philosophy of Descartes, who can with some justice be considered the architect of modern science. In a famous passage near the end of the Discourse on Method (1637), Descartes heralds a “practical philosophy” that, unlike the speculative philosophy of scholasticism,

would show us the energy and action of fire, air, and stars, the heavens, and all other bodies in our environment, as distinctly as we know the various crafts of our artisans, and could apply them in the same way to all appropriate uses and thus make ourselves the masters and owners of nature.

To make ourselves the “masters and owners of nature”: that is the goal. The index of knowledge here is not accurate theory but power and control. Like the artisan, we really know something when we know how to make it. And among the benefits that Descartes envisioned from his new philosophy was a more efficacious medicine: by understanding the principles of nature, Descartes wrote, we might hope to control man’s physical nature, freeing him from “an infinitude of maladies both of body and mind, and even also possibly of the infirmities of age.”

The staggering success of modern science and technology—including modern medical technology—underscores the power and truth in some respects of Descartes’ vision. What we call the modern world is in a deep sense a Cartesian world, a world in which humanity has exploited the principles outlined by Descartes to remake reality in its own image.

Increasingly, one confronts the fear that humanity, despite its technical prowess, may be enslaved by its dominion over nature.

But there is, as Churchill, C. S. Lewis, and others have pointed out, a dark side to this mastery. Increasingly, one confronts the fear that humanity, despite its technical prowess, may be enslaved by its dominion over nature. “If man chooses to treat himself as raw material,” Lewis writes, then “raw material he will be.” Nor is this fear confined to the effects of our science, to the lethal arsenal of weapons and pollutants that human ingenuity has scattered over the face of the earth. Equally (if more subtly) fearsome is the crisis in values that modern science has helped to precipitate. Committed to the ideal of objectivity, of treating everything as “raw material,” modern science requires that the world be silent about “values,” about meaning in any human sense, for it requires that everyday experience be reduced to the ghostly, “value-free” language of primary qualities and mathematical formulae. While this language has given man great power over the world, it cannot speak to him of his place in the world.

Hence, as Nietzsche saw with devastating clarity, the other side of our commitment to the ideal of objective truth is nihilism. “All science,” Nietzsche wrote in On the Genealogy of Morality, “has at present the object of dissuading man from his former respect for himself, as if this had been nothing but a bizarre conceit.” Since the truth for Nietzsche—as for us as well—means first of all objective, scientific truth, everything that binds us to the world as wanting, lacking, embodied creatures (sexual attraction, beauty, pain, hunger, respect) must be denied the title of truth. It was precisely this insight that led Nietzsche to insist that “the value of truth must for once be experimentally called into question.

Many people are worried about the ethical implications of genetic engineering. They read about cloning or “harvesting” embryos for genetic material and wonder whether we have not started firmly down the path described by Aldous Huxley in his 1932 novel Brave New World or foreseen by Churchill in “Fifty Years Hence.” Today we cull certain biological material from so-called “dispensable” embryos; tomorrow might we not have factories for the production of children carefully segregated according to genetic endowment?

But if many people worry about what genetic engineering portends, others face that crossroads and worry primarily about what public anxiety over such scientific research will mean for the progress of science. Such people are not necessarily insensitive to ethical issues, but for them the search for scientific truth is ineluctable. Public opinion might delay the march of progress. It will never entirely derail it. So (they argue) it behooves us to pursue science wherever it leads. If we don’t, someone else will, and we in the West are better equipped than anyone to deploy new technologies wisely and humanely. To oppose the application of genetic engineering (the argument goes) is to be a latter-day Luddite, railing impotently against a technology whose effects might be painful at first but will ultimately be liberating.

It is a mistake casually to reject either side of the argument: those who worry about genetic engineering, or those who worry about the worriers. Consider the plus side. The therapeutic promise of genetic engineering is more than enormous: it is staggering. No one who has seen somebody suffer from Parkinson’s Disease or any of the many other horrific ills that the flesh is heir to can be deaf to that promise.

Of course, any powerful technology can be put to evil purposes as well as good ones. In this sense, one might say that technology is like fire. It is neither good nor bad in itself. It is good when used appropriately for good purposes, bad when used inappropriately or for evil purposes. It would be pleasing to think that we could apply some such calculus to determine the moral complexion of a particular application of genetic engineering.

It is not at all clear, however, that the moral quandaries with which genetic engineering (for example) confronts us can be solved by such a calculus. Part of the problem is that the creed—familiar to us from Marxism—that “the end justifies the means” seems particularly barbarous when applied directly to human reality, as it is in genetic engineering. Are all embryos potential candidates for “harvesting,” or only certain embryos? And what about newborns, another good source of genetic material? Are certain infants to be regarded as potential “raw material” for genetic experimentation? Which infants? It is easy to conjure up a nightmare world in which some human beings are raised for spare parts (and Kazuo Ishiguro did just that in a recent novel). Already in certain parts of the world, the bodies of executed criminals are raided for kidneys, corneas, and other body parts. Why not extend the practice? According to some news reports, the Chinese already have.

It is impossible, I think, for any rational person to say “No” to science and technology.

In his essay “D’Holbach’s Dream: The Central Claim of the Enlightenment,” the Australian philosopher David Stove asked a relevant question. Granted the enormous contribution of science and technology to human happiness in the last few hundred years, how can we be sure that period was “a typical specimen of the effect of scientific progress on human happiness”? Perhaps, Stove suggests, it was “a lucky accident” and we have now “returned to the state, historically the more usual one, in which any progress that knowledge makes does not much increase either human happiness or misery? Have we entered a period in which scientific progress will enormously increase misery?” No one knows the answer to these questions, but it would be silly to dismiss them out of hand.

My own belief is that humanity, as Churchill discerned, is at the crossroads of an awesome moral divide. Recent advances in the technologies of artificial intelligence and genetic engineering—cloning, stem-cell research, and the like—confront us with moral problems for which we have no ready-made solution. Perhaps the biggest problem concerns the nature of the technologies involved. When we look back over the course of technological development, especially in the last couple hundred years, it is easy to be a technological optimist. Science and technology have brought us so many extraordinary advances that one is tempted to close one’s eyes and take a leap of faith when it comes to the subject. No doubt science and technology have brought us many destructive things, but who except the hermits among us would willingly do without the conveniences—including lifesaving conveniences—they have bequeathed us? It is impossible, I think, for any rational person to say “No” to science and technology. The benefits are simply too compelling. Indeed, as Churchill noted already back in the 1920s, mankind has become so dependent upon the fruits of scientific progress that “if it stopped or were reversed,” there would be a “catastrophe of unimaginable horror.” We have, Churchill wrote, “gone too far to go back. . . . There are too many people maintained, not merely in comfort but in existence, by processes unknown a century ago, for us to afford even a temporary check.”

But can we afford to acquiesce and issue an indiscriminate “Yes” to scientific progress? Churchill went on to warn that “It would be much better to call a halt in material progress and discovery rather than to be mastered by our apparatus and the forces which it directs.” Was he right?  “There are secrets,” he continued, “too mysterious for man in his present state to know, secrets which, once penetrated, may be fatal to human happiness and glory.”  Already, “scientists are  . . . fumbling with the keys of all the chambers hitherto forbidden to mankind.” Yet “without an equal growth of Mercy, Pity, Peace and Love, Science herself may destroy all that makes human life majestic and tolerable.”  

It would be difficult to formulate sentences more thoroughly at odds with the technological spirit of our age. But can we simply dismiss Churchill’s concerns? Are there not lines to be drawn, limits to be respected? If so, where do we find the criteria for drawing those lines and limits? There is no simple or pat answer to such questions. Perhaps the one thing that is certain is that we are operating here in a realm beyond certainty. No one will come up with a formula that can be successfully applied to all cases.

There are two dangers. One is the danger of technophobia: retreating from science and technology because of the moral enormities it makes possible. The other, perhaps more prevalent danger, is technophilia, best summed up in the belief that “if it can be done, it may be done.”  There are many things that we can do that we ought not do. But whence does that “ought” acquire its traction and legitimacy? As science and technology develop, we find ourselves wielding ever greater power. The dark side of power is the temptation to forget its limitation. Lord Acton was right to warn that “Power corrupts, and absolute power corrupts absolutely.” That observation has relevance in the world of science and technology as well as politics. None of us, of course, really commands absolute power. Our mortality assures that for all of us—rich and poor, famous and obscure—life will end in the absolute weakness of death.

But the exercise of power can be a like a drug, dulling us to the fact of our ultimate impotence. It is when we forget our impotence that we do the most damage with the power we wield. At the end of his book Main Currents of Marxism (1978), the Polish philosopher Leszek Kolakowski observed that “The self-deification of mankind, to which Marxism gave philosophical expression, has ended in the same way as all such attempts, whether individual or collective: it has revealed itself as the farcical aspect of human bondage.” It would be a mistake to think that Marxism has a monopoly on the project of self-deification. It is a temptation as old as mankind itself. The Greeks called it hubris. And the Book of Genesis warns us about such hubris with the story of the serpent’s promise to Eve: “Ye shall be as gods.”

If that seems hyperbolic, consider Yuval Noah Harari, the Davos-friendly, bestselling author of pop-philosophy books warning—or crowing (it’s not always easy to tell)—that human beings are just about to exceed their shelf life and need to be replaced by something better. “We are really acquiring divine powers of creation and destruction,” he said in a recent interview. “We are really upgrading humans into gods.” “Upgrading.”

Because humans are now “hackable animals,” Harari argues, they will soon grow out of their attachment to outmoded ideas such as the worth of the individual and free will. Such ideas were OK for people like Locke, Rousseau, and Thomas Jefferson, he admits, but thanks to technological advances  we’ve gone beyond all that. “That’s over,” he says about the idea of free will.  

It’s all quite breathtaking. “The most important question in 21st-century economics,” Harari writes in the best tech-mandarin style,  “may well be what to do with all the superfluous people.” There will be so many “superfluous” people, you see, because only a small portion of humanity will be “upgraded” to be “superhumans” who will “enjoy unheard-of abilities and unprecedented creativity, which will allow them to go on making many of the most important decisions in the world.” People, it goes without saying, like Yuval Noah Harari. 

All of which is to say that modern technology has upped the ante on hubris. Our amazing technological prowess seduces many people into thinking we are or, with just a bit more tinkering, might become “as gods.” The first step in that process is to believe that one is exempt from normal moral limits: that “if it can be done, it may be done”—i.e., the capacity to do something brings with it the moral sanction to do it. It is a foolish thought, a dangerous thought. But it is a thought with which we will all find ourselves having to contend as we continue to surprise ourselves with our strange cleverness. It is part of the crossroads at which the West finds itself today.


  1.   The online version of this essay includes material omitted from the printed version because of space. 

A Message from the Editors

Receive ten digital and print issues plus a bonus issue when you subscribe to The New Criterion by August 31.

This article originally appeared in The New Criterion, Volume 40 Number 10, on page 4
Copyright © 2022 The New Criterion | www.newcriterion.com
https://newcriterion.com/issues/2022/6/highways-to-utopia-12601

Popular Right Now