For I dipt into the future, far as human eye could see,
Saw the Vision of the world, and all the wonder that would be;

Saw the heavens fill with commerce, argosies of magic sails,
Pilots of the purple twilight dropping down with costly bales;

Heard the heavens fill with shouting, and there rain'd a ghastly dew
From the nations' airy navies grappling in the central blue;

Far along the world-wide whisper of the south-wind rushing warm,
With the standards of the peoples plunging thro' the thunder-storm;

Till the war-drum throbb'd no longer, and the battle-flags were furl'd
In the Parliament of man, the Federation of the world.

There the common sense of most shall hold a fretful realm in awe,
And the kindly earth shall slumber, lapt in universal law.

—Alfred, Lord Tennyson, “Locksley Hall” (1842)

In the first week of September 2001, the kindly earth, lapt in universal law, was gathered in South Africa, yakking incessantly, shrieking hysterically, but slumbering nonetheless. In a novel or a movie, it would have seemed too pat, the juxtaposition too obvious. But real life is not so squeamish as the professional scenarist, and so the weekend of September 8, 2001 found the representatives of the civilized world locked in intense negotiations with the planet’s preeminent thugs over what recompense the West should make for its evil legacy.

Underneath the surface controversy, the United Nations Conference Against Racism, Racial Intolerance, Xenophobia and/ or Related Intolerance in Durban had an impressive unanimity on the key points. Everyone agreed that the West was guilty as charged, disagreement being confined only to the appropriate remedy. Dismissing apologies for colonialism and slavery as a pathetic attempt to cop a plea, Robert Mugabe’s government—taking time out of its hectic schedule of terrorizing white farmers—called on Britain and America to “apologize unreservedly for their crimes against humanity.” There was a big split on the slavery-reparations front between the African-American bloviators, who wanted whitey’s payments to go to individuals, and the African presidents, who thought it would be more convenient if the West just dropped off one big check at the presidential palace. The Organization for African Unity demanded that reparations for the Hutu slaughter of the Tutsis should be paid—by the Americans.

But the jingling of the guinea helps the hurt
      that Honor feels,

And the nations do but murmur.

Instead of slapping their knees and weeping with laughter and wondering how the after-conference cabaret was ever going to top this, the Europeans and North Americans were at pains to agree with their chastisers. The British, French, Dutch, Italians, Spanish, Germans, and Portuguese were happy to concede European colonialism was wrong, but unhappy at having their case formally presented by the Belgians as current holders of the EU’s rotating presidency. Having been remarkably inept and corrupt imperialists, the Belgians understandably find consolation in the leftist theory that the murkier bits of their history owe more to the general, inherent iniquities of colonialism rather than to their specific failure to be any good at it. Prostrate as they were, their European colleagues had enough residual rump professional imperialist pride to resent Brussels’s willingness to overegg the self-abasement pudding.

Everyone agreed that the West was guilty as charged, disagreement being confined only to the appropriate remedy.

Otherwise, the West’s current leaders—Britain’s Tony Blair, France’s Lionel Jospin, Canada’s Jean Chrétien, et al.—side with their tormentors on the sins of their fathers, but feel that, as impeccable multiculturalists, they themselves should be cut some slack. Mr. Chrétien was all for the conference shining a useful spotlight on the most wicked racist sexist societies on earth but was a bit taken back to find that the first country caught in the beam was his own. Matthew Coon Come, chief of Canada’s “First Nations,” told UN delegates that he and his fellow natives were victims of a “racist and colonial syndrome of dispossession and discrimination” and only last year had been savagely attacked by “white mobs” acting on the behest of the government in Ottawa. The crowd applauded wildly. Needless to say, the Canadian government had paid for Mr. Coon Come to fly to Durban to explain how oppressive it is. In Britain, your average Tory backbencher likes to spend his lunch hours hanging upside down in a bondage dungeon being flayed by Miss Whiplash for fifteen quid plus a small tip. Enlightened opinion has, as in so many other areas, “Federalized” this important adjunct to government. Mr. Coon Come provides essentially the same service as Miss Whiplash, though at rather greater cost. The United States, though it’s spent years being damned by biens-pensants for refusing to pay its UN dues, was happy to pony up 25 percent of the tab for Durban.

And it was cheap at the price. As the conference bogged down in the precise degree and number of anti-Israeli slurs the final communiqué could accommodate (“Okay, you can have two ‘racist Zionist murderers’ but take out the ‘Jew bloodsuckers’”), the chancelleries of the West devoted most of their attention to calibrating the status of their emissaries, like a dowager duchess obsessing over placement at dinner. Though many nations were represented at the highest level—“Today the Presidents of Nigeria, Latvia, Togo, and Cuba, among others, addressed delegates on the topics of racism, slavery, and reparations,” reported The New York Times without giggling—some European nations downgraded their delegations to “mid-level.” Canada toyed with downgrading from “mid-level” to “low-level,” but eventually decided against it: the low-level diplomat is the gunboat of modern international relations and not to be deployed lightly.

The United States, for its part, downgraded from low-level to complete withdrawal, much to the regret of other delegates such as the Reverend Jesse Jackson, President-for-Life of the People’s Republic of Himself. According to Jesse and Co., only by remaining at the table can we “influence the debate.” Take the Syrian Foreign Minister, whose position is that the Holocaust is “a Jewish lie.” Okay, we can probably never get him to accept that six million Jews were murdered but, if we’d joined the Norwegian delegation in all-night negotiations, we might have been able to persuade Damascus to accept a compromise position acknowledging that, oh, eight or nine hundred may have died, mostly troublemakers who were asking for it. This would represent what the Rev. Jackson, the UN bigwigs, and Norway call “progress.”

Fortunately, it wasn’t all negative. The delegates came up with a delicious new term for advanced capitalism—“techno-racism”—and there was a rapturous ovation for Fidel Castro, who was introduced as the leader of “the most democratic country in the world.” Late on the Saturday, the UN’s Mary Robinson declared the Conference a grand success and everyone went home. A little over forty-eight hours later, three hijacked airliners were flown into the Pentagon and the World Trade Center.

The left were the first to draw the connection between the UN Conference and Ground Zero, even before the dust had settled. What happened, said various professional grievance-mongers, was a reaction to America’s decision to walk out in Durban. It then emerged that the nineteen wealthy Arabs, mostly Saudi, had been planning their attack for years, while living openly in the United States and other Western societies.

But, of course, in broader terms the left is correct: Durban leads inevitably to the rubble of lower Manhattan. If we are as ashamed as we insist we are—of ourselves, our culture and our history—then inevitably we will invite our own destruction. If Western civilization is really something to apologize for, then surely the sooner all our cities are flattened the better it will be for the world. In that sense, aside from anything else, September 11, 2001 was a call to moral seriousness. We know now what is at stake.

Around the world, there are certain societies that function and those that don’t. The ones that work have certain things in common. The ones that don’t reject some or all of those features, disdaining explicitly the rule of law, property rights, free expression, and representative government, thereby additionally depressing economic activity, technological innovation, foreign investment, education, and the arts. For the people running these loser states, this grim soiled laundry list is necessary to maintain the regime. For those observing from the West, it is in the nature of multicultural man, generous to a fault, to regard these aberrations as just another “alternative lifestyle”—lesbianism, vegetarianism, totalitarianism, whatever. I recall the Duke of Edinburgh being lectured by some or other African President-for-Life, explaining how his country had learned from the mistakes of the British: having a lot of different parties simply meant you wasted too much time arguing with each other. Under his nation’s evolved form of democracy, lots of different views were still allowed, but now they were brought together within one party, which was much more effective. How seductive it sounds: for many starry-eyed Western progressives, the Afro-Marxist economic illiterate is the latterday version of Rousseau’s noble savage, his very inability to master the rudiments of competent public administration a triumphant confirmation of his cultural integrity. For the realpolitik crowd, it’s all you can expect from these countries. And, for a vast mass of others in between, Tennyson’s “Parliament of man” and “Federation of the world” are so worthy in and of themselves, such a restrained vision of an attainable Utopia, “the common sense of most,” that we are willing to pretend the Foreign Minister of Syria is no different from the Foreign Minister of Luxemburg or New Zealand. If this polite fiction was ever worth observing, it is so no longer.

The good news was that, in the immediate aftermath of September 11, it quickly became clear that there was no serious antiwar movement—just a few aging Ivy League slogan-parroters whose tired tropes failed to spark even on campus. Noam Chomsky will never be a threat to anyone because, when he warns that the Pentagon programs are being “implemented on the assumption that they may lead to the death of several million people in the next couple of weeks,” he’s being too self-evidently ridiculous for all but his most gullible patrons. Week after week, poll after poll showed that those opposed to the war numbered no more than 5 percent. True, they were overwhelmingly concentrated in our most prestigious institutions—Berkeley, National Public Radio—but the inability of the elites to rouse the masses to their tattered banner speaks well for Tennyson’s “common sense of most,” at least in the United States. Not for the first time, one appreciates the importance of the popular will as a brake on the inclinations of the elite.

This is one of the great strengths of … well, not democracy per se, but of our culture and its particular kind of political temperament. In the Middle East, the nominally pro-American sewer regimes sell themselves to the state department on the basis that they’re a restraint on the loonier inclinations of their people. And there’s a measure of truth in that. Insofar as public opinion can be determined in the Arab world it seems that the majority believe that it was Mossad that destroyed the World Trade Center. “Only the Jews are capable of planning such an incident,” Imam Muhammad al-Gamei’a said, “because it was planned with great precision of which Osama bin Laden or any other Islamic organization or intelligence apparatus is incapable.” This sounds like what the Great Satan’s grade-school teachers might call a self-esteem issue: we Muslims just aren’t smart enough to pull off a Jew stunt like this.

Not so, say 30 percent of Greeks. It was the American government that destroyed the World Trade Center, because they were looking for an excuse to invade Afghanistan! Of course! Washington was just waiting for an opportunity to seize a country uniquely blessed in supplies of premium-grade rubble.

The Greeks! The font of our civilization! Evidently, it’ll take a lot more than the British Museum returning Lord Elgin’s plunder to get them their marbles back. But we forget that Greece, like Spain and Portugal, was a dictatorship a quarter-century ago. When Americans look back on the Seventies, it means Jimmy Carter, the Partridge Family, flared pants. In Southern Europe, it means Franco, Salazar, and the Colonels. The exceptionalism of the American experience can be measured against not just Cuba, Rwanda, and Iraq, but also Greece and Austria and France. Continentals, for their part, are happy to acknowledge America as an aberration, though in their own snide way: if you talk to the French, for example, the fact that America had Prohibition seems far more bizarre to them than that Italy had Fascism; it’s apparently more irrational to have a Communist scare (America) than Communism (East Germany).

In 1999, Thomas Friedman of The New York Times unveiled his “Golden Arches” theory of history—that no two countries in which McDonald’s operated had ever gone to war with each other. A few weeks later, NATO began bombing Belgrade, though narrowly missing the local McDonald’s franchise. Friedman’s thesis is insufficient: today, Europeans eat Big Macs and listen to Britney Spears, just as a century ago Americans listened to Caruso and flocked to The Merry Widow. Or to put it more bluntly: even under the Taliban, the Afghans played cricket. But a common taste isn’t the same as a common culture, as Friedman should have known. Given the choice between its booming westernized economy or a blood-soaked wasteland, Yugoslavia eagerly chose the latter. The critical soundtrack is not Britney, but the deeper rhythms of the culture playing underneath.

The rest of the world is reluctant to acknowledge this. Mr. Mugabe says Britain is to blame for Zimbabwe. The Arabs say America is to blame for the Middle East. And Britain and America don’t disagree, not really. The Durban Syndrome—the vague sense that the West’s success must somehow be responsible for the rest’s failure—is a far slyer virus than the toxic effusions of the Chomsky-Sontag set, and it has seeped far deeper into the cultural bloodstream.

If Western civilization is really something to apologize for, then surely the sooner all our cities are flattened the better it will be for the world.

At its most benign, Durban Syndrome manifests itself in a desire not to offend others if one can offend one’s own instead. We saw this after September 11 in the incessant exhortations from government, public service announcements, the nation’s pastors and vicars, etc., that the American people should resist their natural appetite for pogroms and refrain from brutalizing Muslims. Ninety-nine-point-nine-nine-nine percent of Americans had no intention of brutalizing Muslims but they were sporting enough to put up with being characterized as a bunch of knuckledragging swamp-dwellers, understanding that diversity means not just being sensitive to other peoples but also not being too sensitive about yourself. Similarly, at airports across the continent, eighty-seven-year-old grannies waited patiently as their hairpins were confiscated and their bloomers emptied out on the conveyor belt, implicitly accepting this as a ritual of the multicultural society: to demonstrate that we eschew “racial profiling,” we go out of our way to look for people who don’t look anything like the people we’re looking for.

This is what we’re fighting for—the right not to tolerate any intolerance of our tolerance. As Keith Windschuttle pointed out last month in these pages, Silvio Berlusconi, the almost implausibly pro-American media magnate currently serving as Italy’s prime minister, wandered deplorably off-message when he suggested that “we must be aware of the superiority of our civilization, a system that has guaranteed well-being, respect for human rights and—in contrast with Islamic countries—respect for religious and political rights, a system that has as its values understandings of diversity and tolerance.” Poor old Berlusconi found himself scorned as a pariah by his chums in the EU, though no one seemed able to single out anything in his statement that was not, in fact, true. Nevertheless, the Belgian Prime Minister apologized profusely to Islam and promised to send the guy for sensitivity training. Despite including the multiculti buzz-words of “diversity” and “tolerance,” Berlusconi had made the mistake of assuming said diversity and tolerance derived in some way from our Western heritage. Best to keep it simple: our tolerance derives from our diversity, our diversity from our tolerance, that is all ye know and all ye need to know. And so Her Majesty’s viceroy, the Rt. Hon. Adrienne Clarkson, Governor-General and Commander-in-Chief of Canada, stood on the deck of HMCSPreserver a few weeks after September 11 and told her forces, as they set sail for the Gulf, that they were fighting not for queen and country—perish the thought—but for “tolerance” and “multiculturalism.”

As it turns out, even these virtues may be somewhat vieux chapeau. Helena Kennedy is a leftwing thinker, celebrity lawyer, and Chair of the British Council who goes under the stagename Baroness Kennedy of the Shaws, QC. Promoting a forthcoming lecture called “Cultural Conundrums in the Brave New World,” Lady Kennedy opined that it was too easy to go on about “Islamic fundamentalists.” “What I think happens very readily,” she said, “is that we as Western liberals too often are fundamentalist ourselves. We don’t look at our own fundamentalisms.” Her interviewer asked what exactly she meant by Western liberal fundamentalism. “One of the things that we are too ready to insist upon,” replied Lady Kennedy, “is that we are the tolerant people and that the intolerance is something that belongs to other countries like Islam. And I’m not sure that’s true.”

If Lady Kennedy is saying that we’re too tolerant of our own tolerance, she might be on to something: there is something a little preposterous about a world in which the Western politician’s first reaction to the slaughter of thousands by Islamic fundamentalists is to buzz his secretary and demand she find him a mosque to visit so he can demonstrate his respect for Islam. But in fact Lady Kennedy seems to be saying that our tolerance of our own claims to tolerance is making us intolerant of other people’s intolerance: when we moan about Islamic fundamentalism, we forget how offensive our own fundamentalisms—votes, human rights, drivers’ licenses for women— must be to others. The baroness says we’re too tolerant of ourselves, and she’ll tolerate anything but that.

The British Council was founded to promote British culture around the world, but that was then and this is now. Today, explained Lady Kennedy, it’s all about “mutuality—an exchange without anybody saying which set of values is more important.” In the multicultural West, our values are that we have no values: we accord all values equal value—the wittering English feminist concerned that her tolerance is implicitly intolerant or the Sudanese wife-beater and compulsory clitorectomy scheduler. That’s why Lady Kennedy likes “globalization.” It gives her “warm feelings that are about sharing,” she said. “I feel often when I meet with women, you know, from other parts of the world a shared sense of, somehow an experience of … of … womanhood.”

Baroness Kennedy has got her priorities right. I am woman, hear me roar! Say it loud, I’m black and proud! We’re here, we’re queer, get used to it! The one identity we’re not encouraged to trumpet is the one that enables us to trumpet all the others: our identity as citizens of a very particular kind of society, built on the rule of law, property rights, freedom of expression, and the universal franchise. I am Western, hear me apologize! Say it loud, I’m Canadian and cowed! We’re Brits, we’re shits, awf’lly sorry about that!

As an example of how formalized this routine now is, consider the Congressional Resolution that came before the House of Representatives a couple of months after 9/11. On the face of it, it was an unexceptional measure to inaugurate “Native American Month.” But, along with various unobjectionable platitudes, it included the observation that “Native American governments developed the fundamental principles of freedom of speech and separation of powers in government, and these principles form the foundation of the United States Government today.”

This is a reference to the Iroquois Confederation, which, according to the multiculturalists, America’s Founding Fathers used as the blueprint for the U.S. Constitution. It’s not just that the white man stole the red man’s land but his very system of government, too. To call this a revisionist theory would be to credit it with a degree of scholarship. There’s not the slightest evidence that consideration of how any particular group of Indians governed themselves had any bearing on how the American colonists wished to govern themselves—nor, come to that, that the Indians in question even formulated for themselves any such “separation of powers.” The principal Indian contribution to the founding of the American Republic was, sad to say, getting whupped in the French and Indian Wars. The New England colonists who fought under Lord Amherst stood in Montreal on September 8, 1760 and watched the French flag come down the flagpole and half a continent change hands; they returned to their towns with a new sense of themselves and their potential. In Japan after the war, the Americans were insistent that Emperor Hirohito cut out all that nutty stuff about being a direct descendant of the sun goddess. But it’s no more irrational than our determination to pretend that a group of Anglo-Celtic settlers who’d been governing themselves for a century deliberately chose to adopt an Iroquois constitution.

The republic’s founders were, I’m afraid, British subjects animated by certain eighteenth-century English theories about liberty, themselves deriving from the principles of common law and Magna Carta. It is not “Eurocentric” to make such an obvious point. Indeed, “Europe” was noticeably antipathetic to these ideas and in many ways still is. That’s why, while America still has only the same yellowing parchment it started out with two centuries ago, the continent has lurched through its Third Reichs and Fourth Republics and wholesale constitutional rewrites every generation. The U.S. Constitution is not only older than the French, German, Italian, Belgian, Greek, and Spanish constitutions, it’s older than all of them put together. The ideas of a relatively small group of Englishmen on the rule of law and responsible government have been responsible for centuries of sustained peaceful constitutional evolution in America, Britain, Canada, Australia, New Zealand, India, Barbados, Mauritius. True, of those Englishmen, America got the exceptional talents, as is clear from a casual comparison of The Federalist Papers and an equivalent opus called Canada’s Founding Debates. But generally, around the world, the likelihood of living your life unmolested by the arbitrary cruelties of government is inversely proportional to how far the state departs from Anglo-American theories of liberty.

This is not an argument about “Englishness,” whatever that is. I doubt whether more than one in a thousand Americans could name England’s “national day”—St. George’s Day is unobserved even in dear old Blighty—and certainly no New Yorker has ever seen an English parade wending its way up Fifth Avenue in national dress. English nationalism is very nearly an oxymoron (at least anywhere outside the soccer terraces): the English have traditionally preferred to submerge their own ethnic identity within larger forces, inventing Britishness as a means to make the Scots, Irish, and Welsh feel more at home—and eventually the Indians, Basutos, and Fijians, too. By the late 1940s, a quarter of the world’s population were “British subjects” and, formally, no distinction was made be- tween a Londoner and a Malay. “Civis Britannicus sum,” as Lee Kuan Yew, the former Prime Minister of Singapore, likes to say, not entirely tongue in cheek. American and British nationality were the first non-ethnic citizenships of the modern world, a concept other countries have adopted only recently and reluctantly: until a few years ago, third-generation Turkish workers in Germany were unable to become citizens.

Every revolution—France’s, Russia’s, even Iran’s—claims to be exportable, but some values seem more universally applicable than others. That’s why the framework that the Founding Fathers devised to unite a baker’s dozen of small homogeneous colonies on the Atlantic coast proved strong enough to expand across a continent and halfway round the globe to Hawaii. That’s why the British have successfully exported Westminster constitutions to Belize, Papua New Guinea, and India, the world’s largest democracy, mainly Hindu but with a minority population of 150 million Muslims (that’s some minority) who to their credit have no interest in the fetid swamp of militant Islamism in which so many of their co-religionists elsewhere are festering. Of the world’s fifty most free nations, half were once ruled by Britain. That’s the sort of thing most countries would boast about, not teach in schools as a shameful legacy of oppression.

That said, this is not an argument in favor of the “Anglosphere,” a concept much promoted by John O’Sullivan and others but which these days is confined mainly to the battlefield. The “Partnership of Nations,” as America and its allies were billed in Afghanistan, consisted on the ground of the United States plus the British, Australian, and New Zealand SAS (special forces commandos) and somewhere a little further back Canada’s JTF2 (being semi-French, Canada is a semi-detached member of the Anglosphere). All these states are British-derived and, on the face of it, suggest a working version of Winston Churchill’s dream of a grand reconciliation between the United States and the British Empire in some new configuration. But these days what these countries share is a common culture that, officially, recoils from the idea that they have a common culture. We’re multiculturalists now, and the salient point about multiculturalism is that it’s a unicultural phenomenon, existing almost entirely in the Anglo-American world.

Young Britons, we’re told by Tony Blair and the other Europhiles, now think of themselves as European—they eat pasta, they drink Perrier, they like nothing better than to curl up with a good EU harmonization directive on the permitted curvature of bananas, they wear regulation Euro-condoms, etc.

Similarly, Australians, according to their new orthodoxy, think of themselves as Asians. This was the essence of the republican case in the 1999 referendum on the monarchy: it was inappropriate to have an English queen presiding over a country with so many Vietnamese restaurants. As it transpired, not all Australians were up to speed on the new orthodoxy and on referendum day Her Majesty won handily. Australians, the republicans assured us, wanted an elected head of state. Now they’ve got one. To paraphrase Tony Blair, she is the people’s queen now.

Canadians, meanwhile, think of themselves as … well, they’ve yet to come up with a word for it, but it sure as hell isn’t “British” or “American.” In the last thirty years, no other country has worked so hard to upturn the realities of both history and geography.

And in America, Congress now wants us to believe that the U.S. Constitution is based on that of the Iroquois Confederation, but with “Iroquois” whited-out and “United States of America” scribbled on top by Ben Franklin. The Congressional Resolution is an unusually literal confirmation of a general rule: these days the surest sign that you share the Britannic inheritance is your willingness to reject it.

Since September 11, “the survival of culture” has been not just a theoretical proposition. Though the demolition of the Twin Towers was seen as an assault on Western modernity—on skyscrapers, Wall Street, jet travel—Islamofascism is no friend to Eastern antiquity, either. A few months earlier, the Taliban cheerfully blew to smithereens the great stone Buddhas that had loomed over some of the world’s earliest trade routes for two millennia. Multiculturalism was invented to make amends for “cultural imperialism,” for the idea that the West in taking its ideas to the world had somehow obliterated all the other cultures out there. But this couldn’t be further from the truth.

Relaxed about its own middlebrow contributions—Wodehouse, Gilbert & Sullivan—English culture has been fascinated with the glories of the past. No other civilization has gone to such extraordinary lengths to reconstruct its predecessors— sending out professors to scramble in the dust for the pharaohs’ tombs; reconstructing baroque instruments so that the music of the period can be heard as it sounded at the time; building an amphitheater in the English countryside so that schoolchildren can enjoy Euripides and Sophocles—in the original Greek. Much of what survives from the ancient world would have been lost forever without our respect for the past as the building blocks of our own culture. As it happens, willful cultural vandalism is more characteristic of the new age: crucifixes in urine, dung-smeared madonnas, yawn, yawn, been there, done that. Had some waggish Taliban scooped up the rubble from those Buddhas and entered it for Britain’s Turner Prize, I’ve no doubt it would have won and been hailed by the judges for the way it wittily deconstructs traditional notions of religious iconography.

And, while it’s not as dramatic as blowing statues sky-high, when Congress is willing to collude in a fiction about the foundation of America’s institutions, it’s embarking on the same process, removing some of those building blocks from our civilization. Once begun, selective demolition is hard to control. Until relatively recently in Canada, many natives went to “residential schools” run by the Christian churches on behalf of the federal government. They learned the same things children learned in other schools: there was a map on the wall showing a quarter of the globe colored red for the Queen-Empress’s realms; there was Shakespeare and Robert Louis Stevenson, and “Dr. Livingstone, I presume”; there was not a lot about the Iroquois Confederation. No doubt, as in any other school system, there were a number of randy teachers and sadistic brutes.

In the Nineties, a few middle-aged alumni came forward to claim they’d been “abused” while at the residential schools. How did the churches react? Here is Archbishop Michael Peers, the Anglican Primate of Canada, making his first public statement on the matter in 1993: “I am sorry, more sorry than I can say,” he said, “that in our schools so many were abused physically, sexually, culturally, emotionally.”

At that point, there was not one whit of evidence that there was any widespread, systemic physical or sexual abuse in the residential schools. There is still none. But His Grace had lapsed reflexively into a tone that will be all too familiar to anybody who’s attended an Anglican service anywhere outside of Africa or the Pacific isles in the last thirty years. In the Sixties, “Peter Simple,” the great satirist whose work appears in The Daily Telegraph, invented a character called Dr. Spacely Trellis, the “go-ahead Bishop of Bevindon,” whose every sermon on the social issues of the day reached a climax with the words, “We are all guilty!” Riddled with self-doubt and an enthusiastic pioneer of the peculiar masochism that now afflicts the West, the Anglican Church has for years enjoyed the strange frisson of moral superiority that comes from blanket advertising of one’s own failures. It was surely only a matter of time before some litigious types took them at their own estimation.

So, in the wake of Archbishop Peers’s sweeping declaration of his own guilt, more victims spoke up—dozens, hundreds, totaling eventually some 15,000 “survivors” with some 5000 claims of damages. Though none has yet been tested in a court of law, by 1999 the costs merely of responding to the charges were threatening to bankrupt not just several Protestant and Catholic dioceses but the entirety of both churches throughout Canada. Yet still the clergymen felt it would be bad form to defend themselves. A United Church of Canada employee, John Siebert, spent six years researching the history of residential schools and their impact on native culture and pointed out several helpful facts: Native children were not forced to abandon their own beliefs and become Christians; in 1871, before the first residential school ever opened, 96 percent of Canada’s Indians identified themselves as either Anglican or Catholic. When, over the years, the Federal Government and the churches wanted to close residential schools, it was the Indian bands (the tribal councils) that wanted to keep them open. … ah, but there’s no point even going on. The defendants weren’t looking for a defence, only a way to plea-bargain themselves into oblivion.

So Mr. Seibert’s former employers at the UCC wrote to the papers, indignantly dissociating themselves from his position, facts notwithstanding: “It is the position of the United Church that the national residential schools system was an integral part of a national policy intended to assimilate First Nations people into the dominant Euro-Canadian culture,” they said. “There are simply too many stories of the pain and cultural loss experienced by survivors of the residential schools system to conclude that this policy and its expression in the residential schools system represents anything but a profound failure in the history of the relationship between First Nations and non-First Nations peoples.” With defendants like this, who needs plaintiffs? The Canadian Government, a co-defendant, prepared for an optimistically-priced out-of-court settlement of some $2 billion, split between 15,000 “survivors” of “crimes” never recognized by any court.

Nonetheless, “pain and cultural loss” are categories worth separating. Is it possible even the horniest vicars could sodomize 15,000 kids? Well, no. Ninety percent of claims are for the vaguer offence of “cultural genocide,” a crime we’ll be hearing a lot more of in the future. “Cultural genocide” is similar to traditional forms of genocide—such as being herded into ovens or hacked to pieces with machetes—but with the happy benefit, from the plaintiffs’ point of view, that you personally don’t have to be killed in order to have a case. All you need are blurry accusations, historical resentments, and a hefty dose of false-memory syndrome. Against craven clerics like the Anglican Church, that’s more than enough.

“Follow me,” said Christ, “and I will make you fishers of men.” But that’s a tad strong, isn’t it? Much easier to concede that, yes, doing the Lord’s work for over a century is, indeed, cultural genocide, and the only question is whether our victims will accept a postdated check. Residential schools did not forcibly “convert” their charges, most of whom had been Christians for several generations. There have been Christian Indians on this continent for four centuries. The first complete Bible published in North America was a 1663 translation into Algonquin, for the Indians around Boston and Roxbury. In a theological sense, the essentially temporal expression of Christian spirituality has for the most part been complementary to rather than a displacement of the fundamentally spatial nature of Indian spirituality. The Iroquois may not have drafted the U.S. Constitution but they could easily have written the King James Bible. But Archbishop Peers and his chums not only have no interest in being fishers of men, they’re desperate to unhook and throw back many of those reeled in centuries ago, preferring to abandon them to the secular hell of identity-group welfare.

Onward, Christian soldiers, retreating into oblivion. As for “cultural genocide,” if there’s any going on these days, it’s the genocide of the Britannic inheritance—in North America, in the Antipodes, in Blair’s Britain. A couple of generations back, governments thought they were doing native children a favor by teaching them the English language, the principles of common law and Magna Carta, and the great sweep of imperial history—that by doing so they were bringing young Indians and Inuit “within the circle of civilized conditions,” enabling them to become full participants in the modern state. It’s only half a century ago, but that’s one memory the Government of Canada will never recover.

No civilized society legislates retrospectively: if you pass a seatbelt law in 1990, you don’t prosecute people who were driving without them in 1980. Likewise, we should not sue the past for noncompliance with the orthodoxies of the present. We are the accumulations of our past, in its wisdom and folly, and to repudiate it is a totalitarian act, never more explicitly captured than in Pol Pot’s proclamation of “Year Zero.” The reason why the American Revolution succeeded and the French, Russian, and almost all others failed is precisely because it resisted the “Year Zero” approach. Today, though, almost every Anglo-American institution instinctively opts for Pol Pot Lite. The Anglican Primate of Canada is not saying he personally has done anything wrong, but is merely casually defaming the memory of generations of his predecessors. Tony Blair apologizes for the Irish potato famine, thereby slandering his predecessors, many of whom from Queen Victoria down acted with great compassion and generosity. And, as we know, Bill Clinton has for years has been too busy apologizing for the sins of his predecessors to apologize for any of his own: “I cannot tell a lie. My slave-owning predecessor George Washington did cut down that cherry tree.”

And once their authentic culture has been discredited, the Anglosphere governments are happy to spend time and money inventing an entirely different one. A decade ago, to counterbalance their membership of the (British) Commonwealth, Canada prevailed upon France to set up a French version of the organization. The Quai d’Orsay, for its usual devious reasons, was happy to oblige. The “Francophonie” does everything the Commonwealth does but in French. Or rather it would do it in French if enough of its members knew how to speak it. The Commonwealth is made up of former colonies of Britain, bound by ties of language, legal system, or, if all else fails, cricket. But Paris gave the old globe a spin and decided it didn’t have enough former colonies, so the Francophonie was expanded to include nations that had a vague, unspecified connection with French. And not just nations. The Francophonie’s members include Poland, the Czech Republic, Louisiana (really), and for all I know the express croissant shop at the Rockefeller Center subway station.

I can’t speak for Louisiana, but otherwise many of the government leaders are of the genocidal strain. Real genocide, that is, not the “cultural” variety. I’m sure President Mutilata of Nogo enjoys his trips to Francophonie summits, even if the Champs Elysées Wal-Mart has a disappointing range of machetes. But it’s hard to see what Canada gets out of it. Discussions on the perennial problem of human rights proceed according to ritual. The Swiss suggest holding an open press conference at which the demonstrators could make their case. Burundi then files a countermotion suggesting the troublemakers should be clubbed and disembowelled and their genitals hung out on the balcony railings. Then, just when the summit seems hopelessly split from top to toe like a Tutsi villager, the Canadians draft an ingenious compromise proposing that they all leave quietly by the back door and return to their hotels for a quick shower and cocktail before the Acadian cultural gala. “Once again we see Canada acting as a force for good in its traditional role as honest broker,” says the prime minister, and everybody goes home.

Comparing Quebec and Mauritius with Paris-administered territories such as Chad and Mali, you could make a compelling case that the best and least brutal guarantor of the French language and culture on the planet has been the British Crown. But the argument hardly seems worth the effort. The real question is why Canada feels it has to go to such expensive lengths as inventing a completely bogus international body to demonstrate its multicultural bonafides.

No civilized society legislates retrospectively: if you pass a seatbelt law in 1990, you don’t prosecute people who were driving without them in 1980. Likewise, we should not sue the past for noncompliance with the orthodoxies of the present.

For the British, the European Union is scarcely less perverse. One day before September 11, the playwright Harold Pinter gave a speech at the University of Florence. After the usual stuff about America deliberately bombing civilian targets in Yugoslavia and rejecting Kyoto and the International Criminal Court and being “an authentic rogue state” and “the most dangerous power the world has ever known” and “a brutal and malignant world machine” that has to be “resisted,” the great man tossed in a witticism that delighted his audience. A propos Mr. Bush’s frequent references to “freedom-loving people,” Pinter remarked, “I must say I would be fascinated to meet a freedom-hating people.”

Actually, it’s not that difficult. The chaps who run the EU, if not actively freedom-hating, certainly regard it as a frightful nuisance. The principle underpinning the EU is not “We, the people” but “We know better than the people”—not just on capital punishment and the single currency, but on pretty much anything that comes up. Not so long ago, Jean-Pierre Chevenement, then France’s Defence Minister, insisted that the United States was dedicated to “the organized cretinization of our people.” As a dismissal of American pop culture—Disney, MTV—this statement is not without its appeal, though it sounds better if you’ve never had the misfortune to sit through a weekend of continental television. But the reality is that no one is as dedicated to the proposition that the people are cretins as M. Chevenement and the panjandrums of the new “Europe.” The EU is organized on this assumption. If, like the Danes and now the Irish, they’re impertinent enough to tick the wrong box in referenda on deeper European integration, we’ll just keep re-asking and re-re-asking the question until they get it right. It’s not really about left or right in the sense of political alternatives so much as a permanent European governing class with very tight rules of admission.

After September 11, Tony Blair declared he had a twin-track commitment—to Mr. Bush’s war on terrorism and to deeper integration within the EU. But the two are irreconcilable. Had Osama bin Laden waited till 2005 to attack New York and Washington, Mr. Blair could not have dispatched his task force: he would be bound into a common European foreign and defence policy. Would the EU offer frontline troops to Mr. Bush? In the Gulf War, Belgium refused even to fulfill its pre-existing contracts to sell the British ammunition because they didn’t wish to fall out with Saddam. There is a level of basic incompatibility here which the British persist in deluding themselves about. To London’s Europhiles, Britain is obviously “part of” Europe. But, in the age of jet travel, cellphones, wire transfers, and the internet, we’re less bound by physical proximity than ever. Yet Britain for the first time in history has chosen to be imprisoned by geography and to disconnect itself from its culture.

If the hope is that small-time African losers like Robert Mugabe will stop beating up on Britain, I fear Mr. Blair is likely to be disappointed. As long as we are ashamed of ourselves, there’ll always be something to apologize for. Touring Africa in 1998, Bill Clinton began by saying sorry for America’s role the Cold War. In what was described by aides as “a spur of the moment rumination,” he said

Very often we dealt with countries in Africa and other parts of the world based more on how they stood in the struggle between the United States and the Soviet Union than how they stood in the struggle for their own people’s aspirations to live up to the fullest of their God-given abilities.

Is that really what we did wrong in Africa? Isn’t it the case that the only thing the West has to apologize for in the Cold War is that it was too indulgent of Kwame Nkrumah, Julius Nyerere, and postcolonial Africa’s other founding frauds and simply stood by as they beggared the continent with their uniquely virulent strain of tribal Marxism?

But, of course, on the president’s tour or at the Durban Conference, that’s the one item that’s never on the agenda. If Cold Warriors and old-school imperialists and Anglican missionaries were to take a leaf of Mr. Clinton’s book and apologize for things they had nothing to do with, they might try this line: We apologize not for inflicting Western values but for inflicting all the anti-Western values, all of which, paradoxically, are also Western. Until Islamic fundamentalism came along, all the noisiest anti-Western ideologies were developed in the West, again mostly in Anglo-America. (Capitalism itself, as the French Eurocrats used to lecture Mrs. Thatcher, is “an Anglo-Saxon fetish.”) Whether you’re a liberal democracy or a moribund dictatorship, you’re operating to a Western template. Ever since Karl Marx sat in the Reading Room of the British Library writing Das Kapital, great Western thinkers have been obsessed with discovering the flaw in capitalism, a kind of negative Holy Grail for the knights of progressivism. For Marx, capitalism functioned only by exploiting the proletariat. But the proletariat got richer and bought homes in the suburbs. So the next generation of Marxists turned their attention to “colonialism”: capitalism functioned only by looting the West’s imperial possessions. But the West decolonized in the Fifties and Sixties, and they didn’t get any poorer, only the colonies did. So the Marxists invented “neo-colonialism”: capitalism functioned by informally exploiting the nominally independent developing world. But the dramatically differing rates at which developing economies developed in Asia, Africa, and Latin America seemed to have little to do with external forces and a lot more to do with obvious local factors.

By the time the UN met at Durban, the grievance-mongers were down to slavery: Europe and America had built their wealth on the slave trade. By this theory, the United Kingdom, which was first to abolish slavery—in the British Isles in 1772 and throughout the Empire in 1833—ought to be an economic basketcase, while the Sudan, Mali, Niger, Sierra Leone, Ghana, and the Ivory Coast, to name just a few of the countries in which slavery is currently practiced, ought to be rolling in dough. Instead, of course, large parts of the post-colonial world are more impoverished than they’ve ever been. Fifty years ago, Uganda was a net food exporter. Today, it can’t feed itself. The average Egyptian earns less now than he did when the British left—that’s not adjusted for inflation, but in real hard pounds.

Jesse Jackson and John Conyers have no interest in suing Mali for slavery reparations, because those poor chumps have a per-capita income of six-hundred bucks. Like all the rest of the West’s anti-Western theorists, they don’t dispute that capitalism works but only why it works. To those of us of a less pathological bent, it seems obvious that, rather than “exploiting” people, it invites citizens to exploit their own potential. Some will develop computer software and become billionaires (Bill Gates). Some will make a nice living as professional race-baiters and corporate shakedown artists (the Rev. Jackson). Some will clean up cranking out ridiculous theses for lucrative niche markets (Noam Chomsky), all the while bemoaning the system that keeps them in the style to which they have become accustomed. Only our society generates enough cash to fund such a wide range of fatuities. Even Osama bin Laden owes his wealth to U.S. investment in Saudi Arabia. That the son of a spectacularly rich building contractor should have wound up literally digging himself his own personal hole in the ground in the Hindu Kush is in its way a poignant emblem of the Middle East’s perverse misunderstanding of modernity.

This then is the paradox of the most successful culture in history: the “Anglo-Saxon fetish” and its attendant liberties have enabled more people to live their lives in freedom, health, and material comfort. Yet at the same time no other culture works so hard to deny its achievements and its heritage, to insist there must be a catch, there’s gotta be an alternative. There isn’t. The Anglophone culture has succeeded because it is in its way an anti-culture—a culture of individual rights, not collective rights or group rights (which the Euro-left have always been partial to) or identity politics (in which form collective rights have made a critical beachhead here). A “Federation of the World” is a nice fancy, but it will inevitably be a fretful, Durbanized, Jessefied world:

Knowledge comes, but wisdom lingers, and 
     I linger on the shore,
And the individual withers, and the world is

     more and more.

Knowledge comes, but wisdom lingers, and
      he bares a laden breast,
Full of sad experience, moving toward the 

      stillness of his rest.

A Message from the Editors

As a reader of our efforts, you have stood with us on the front lines in the battle for culture. Learn how your support contributes to our continued defense of truth.

This article originally appeared in The New Criterion, Volume 20 Number 6, on page 4
Copyright © 2020 The New Criterion | www.newcriterion.com
https://newcriterion.com//issues/2002/2/the-slyer-virus-the-wests-anti-westernism

Popular Right Now