In Fuguing Wake

    (Amy Clampitt, for you)

    Casual. Flitting skin off cucumbers over
    a wide metal bowl, catcher for the harvest
    from the summer market, that cosmos
    of virtue — jug bands, frailed banjos, picked
    mandolins, tomatoes as a toddler sees them,
    mitts of stars — I feel a comet trail of shiver
    where my sternum is, that bone high-school
    posters flat out to a Roman sword, but
    dug up horrors prove as dropping bombs.

    I know it’s nothing serious, which worries.
    I’m known to be pulled along and flipped
    in fuguing wake. It’s good this happened at the sink,
    at evening, with time before the white flock orchestra
    descends the grass drifts through the shrub maze
    of picknickers and their pickneys on Tanglewood’s
    lawn. Shed bound! Music in hand and jaw muscles;
    the old estate spellcast as something sacral near
    Kripalu’s mount where stone hips cleave.

    I’ll be there in the back with maples oaks or birches,
    X-legged with Txakolina for as long as I can bear,
    cut jeans on the blue quilt gifted by a poet from Oaxaca
    to embrace my then small children when I wasn’t
    home and here I am not home again, months here
    in this cottage in this rumple-range terrain,
    summer homestead of phantasmal ease: Shakespeare
    by starlight, Martha Graham’s wraithings,
    mill blood museums, high-priced downward dogs,

    and all year, inns in mansions, British-eyed
    manors, valleys with asemic oratory
    patterned with what private cars elided
    from Manhattan — big gauge rail.
    Once, I swashed about a bend as if in winter
    on a sled, found origin: a high rock face
    bird-ferned, no kerb just bushing then
    road so thin my silver comet must have
    singed that low stone fence stacked

    with logic like dry wood, marker for the sheer
    fall to an olive river chilling twenty feet
    below, just lazy, just shiftless, not wanting to
    be Hudson or Mackenzie, Zambezi, Ob,
    Huang He, Purus or Nile, unambitious but
    with hint in watermark it often spites, cussing
    sotto voce like my petty Rio Grande does
    through what we scale-exempt Jamaicans call a gorge.
    Look good in the bushing — the bevel cut for rail.

    This flesh in its not-green greenness
    comets me in slow arc to a beach in Anguilla,
    a place rolled and sprinkled with hotels,
    empire’s latest quest. Cane nor cotton
    rooted here despite keratoconic oversight:
    Oh the whipped when healed have keloids,
    so the catting doesn’t work so well if those
    who love them don’t know braille.
    Bless fingerprints (we tell our food by skin).

    In this eel island’s shallows, the skin of recollection
    peels, everts leisure, flips the calendar of ease.
    Winter is Antillean frolic season. Here, August blisters
    quick-quick in air too damn sun-strong. The combers
    by the like-it’s-Eid-in-the-Sahara-by-religion-closed hotel
    obscene. My swim wake churns the clear-clear
    surf. To translucence it is spoiled. Who dreamed
    this, this shiver of villas, prompts of fins …
    this pastiche of lateen sails? Good stranding here. Alone.

    It feels ablative. Is there such a word?
    I’m shy to pare my clothes. Though I understand
    this multilingual archipelago as accident …
    tectonics, coral … science … moosh and voof;
    though I know plumb flight from here to
    my own island’s all-inclusive-cuffed north coast
    is 900 mash-mash miles, I feel indulgent,
    touristic, bathing naked here in Windsor Castle’s sea.
    Is it politics? Asemics? Ghost marks on my skin?

    There’s heat etch on my corneas as I drift
    purse eyed, slow turn in this gourd flesh tinted
    sea as cool as — each cliché is intaglio
    and there’s always room and need to shave —
    as cool as … as cool as, as cool as one can be
    while getting palmed by that translucence one rasps,
    is frilling longwise into waves collapsing in a
    wide bowl in the Berkshires, basin-filled hill province
    of the baronesque, of country folk and writers:

    Hawthorne, Wharton, Melville, Dubois.
    In this unremodeled kitchen cooked
    a modest poet of indecent skill, Clampitt
    who bloomed late and in profuse
    language, her labyrinths from this simple
    home breath-hitching as the gardens
    Beatrix Jones Farrand baroqued for
    Wharton at The Mount.

    For now and months to come I’m Berkshire too.
    I’m bass lake, escarpment stroll, bay horses
    grazing long necked, red barn on the rise;
    as well the grandfathered split-rail fence,
    facets glinting, the village with the lancing church,
    the felt common. I’m the hulled town
    streetpill-glumed; the plein air philharmonic,
    the market bounty-gemmed, the—
    did a fox skurch by my window or a cat?

    On a hike a black dog rushed me while
    its owners stayed off pace and I knew
    I’d kick it if it lunged and it did and I did not.
    A statement is a story, a salad of precisely
    mandolined ingredients, dressed up;
    and what if I were celeried, mustarded,
    mayonnaised? Things I do not like.
    At Tanglewood, mosquitoes will chorus,
    picknickers will gnat and I will camp

    on my alfombra, the story of me here
    saladed, tremble like a kite on sound’s
    current, belly in halfing light to galactic fireflies,
    the cosmos, way far up in the stuttering
    consolations, swoon, the shake at breastbone
    in the kitchen the work of self-yeast
    as much as sparked comet, a rise to infinite
    in gratitude, forgiveness, in terror, in faith,
    high as a cello in the red realm. Listz.

    The Rise of Decline

    Encompassed with domestic conspiracy, military sedition, and
    civil war, they trembled on the edge of precipices in which, after
    a longer or shorter term of anxiety, they were inevitably lost.

    EDWARD GIBBON, THE DECLINE AND FALL OF THE ROMAN EMPIRE

    In a widely noted coincidence, the Declaration of Independence and Adam Smith’s The Wealth of Nations were both documents of 1776. In a less well-known accident of fate, the Declaration of Independence dovetailed with the publication of the first volume of Edward Gibbon’s The Decline and Fall of the Roman Empire, yet another monument of Enlightenment thought, though not one that created a new nation or established a new method of understanding political economy. Gibbon’s inquiry into the history of an ancient empire passed quickly over its rise. He focused instead on the centuries of decline preceding the fall. These were contrasting endeavors: Jefferson’s excitement about the rights and the liberties of a newborn republic and the learned British historian’s inquiry into the logic of decline in the most paradigmatic of empires. Gibbon’s autumnal epic was peripheral to the creation of a nation. The proper companion for the birth of the American republic, which was destined to be prosperous, was a philosophy designed to unlock the wealth of nations. For a long time, that was the obvious conclusion to draw. Like it or not, Adam Smith, the Scottish economist and philosopher, was a patron saint of the American republic.

    Three and a half centuries later, the coincidences of 1776 strike us differently. Even if Gibbon is not being revisited in American intellectual life, his preoccupation with decline is now ubiquitous, a settled thesis in search of fresh evidence. Donald Trump may be the face of American decline, not just the president for four years but the monster waiting to finish the work of destruction that he began. Joe Biden may be the face of American decline, America’s oldest president when he was elected and clearly not the FDR he was made out to be after the election. Decline is now etched into the American landscape — its poverty, its inequality, its racism, its weakening manufacturing base, its debts, its environmental crises, its rusting bridges and pockmarked roads, its moneyed anomie, its decadent elites, its unending polarization, its despair, its surliness. And decline is etched into American foreign policy, too, most vividly in the images of the ignominious retreat from Kabul, of partners and innocents abandoned, of theocrats and terrorists resurgent in the age of American flight.

    When will the barbarians arrive at the gates? Or are those of us within the gates the barbarians? When will the once-mighty city fall? And what will the ruins look like after the fall? These are the questions of the hour, and they are surging across the political spectrum.

    Before addressing these questions, let us take a closer look at Gibbon and the subsequent career of the declinism that he pioneered. Enduring and worthwhile as the decline of empires is for historians, it is an elusive subject. An aesthetic of decline can certainly be captured — the Goths rampaging through the streets of Rome, Constantinople besieged and then erased by Ottoman armies, Tsar Nicholas II and his family executed outside of Yekaterinburg — and made into the whole of the story. But it is not the whole of the story. It is immensely difficult to explain imperial decline or to disentangle its constituent elements, which usually include individual mistakes, financial ruin, military defeat, intellectual confusion, and the loss of legitimacy. No decline is ever preordained. More carefully considered, the fall of empires can appear accidental. Nicholas II was not obligated to blunder into World War I; many of his ministers tried to argue him out of the mobilization that brought Germany into the war; the celebration of the three-hundredth anniversary of Romanov rule in 1913 was enormous and heartfelt and implied a degree of popular backing that could have enabled the monarchy’s survival for decades to come. Until it happened in 1917, Lenin did not expect to see revolution in his lifetime. Such are the contingencies that complicate every generalization that can be drawn from the annals of failing powers. Decline is not inevitable. It is a consequence of human choice.

    Gibbon’s history begins with the Roman empire at its apex. Its splendor in the second century CE was manifold: Roman law, Roman military superiority, and Roman engineering, but also an intellectual culture of balance and beauty. For Gibbon, religious toleration was a precondition for Rome’s flourishing. It produced “not only mutual indulgence but even religious concord.” Religion and philosophy worked hand in hand: “in their writing and conversation the philosophers of antiquity asserted the independent dignity of reason; but they resigned their actions to the commands of law and custom.” A capital of religion, philosophy, and politics, the city of Rome was “the common temple of her subjects,” the heart of a “polite and powerful empire” and in its architecture attuned to the “liberal spirit of public magnificence.” Rome united the rulers and the ruled. Gibbon attributes a literary cast to Roman political order — “the love of letters, almost inseparable from peace and refinement.” The second century CE was — and Gibbon generalizes here about all ascendant empires — “an age of science and history.” (By history, he meant that empires in their golden age produce first-rate historians.) Finally, Gibbon identified a spirit, a set of principles and practices, that ornamented Rome’s politics in its heyday: “that poetic courage which is nourished in the love of independence, the sense of national honor, the pressure of danger and the habit of command.”

    So what undid all this stability and excellence? Gibbon supplies three reasons for the Roman empire’s decline and fall. Of these the first is probably the most familiar. It is what we would call overreach. In Gibbon’s words, those entrusted with the habit of command “gradually usurped the license of confounding the Roman monarchy with the globe of the earth.” The empire became too big, too difficult to defend. It attracted too many enemies, diminishing even the colossal city of Rome. A second cause of decay was the destruction of the Roman Senate that was achieved by the emperor Septimus Severus circa 193 CE. Gibbon bluntly asserts that he was “the principal author of the decline of the Roman empire,” the figure who cleared the path to full-out despotism. A third cause was to be found in what Gibbon called manners and in what we might call values. The commitment to “a rational freedom” dissipated. Several millennia before social media, Gibbon noted that “a crowd of critics, of compilers, of commentators, darkened the face of letters, and the decline of genius was soon followed by the corruption of taste.” The result was “an opulent but feeble country,” though it took time for Rome’s adversaries to see through the imposing edifice. Roman power had been slowly hollowed out, and by Romans.

    Ancient Rome is famously redolent of decadence, and Gibbon details the affinity between political decline and cultural decadence. Of the late-imperial Romans, Gibbon writes that “as long as they were indulged in the enjoyment of their baths, their theaters, and their villas they cheerfully resigned the more dangerous cares of empire to the rough hands of peasants and soldiers.” More pivotal than the love of leisure and luxury, though, was the intellectual corruption that Gibbon detected in the aging empire, “when the human mind was depressed by civil and religious slavery.” This slavery was reflected in “the decline of the arts” and “the gradual decay of discipline and courage.” A fall of character, we might say. In a turn of events more intellectual than military in nature, “the model of ancient freedom was suffered to sink into oblivion.” Yet the most striking aspect of Rome’s decay, as narrated by Gibbon, is the fatalism of the Romans themselves. Their “minds [were] afflicted by calamity and contempt for mankind.” In natural disasters such as earthquakes, “their fearful vanity was disposed to confound the symptoms of a declining empire and a sinking world.” Catastrophism took root. Beneath its dark star, the collapse of the empire, when it came, must have been a relief, a deliverance from the burdens of a rational freedom and a blissful surrender to anxieties that had long ago been internalized as inevitabilities. A culture of decline preceded the fall, a dark mood, a proliferating suspicion of apocalypse.

    The American Gibbon is Henry Adams, a virtuoso pessimist and a connoisseur of political and cultural decline in the late nineteenth and early twentieth century, when by every empirical standard the United States was a rising power. Democracy, his satirical novel of 1880, was a tale of the decline of manners, and dedicated to the proposition that democracy itself, and the race for money and status, had undone a once glorious republic. The American experiment was no longer what Jefferson had hoped it would be. It was not much more than a sham. Adams also intuited — with genuine brilliance — the ethical danger in the cult of the “dynamo,” the veneration of technology. He examined the German guns on display at the Chicago World’s Fair in 1893 and started to picture a war along the gruesome lines of World War I; he envisioned civilization moving backward. His memoir, The Education of Henry Adams, was an ironic homage: it portrayed a country endowed by its Creator with an inalienable talent for wasting its promise — from the magisterial John Adams (his great-grandfather) to the formidable John Quincy Adams (his grandfather) to the diminutive Henry Adams, proof positive of the decline of genius.

    World War I precipitated the decline and fall of many European empires. It validated the seductions of decline, popularizing a largely unreadable book by Oswald Spengler, an eccentric German intellectual. In Das Untergang des Abendlandes, or The Decline of the West, Spengler concerned himself less with empire than with civilization. What fascinated him was the efflorescence and then the decay of civilizations over time — architectural, literary, and religious constellations that rose up, blossomed, and were replaced by other architectural, literary and religious constellations. Spengler devised a “morphology” of civilizations, a map of their natural life cycles. For him, civilizations were organisms that would grow for a while, then mature, then sicken, then die. Unlike Gibbon, who was precise, elegant, spare in style, Spengler was oracular and vague and obtrusively mystical. What he captured best in the multiple volumes of his popular work was the allure of decline, the fascination with catastrophe, the hypnotizing effect of the car crash from which one cannot avert one’s eyes. He had “the imagination of disaster,” in a phrase of Henry James’s, a twilight sensibility that was complicit with catastrophe.

    Regardless of how many or how few people read Spengler’s books, regardless of what they actually learned from him, he was immortalized in the adjective “Spenglerian.” Spengler tried to keep his distance from National Socialism, but Hitler and his acolytes had a Spenglerian view of the world; they read him as a forerunner. Germany faced one abyss in the East, the Soviet Union, and another in the West, the Anglo-American democracies. By its own lights, fascism was an effort to “save” a faltering Western civilization, to restore it to health by excising the cancers of internationalism, of finance capitalism, and of Bolshevism, which in practice meant excising Jews from the European body politic. Ironically, the emergence of fascism also inspired a Spenglerian pessimism among its opponents. The decline of the West, the sinking of Europe, had arrived with Mussolini’s and Hitler’s deviation from democracy, the barbaric irrationalism and the boundless militarism of the fascists. Once decline was accepted as a given, its political functions could vary. Declinism was broadly fashionable in the art and culture of the 1920s. In Europe and the United States, the specter of decline was hard to escape after the stock market crash of October 1929.

    Spengler’s books showed up in the knapsack of George Kennan when, as a student, he traveled to Europe in the 1920s. Kennan admired Spengler and Adams, and he considered them profound because of their pessimism. As someone who idolized the agrarian Protestantism of nineteenth-century America, Kennan had the nostalgia that often shadows the intuition of decline. The longing backwards glance becomes the nervous look ahead. Kennan was also an enthusiastic reader of Gibbon and in fact extrapolated his theory of containment, the American Cold War strategy, from The Decline and Fall of the Roman Empire. Kennan applied its lessons to Stalin’s Soviet Union, which was rapidly expanding at the end of World War II, an empire following in the footsteps of the Red Army. Dwelling on the theme of imperial overreach, Kennan noted risks for Moscow, and not just for the West, in a massive Soviet imperium. So let this empire grow brittle, Kennan advised. Contain its growth where this growth is unwanted: there is no need to go beyond containment. The risks of conflict are too great, whereas the potential for self-destruction, the likelihood of Soviet decline, is written in the stars.

    Kennan did not limit his lessons from Gibbon to the Soviet Union. Kennan loathed the commercialism of America, its materialism, its cultural vulgarity. He especially resented what he construed as the depredations of multiculturalism. He regarded the Vietnam War as an instance of imperial overreach and was convinced that postwar American culture had fallen into chronic decadence. Kennan, who lived a long life, spoke out against NATO expansion in the 1990s and against the Iraq War of 2003, lamenting that Americans had come to be seen as the world’s “spiritual dunces.” The country that won the Cold War was not a serious country, in his mandarin opinion. Gibbon had tied Christianity to the decline of the Roman Empire. Kennan’s Spenglerian conservatism convinced him that a diminishing Christian piety marked the moral decline of the United States. He ended his life a despondent outlier in the American foreign-policy establishment. Where his colleagues heard the martial music of international engagement, he heard the tolling of imperial decline.

    Kennan’s declinism was most in vogue nationally in the 1970s. Expressing a widespread sentiment on the American right, Ronald Reagan would later claim that Washington had been defeated in Vietnam by its own fecklessness. The United Sates had forgotten the can-do optimism of Franklin Roosevelt and the muscular anti-communism of Harry Truman. For those on the left, the cognate for decline was the Vietnam War itself, the mendacity of the Johnson administration, the imperial instinct that went back to the Spanish-American War of 1898, to the Mexican War of 1848, and to the theft of land from the Native Americans. In Vietnam, the United States was declining into the worst version of itself: the Cold War disingenuously imposed on the Soviet Union by Truman was nothing but a detour from America’s democratic potential. The scapegoat for the left was the liberal establishment. It had promised competence and progress and delivered incompetence and decline. It had used American idealism to betray American idealism, democratic rhetoric to mask imperial intent, all in the name of a lost war. Even within the government, decline was on the agenda in the Vietnam era. Henry Kissinger kept trying — with unknown results — to get President Nixon to read Spengler.

    In the 1970s, the imagination of decline went beyond Vietnam. The energy crisis of the early 1970s halted the postwar economic boom, the trentes glorieuses or thirty years of prosperity, as the French refer to them, and launched a period of truncated social mobility and declining real wealth for the American middle class. Journalists avidly reported on the decline of American cities. The seaminess of Times Square in the 1970s was a metaphor not quite for Roman decadence, more for the depleted landscapes that connote an empire’s fall. The aura of decline inspired the cinema of the 1970s, whether it was the psychopathic rage in Martin Scorcese’s Taxi Driver (set in and around Times Square) or the emptied-out, traumatized community in The Deer Hunter. Parts one and two of The Godfather align the decline of the Corleone family with the nation’s gathering losses in the 1970s, loss of purpose, loss of strength, loss of integrity. In the late 1970s: President Jimmy Carter did not just suffer from the conventional setbacks of political life. He gave his infamous “malaise speech” in 1979, invoking the purgatory of decline, of a deflation of spirit, to which the United States had apparently condemned itself. It was a country that needed to be diagnosed, and in the 1970s the diagnoses were legion.

    A bit like a pandemic, the attraction to decline comes in waves. By the 1980s, the Vietnam wave had crested. The Soviet Union was Brezhnev’s and no longer Stalin’s. It was ever more enmeshed in the decline that Kennan (channeling Gibbon) had foreseen. China was changing after Deng Xiaoping came to power in 1979 but was not yet a contender for great-power status. In the 1980s, the barometer of decline was Japan, an American ally and an economic behemoth. Japan’s future could look a lot brighter than that of the divided, confused, unproductive United States. Japan as Number One was the title of an influential book of the late 1970s and early 1980s. In 1987, the historian Paul Kennedy published The Rise and Fall of Great Powers: Economic Change and Military Conflict from 1500 to 2000, retracing the Gibbon-inflected dilemma of imperial overreach. Kennedy identified a sweet spot of technological innovation, productive capacity, wealth, and military prowess that equals a great power’s rise. Over-extension, though, is hard to avoid, he argued, and over-extension shifts wealth into unproductive military assets. The economy that had supplied the great power in the first place loses steam, inviting the aggression of enemies. Rise engenders fall. Four years after Kennedy put out his book, the Soviet Union fell apart in a perfect confirmation of Kennedy’s rise-and-fall model. But the intended audience for his book had not been Soviet. It had been American.

    The America of 1987 was rescued from inklings of decline by the Cold War’s sudden end. The vanquished Soviet Union shed a soft, attractive light on the United States. Having retired its malaise, Washington could share its wealth, the marvels of the American system, with the world. It could export the ingredients of its success: the rule of law that regulated commerce, the freedoms that underpinned the American university, the entrepreneurial spirit of Silicon Valley, the multiculturalism that made America America. A blend of free-market capitalism and democracy promotion became the “Washington consensus” of the 1990s, which bespoke a kind of optimism or, more precisely, a will to capitalize on the ascendance of the United States. Without a peer competitor, Washington sought partnerships with everyone under the sun of globalization: with Europe, with Russia, with China. Those recalcitrant areas whose tribalism evaded the Washington consensus could with engagement, with diplomacy, and if necessary with military force. The collective sense of power and right enabled the “humanitarian interventions” of the 1990s.

    Even in this period of general optimism, decline was never forgotten. It silhouettes many of the big ideas that emerged after the Cold War. Francis Fukuyama’s The End of History and The Last Man was surprisingly melancholic. To win the political argument, as Fukuyama was certain the United States had done by the 1990s, was to lose the bracing headwinds of struggle. The last man would be no Achilles or Odysseus, no Pericles full of vigor and vitality. The last man would be a solitary consumer lulled into complacency by the absence of ideological strife. Or the last man would be a technocrat, a political manager pressing the buttons and pulling the levers of the machine known as liberal democracy. The End of History was an essay on the spiritual perils embedded in liberal democracy, though Fukuyama did not fret about excessive leisure and luxury, the slide into decadence and from there the descent into despotism. Nor did Fukuyama take up Spengler and regard liberal democracy as just the latest turn of the civilizational wheel. Liberal democracy was — to a certain extent unfortunately — destined to conquer. It was not long before history eluded the ending that Fukuyama wished to give it.

    Samuel Huntington was another melancholic of the 1990s. If the title of his book The Clash of Civilizations and the Remaking of World Order sounds gung-ho, Huntington was no cheerleader of the West. In fact, The Clash of Civilizations is quite Spenglerian. The West’s decline, according to Huntington, is its loss of conviction and inability to defend itself, perhaps even to know itself. No amount of military power could mask the West’s cultural decline, as Huntington portrays it: universities that are no longer teaching the canon of Western civilization, a rampant commercialism, the blinders of a callous universalism, a failure to see that there is such a thing as civilization at all. Enamored of the “Washington consensus,” the West has gradually usurped the license of confounding itself with the globe of the earth. Thus might an inner cultural emptiness and lack of curiosity conspire with an appetite for imperial overreach. The West is claiming the globe of the earth as its dominion, unprepared for the immense civilizational backlash that this arrogance will engender. Much like Kennan before him, Huntington could combine ideas from Spengler with those from Gibbon into an ominous synthesis. Decline was the connecting link.

    Fukuyama’s and Huntington’s were voices from the political center. To their right, decline had a much more forthright post-Cold War exponent. Patrick Buchanan will haunt the history of the twenty-first century United States: he is the tragic Henry Adams, pining for past glory, reincarnated as a cable-TV farce. Buchanan had worked in the Nixon White House. He ran for President in 1992, losing to George H.W. Bush in the primaries. Losing suited him: it proved his point. At the convention the theme of his fire-and-brimstone speech was that Americans were losing their country. In Buchanan one can find another mingling of Gibbon and Spengler. The Spenglerian element was cultural or moral decline, civilizational self-destruction — achieved, according to Buchanan, by immigration, multiculturalism, and the embrace of homosexuality. The Gibbonesque element was, as usual, imperial overreach. The Cold War was over, Buchanan reminded his audience, and the Cold War’s imperial temptations had to be set aside. Only a return to republican virtue and to traditional Christianity and to the fantasy of an ethnically homogeneous (read: white) society could silence the siren song of a great power’s decline.

    The White House beyond his reach, Buchanan charted the decline of the United States in books. Their elaborate titles tell his story: The Great Betrayal: How American Sovereignty and Social Justice Are Being Sacrificed to the Gods of the Global Economy (1998); A Republic, Not an Empire: Reclaiming America’s Destiny (1999); The Death of the West: How Dying Populations and Immigrant Invasions Imperil Our Country and Civilization (2002); State of Emergency: The Third World Invasion and Conquest of America (2006); and Suicide of a Superpower: Will America Survive to 2025 (2011). These are not good books. They are bigoted, hyperbolic, sensationalist, dogmatic, and studded with celebrations of the Confederacy and nineteenth-century European imperialism. Yet they are historically more important than either The End of History or The Clash of Civilizations, as they so perfectly anticipated the presidency of Donald Trump. George H.W. Bush eventually lost his argument with Buchanan, at least within the Republican Party. Buchanan’s books reveal Trump’s political secret, which is his sensitivity to the appeal of decline to the American electorate. “Make America Great Again” is the slogan of a declinist.

    The causes of decline are many, according to the contemporary right. Foremost is the claim that the country has lost its moral compass. The problem is secularism, or a post-religious culture that is revolutionizing the triad of sex, family, and gender. A Christian or Judeo-Christian tradition through which chastity until marriage and stable gender roles were publicly endorsed and enshrined in law has been demolished. Without this tradition, they contend, the country cannot go on. It will cease to be what it is, having abnegated its moral self, and it will come to political ruin in the process. The decadence of ancient Rome is relevant for them. Closer in time is the storied decadence of the Weimar Republic, too exhausted by its transgressions to defend itself from Hitler and his National Socialists. In 2017, the Orthodox Christian conservative Rod Dreher published The Benedict Option: A Strategy for Christians in a Post-Christian Nation. America as such is a lost cause for Dreher, a pagan desert, Babylon. A retreat to the available oases of Christian faith, the Benedict option, after the founder of Benedictine monasticism, is his recommendation for escape from the abyss.

    Another pillar of declinism on the right is socio-economic. It was one of Buchanan’s recurrent subjects in the 1990s and thereafter: that global capitalism benefits a small cadre of cosmopolitan elites and leaves everyone else behind. (This was the thesis of the left-wing blockbuster Empire, by Michael Hardt and Antonio Negri, which appeared in 2000, and thirteen years later of Thomas Picketty’s Capital in the Twenty-first Century.) Applied to the United States, the right-wing theory of socio-economic decline goes as follows. Business and foreign-policy elites colluded in the 1990s to compromise borders and to liberate capital flows so that they might grow richer. Manufacturing jobs leaked out to China and other countries. While the financial districts of American cities flourished, the nation’s heartlands were bled dry. This was the “carnage” to which Trump referred in his inaugural address. At issue was no neutral economic process. At issue was the malice of the American ruling class, which had happily sacrificed real progress to its own short-term cupidity. America’s death is the life of its craven, sybaritic, and post-national elites. The stage was being set in these narratives for a populist backlash.

    A third reason for decline in the new American conservatism is an alleged loss of nationhood. The conservatives watch with horror as a progressive elite elevates a critique of American nationhood into a “woke” religion, which tethers every facet of American identity to white supremacy, to patriarchy, to assertions of heterosexual privilege. The moment of national origin has been pushed back from 1776 to 1619, from heroism to exploitation, from the founding of the republic to the arrival of the first African slaves on the American continent. Even Abraham Lincoln belongs to this narrative of woe, as another white supremacist. Left to their own devices, according to the right, the progressives will turn every university, every museum, every magazine, every Hollywood studio, and every public school into a thorn in the side of American nationhood. Recall that the Soviet Union was not felled by a coup d’etat and was never defeated on the battlefield. It just stopped believing in itself. It canceled itself, and a mere thirty years after the Soviet Union’s demise a similar inner collapse awaits the country that triumphed over it.

    The crumbling of nationhood has international echoes on the right. The ur-text here is Yoram Hazony’s book, The Virtues of Nationalism, which appeared in 2018. Hazony pins all foreign-policy errors on the neglect of nationalism. Empires homogenize and empires conquer, in Hazony’s reductive verdict. But nations hold back and nations nurture. The Soviet Union was an empire and went the way of empires. Nazi Germany, in Hazony’s bizarre reading of twentieth-century history, was an empire free from nationalistic tendencies. For nationalism, despite the many atrocities committed in its name, is for Hazony the stuff of human kindness. The Third Reich also went the way of empires. The European Union is an empire with an empire’s arrogance and dullness of mind. The United States is properly a nation, Hazony feels, though one that shows many signs of having metastasized into an empire. America at its worst is a progressive, faithless empire. America at its best is, well, Israel, a multi-ethnic Israel, as in the biblical Israel, a community with faith in its founding creeds, a nation that walks with God. Donald Trump’s “America first” foreign policy is admirable — in Hazony’s view — for its drive to reverse the post-national decline fomented by George W. Bush, Barack Obama, and the globalist elites who staffed their administrations and who justified the evils of their foreign policy.

    The right’s most ambitious declinist text is Patrick Deneen’s Why Liberalism Failed, also from 2018. Deneen belongs to a coterie of Catholic conservatives who, like Rod Dreher, vociferously lament the fading of Christianity. For Deneen, this is no temporary disturbance of American manners. It is the malfunctioning of the system — a malfunctioning that Deneen describes in the past tense because liberalism has already failed. American liberalism, the Jeffersonian venture launched in 1776, cracked up because it gave individuals too much liberty. With their unbounded rights, Deneen believes, Americans accomplished only a transvaluation of values, a dissolution of community, a degradation of their own dignity, and the travesty of an environment-destroying capitalism. Deneen wavers between a populist conservatism, an incipient political movement, and a utopian dream of organic community knit together by faith, and economic equality, and the curbing of appetites. The future is hazy in Deneen’s writing. What he seems to really care about is cataloguing America’s day-to-day decline.

    Meanwhile the American left is not to be outdone by the American right in its intimations of the end. Some of the gloomy analysis overlaps; much does not. There is a shared conviction that American political economy is out of whack. Manufacturing jobs are less the issue on the left, whose villain is not so much China as Wall Street and a predatory economic elite that has fixed the rules of the game. Since the government slavishly serves the one percent, the labor movement has been eviscerated and politicians routinely side with the plutocratic malefactors, giving them the assistance of the courts and the legislatures, and of lenient tax policies. Elizabeth Warren and Bernie Sanders both speak as if something can still be done. They are not quite apocalyptic, but both are heavily invested in the theme of decline. For them, we are on a deteriorating path towards economic and social entropy. Absent systemic change, the progressives insist, this is the path to doom. Not exactly nostalgic and not exactly pragmatic, progressive leaders such as Warren and Sanders are consistently declinist in their attitude toward the country before their eyes.

    For the past several decades, the academic left has tied class to race and gender. Its project can be literally progressive: the expansion of rights along a historical continuum — the New Deal welfare state and Great Society, the abolition of slavery and the civil rights movement, the successive feminist movements of the nineteenth and twentieth centuries. Barack Obama liked to say “from Seneca Falls to Selma,” and he had Martin Luther King’s phrase about the arc of the moral universe bending toward justice sewn into the Oval Office carpet. Obama’s own presidency was meant to demonstrate the reality of progress: he was the change we had been waiting for, a biracial man with a foreign-sounding name who would save the country from the regressive policies of George W. Bush and would do what he could do to help out poor and working-class Americans, to advance women’s and gay rights and to heal the country’s racial wounds. The victory of Obama’s Vice President in the election of 2020 revived the progressive hope that had once been the halo around Obama’s head. But even when Obama was in office, the left was moving on. In doing so, it ran up analytically against the inverse of progress, which is decline.

    On race, the left-wing case for decline is tricky. That there has been progress is incontrovertible, but it has too often been brief and fitful — the heyday of the civil rights movements, or the twelve years of Reconstruction. To say that the civil rights movement has been betrayed is not to admit decline so much as to acknowledge the size and the scope of the countervailing pressures — the never-ending momentum in favor of white supremacy, its constant reinvention and repurposing. The 1970s witnessed Richard Nixon’s Southern Strategy. The 1980’s witnessed Ronald Reagan’s racially motivated assault on the welfare state. The 1990s witnessed Bill Clinton’s prejudiced criminal justice reform. The months after Hurricane Katrina witnessed George W. Bush’s indifference to the suffering of African Americans. The Obama years witnessed one episode after another of racist police brutality, and 2016 witnessed the election of Donald Trump. Trump’s contempt for black Americans is plain and unapologetic, and it claims the allegiance of half the country. For the bleak picture developed on the left, America’s racial history can be said to culminate in the career of Donald Trump.

    On gender, the fragility of progress similarly invites pessimistic scenarios. The fragility emanates primarily from the Supreme Court’s perceived role as the defender of abortion rights and of gay marriage, though state legislatures in red states have become more activist in recent years. Now that it has tipped in the wrong direction, the Supreme Court can render everything null and void. An alternate reactionary future is hardly inconceivable. In 2016, the year of Donald Trump, a television adaptation of Margaret Atwood’s A Handmaid’s Tale explored the reversibility of women’s rights via a Christian patriarchy of Orwellian hue, a 1984 for the twenty-first century. Without implementing Atwood’s dystopia, Trump traduced everything that progressives cherish concerning gender: he appointed almost no women to senior cabinet positions in government; he regularly demeaned women and mocked their physical appearance; his administration did not respect the rights of transsexuals; it ceased speaking out on behalf of women’s rights internationally. In the figure of Mike Pence it signaled a disapproval of homosexuality. A Handmaid’s Tale could almost be viewed as a roadmap of the GOP’s intentions. A falling away from gender equality, the progressives warn, is terrifyingly close at hand.

    In American foreign policy, schemes of decline cross party lines. The idea that 9/11 sponsored imperial overreach is widespread on the left and the right. Trump rode this idea to the White House. For the left, the domestic consequences of an activist, even interventionist American foreign policy issue in depleted budgets and in techniques of surveillance and military control that may have been officially intended for counter-insurgency overseas but have been unofficially deployed on American streets. The killing of George Floyd, it is said, can be traced back to a criminal war on terror, the empire’s core corrupted by the chaos projected onto its periphery. In a recent book pointedly titled Humane: How the United States Abandoned Peace and Reinvented War, Samuel Moyn indicts America’s military interventions abroad as imperialism. A veneer of humanitarian moralizing covers an insatiable lust for power — the old Chomsky point. The collateral damage is hidden from view by drone warfare and by a population and an intelligentsia inured to the human costs of American foreign policy. As with the Vietnam War — from a left-wing point of view — the violence of today’s declining empire is made worse by the futility of its aspirations. To survive, the United States needs to end its addiction to neo-colonial “forever” wars. And there are many on the right who wholeheartedly agree.

    On climate change, however, the left is singular in its prophesies of decline. Trump’s Republican Party pretends that the problem does not exist. Their denial sharpens a left-leaning panic about climate change as an existential threat to cities, to communities, to human life as we know it. Al Gore’s inconvenient truth has taken on the proportions of an overwhelming truth. The nineteenth-century eft had placed its bets with industrialization, which if intelligently guided would bestow abundance and happiness upon the proletariat. The twenty-first century left cannot afford this optimism: the rape of the planet has come about through industrialization, through mass consumption, through the insanity of unchecked economic growth. Adam Smith’s wealth of nations has turned to dust. A Green New Deal might thread the needle, containing climate change and granting equality in one blow, but the details and the feasibilities of such a vast plan are sketchy. On the left, the foreboding is more vivid than the sense of agency. So far, it is the poor, the less powerful, the migrants and minorities of the world and of the United States, who are paying the price for climate change. Throw in a global pandemic and Spengler’s musings on civilizational decline are a genteel footnote to the future we have before us. What is the decline of the West compared to the decline or the extinction of the human race?

    In our savagely polarized country, then, there is one widely shared perception: that it is almost closing time. Declinism is trending. It is also self-fulfilling and mutually reinforcing. Each side has persuaded itself that the other is a vehicle of decline: the next terrible chapter is only a mid-term or presidential election away. The rising level of antagonism has attracted a third school of declinism, call it the centrist school, which posits political polarization itself as the catalyst of democratic decline: a “democracy deficit” in the America once rhapsodized by Walt Whitman as the beau ideal of openness, freedom, and the common man. This interpretation has a pre-history extrapolated from the electoral maps of 2000 and 2004, parts of which were red and parts of which were blue. In 2008, Sarah Palin was a sign of the coming red-blue war, though the race between John McCain and Barack Obama was polite, as was the race between Mitt Romney and Obama in 2012. Those were not duels to the death. Obama even campaigned against the diminishment of America into red and blue quadrants.

    Trump disagreed. He applied his cunning in marketing and branding to a movement based on his authoritarian personality and on a series of vague promises. Trump also applied his rhetorical gifts — crude but effective — to the demonization of his opponents, constructing a trap into which his enemies quickly jumped. They responded to his invective by demonizing him and his followers. Hillary Clinton dismissed Trump sympathizers as “deplorables.” The vituperation proliferated. Trump profited enormously from this degradation of political culture and from the melodrama of polarization that gave the media endless stories that wrote themselves. And the vortex of polarization, once entered, has been punishingly difficult to leave. It could justify impatience or rage against the procedures of democracy, which rest on a degree of civility, of give-and-take, of reasonableness, and on the acceptance that the defeat of one’s party is never an ultimate defeat. Setbacks happen, and they can be endured because not living with them leads either to revolution or to civil war. The Democratic Party struggled to stay civil as long as Trump was in office, and it succeeded. Biden ran sincerely as a Democrat and as a democrat in 2020. He did well as both. As a referendum on politics, the 2020 election illustrated Trump’s limitations and democracy’s fortitude. The less polarizing candidate came out ahead.

    Then came the putsch of January 6. Even those who had been wringing their hands about polarization and its declinist overtones were shocked by what happened: the Capitol stormed, the confirmation of Biden’s victory delayed, a violent mob on the loose within the halls of power, a President watching delightedly on television after having told this same mob to go and “fight like hell” for him. The Republicans responded as they have usually responded to Trump’s disgraces: briefly horrified, they stuck with the boss. Trump was impeached but not convicted. Sizable numbers of Americans now believe that the election was stolen, and were Trump to come back as a candidate in 2024 it would be as a would-be autocrat. Trump is the symptom, it is often pointed out. Polarization is the disease. In an extraordinary essay in The Washington Post in September, Robert Kagan diagrammed the death of the republic. “The events of Jan. 6,” Kagan writes, “proved that Trump and his most die-hard supporters are prepared to defy constitutional and democratic norms, just as revolutionary movements have in the past.” The Republican Party is their accomplice, Bolsheviks who wrap themselves in the Stars and Stripes. How Democracies Die was the name of a book in 2018 by two political scientists which, when published, sounded abstract and speculative, a bit like a seminar subject. For Kagan, democracy’s death may be three years away. Driven by rival expectations of decline, by the swooning pessimism of the left and the right, polarization might be the accelerator of actual decline. Trump the believer in decline and Trump the raven of decline is also Trump the guarantor of decline, the Captain Ahab for the doomed Pequod of American democracy.

    In 1978, Václav Havel wrote an essay on the political agency that had magically persisted in communist Czechoslovakia. Havel knew that people were living under despotism and that they were supposed to conform to the script written for them. Yet they had discovered countless subtle ways to expose the falseness of the power above them, and to resist it. Their small acts of honesty might blossom over time into something else, Havel speculated. His essay was called “The Power of the Powerless,” and in the years since it was written it has given heart to dissidents the world over. It described a political sphere that is accessible to those even in a tyranny, the inner freedom that may require courage to express and is noble and liberating when expressed. There are limits, Havel insisted, to how far even totalitarians can reach into the human spirit. Havel had never seen a Czech democracy with his own eyes. He knew only Soviet control, and 1978, when the Soviets were celebrating America’s humiliation in Vietnam and preparing to invade Afghanistan, was not an auspicious year for dissidents living under communism. Havel’s essay is astonishing not just for its trust in rational freedom, but also for its utter lack of anguish. He wrote confidently about the ultimate erosion of dictatorship in Czechoslovakia, when there was no empirical evidence to support such democratic assurance.

    Is such democratic assurance available to us? We live in a country that is superbly well defended and rich, a country blessed with institutions and with rights that ensure bona fide civic freedoms, that has an effervescent cultural energy, and that is nevertheless beguiled by the possibility of its own decline. We prefer to believe in the powerlessness of the powerful of ourselves, that is. The repetition of the declinist hypothesis has established it as a trope of our political culture, and tropes are hard to shake off. Declinism also meets the psychological needs of the pandemic, mirroring the pandemic’s reversals of fortune, its confirmation of threat and fear, and its curtailing of liberties that prior to covid were taken for granted. It may seem paradoxical to speak of the élan of declinism, but we are witnessing it. Since we are going under, we should discuss only the reason why we are going under. Gloom invites the consolation of further gloom. The gloom will only pass when the Untergang has reached its climax and that which we worried about losing is finally lost.

    The pervasive sensation of crisis in America cannot be boiled down only to a mood or an attitude. It follows also from a coruscating convergence of real-life problems. Declinists of different stripes have correctly recognized the terrible disparities between rich and poor, urban and rural, and the unequal opportunities and prejudice facing many minority groups; the undercurrent of violence in American life; the oligarchic nature of much of the American business elite and of portions of the political elite; the assault on facts and on intellectual and journalistic authority that has arisen from multiple technological revolutions, from our own cult of the “dynamo,” from post-everything intellectuals, and from an overabundance of confident men and women; the waning effectiveness of American foreign policy; and the terrible pressures of climate change, which require in response a willingness to plan and to sacrifice that does not come naturally to materialistic societies. Covid has trained a brutal spotlight on health care in the United States. The means were there to beat back the pandemic — the overall wealth, the government institutions, the scientific know-how, the heroic first responders — but the national will was missing as well as the organizational talent and the social trust.This is not the best of all possible worlds. It is also not the best of all possible Americas. To tie the country’s many crises into a bundle and to impute progress to this bundle, to regard our time as a period of upheaval after which the promised land is bound to be entered, would be absurd. There is no guarantee that any of the country’s many crises will be adequately dealt with, no hidden Hegelian machine that will produce a better democracy from the punishing antitheses of the present moment. Declinism is valid insofar as it consists in the recognition of actual problems. But it honors the problems too much: it merges them into a fate and prefers the consequent fatalism to an educated hope. Strangely, a narrative of decline also soothes: it rescues us from our challenges, and it comforts us with the certitude that tomorrow’s problems will be worse than today’s. One need not be an optimist to tackle problems, but if good government is to be attempted, if intelligent policies are to be matched to pressing problems, this must be done with the rational and researched expectation that problems can be solved — that, in the language of the Federalist Papers, they can be solved through “reflection and choice” and not resolved through “accident and force.” It is declinism that favors accident and force. It does so by inhibiting reflection and choice.

    Declinist arguments are wielded for political effect at home. So, too, are they wielded for political effect internationally. They have become the mainstays of official Russian and Chinese messaging about American politics and American power. “American Decline” was the title of a piece in Russia Today, a state-run media enterprise in Russia, in 2018. “In America”, the article informs its readers, “the richest country in the world measured by raw GDP, children are getting sick from living in open pools of sewage.” A less vivid analysis surfaces in a recent interview on Frontline, Chinese state media, in which an expert declares that the United States “is declining because of its own internal contradictions, bad governance, inequality, injustice. But American elites don’t want to say so, so they blame it on China.” This is a misreading of American elites, who discuss American decline incessantly, blaming it squarely on the United States. The concordance between what decline-oriented American elites are saying and what Xi Jinping and Vladimir Putin say does not itself undercut the argument that the United States is in decline, nor are American pessimists the stooges of China or Russia. Yet the inward-looking tenor of the declinist narratives that are proliferating left, right and center are costly in two respects. Too often they fail to incorporate comparative assessments that could moderate the sensation of decline: they tend to compare America at its best or America at its imagined best with the unvarnished America of the present day and leave it at that, missing the anomalous stability of American politics (even in recent years) and the frequent instability of other political traditions. And these narratives unwittingly amplify Chinese and Russian propaganda. Xi’s China and Putin’s Russia do not just wish to see American decline. They also wish to engineer it.

    Let us return to Gibbon. He can help to define political decline, which has a thousand different and mutually exclusive meanings in American debates. (Moral decline is notoriously in the eye of the beholder.) Gibbon’s three causes of Rome’s decline were imperial overreach, the crippling of the Senate, and the slipping away of rational freedom. The problem of imperial overreach may resonate today, as does Paul Kennedy’s association of debt with decline. But the United States is only imprecisely an empire. It lacks ancient Rome’s borderlessness and its appetite for conquest and occupation. The divisions of empire that Gibbon chronicles in his history, especially the empire’s separation into Eastern and Western halves and the transfer of the capital from Rome to Constantinople, have nothing in common with the United States as currently constituted. The United States is a geographically stable republic; Washington, D.C., is its indisputable capital; and it can determine the contours of its international commitments. It has no incentive to expand. Indeed, the United States faces no credible threats to the security of its homeland. No Roman general or Senator or Caesar could have imagined such a reassuring state of affairs.

    As for the destruction of the Senate, Trump is no Septimus Severus. Trump had four years to erect an American despotism. He spent them feuding with his own party. The Senate cheerfully defied Trump on Russia policy and in many other areas (such as health care). In 2018, Trump irritated enough voters for them to hand the House of Representatives to the Democrats. In the 2020 elections, which Biden won by some eight million votes, Trump’s party lost the Senate. In the Senate chambers, Trump’s Vice President calmly gaveled in a Biden presidency on January 7. Republicans in many state senates blocked Trump’s attempts to steal the election. Dozens of the January 6 rioters are now in jail. What the Senate represents — the spirit of mixed government, the skepticism toward executive power, the resistance to despotism in the making — has been damaged by the collapse of reasoned debate and by the slings and arrows of outrageous partisanship, but it is hardly dead. Trump’s presidency had the unwitting effect of elevating the Bill of Rights, making it more necessary, more urgent. Whether he runs and is reelected or not, Trump will kick sand into the machinery of American democracy until his last breath, but he will not become an American Caesar. He cannot become an American Caesar unless the country throws up its hands and starts to cry “Hail, Caesar!” in unison.

    Gibbon’s third cause, the divorce of a rational freedom from the life of the mind, is hard to measure. The religious tolerance that Gibbon lauded in pre-Christian Rome is active in twenty-first century America. It does not appear to be in remission. America continues to possess reservoirs of quotidian tolerance, the invisible successful negotiations of a richly heterogeneous society. Reason holds sway in the sciences, in medicine, in the study of law and economics: these are not negligible intellectual pursuits. And yet the “independent dignity of reason,” as Gibbon described this necessary attribute of ancient Rome’s success, is by no means in good shape. It has been battered by the stupidity of social media, the fads and fun-house-mirror politics of American universities, the perspectivist thrust of identity politics, the political wars on science and public health, the will-not-to-know on the right, and the soap-opera rhythms and ideological crusades of the news media. Consider this comment in an article on cancel culture in The New York Times: “The idea of intellectual debate and rigor as the pinnacle of intellectualism comes from a world in which white men dominated.” It was uttered by Phoebe Cohen, a professor of geoscience at Williams College, who may not directly oppose a rational freedom. She — a scientist, a professor — merely opposes everything that underpins it. Gibbon writes of the emperor Commodus having “an aversion to whatever was rational or liberal.” Many political figures, ancient and modern, have had this aversion. When it is shared by the teachers of the young, we are in trouble.

    Gibbon’s most applicable insight concerns the fatalism of the late-imperial Romans. Our talk of decline fulfills itself, and usually results in intellectual muddle. Talk of decline stymies compromise, because nobody should compromise with a political opponent on the verge of cementing a nation’s decline. Worst of all, it inclines toward emotional paralysis, the obsession with decline shading into depression and, as Gibbon noted, into narcissism, into a “fearful vanity” through which we love by lamenting ourselves. Musing on Rome’s decline and fall, Gibbon characterized the human mind as depressed by civil and religious slavery — held down, weighed down, bogged down. Moving in circles, the encumbered mind is its own biggest obstacle. The Goths did not prevent the Romans from upholding their civil and religious freedom. The Caesars did not eliminate freedom of mind, not completely. In the unwinding of Rome’s decline and fall, mental slavery was self-imposed. It arose from a fashionable melancholia, a mass depression, and so the power of the powerful was misapplied or went unused. But in the United States the hour is not too late. There are no experts in the coming of apocalypse. Born in 1776, like Gibbon’s great and cautionary book, ours is a country awash in agency, with many roads and crossroads ahead of it. Our liberty is a truth that should still be self-evident. It is ours to lose, and it is ours to keep.

    Birthrights

    One morning in tenth grade, my Bible teacher started class by holding up a copy of The New York Times. He was the one we called Little Adler, to distinguish him from his older, taller brother, Big Adler, who also taught at the school. Little Adler was a good guy, at a place that was notably short on them. A modest, bearded man, slightly pstooped, he was compassionate, he had a dry sense of humor, and he was the only teacher that I came across, in my ten years of yeshiva day school education, who told us that it was okay to ask questions — meaning fundamental questions, questions of belief.

    “Every story on the front page today,” he announced that morning, “is about the Jews.” Then he proceeded to point at them one by one, explaining why. Some were obvious. This was the year of the Camp David accords, and there were one or two articles about that. But the front page of the Times, back then, had eight or nine stories, and as he worked his way around the page, his reasoning became increasingly Talmudic. Nonetheless, in every case, he managed to find a way to connect the events in question to the fortunes of the Jewish people. “And,” he concluded, “you can do this every day.” Every day, in other words, one way or another, every story on the front page of the New York Times was ultimately about us.

    I grew up in a world that had a thick black line down the middle of it. On one side were us, the Chosen People, the “holy nation.” On the other side were them, the goyim. Each day in morning prayers we thanked the Lord for not making us Gentiles. On Saturday nights we recited the havdalah, the prayer that marks the close of the Sabbath. “Blessed are you O Lord our God king of the universe,” we said before the flickering light of a braided candle, “who distinguishes between sacred and profane, between light and darkness, between Israel and the nations, between the seventh day and the six days of creation.” It was an early lesson in grammatical parallelism.

    The goyim were inferior to us. They indulged their brutish appetites. They ate pig. They ate horse. They ate shrimp, which was practically like eating insects. They ate “creeping things that crawleth upon the ground.” They drank themselves blind. Oy, oy, oy, went the Yiddish ditty, shikker iz a goy, a drunkard is a Gentile. The maid was the goya; foolish pleasures were goyishe naches, Gentile delights; a dummy had a goyishe kopf, Gentile head. One night my father and I were watching a cop show. The detective’s friend had just gotten out of prison. “What can I get you?” the detective asked. “A bottle and a blonde,” the friend replied. “Of course,” my father said. “Why of course?” I asked. “Because that’s how a goy celebrates,” he said.

    The goyim hated us — every one of them, without exception. The only difference was whether they did so openly or not. Scratch a goy, my father would say, and you find an anti-Semite. Their hatred was eternal: it had existed since our beginning as a people, and it would persist until the coming of the Messiah. History did not progress but turned back on itself in an endless loop: persecution, redemption, persecution, redemption. The antagonists were not merely similar; they were identical, and had a name: Amalek. In the Book of Exodus, after Moses has led the Children of Israel in flight from Egypt — “when you were faint and weary,” he later reminds them — they are attacked by a tribe of that name. After the battle, Moses builds an altar and swears an oath: “The Lord shall be at war with Amalek for all generations.” It was Amalek whom we saw in history’s perseveration: Assyria, Babylon, Haman, Antiochus, Rome, the Crusades, the Inquisition, Chmelnitzki, the pogroms, Hitler, the Soviet Union, the Arabs. “In every generation and generation,” we sang at the Passover Seder, “they rise up to destroy us.”

    History, beyond that, was a blank. Of everything else that had happened to the Jewish people, or had been done by them — medieval Hebrew poetry, the life of the shtetl, Yiddish theater, the German-Jewish bourgeoisie; the ancient Jewish communities of Rome, Salonika, Alexandria; the Jews of Yemen, Morocco, Cochin; Baruch Spinoza, Moses Mendelssohn, Heinrich Heine — we were programmatically ignorant. Between the Exile and the State of Israel, history was one unchanging scene of persecution.

    At the same time, as Little Adler reminded us that morning, history was all about us, now as in the past. Wasn’t Christianity, after all, a bastard outgrowth of the Jewish faith? Wasn’t their Bible primarily stolen from ours, and the very messiah they worshipped a Jew? And Islam grew out of Christianity, and Hitler started World War II to exterminate the Jews, and Israel occupied the crossroads of the world, they couldn’t stop talking about it at the United Nations, and in the end of days the final battle would be fought on Mount Megiddo between Gog and Magog, symbolized by an eagle and a bear, which obviously referred to the United States and the Soviet Union.

    Our job was to keep the commandments. There were a lot of commandments (six hundred and thirteen delineated in the Bible, plus thousands more elaborated by the rabbis). There were things you couldn’t say and things you had to say, things you couldn’t do and things you had to do. Prayers morning, afternoon, and evening; blessings before sleep, after using the bathroom, before and after meals; holidays and fasts throughout the year. No mixing meat and dairy, no bread (or rice or beans or corn) on Passover, no using electricity, or playing music, or driving a car, or riding a bicycle, or handling money, or cooking, carrying, writing, tearing (I could go on) on the Sabbath. Boys and men wore yarmulkes and tzitzis, a tasseled undergarment. Women dressed modestly and covered their hair. To violate one of these precepts, which carried the force of taboo, was to commit an unthinkable act, an offense against the group as well as God. It was to mark yourself as other, beyond the pale, a kind of pollution. And in our tightknit world, with scores of families living in close proximity, you felt the eyes of the community eternally upon you.

    We were aware that there were other, non-Orthodox Jews — the Reforms and Conservatives, as we called them — but with them we had nothing to do. They were practically goyim themselves, with mixed-gender seating and prayers in English. Their children forgot who they were. Worse still were the traitors, the self-haters, the Jews who held us up to mockery before the world: Philip Roth, Woody Allen, Mel Brooks, all of whom our rabbi sermonized against. The dirtiest words in our lexicon were “assimilation” and, still worse, “intermarriage.” With those you completed the work of Hitler, though they wouldn’t save you from the next one. The German Jews had been assimilated, and look what had happened to them. My father had escaped with his parents from Czechoslovakia by the skin of his teeth in 1939, three days ahead of the Nazi invasion. “If you forget that you’re a Jew,” he liked to say, “the goyim will always remind you.”

    That was my world, unquestioned, until around the time I turned fifteen, the year that I had Little Adler. Then, browsing in the school library one day, I came across a book by Sigmund Freud. I was curious about psychology, and I had heard enough to be curious about him. The book was Civilization and Its Discontents. On the first page, I read this:

    I had sent [a friend] my small book that treats religion as an illusion.

    A few pages later, I read this:

    The derivation of religious needs from the infant’s helplessness and the longing for the father aroused by it seems to me incontrovertible.

    A few pages later still, I read this:

    The whole thing is so patently infantile, so foreign to reality, that to anyone with a friendly attitude to humanity it is painful to think that the great majority of mortals will never be able to rise above this view of life.

    And just like that, within the space of twenty minutes, the scales fell from my eyes. Of course the whole thing is ridiculous, I thought. Of course there is no God. How could I have ever believed any different?

    I kept the revelation to myself, but it must have leaked out of my skull like a kind of radioactivity, because soon my friends, then the teachers, figured out my secret. I had contracted — unspeakable word — atheism. My presence in the school became intolerable. I was permitted to finish the year, to avoid the stigma of expulsion, but only with the understanding that I wouldn’t be returning in the fall. Leaving meant giving up most of my friends. It meant transferring to public school, which was tantamount to stepping off the edge of the known universe. And it was the greatest thing that had ever happened to me. In Nabokov’s Invitation to a Beheading, the hero, in the final scene, is lying on the executioner’s platform, about to have his head chopped off, when

    with a clarity he had never experienced before — at first almost painful, so suddenly did it come, but then suffusing him with joy, he reflected: why am I here? Why am I lying like this? And, having asked himself these simple questions, he answered them by getting up.

    I had lived inside an iron cage that I’d mistaken for the limits of my world, and all I had to do to walk away was walk away. Before I did it, it was unimaginable. As soon as I had done it, it was inevitable.

    But reading Freud, in truth, was not the only or even the main event that levered me out of the world of my childhood. The previous summer, following in the footsteps of my siblings (who were considerably older and not as dug in to the Orthodox world), I had gone to a progressive Zionist summer camp on the banks of the Delaware River. For the first time, I experienced a way of being Jewish that was joyful and positive rather than scowling and dark. We had morning services, but they were creative and thoughtful, not rote. We celebrated the Sabbath, but as a day of peace and fellowship, not strictures and surveillance. We sang, we danced, we put on plays. There were girls, like at school, but they actually smiled at you. There were Jews of all kinds, but we were Jewish together as equals. My counselor, who was skipping college to move to a desert kibbutz, was a self-professed Maoist. My unit head was an intensely charismatic hippie rabbi. That summer opened many windows in my mind, and by the time I got back to yeshiva that fall, my consciousness was already in motion. Freud just gave me the intellectual push to head in the direction that my feelings were already pulling me.

    The camp was part of a Zionist youth movement, and from the day I left yeshiva till the year after college, the movement was the center of my life. I owe it more than I can say, but in retrospect it had more in common with my day school world than I would ever have admitted at the time. Instead of Orthodoxy, to structure our worldview, we had our ideology — a word that we used without a trace of irony and often with an edge of adolescent fervor. Under its aegis, we likewise cleaved the world in two, with Jews on one side and everybody else on the other. We talked incessantly about our Jewish identity, as if we had no other. We pronounced the words “assimilation” and “intermarriage” as reproachfully as they had in my synagogue. We didn’t stigmatize the Gentiles, but we knew our place was not with them. Our place was in Israel. Life in the diaspora, we told ourselves, was untenable as well as inauthentic. It had happened in Europe (nobody needed to ask what “it” was), and it could happen here. To be truly Jewish, you had to live a fully Jewish life, which meant, for us, not an Orthodox one, but one in an environment where everybody else was Jewish, too (the cop, the baker, the bus driver), just as everyone in France is French — where the national life was steeped in Jewish history, governed by Jewish rhythms, and conducted in the Jewish tongue. America, however hospitable, was not, we were sure, our true home. Our conduct was ruled not by the six hundred and thirteen commandments, but by one very big one: that we make aliyah, or move to Israel (the word means “ascent”), an event that we imagined as a personal transfiguration. Anything less was a failure.

    Why, after extricating myself from one belief system, did I throw myself into another? Mainly, the movement was just a wonderful environment in which to be young. It brought me friendships and community, a focus for my intellectual energy and an arena for my idealistic passion. But the movement and its ideology, I see now, also served some deeper psychic needs. As I moved out of Orthodoxy and into the American jumble — as I met Gentiles, for pretty much the first time, in high school and college, as well as Jews who truly didn’t give a damn about their Jewishness — Zionism gave me a sense of stability and certainty. It told me who I was, what I should do, and where I belonged and with whom. It gave me a system to structure not only my beliefs, not only my affinities, but also my decisions, my future. It solved the problems, in advance, that are raised by being young.

    Zionism also provided me with the righteousness and glamour of an oppositional stance with respect to American society — especially since our goal was not only aliyah but, ideally, aliyah to a kibbutz, a collectivist agricultural settlement, so that the opposition was not only to America per se but also to its shallowness and materialism. More deeply still, it provided me with a defense against otherness — above all, against the otherness within myself, the threat that it posed in the shape of forbidden desires, forbidden possibilities. The possibility, for example, of falling in love with a non-Jewish woman. Zionism enabled me, in other words, to evade the contradictions and complexities that went along with growing up not only Jewish but also American. Like Orthodoxy, it simply canceled the second term.

    I left Zionism, in my early twenties, as I had left Orthodoxy: for one intellectual reason and lots of psychological and existential ones. If the premise of Orthodoxy is the existence of the Jewish god, the premise of Zionism is that “it” can happen here. And at a certain point I realized that, no, it can’t happen here. There was anti-Semitism in America some of it violent, some of it organized, some of it even both (this was the 1980s, the age of the rise of the right-wing militias)but there wasn’t going to be a Holocaust. America was a different kind of society than Germany, than any European country. Which meant that the argument for aliyah could not be exclusively negative. There had to be something drawing me to Israel as well as something pushing me there. Spending a year in the country after college, I discovered that there wasn’t — or at least, that there wasn’t enough. Israel was beautiful and charismatic, but it was also, ultimately, alien. Its culture was Jewish, but its culture wasn’t mine. The place was full of Jews, but not ones that I had a lot in common with. It wasn’t, after all, my home. My home, with whatever ambivalence, was America. My home was with other Americans.

    The question became, on what terms? The problems of being young were problems once again. Who was I, and what was my place in the world? Solutions began to arrive when I started paying attention to the parts of myself that I’d held in abeyance, the ones that didn’t fit the story that I’d learned to tell about myself. Above all, my love of literature, the fact that novels spoke to me more intimately, more stirringly, more persuasively, than anything else in my life — that it was in reading that I knew myself and felt myself most deeply. In high school, the one class that actually felt real, like it was about something that existed as more than a “subject,” an academic exercise, was the one in which I discovered Dostoyevsky and Camus. In college, as a science major, I’d hold a book beneath my desk in class. During my months on kibbutz, that year after college, I read my way through Nabokov and Kundera. I had been trying to tell myself something, and eventually I started listening. Instead of treating reading as a private passion while I gave myself to other things, things I didn’t ultimately have my heart in, I needed to make it the center. I needed to go where my heart was.

    Three years after I got back from Israel, I enrolled in a PhD program in English literature. To embark on the study of Western culture, in any of its aspects, is, as a Jew, to venture onto hostile ground. Anti-Semitism is foundational to Christianity and endemic to Western art and thought. In the English literary canon, it is famously present in Chaucer, Shakespeare, Marlowe, Dickens, T.S. Eliot. Anti-Semitic stereotypes and sentiments surprised me in Conrad, Hawthorne, Woolf, Waugh, Henry Adams, Henry James. They were waiting for me in Voltaire when I took a course on the Enlightenment; in Celine, when I studied for my oral exams. These were people who didn’t like me. But they also were not going to stop me. They were not going to stop me from claiming a right to the Western tradition. They were not going to stop me from feeding myself with its fruits. They were not even going to stop me from liking their work — and in some cases, loving it, studying it, and teaching it. That they would have sought to exclude me was not going to bully me into excluding myself. That they would have sought to deprive me was not going to shame me into depriving myself. They wanted — like the rabbis and the Zionists, in different ways — to keep me in the ghetto, but I wasn’t going to let them.

    Was I “colonizing” myself? No, I was educating myself. I was forming myself, with the freedom that America allowed me and the elements that it made available to me. I was rebirthing myself, by choosing a new set of forebears, a new inheritance. I was also choosing how I wished to be American, because there are as many ways as there are Americans. Besides, being colonized is not the worst thing, if you do it voluntarily — a lesson that I learned from literary history itself. Joyce, Faulkner, Rushdie: these and many other writers, by placing themselves under the tutelage of metropolitan cultures, had freed themselves from the parochialism, the mental confinement, the moral and aesthetic backwardness, of their places of origin. Better to be colonized like that than to remain forever captive to the group.

    As I moved out, internally, into a wider American space, I also moved out socially. In college, I had formed my first real friendship with a non-Jew, one of my freshman suitemates. He was Italian-Polish, from deepest Brooklyn, and had gone to an all-boys Jesuit high school. When I first met him, he was wearing a large wooden cross. Before long, he had taken it off. Our affinity was obvious: we were each in flight from intensely religious backgrounds and each voyeuristically curious about the other’s. But hanging out with him, as well as with a bunch of his high school friends who were also going to college in the city, allowed me to still play the Jew, to still mark my difference. As I moved through my twenties, what was new about my friendships was that I didn’t need to do that anymore. At a certain point, I realized that I no longer asked myself, when meeting someone new, whether or not they were Jewish. That, for me, was a happy day.

    So what is my Jewish identity now? I don’t practice at all (my observance of Passover consists of calling my sister, who is still Orthodox, and wishing her a happy Passover). I’m not affiliated at all. On Israel I’ve given up. But I still feel as Jewish as ever. I still am as Jewish as ever. I was formed as a Jew — my consciousness, my sensibility — and that isn’t ever going to change. I’m not half Jewish and half American; I am entirely both. I am also entirely a writer, a husband, a teacher (this last despite the fact that I no longer teach). I’m not Jewish in a way that any organized Jewish entity, of whatever kind, would approve of. And I couldn’t care less.

    This is not an essay about being Jewish — or rather, it is not only about being Jewish. It is about identity groups, and what it means to live inside one. For everything I saw growing up as an Orthodox Jew and, to a lesser extent, as a Zionist, I see in the formations — highly mobilized, politicized, and ideologized — that dominate our social space today. Just as Little Adler reminded me that it’s all about us, so have the ideologues of African-American identity — to take the most conspicuous example — instructed us that it’s all about black people. We are told that the American Founders undertook the Revolution out of fear of the abolitionist movement that was gathering strength in Great Britain (they didn’t, and it wasn’t); that the slaves built America’s wealth (all of it, not some of it); that blacks were exclusively responsible for abolition (never mind the Union army, the Radical Republicans, and Abraham Lincoln); that the blacks who came north during the Great Migration created the industrial boom in the Northeast and Midwest (of course, they were drawn by it); that blacks were exclusively responsible for the successes of the civil rights movement (never mind the whites who marched with them, the liberal Democrats, or LBJ); that modern urban policing grew out of Southern slave patrols (the first metropolitan police department was established in London); that black women are the “backbone” of the Democratic Party (even though they make up only about 10% of its voters) and “saved” Joe Biden in the general election (even though the decisive swing, from 2016, was in moderate whites); that Republican efforts at voter suppression, “a new Jim Crow,” are directed exclusively at black people (not Democrats as such — blacks, like the college students who are also targeted, being simply an efficient group to aim at); that the insurrection at the Capitol was driven by racism (rather than a stew of many right-wing hatreds) and that the police response, so different from what greeted Black Lives Matter protests, was, as well (even though the crucial difference, as any student of left-wing movements can tell you — remember Kent State? — was not black versus white but left versus right); and, in sum — to sweep away all inconvenient particulars — that racism is the single ruling factor in American history, the hidden hand in every institution and development.

    We are also told that the history of race relations in America is one unchanging scene of oppression. Nothing alters or alleviates it: not emancipation, not civil rights, not affirmative action. As in 2021, so in 1950, 1890, 1619. Like Amalek, the demon known as white supremacy appears to be immortal, returning under different guises but eternally the same. Progress is unattainable, because the enmity between the races is immutable, not a historical struggle so much as a metaphysical one: black versus white, or, when things get really Manichaean, “blackness” versus “whiteness.”

    We do not find these mental structures only in the area of race, of course. For contemporary feminist orthodoxy — at a time when women earn fifty-eight percent of bachelor’s degrees, sixty-four percent of master’s degrees, and fifty-six percent of doctorates — today is one of the worst times ever to be a woman in America, and America is one of the worst places in the world to be a woman. According to the Human Rights Campaign, one of the nation’s leading gay and lesbian advocacy organizations, the rights of LGBTQ Americans are under unprecedented assault — this in the wake of Obergefell and Bostock, decisions that removed the last significant legal disabilities from those groups. And in the world of identity now, just as in my early milieu, it is everywhere us versus them — with extreme prejudice toward them. All whites are racists. Masculinity is toxic. Working hard and showing up on time are white things (yes, that’s a leftist position today). Independence and competitiveness are male things. We loathed and scorned the goyim in my Orthodox community, but at least we kept it to ourselves. Now, public expressions of hatred for the other are not only acceptable but applauded. “I’m so done with white people,” a member of the Twitterati will announce, to a chorus of likes, or, “Is there anything that women can’t do better than men?” “White,” “male,” and “cis” (not to mention “Karen”) are terms of ridicule and abuse, and speak for themselves.

    And just as in my Jewish world, the group demands unswerving adherence to norms. Deviate, and you’re no longer part of the us. Barack Obama isn’t “really black” (because he “talks white”). Pete Buttigieg isn’t “really gay” (because he “acts straight”). Where integration was once the goal, now assimilation is the bugaboo. Even worse are the apostates: black intellectuals who question critical race theory, feminists who point out feminism’s gains, trans writers who challenge the official line on youth transition or the social construction of biological sex. But who is harmed by assimilation? Who is challenged by dissent? The self-appointed leaders — the demagogues, the “spokespeople,” the professional Jews, blacks, feminists, gays — who need to keep the walls up to protect their status and their gigs. Even more, the group itself: not the members of the group — the group. “The we closes its ranks to protect the space inside it, where the air is different,” Patricia Lockwood writes. “It does not protect people. It protects its own shape.”

    I grew up in a community that was still deeply scarred by the Holocaust. My father was far from the only former refugee in our synagogue, and my second-grade teacher was a survivor. Many people had lost family, including us. Even today, when I read about the Holocaust, I think, we should have plowed Germany under, after the war, and scattered its inhabitants to the four corners of the earth. So I understand why historically oppressed communities develop the mentality they do. And I also understand the appeal of that mentality, of group identity, to the individual, especially the young individual and especially in America. I don’t mean the America of the identitarian imagination, of oppression and restriction, but its opposite: the America of freedom and possibility, of mobility and flux. The America that says: you can be whoever you want, but you won’t get any help in figuring it out. Because freedom is disorienting, and ready-made identities are reassuring. All the more so now. Everywhere the word today — as the images fly by — is “authenticity,” “authenticity.” We speak of it because we feel the lack of it. But in this age of relativism, of radical skepticism, of anti-institutionalism, when the self is always up for grabs — my situation, as I stumbled out of Orthodoxy, but a thousand times worse — the identity group can claim an ontological solidity, an iron-girder foundationality, that nothing else can. It alone, of all formations, is legitimate. It alone possesses the power to tell you who you are. And that’s tremendously seductive, even if it’s who you’re not, or not completely, or not anymore.

    It is especially seductive for those in the process, in the concrete circumstances of their actual lives, of leaving the exclusive environment of the group. That is, for members of minority communities who are rising to join the elite: students of color at fancy private high schools and colleges; their elders in academia, at The New York Times or NPR, in Silicon Valley, at the major foundations and think tanks, climbing up the ranks in Washington, DC, in Hollywood, and so forth. They are the ones who need to constantly insist on their identity, to reaffirm their separateness against the fact of their participation. Which is why they are the ones who have been preaching the identitarian crusade. It is elites within the Hispanic community who say “Latinx”; the vast majority of Latinos hate the word, when they have even heard of it. It is elites within the black community who propagate the dogma of critical race theory; blacks on average are actually more moderate than the typical white Democrat. It is elites within the Asian-American community who inveigh against assimilation; most Asian-Americans are busily assimilating. But then so are the elites, of all groups — so especially are the elites — and it is only their bad conscience, or, more charitably, their understandable ambivalence, that leads them to imagine otherwise. I was at Columbia for one of the uprisings of African-American students. “Columbia is a plantation,” the protesters said. Columbia was not a plantation. It was an institution that was ushering them, often with generous financial aid packages, into the upper middle class. But that is a destabilizing thought.

    I am not suggesting that individuals from marginalized groups who find themselves in elite settings should just be grateful and keep their mouths shut. Nor am I suggesting that they all should do as I did, or that the process of finding your place in the wider world, for such an individual — call it integration, assimilation, or what you will — is ever less than highly fraught. What I am suggesting is that it’s a process that you go through as an individual — that you must go through, in some fashion — and that to cling to a collective identity, particularly in the artificially exacerbated forms in which they come to us today, is to evade your own reality. “The question of color,” James Baldwin wrote, “operates to hide the graver questions of the self.” America had told him what he was, he once remarked; he went to France, leaving blacks and whites alike behind, to find out who he was.

    To insist on cultural self-segregation is to limit one’s own possibilities. Who would I be if I had only studied Jewish sources, or only read Jewish writers? What is a young woman missing if, as many of them say they do today, she only reads female authors? You won’t read Shakespeare? Virginia Woolf did not share that prejudice, and would not have become Virginia Woolf if she had. Children should read books, we are told, about people who “look like them.” But as Glenn Loury, the black intellectual, recently remarked, everybody “looks like them” — that is, human. No, not everyone is fully free, not yet, but everybody’s mind is free. The only limits there are are the ones that you place on yourself, that you agree to accept from others. And culture, too, is free. People may discriminate, but books do not. They make themselves available to anyone who cares to read them. Ruth Simmons, the former president of Brown University and the first African-American to lead an Ivy League institution, was asked why a sharecropper’s daughter would decide to study French literature. “Because,” she said, “everything belongs to me.” You can’t choose where you’re from, but you can choose where you want to go. “So, in Macon County, Alabama,” Ralph Ellison wrote, “I read Marx, Freud, T.S. Eliot, Pound, Gertrude Stein and Hemingway. Books which seldom, if ever, mentioned Negroes were to release me from whatever ‘segregated’ idea I might have had of my human possibilities.”

    The author of Invisible Man did not stop being black, of course, any more than I stopped being Jewish. But he worked through to a way of being black in his own individual fashion. And in so doing, he enlarged the possibilities for every African-American who followed. A healthy identity, for the group as for the individual, is not rigid and immutable, but creative and ever-evolving. That is progress. That is liberation.

    Soloism

    “You are the music/ While the music lasts.” Whatever these words mean in Eliot’s Four Quartets, they have often been given new meaning in dance, and nowhere more so than in the solos choreographed by Merce Cunningham. He is the choreographer whose most radical, controversial, and profound contribution to choreography was to separate it from music — or so it seemed. The dance that the audience saw and the music that it heard were composed independently. The audible music, which often varied unpredictably at each performance, operated separately, sometimes like a hostile environment. What was easy to miss was that, whereas most dance responded to heard music, Cunningham’s dance embodied many unheard musics. It was a theatrical adventure without serious precedent: sometimes the dancers seemed to be at one with “the music of the spheres.”

    The neurologist Oliver Sacks often wrote, most hauntingly in Awakenings, of the effect of music on physical coordination. He turned this effect into a participle: some recovering patients feel that they are “musicked,” others feel “unmusicked.” The profound connection that he describes of music to movement is something that all lovers of dance surely recognize. But Sacks tends to write of music as something that enters the human from an exterior source, from outside, as a stimulating accompaniment. Cunningham took the process to another level: he and his dancers made music by dancing it. The originality of the approach can hardly be exaggerated.

    Cunningham died in 2009. Are the musics within his choreography surviving his death? April 16, 2019 would have been Cunningham’s hundredth birthday. Since he died not too long ago and presented his last big premiere on his ninetieth birthday, memories were fresh. The centennial evening was marked by “A Night of a Hundred Solos,” staged in three cities: London, Brooklyn, and Los Angeles, a hundred in each city, a different anthology in each case. If you knew Cunningham’s choreography, it was natural to regard all three events as retrospectives, spanning the fifty-six years of his extraordinary career, 1953 to 2009. I was among many who watched one performance live, the other two on screen.Yet much about the evenings was anti-retrospective. The dancers could have included alumni of Cunningham’s own company (several of whom are still in good shape), but they were excluded. Instead, the performers were chosen from a far wider range of ages, physiques, and technical skills than Cunningham himself had used. As a result, the performance opened up Cunningham’s work in new directions to dancers who had never performed his work in public before. As always after a choreographer’s death — and sometimes during her or his lifetime — there were moments when his spirit seemed entirely absent, and others when it was alive but in peculiar ways. Often enough his solos were marvelously present, with an astonishingly diverse assortment of dancers, newly musicked, offering a wide range of physical and expressive idioms.In a career devoted to experiment, Cunningham opened many doors to the future. (“New possibilities” was a favorite catchphrase of his.) What was uncanny about “Night of a Hundred Solos” is that, by creating a dance theater composed of solos alone, it anticipated the realm of social isolation that would become the new norm within less than a year. Even when the stage contained several different solos at the same time, each of the dancers was isolated, focused, motivated — and sometimes conflicted, sometimes experimenting. For Cunningham, a solo had always been a process of self-discovery — and here was the process continuing after his death. The London, New York, and Los Angeles editions, each ninety minutes long, were quite different anthologies of Cunningham material, but each amounted to more than a collection of solos: all three were celebrations of the possibilities of soloism. A year later Cunningham solos were being taught on Instagram, where “mercetrust” now has over 39,000 followers. And each solo is a musical composition in physical form.

    Cunningham was a skilled and powerful partner, trained in ballroom genres from his early teens; and he was happy amid company. But he was also, by temperament, one of nature’s soloists: freest when performing alone. In his case, isolation really was splendid.

    He began studying modern dance at the age of eighteen in Seattle, with Bonnie Bird, who had studied with Martha Graham. Graham, notably in her masterpiece Primitive Mysteries in 1930, had been studying the dances of the Native Americans of the Southwest. This was an era when American anthropology was seizing the imaginations of many: it helped Americans to identify the traditions peculiar to their continent. Cunningham, in his second year of studying modern dance, discovered anthropology, which remained a lifelong source of interest — in particular the dances and culture of one particular tribe of Native Americans, the Swinomish, a tribe that lives and fishes on the islands and coasts in northern Puget Sound, in northwest Washington state. Whereas the dances of the Southwest tribes emphasized the collective, many of the dances of the Northwest are trance dances for soloists. As Bonnie Bird put it: “The Northwest Indian dance is spirit dancing, quite different from the group dancing of the Southwest — it’s always solo dancing, one dances only when one has ‘caught one’s song’ and is filled with the spirit that takes over, that invades one.”

    In 1939, Cunningham moved to New York, dancing with Graham’s company until 1946. He began presenting his own choreography in 1942. From the 1950s on, his own work became renowned and notorious for what remains the most radical move in modern choreography: making dance and music co-exist independently of each other. The word “song,” used by Bird to describe the Northwest Native American dances, is often used by Cunningham dancers, though few of us were aware of this while Cunningham was alive. Once you know, however, it seems obvious. Patricia Lent, who danced for Cunningham from 1982 to 1993 and was named by him as one of his Trustees, has observed that

    Merce (and all of us) “sang” the rhythm of phrases. It’s related to, but not the same as counting a phrase — a little more open, using words/sounds for the movements mixed in with counts & claps & snaps, in a way that indicates rhythm, dynamics and timing, etc. If a phrase has multiple meters or is non-metric, it can still be “sung.”

    The dancer Jeannie Steele was a member of the company between 1993 and 2005. Those dates mean that she belonged to what is known as Cunningham’s computer period, when he was generating choreographic ideas on screen before he taught them to the dancers. This often seemed to change the nature of Cunningham’s phrasing and all-over physicality — and yet Steele, speaking in 2019, recalled that “My head was full of songs and sonatas when we danced!” She added: “We didn’t hear those songs and sonatas; yet we saw them, even felt them.” When I asked Steele whether the music in her head was ever remotely like John Cage or other Cunningham composers, she said, “No — it was not minimalist.” And she made a further point. When making a solo, Cunningham would sometimes say, “Rhythm is one thing, but phrasing is another. For phrasing, you have to listen to Billie Holiday.” He loved Holiday for her way of bending a rhythm within a phrase. Although Cunningham did not speak of Billie Holiday to most other dancers, he did encourage various forms of rhythmic play and (in some dancers, not all) rubato, the old musical technique of taking freedoms with tempo in performance, of delaying and accelerating for purposes of interpretation. BIPED, from 1999, the most generally acknowledged masterpiece of his final years, begins with a suite of five solos. The first was made on Glen Rumsey, whom Cunningham instructed: “This solo must always be one minute long — but you must take different parts faster and slower at each performance.”In 1940, when the young Cunningham was sent by Graham to study at the School of American Ballet, Lincoln Kirstein, the philosopher-historian of ballet and the school’s director, did not look kindly on the prospect. “Why would a modern dancer want to study ballet?” Cunningham’s reply was frank, simple, and accurate: “I really like all kinds of movement.” Indeed he did. Before he had taken up modern dance at the age of eighteen, before he had observed the Swinomish dances at the age of nineteen, he had trained extensively in tap and ballroom; and many of his subsequent dancers felt that tap was near the roots of his creativity. The radical Cunningham was not always averse to the traditional interaction of music and dance. From the 1970s on, many who took his technique classes remember the glee he took in the pianism of Pat Richter, whose repertory ranged from Scarlatti to Rodgers and Hammerstein.

    A part of the technical training that he gave to his dancers was to increase their speed and thereby the finesse with which they could play with rhythm. But if there was no pianist, he himself gave his dancers their music. The Cunningham dancer Alan Good, who danced for Cunningham from 1978 to 1994, said recently:

    It’s not true that we took class from Merce without music — Now that he’s gone, I crave a recording of it! Merce had that amazing way of snapping his fingers, calling, hitting radiators, tapping his foot, slapping his leg — incredible rhythms. By any stretch of the imagination, that’s music.

    This was Cunningham the jazz musician. Those Cunningham dancers with prior tap training often discerned a tap sensibility in the way he would teach them the rhythms and phrasing of a new dance. For Cunningham, rhythm was already an opening to music. Sometimes the rhythms were of exceptional complexity. To an outsider, Cunningham’s Native Green, from 1985, seems like one of his less physically rigorous pieces, with a marvelously rippling lyricism, and an emphasis — unusual for him — on connective flow throughout the limbs and body. (“This “ripple” effect may have been Cunningham’s adaptation of the “release technique” developed by a number of choreographers who had worked with him in the past.) But it opened with a solo that Cunningham gave to Megan Walker, who has recounted how she was dealing with sequences all in different odd numbers: 9, 7, 5, 11, 13, 15, 17, then 13, 7, 5, 15, 17, 11, 9, then 11, 13, 9, 15.

    My experience dancing the solo was that the odd inorganic measure of count was done to break up the possibility of the movement becoming too “flowy” and predictable. The movement was very sequential within each body part but also between body parts. So the back might ripple — lower, middle, upper — but also have a sequential relationship to the entire body: torso, leg, arm, head, for example. Or head, body, arm, leg, or some other variation of body parts following one another. It was sensual but awkward and weird.

    There are many musicalities, inside dance and beyond it. For some dancers, movement must be absorbed as meter: they need to count. For others, it is a more jazz-like kind of rhythm: hence the tap response. And for others, movement is melody: they have to sing it. Cunningham allowed, and educated, his dancers to work in all these ways. A Cunningham solo need not stay rooted in one rhythm alone, while almost every larger Cunningham dance deploys a wide range of rhythms. There are works of polyphonic music in which three or more voices are heard, with different speeds but part of one encompassing rhythm. Cunningham, however, developed something more: an alternative polyphony in which multiple dancers go about their business onstage, each in a different meter. The music in a Cunningham piece is never part of the music we hear, except the sounds of the footwork.

    While the Cunningham dancer explores time, he or she explores space, too. The connection of time and space is as elemental to dance as it is to physics, but for the musician-composer John Cage it was especially so. Cunningham first encountered Cage in 1938-1939 in Seattle, when Cage, who was twenty-six, joined the faculty of the Cornish School where Cunningham was studying. In some classes Cage taught the dancers to consider dividing both space and time as they made dances, and to compose percussion music as they danced. Cage, a man of Diaghilevian enthusiasm for the latest modernism in music and art (his wife Xenia was a surrealist sculptor), encouraged Cunningham and other students to study new art. When Cunningham arrived in New York in 1939, he continued the process, befriending many artists. And when the Cages arrived in New York in 1942, initially staying with Peggy Guggenheim and Max Ernst, they intensified Cunningham’s absorption in new painting.

    When Cunningham began to present dance recitals — usually with Cage as the lone musician — there was always a large number of artists in the audience; virtually a clique. According to one anecdote told in a book about Willem de Kooning, in 1946 the art critic Harold Rosenberg stood up before curtain time at a Cunningham performance at Hunter College and in a booming voice announced, “There’s a stranger in the third row — throw him out!” Then, as the lights began to dim, he jumped up again. Looking around and not seeing the sculptor Ibram Lassaw and his wife in the audience, he protested, “The evening can’t start — the Lassaws aren’t here yet!” Elaine de Kooning told the same story, but about a Cunningham concert in 1954. Cunningham’s early dance recitals — avant-garde by the standards of both ballet and modern dance — appealed to the clubbish world of New York painters long before those painters attained fame and even longer before Cunningham’s work grabbed other dancegoers.I am reminded of those artist-dominated audiences whenever I see Cunningham’s Suite for Five, from 1956. (It incorporates some male solos from a 1953 work, and in 1958 Cunningham added a female solo to it.) This, as pure a dance as Cunningham ever made, is dramatic not because of any acting but because it abounds with formal and physical contrasts. There is nothing that resembles a plot: we seem to be watching an artist at work in the studio, trying out ideas. When we first see the lead man (originally Cunningham himself) he is standing still in one downstage corner but is facing along a further downstage diagonal. To open his first solo, he simply turns his upper body, from the waist, to face the side. Returning to his original stance, he slowly rises onto the balls of his feet (relevé in ballet) in a way that looks precarious, hovering on half-toe for the sake of its sheer vertical emphasis. His next sequence of steps takes him backwards along that diagonal. Sideways; upwards; backwards — the work has begun by addressing fundamental facts of space: you hardly notice that they are etched in long, firm phrases.In the same work, when the leading woman dances a solo, she is both artwork and artist. She has an early phrase in which she, balancing on one leg with her other leg extended behind her in a sculptural position, rotates from the ankle from side to side, so that her leg, torso, and arms maintain their fixed position while swaying as if in the breeze. The image always evokes one of Alexander Calder’s mobiles. (Calder had been an inspiration for New York artists since the 1920s and 1930s.) Moving forward towards us, she then turns and tilts her upper body to the side, and then extends her arms forwards as if making a long hoop, one arm above the other. With her legs facing in one direction and her arms in another, she is multidirectional. And then she parts those arms wide, as if opening a vertical zone of air. It’s as if she is sculpting space.

    In those solos and others, leading Cunningham dancers, although performing set choreography, seem like creative “action” artists, as vivid in their cool way as film of the more violent Jackson Pollock throwing paint on a canvas. It might seem surprising to describe Cunningham as an Abstract Expressionist: one intelligent account of his work, Roger Copeland’s Merce Cunningham: The Modernizing of Modern Dance, argues that Cunningham and Cage went with their friends Robert Rauschenberg and Jasper Johns in rejecting the “action art” of the earlier generation of New York artists. But the issue is more complex than any either/or. Willem de Kooning, the only Abstract Expressionist to share Pollock’s degree of eminence, spoke to Cunningham of his admiration, and Cunningham valued his praise. There was a personal antipathy between the extremely heterosexual Pollock and Cage, whose private homosexuality Pollock seems to have spotted, but that did not stop Cunningham from recognizing Pollock’s importance. In an interview in 1970, Cunningham chose Pollock as the exemplification of the modern impulse in theatre:

    I think it is essential now to see all the elements of theater as both separate and interdependent. The idea of a single focus to which all adhere is no longer relevant. With the paintings of Jackson Pollock the eye can go to any place on the canvas. No one point is more important than another. No point necessarily leads to another.

    By the 1950s, Cunningham was a master of peripheral space and unorthodox stage geometries. Rune, from 1959, begins with the male dancer alone not only in an upstage corner but facing into it: in one of the most peculiar beginnings of any dance, he stands still there for a whole half-minute before moving. One basic law of Cunningham physics is, Wherever the dancer is facing, that is front. Cunningham often took conventional geometries and made surprising use of them. Perhaps no path across the stage is more orthodox than the diagonal, especially when crossing from an upstage corner to the downstage corner opposite. It was a favorite of the late nineteenth-century choreographer Marius Petipa, and it was also recommended by the old-school modern-dance guru Doris Humphrey in The Art of Making Dances. Cunningham uses it often — for example in a renowned solo made for Holley Farmer in Loose Time in 2002. But see what he does with it! For a start, the diagonal is the entire solo, which takes the better part of two minutes. (On the Hundred Solos night, the ballerina Sara Mearns danced it, barefoot, as she has on a number of subsequent occasions in several countries.) The soloist punctuates it with a furious series of steps (many of them jumps), attacked with maximum intensity, in which arms, legs, and spine keep changing position, all phrased in groups of threes. You hardly notice that her trajectory is a diagonal, because she keeps changing direction with each step. She is hurling thunderbolts, but she never fully releases their energy. Even though her energy is explosive, we are aware that her furnaces stay charged.Cunningham’s Neighbors, from 1991, starts with a lone man leaping along the opposite diagonal — from upstage left to downstage right — in three jumps of exciting impetus. But whereas a Petipa dancer might use such a line of jumps along that same diagonal to convey a single progress, as stations of a single journey, Cunningham changes the emphasis on each jump. The dancer takes the first jump facing in the direction of the jump. On the second, however, his torso pivots so that, though his legs have taken him in the same direction, he lands facing sideways. And on the third, his torso pivots yet further, so that he lands facing back up that same diagonal, looking in the direction from which he came. You watch the art disassembled into its basic elements, which are then re-combined in wholly unexpected and unprecedented ways.

    For many Cunningham dancers and observers, it is important not to attach meanings to steps, and generally they are right. Still, you cannot fully experience Cunningham’s work as drama unless you find some passages that lodge ideas in your head. Neighbors has not been danced since the early 1990s, but I have never forgotten that opening diagonal. (The dancer was Alan Good, marvelously in-the-moment.) Although I resisted the temptation to interpret it, the haunting effect of that sequence — with arrivals successively facing ahead, sideways, and backwards — began to suggest someone who addressed future, present, and past. Meanings began to insinuate themselves. After Cunningham’s death, his choreographic notes were opened to researchers at the New York Public Library and lo and behold, Cunningham prepared Neighbors with the working title of Hierartic Shaman. (The strange spelling “Hierartic” is consistently used over some weeks of notes.) The dancer who opened it with those jumps had begun life in Cunningham’s highly formal mind as a shaman, another of his anthropological reimaginations.Since Cunningham took the purity of pure dance into new zones, his dances have too seldom been considered as drama. Most Cunningham dances call for no dimension of acting. The pared-down lack of make-believe or role-playing was an important ingredient for many of his performers. And yet we would be wrong not to regard them as theater. John Cage liked to say that any street corner was theater anyway. A few Cunningham dances do have evident elements of dance acting — Crises (1960), RainForest (1968) — but they are so far from most forms of theater that people shy away from discussing their expressive aspects. I can only plead that to try to do so is to deepen the Cunningham experience. Let us consider only the beginning of Crises and the ending of RainForest.In Crises, a work revived this century, the curtain rises to reveal a dancer (originally the exceptional Viola Farber, whose performance was caught on film) in pure-dance conflict. She is unforgettably divided against herself. Two concurrent impulses are visible in her body at the same time: she slowly extends one leg to the side (développé à la seconde, to use ballet terminology), but above the waist her torso is convulsed by spasms. She could be a character out of Dostoyevsky (a writer whom Cunningham admired); she could be a case of Multiple Personality Syndrome. The lead male dancer (originally Cunningham himself) is gradually approaching her, but she is so locked in her own state, so possessed, that what she is doing seems like a solo. A duet does eventually ensue, but with this astounding opening image Cunningham has established how powerfully independent a woman can be. Crises is a dance for four women and one man, and all four women might take “I’ll be hard to handle” for their motto.

    Until its ending, RainForest is one of Cunningham’s many studies in change. We watch the first dancers leave the stage while others take their place. Everyone who enters is new to us, and is distinctly individual. There is no interest in any unison of identity. There are six in the cast: the fourth, fifth, and sixth dancers we see might each be from a different animal species. The title refers to two very different rainforests: that of the Congo described in the anthropologist Colin Turnbull’s book The Forest People and the one in the Pacific Northwest’s Olympian Peninsula, the magical realm to which the young Cunningham was introduced by his parents in his childhood. Nothing about Cunningham’s drama is a literal rendition of either locale: his imagination is powerful, but it is not narratively specific, so it also liberates ours. But then he adds two final twists to his wordless scenario. The sixth character is onstage when the original man strides back on. We were not expecting someone we knew. Has this succession of changing characters suddenly become La Ronde? Yet he is transformed: he is far more authoritative than before. Since RainForest has been a succession of highly charged encounters, we do not fully appreciate the change in him at first.

    Finally he is left onstage alone. The solo that ends RainForest is like nothing else we have seen here. He is on fire, furiously charging round the stage, swinging one arm like a propeller, full of abrupt changes of direction and focus. This is the role that Cunningham made for himself, and this is the solo in which he declares himself as seer and poet. Whereas he was once part of the RainForest world, now he has taken possession of it. The stage in RainForest famously abounds in the helium-filled silver balloons designed by Andy Warhol — but it is only now that a character commands this unpredictable vegetation with gestures that claim the air. If there is a line in poetry that comes to mind — it does invariably for me — it is Rimbaud’s “J’ai seul la clef de cette parade sauvage,” “Only I have the key to this wild parade.” (For John Ashbery, introducing his translation of Rimbaud’s Illuminations, Cunningham was one of Rimbaud’s heirs in his visions of the modern.)

    Cunningham’s dance theatre abounds with solos, but it is not anti-social. Some of his ensembles, even those with solos occurring in them, are like parties. How to Pass, Kick, Fall and Run (1968) combines vigorous athletic impetus with champagne celebration; Duets (1980) is a series of six couples each exhibiting a different idiom — “ballroomy,” said Cunningham, though no ballroom ever contained pairs like these; and Roaratorio (1983) is a (largely festive) wake. In a number of Cunningham dances, the entire ensemble — often more than twelve dancers — all takes to the stage doing individual solos at the same time. A fine example is the end of CRWDSPCR, made for thirteen dancers in 1992. The work’s name is pronounced “crowd-spacer” or “crowds-pacer”: it is an intensely urban piece, with an ebullient finale that is the biggest moment of release throughout the whole work, yet with nobody connecting. Such scenes — “ensembles” is certainly the wrong word — are celebrations of contingency in human encounters, of diversity in social coexistence.Often the image of coexistence seems to become a peaceful, nothing-happening, everything-happening landscape populated by multiple species but undisturbed by man. These works — though several Cunningham alumni do not accept the term — are widely known as Cunningham’s “nature studies.” (Inlets, Inlets 2, Beachbirds, and Pond Way are prime examples.) In reality, they are studies in Zen. Some of the most striking soloist roles are for women. In Pond Way, from 1998, one woman slowly but unstoppably passes across the stage: she does not seem to notice that three men are dancing around her, waiving attendance on her, supporting her — as they claim her, she behaves with complete indifference, as if she has no need of them. Another woman dances a stationary solo that reminds me and others of a water lily. She steps into a balance, slowly allows an impulse to pass up her body like a wave, until her upper body and arms open into bloom; then she steps into the next balance, gradually opens upward again into bloom; and thence onward, flowering and re-flowering.

    BIPED, the sensational hit of Cunningham’s old age, had its premiere the year after Pond Way, and two weeks after his eightieth birthday. It begins with a suite of five brilliantly energized solos. Each swept to and fro across the stage; each drew a different irregular chart in a different rhythm. Whether you saw these dancers as humans or spirits — for many people BIPED is about transcendence or death — their range of vocabulary said much about Cunningham’s view of energy itself. He himself was increasingly confined to a wheelchair by then, but not so his idea of dance. Ten years later he celebrated his ninetieth birthday with Nearly Ninety, which abounded with trios, duets, and, yes, solos. One section was a series of no fewer than eight successive solos for different dancers. Every solo announced the dancer brilliantly, with marvelously differentiated uses of the head and eyes. Brandon Collwes, facing front as he began, slowly turned his eyes to his left while his left foot stealthily crossed over to the right, as if the action of his eyes might distract you from noticing that that foot was moving in the opposite direction. And every solo had a “but,” a striking change of topic. Rashaun Mitchell began with his feet planted far apart, slowly swinging his pelvis from side to side like a pendulum, but he ended with a stunning circuit of jumps around the stage, and each jump, hovering at its apex, struck a different shape.When is a solo a solo? Often in Cunningham it is hard to know whether a quintet is a quintet or five simultaneous solos, or whether a duet is a duet or two adjacent solos. And it is deliberately hard. Cunningham loved ambiguity and hated categorization. August Pace, in 1989, was an ensemble for fifteen dancers, but its duration was structured as a sequence of seven male-female duets. Intermittently there were marvelous supplementary passages that added extra layers of structural mystery around each duet: the fifteenth dancer, never seen in a duet but often partnered by several men at a time, was like the joker in the deck. Within each duet, when each couple met for partnering, they did so dramatically, with different kinds of support. In the final duet, the woman took exciting falls into the man’s arms.

    In most of each duet, however, woman and man stayed notably apart from each other, in separate but simultaneous solos, in separate meters and thus in separate spheres. It is no surprise to discover in Cunningham’s notes that the August Pace duets were largely planned as pairs of independent but simultaneous solos. In most choreography, the term pas de deux, literally “steps for two,” is a misnomer, with the man doing little but supporting the woman — but in Cunningham we often see separate pas given to tous les deux. The surprise is that several of his other duets involve sustained proximity between their two dancers.

    His penultimate work, XOVER, from 2007, consists almost entirely of couples. But while the opening and closing sections were ensembles, made up of pairs, the long central section was one long, slow duet. It lasted seven minutes and thirty seconds, and it tested the duet form to a new kind of extreme. Maintaining one tempo, the couple moved in the same tempo in a zig-zag path across the stage, and they never left one another’s side. But they were often at right-angles to each other, connecting a series of statuesque positions in a steady legato sequence; and their eyes seemed never to meet. Their degree of contact and support seemed largely formal, though they showed signs of mutual awareness, even mutual curiosity. This was, it seemed, a picture of a couple politely cohabiting while maintaining separate lives of the mind: both beautiful and poignant. Beckett once remarked that Waiting for Godot was about symbiosis; the same can be said about the XOVER duet.

    To borrow a Sondheim line, many Cunningham duets seem to ask, “Are we a pair?” His solos, however, contradict another line from Sondheim: “Alone is alone, not alive.” In Cunningham, alone is alive. No soloist in Cunningham seems lonely, and many are freer when apart from others. And in an era of epidemiologically enforced isolation, Cunningham’s art of soloism seems more profoundly connected to human experience than ever before.

    Cunningham died in July, 2009. He amazed many (though not all) in announcing in June of that year that he wanted his dance company to deliver a two-year posthumous tour and then close operations forever. Sure enough, the Cunningham company then toured the world intensely in 2010-2011, giving more performances in 2011 than in any year since 1987. Thus December 2011 became the Cunningham Dance Company’s final month of performances. Three of them were at the Brooklyn Academy of Music. One of the works performed there was Split Sides, which originally had its premiere in that theater in 2003. An elaborate affair, it involved two musical scores (by the British rock band Radiohead and the Icelandic experimental group Sigur Ros), two décors, and two lighting plans. People in the audience, often celebrities, were invited onstage before each performance to shake dice, which determined the order of the scores, lighting, sets, costumes. You never knew in advance how Split Sides would look or sound.The binary nature of Split Sides continued into one highly complex solo for the young Jonah Bokaer, who had to ride two different rhythms, both intricate, at the same time, often while hopping or balancing on one leg. After Bokaer left the company, Cunningham gave this solo to his newest soloist, the prodigiously talented Silas Riener. He had over three years of experience in it when he danced it in Brooklyn that December. He made an immediate sensation, with applause bursting out as he left the stage after his dance. Cunningham had been fascinated by the possibilities of balance since at least the 1950s: by one or more dancers balancing on one leg, adjusting balance while on one leg, hopping and/or tilting on one leg while maintaining an otherwise fixed position. Riener’s Split Sides solo, even more audacious and complex than Bokaer’s had been, has become Cunningham’s last and posthumous demonstration of the miracle of balance. (The Brooklyn Academy captured his performance on a video, which was quickly released and soon went viral on YouTube, where it remains an amazement to this day; type “Silas Riener Cunningham Split Sides.” Indeed, the Cunningham adventure continues on Instagram and Facebook among other media, much consulted when the world’s dancers were at their most isolated from one another during the Covid years; and a Doubles Suite arranged by Cunningham in 2008 may be seen in both Tacita Dean’s sensuous film of the Cunningham company’s Craneway Event that year and in the DVD of the Cunningham Company’s closing performances at the New York Park Avenue Armory at the end of December 2011.)

    Cunningham made Split Sides during his “computer period,” when he devised dances on a dance-adapted computer before teaching them to the dancers. (The dancers never saw the computer, just as they seldom if ever saw him using chance procedures, though his notes reveal that, just as he had said, he used dice or coins to determine many aspects of each work from the 1950s onward, often a different compositional aspect with each work.) He used the computer to delineate feats of physical coordination that had never been attempted before. It was sometimes possible to feel that this made the dancers look like robots — but what do any of us know of robots? Riener in Split Sides was many things: an alien from another planet, intensely human, wholly unpredictable, kinesthetically affecting. There are many moments in dance when, as we sit still watching, we nonetheless feel some part of us is dancing or being danced. This solo abounded in them, so that Riener’s initial staccato outbursts at first seemed inhuman, robotic — and then, such was his sustained control of physical energy, we were swept along by his impetus even while being amazed by it. I saw Riener perform it in Cunningham’s lifetime in 2008, and twice at the Brooklyn Academy in 2011, but there are still moments that astound me to watch on video. He chews up space as he travels backwards across the stage on one leg, and then, with the other leg stretched arrow-like behind him, he suddenly tips his torso over to the side: an invariably breath-taking feat. Watching, we feel that we are losing our balance with him — and then we realize that we are not. Like the amazing figure on stage, we are still in control.

    Cunningham himself had been a great dancer, often performing lead roles with jumps into his fifties, always onstage in at least one work given by his company every evening until he was seventy, sometimes returning to perform onstage into his eighties. At every stage of his career, however, there were important works in which he showed his faith in what his dancers could do without him. One such, in 1985, was named Doubles. One of the name’s implications was that Cunningham planned it with two casts, but gave each cast its own striking differences of inflection from the very first rehearsal. There were three male solos, one female solo (which opened the work), and a slow, recurring female duet — no touching — in the peripheries of the stage (which, Cunningham’s notes reveal, was originally planned as another solo). Each solo lasted several minutes in its own sustained meter; some of them seemed to have the physical spacing of chess pieces.

    The dancing was so powerful, each dancer was so individual, that you could reasonably see these solos as the heart of Doubles. Yet the Doubles world was larger than these thrilling individuals. As in August Pace, Cunningham created other ensemble dance material that conjured a fluid climate and context, like an urban park or piazza. These ebullient solos poured forth with force, pulse, and wonderfully heart-catching details (sudden changes of direction or focus, jumps whose apex caught the light, gestures of arms or legs that claimed the moment) while other people — including those soloists — came and went in various speeds and idioms, with the sublimity of a city square. What a piece of work is man! And how beautiful the democracies within which he or she can be free.

    Hardly a Day

    I.

    All time indeterminate now
    so this might be late or early
    and hardly a day in itself.

    Call it infernal nevertheless
    with my first move a descent
    into air thick with lamentation.

    I mean tension in the clock
    as it works towards sunrise
    and fear becomes natural law.

    *

    Starlings in the tree opposite
    are ghosts from hollow graves
    and green leaves denote death.

    Any loud chorus no longer is
    natural breaking such deep hush
    and human greetings anonymous.

    Forget also courtesies of touch
    its delicious extended sentences
    and bracing diamond texture.

    *

    Which reminds me to make clear
    after speaking droplets of spittle
    will stay airborne eight seconds.

    Long enough anyone might think
    to question what does and does not
    remain subject to our attention.

    If I mention my old man’s hands
    with their liver sots and arthritis
    is that worth the risk of expression.

    *

    Interest does inflate a little
    in the novelty of insults inflicted
    eg cuticles cracked with washing.

    Or specs fogged with breath
    funneled behind the face mask
    and if oniony never that good.

    Not to mention regions destitute
    where the homicide rate holds up
    that being too important to fail.

    *

    There is still noticing however.
    there is the shining eye-machine
    in the quiet remnants of life.

    Impatient I might be elsewhere
    but still favor neglected things
    and continue democratic in that.

    Littleness being one form of life
    on the bleak shore or otherwise
    inclined to seek the silent floods.

    2.

    Midday purgatorial stroll allowed
    and today’s high tide a black cut-out
    the harbour at this time exactly fits.

    Matching forms and the cormorant
    also adaptive in its timely practice
    vanishing then bobbing up well fed.

    An angel too it goes without saying
    there is one skims the silent flood
    inviting me on shores unknown to lie.

    *

    Which leaves things where exactly
    not a real down and dirty sandal
    puckering the pure water mouth.

    More likely a lavender yoga nut
    head-standing by the dry dock
    her world knowingly upside down.

    While current in the deep down
    never knows what odd beauty
    or obstacle might strike it next.

    *

    The port of entry long since gone
    hands pressing for free passage
    reefed sails and wonderful land.

    Now wind over the harbor it is
    and gusts worrying on hastily torn
    dull troughs and avid gleams.

    Meaning in point of fact waves
    lap-lapping to amuse themselves
    while higher levels bide their time.

    *

    Fresh out of sympathy cards
    things are that bad but easier
    than life with no horizon line.

    Although most it is true persist
    who never did master their iCal
    and here remain watchful enough.

    Except to be honest why bother
    every damn thing is wiped off
    in the re-sale eventually all of it.

    *

    At my back a peopled city made
    a desert place it is alas too true
    and DC remains just plain shite.

    Still I might point out that tanker
    in tune with old times at its berth
    accepting the Domino Sugars rush.

    Muscly as hell I don’t doubt it
    and at worst the harbor water
    opposes only mildly disparaging.

    3.

    Evening falling well that is still
    reliable although no bird tonight
    presumes to steer his airy flight.

    The heavenly host instead rises
    from fiery beds to speak to me
    but I would say that wouldn’t I.

    Fireflies that is courteous lights
    albeit mad dogs in their fashion
    frisky through the dusk advance.

    *

    Call it paradise lost or more like
    paradise impossible when claws
    scrabble to open my skylight eye.

    Also that dry note like cobwebs
    coughed into the boiler’s throat
    on the dot as darkness settles in.

    Until hello as usual the future
    meets me with low intelligence
    in the first two inches of gin.

    *

    Occasionally an ocean breeze
    stinking of fish could be worse
    just think of Gorilla Gang turf.

    I might also mention there are
    green peppers frying in the pan
    and notice one losing it to gold.

    But come on! That might well be
    enough to contemplate in one day
    before shutting down entirely.

    *

    Avernus which means bird-less
    a sulphur lake in all probability
    although not here so far as I know.

    Here birds doze on black water
    or blindly scavenge steered by
    piratical scouring of the Earth.

    Here my cat sits in the holly bush
    with her mouth already wide open
    to snaffle the wings as they unfold.

    *

    My country gods I left behind
    my soft approaches also
    and my dreams that fly the day.

    What hope remains my death
    must give and that is not
    to exaggerate in the slightest.

    Meanwhile I must fatten myself
    on forms without their bodies
    and if feathers and dust so be it.

    After the hurricane

    After the hurricane my father walked
    beside me in the woodland broken down;
    he’d known it as a child and now his wife,
    my mother, on the far side of the wall
    we came to in the end, the churchyard wall,
    had left him to complete his time alone.

    Wave after breaking wave of shredded leaf.
    Medusa roots. Spilt sap. A ruined nest.
    One yew tree torso with a wound as white
    as chicken flesh, but fringed with human red.

    He never spoke. She kept her silence too.
    I chose to keep the proof for later on,
    when my own turn would come to see my life
    had finished sooner than my heart allowed.

    The Bee Tree

    American linden also American lime
    of the family Basswood (tiliaceae un-
    pronounceable virtually for this layman)

    but mine opposite keeps that quiet
    and presents facts as they appear –
    being a handsome street-shade tree

    with elephant bark in hard scaly ridges
    and russet twigs wandering into green
    until, when flowering, it fills with bees

    giving a murmurous but once heard un-
    ignorable oozing Spring song to show
    the tree knew what its name was all along.

    *

    This morning everything I need
    to know the tree can tell me.

    Its stillness is perfect stillness.
    Its movement when wind blows

    is the dance of a soul before God.
    Think what a story I would miss

    if ever I looked away. The text
    of beauty. A late chance gone

    of boring a hole in my own skull
    and seeing what I imagine next.

    *

    Delicate still in its breezy morning
    the tree I now want to know as ‘her’
    with those fine swizzle-stick twitches.

    She is playing the part of Summer,
    flouncing her feathery evening boa
    and catching the eye of her paramour:

    me, in the frame of my window,
    watching to see if a rival appears
    in the black silk net of her shadow.

    *

    Loving a thing
    I know is when
    a becomes the.

    The tree in view
    I know is what
    I love to see.

    *

    One night and then for a week after
    a large green heron with no idea
    of the right and wrong place to live
    takes up residence in her branches.

    Only occasional eyes passing below
    look up and notice; mostly they see
    a heavy white shit-trail and assume
    that fell by magic out of nowhere.

    Eventually I realize my duty here
    is to grasp things as they appear,
    and see the puzzle is not what goes
    where, the problem is one of scale.

    *

    Wearing a bright orange hi-viz jacket
    the leaf-blower dismisses fallen leaves
    from left to right along the sidewalk,

    while the wind with no color or form
    orders them back in the other direction
    to settle beneath the tree once more.

    *

    I know this view
    of falling leaves,
    this pane of glass
    I wonder through

    in my nice house
    is what must pass
    for all there is.
    But time reserved,

    the sweeping up
    or under carpets —
    what of that?
    Today, e.g., while

    these last leaves
    release their tree
    I know in fact
    you’re off to town

    (the rarity! — so
    much denied)
    for nothing much
    except to stretch

    your sense of days
    from what it is
    to what it was
    or seemed to be,

    and if our luck
    had turned a jot
    I soon enough
    might then expect

    to hear your key
    scrape in the lock
    of life with me. But
    as it is I only see.

    *

    First thing and last thing
    this fickle time of year
    a skein of Canada geese fly
    over my roof and the tree

    squeezing their rubber horns.
    It is what I expect to hear
    but the tree look up amazed,
    her dark fountain frozen.

    So much extravagant move-
    ment and all of it chosen!
    In the aftermath a breeze
    kicks off a different scene,

    but the most branches do
    is to stir very heavily,
    as a spar of flotsam will
    in the weak hands of the sea.

    *

    Like my love when she takes off her ring
    and lays it on the nightstand beside our bed,
    the tree places her last leaf on the sidewalk
    and stands completely naked wanting to play.

    *

    As the tree opens her long arms
    like a sinner appealing to heaven
    or a past master of levitation

    I hesitate to mention a starling
    given the variety of ways to look
    but clearly a starling settles in.

    The bird looks at the leaves left:
    four a sensible pumpkin brown
    the fifth still alarmingly green.

    Order now depends on relation:
    my love of course at the center
    and the leaves her small crown.

    What can the starling choose
    but to flitter away. So he angles
    his head to one side. Then does.

    *

    This late in Autumn
    she wears a contraption
    made of gold leaf,
    but even such glamour
    is never enough
    to keep her from harm.

    The faintest breath
    of disheveling air
    and she’s poor again,
    a shivering waif
    whose one ambition
    remains to be warm.

    *

    In one bad idea
    a buzz-saw rips
    in Domino Sugars

    across the harbor:
    surely the sound
    of a sugar tree sliced

    into saleable logs —
    there is that panic
    and saccharine crash.

    But this is a thought
    I keep to myself
    and the tree never has.

    *

    I plunge my hand
    inside the tree
    and neighbors think
    I’m a conjuror
    about to snatch
    a dove from a hat.

    It’s not like that.
    I’m testing heat
    and degrees of dark
    to see if they match
    the kind of box
    I have in mind.

    *

    In the absence of straight lines
    the story goes round and round,
    which brings comfort of a sort.

    Be patient is the best advice,
    which I must say is not enough
    but these days has to suffice.

    *

    Linden tree, linden tree
    when the bulk of Europe
    lay to rest in your shade

    and one traveler’s hat
    blew from the traveler’s head
    as he continued on his way,

    now tell me what I can see
    apart from the patch of shade
    where a pillow of brittle leaves

    show in its vacancy
    how my own head might lie,
    linden tree, linden tree.

    *

    Millions of years pass:
    the sun drowns in ash,
    sap sinks underground,
    and the tree is empty –
    a remainder of itself
    but absent from itself.

    Then the glaciers return,
    the heavens open again,
    and creatures multiply.
    The tree is ready for this.
    Like a factory at dawn
    its stiff machinery whirrs

    and a faint tremor passes
    throughout the entire plant
    as the crowned head bows
    down to accept the weight
    of a full-throated blackbird,
    the first of a new species.

    Marcus Aurelius’ Workout Book

    To the memory of Christopher Nelson Lasch

    No image comes as quickly to mind when thinking about the ancient Stoics than that of stone-cold busts from antiquity. Frozen in time, the taut and grim facial muscles secret away any feelings that might have roiled the hearts buried deep beneath the weighted folds of drapery. To be Stoic has long been dismissed as being devoid of strong passion, colorless. Recent studies and museum exhibits, thanks to ultraviolet lights and other tools, have now restored to collective memory the colors with which ancient artists painted their sculpted limestone and marble statuary. They were stone but not quite stoney. In much the same way, the effusive new interest in Stoics’ philosophy of life, popular and scholarly, can restore the vivid hues of the inner life as they depicted it, possibly even recalling us to elements missing in our own self-understanding.

    We can begin by adding to the stately visual imagery of the busts, such as those in the Capitoline Museum’s Hall of Philosophers, a more tempestuous scene. Let us close our eyes and see what colors flood them. Imagine liquid clouds billowing above the midnight ocean like a field of rosebuds all opening at once in the dark, or a blazing scarlet finale of underwater fireworks. Pigment spills into the waves, diffused into the mass of gray swirling and churning beneath evil skies and mad winds. Its shade more than hints at other horrors, such as casualties of naval battles tinting the sea or the effluvium of tribal whale hunts washing up on shore. But just as purple does not capture what we see, neither does red. It is a particular shade from the lesser-known terrain between red and blue — what we now think of as royal purple, but the ancient kind. For moderns, royal purple has more blue and borders on indigo or violet, but to the ancients it had more red and is closer to dark magenta, burgundy, or maroon, rather like the outer layer of a red onion. It was said to be the color of dried blood tinged with black. Someone else saw it as the color of a dark rose. And “Tyrian purple,” once you fix the color in your mind’s eye, is one of those things that starts turning up everywhere. You can find it at a paint store, and Lowe’s Valspar actually offers a red called “Stoic.” In Stoic lore, the color turns up in its very foundation legend, as we shall see.

    But first a word on why we are talking about Stoicism in the first place. Whether you think it deserves just a smiley or frowny face or some deeper response, Stoicism is plentifully back in fashion. That phrase alone should cause mild consternation, if not outright alarm. This is not because the Stoics cut an unattractive figure on the cultural horizon. Far from it. They are certainly dashing personages, some of them with cinematically exciting biographies, others with lasting intellectual achievements. Between their sturdy poses and their piercing aphorisms, they speak eloquently to current problems. And Stoicism has been revived before, notably in the bloody and turbulent early modern centuries in Europe, where it became widespread in thought and culture, and was known as neo-Stoicism. We are now witnessing another such revival — neo-neo-Stoicism, or the New Stoicism, or Modern Stoicism, to use the emerging appellation. But does “our” Stoicism resemble at all the ancient philosophy which it purports to model? And if it does, is that always a good thing? Is a new Stoicism what we need now?

    Right off, it is a charming contradiction that a school of thought so stringently opposed to the entire mentality of fashion, of surrender to the contemporary, should itself become chic. In a time characterized by dispersed attention and mass celebrity (now there is an innovation!), and besotted by goods of the material kind as measured in tweets, clicks, and the contents of your (real or digital) shopping cart, the focus of general aspiration in our society now is the momentous instant of trending, whether your own or that of someone taking your money. What business should Stoics have with trending —  with time-traveling to a period of celebrity worship, insatiable consumerism, socially certified self-absorption, rampant status-anxiety, and image addiction—of faux friends, fickle followers, and impulse likes? Is a healthy dose of Stoicism what we need, or is its attitude of acquiescence and indifference the final ingredient to render our curdling democratic life truly sour, even poisonous? Given the divisions among us, it is fair to ask not only what, if we can get our hands on it, we might do to Stoicism—contort it beyond all recognition, probably—but also what Stoicism might do to us.

    Most connected now to the names of the triumvirate of philosophers of Roman imperial times — Epictetus, Seneca, and Marcus Aurelius — the Stoic school of thought originated earlier, in Greece around 300 BC, with Zeno of Citium, Cleanthes, and Chrysippus (the first three heads of the school), among others. Originally called the Zenonians after their founder, the Stoics got their name from the stoa poikilê, or “painted porch,” the colonnade and murals along the agora of Athens where they initially engaged in their philosophical discussions. In the tradition of Socrates, they taught that the best things in life are free, that virtue is paramount, and that our minds open onto the divine if we make the pathway. For the Stoics, the way to embody these truths was the development of good character and practices that keep us on track to make the most of our lives and pursue what is best. Their ethics, intermingled with their physics and logic, the other elements of the tripartite structure of their philosophical system, was all about goods (the less material kind).

    At its most stark, Stoicism can sound — and no doubt in the wrong hands can be — simply cruel. Epictetus advised the reader about what should matter by imagining a broken cup: “If you are fond of a specific ceramic cup, remind yourself that it is only ceramic cups in general of which you are fond. Then, if it breaks, you will not be disturbed.” He saw a direct parallel with losing a loved one: “If you kiss your child, or your wife, say that you only kiss things which are human, and thus you will not be disturbed if either of them dies.” (Recently in Liberties Leon Wieseltier posed powerful questions about how well this reasoning stands up against the loss of dear friends, and more generally against the nature of human bonds.)

    Such a credo seems the very worst candidate for reintroduction in these times, with so many forces already pitting people against one another. The confluence of individualism, consumerism, and the therapeutic culture has already taken the focus on the self’s desires and putative needs to rare heights. Even those who find great value in Stoicism, such as Martha Nussbaum, Richard Sorabji, and Nancy Sherman, have reservations about the tougher versions of Stoicism that prize such radical detachment. Yet the gentler versions of Stoicism guard against that very cruelty, so it is important to glimpse Stoics’ larger portrait of the world and our place in it.

    Stoicism’s origin comes to us through a legend with vivid imagery. Enter those billowing clouds spilling into the sea the mysterious shade of Tyrian purple (Latin, purpura) known not only as royal purple but also Phoenician red, Phoenician purple, or imperial purple. It was named for the dye of that color made in the Phoenician port of Tyre, now a city in Lebanon. According to Diogenes Laertius, when the wealthy merchant Zeno of Citium was shipping a cargo of this expensive substance from his home in Phoenicia to Piraeus in Athens, he was shipwrecked. In How to Think Like a Roman Emperor: The Stoic Philosophy of Marcus Aurelius, Donald Robertson, who is prominent in the current resurgence of Stoicism, explains the significance of the incident in Stoic lore: the dye, derived from fermented shellfish, was used to color the garments of emperors and kings. After this disaster Zeno supposedly had to live as a beggar in Athens. When he visited the Oracle of Delphi to ask what to do, Zeno was told that he should take on the color of dead men instead of shellfish.

    Returning to Athens, Zeno encountered the teachings of Socrates, which led him to study philosophy with Crates of Thebes, a Cynic philosopher, and then with members of the Academic and Megarian, schools for the next couple of decades, before he founded his own school, the Stoa. Zeno saw the shipwreck as a blessing. Losing his fortune transformed his existence: it was the bouleversement that set him on the path to what really mattered in life. Five centuries later, Robertson points out, Marcus Aurelius still mentioned purple dye as a danger. He told himself that even though becoming emperor means donning the color, he must not allow his very character to become royal purple but instead should imbue himself inwardly with the teachings of the traditional Stoic philosophers.

    What are the teachings that Marcus thought we should let pour into and transform our inner selves? It is not too hard for anyone to find out these days. There are Stoic websites. There are Stoic courses. There is Stoic Week. There are new editions and translations of major Stoic texts coming out at a rapid pace. There are innumerable self-help guides and scholarly tracts not just calling new attention to Stoic works but advocating Stoicism as a way of life. The quality of such offerings varies greatly, but fortunately some serious thinkers head the pack, including Massimo Pigliucci, John Sellars, and others, as well as Robertson. To dip our toes in to the old-new wisdom, there are worse guides than Marcus Aurelius to introduce us to what those teachings were and worse guides than Donald Robertson to introduce us to Marcus Aurelius.

    As Robertson points out, “Know Thyself” meant, in the ancient inscription at Delphi, not just know who you are and what you are about, but also know your limits — the key limit being temporal finitude, the lifespan of a human being. Like all other human beings, you are mortal. Like all mortals, you will die. The oracle’s commandment of self-knowledge certainly did not mean that you should simply celebrate the wonder of you. Since “Know Thyself” has meant different things to different people at different times, we need to retrieve its ancient meaning so that ensuing questions about who you are and what you do have the addendum: given that my time here is finite.

    Drawing on the Socratic tradition and its creed of “Know Thyself,” the Stoics thus displayed a piercing awareness of their own mortality. Memento mori — remember that you die — was not just a saying to them, but a practice. It was whispered into the ears of generals triumphant after battle as they paraded home from victory at war, lest pomp spawn pomposity. Stoic writings, such as Seneca’s letters, issue reminders that we must lead our lives with death always in view. The good life, what it means and how to attain it, meant not just how to live well but also how to die well. This meant fittingly (or according to nature), and excellently (or according to wisdom), and morally (or according to virtue). Living well as a human being meant cultivating our rational faculty, bestowed by nature, for reason allows us the capacity to make moral decisions.

    If any of us consult or observe the people we have known, in memory or in conversation, or in literature, film, and the other imaginative arts, we will recognize the sensibility or disposition that is Stoic, even if it is not made explicit. Recognizing it in others or even in ourselves, we might call it something else — principle, or a stubborn streak. It is the part of us, a faculty almost, that draws a line. It is the line that we will not cross. In Lonesome Dove, a wayward cowboy is brought to brutal justice by his own friends, who lamented that he had lost sight of the line between right and wrong. Jake replies, in half-complaint, half-confession, “I didn’t see no line.” Call it by many other of its names, the Stoic sensibility taps an inner reserve that gives us not just the ability to draw a line but the resources to follow through. The Stoic voice within gives us the capacity to say no. Je refuse. As Bartleby says, “I prefer not to.” We can recognize the Stoic in those among us who stopped their problem drinking, or other dangerous habits, and stuck to it.

    Harder to recognize are those who never started. It is a truism that we can learn a lot about people not only from what they say about themselves but also what they do. We can also learn from what they do not say or do not do. Barbara Pym’s novel, Quartet in Autumn, chronicles the seemingly un-chronicle-worthy lives of four aging office coworkers, all seemingly dull and innocuous from the outside. One of the women stood out as the most colorless of all, an enigma when it came to what made her tick, since she seemed to have no close ties and so little gusto for life. But when she died she surprised everyone by leaving her home and all of her possessions to one of the men in the office. Recalling Stoicism’s rich interior life as a wellspring of actions taken and not taken brings with it possibilities for reading people very differently, especially those not prone to self-disclosure. In our time, when centrifugal forces already pull us apart, our philosophical amnesia risks rendering us unrecognizable to one another.

    Stoicism makes an appearance in daily encounters with the full range of demons that haunt us. We associate it with the broader tableaux of war and soldiery, where heroism and Stoicism seem often one and the same. While our inner struggles can seem insignificant in relation to those of wartime, the suffering they can involve, both for ourselves and others, belong in this same arena of living and dying. The separation of everyday life from the exigencies of war makes us fail to take living seriously enough to marshal the resources necessary for the skirmishes at hand. Those skirmishes can be as deadly at the individual level as war is at the level of whole societies.

    Through its distractions and temptations, contemporary life tries to subvert our thoughts and feelings to its own uses. The activity of drawing of lines only goes so far. It can be transposed as easily from meaningful pursuits to insignificant ones. Given the consumer culture, the drawing of lines can just end up dictating decisions about how to look and what to buy. It can debase the meaning of lines, reducing it to the familiar act of consumer choice. And competing sensibilities vie with the drawing of lines — or the forming of stringent judgements — of any kind. On the surface, consumerism is less suited to Stoicism than to Epicureanism, the ancient Greco-Roman philosophy of pleasure as the goal of life (though it prescribed limits on the grounds that pleasure would be unsustainable with excess and had to be distinguished from hedonism). Yet it remains to be seen whether Stoicism in its modern form will be any different. In the world capital of lifestyle fads, I mean America, Stoic self-help guides could easily go the way of Jane Fonda’s Workout Book or Dr. Atkins Diet Revolution or (may the day come soon) Goop.

    Again, it is fair to ask how far a new Stoicism can go in a society riveted on pleasure and entertainment as measured by individual impulse gratification. It could be just another pendulum swing, a stage in the “cultural contradictions of capitalism” that Daniel Bell observed between the Protestant work ethic of the nineteenth century and the consumption of the twentieth century, the former requiring sacrifice and saving and the latter an openness to spending freely. In the years since he wrote his classic work on the tension between the imperative of market growth and the acknowledgement of inherited limits, American society has only further embraced the-sky-is-the-limit conception of the good life to the point where the culture no longer poses a contradiction at all, because letting go has vanquished holding back. Could Stoicism serve as an attempt to reverse this, returning us to an ethos of humility and limits? And if it is time for Stoic practices of line-drawing, its revival is occurring in a context in which the moral understanding underlying the drawing of lines — the ethic of demarcation, we might call it — is scarce. Rigid line-drawing in the absence of moral considerations is a frightening prospect. Cruelty, to individuals and to groups, could easily result. We need to keep our eye on the line between the good uses and the bad uses of Stoicism.

    In our current quandaries — from the idiocy of today’s identity politics to the masochism of our treatment of the very land that we count on for our survival — we need to remember also the limits of Stoicism. If the new Stoics are the guides showing many people the way to serious moral examination, we should welcome them with open arms. But if they, like Larry McMurtry’s Jake, who saw no line, adopt all of the trappings of Stoicism — the show of muscle and the glint of shield for the righteous cause — to the amoral void, we need to look elsewhere. We certainly do need the rediscovery of the strength and honor not just to face but also to fight and to prevail over our demons. Equanimity is not useful in a struggle, except perhaps tactically. Evil exists, as the Stoics would be quick to say, and we must face real-world foes, outer and inner. But if we do not start with what goes on within, we will be lost. We need an improved inwardness, one that takes us not only into the self, merely to fulfill it and to protect it, but also brings us back out, bettered and equipped for moral actions and social bonds.

    There are those who will say that no attempt to lessen the damages of our way of life can come while religions continue to exist. The “wars of religion” argument against religion, trotted out to suggest that irrationality is the cause of conflict and that religion is the obverse of rationality, ignores what we might call the peace of religions argument, which suggests that religion and rationality have often gone hand in hand in proposing an alternative to war. The new Stoicism that tries to suggest that Stoicism obliterates religious ways of thinking is certainly not recognizable as Stoicism. To assert that philosophy vanishes when religion arrives and religion vanishes when philosophy arrives would make no sense to the ancient Stoics, who believed in a cosmic deity (and could therefore exert a strong influence upon early Christianity) and held that rationality is compatible with religion because reason is itself divine.

    There is something profoundly comforting about Stoicism and at the same time something deeply troubling. While the emphasis is on control, the required discipline comes from an intimate acquaintance with the lack of control. It might be a useful plan for those already motivated to excess — for reining in prolific, wayward desire. But for those without wild and unexamined attachments, or with too developed a sense of discipline, it might offer less. It begs the question of whether some kind of spiritual spillage — of impassioned living, of fruitful chaos, of inevitable suffering — is necessary for an ideal of self-straitening, of strict control, to make sense. And the vicissitudes of feeling and experience against which we are Stoically warned are completely unpredictable. So rather than run away from all that we cannot control, perhaps we need to put some of it to better use. The passions may be less the problem than their objects, which vary in their worth. Intensity is compatible with many purposes: Stoicism itself was intensely propounded. Who wants a wan and colorless life, or can one honestly call it significant and rich? The Stoics proclaim the rigors of emotional control, but their writings — and just as importantly, their lives — were emblazoned with the colors of strong emotion. While they tried their best to give us ways to handle our feelings, in the process they revealed their intimate knowledge of them. They know what they are cautioning against, and they know that we know that they know. We should not confuse their misgivings about experience with testimonials for innocence.

    The books and articles evangelizing for a New Stoicism began to appear early in our new century. (It already feels so old.) For all its extravagance, or because of it, the film Gladiator was a colossally successful advertisement for Stoicism. In recent years the movement has mushroomed. Bestsellers such as Ryan Holiday’s The Obstacle is the Way, blogs such as DailyStoic.com, and Facebook groups have many picking up ancient works of philosophy and giving them a try. One need only check online conversation forums such as Quora and Reddit to see the extent of the interest. As of this writing, for instance, the Stoicism Reddit forum lists 419,000 members. Marcus Aurelius has become an influencer.

    Dipping into digital Stoicism hints at how the New Stoicism is being received. One fellow wrote to confide that he was just sitting around all day, isolated and useless, indulging in bad habits that were leading him nowhere. He said he wants to change but does not know where to start. The flood of responses was touching and sweet. Everyone wanted to help. They cited the Stoic notion of changing what is in your control, suggesting that he get a hobby, go on adventures, do things with friends, and fill his time with activity in order to distract from temptation and replace unconstructive pursuits. He put up resistance. What hobbies? What adventures? What friends? They chimed in with all kinds of suggestions from hiking to harmonica. To their credit, many of the interlocutors suggested getting involved by volunteering and helping others. When someone else, admitting to sharing the difficulty of making friends, asked how to meet people, the answer was to join clubs of people with interests you share. But the author of the original query persisted, stating that he had no interests and did not feel passionate about anything. (Wait — wasn’t he already in possession of apatheia?) Some respondents expressed frustration, signing off with a brisk, “Good luck!” The overall message was Nike’s jingle, “Just Do It.” Just do anything. Change is good. This is America.

    Yet one commenter strayed from the others and tried to engage with the deeper questions. If Stoicism helps one stay resolved and centered on one’s values and main concerns, how does someone find out what to be resolved about in the first place? And how does one become passionately resolved about something? (An anti-Stoic question, but never mind.) This respondent suggested a very different task of contemplation of what his ideal self would be: a meaningful and passionate life. Admittedly a rather contentless formula, and easily exploitable by false doctrines, but still a fine beginning. Marcus Aurelius admonished himself to “hasten then to the goal, lay idle hopes aside, and come to your own help, if you care at all for yourself, while still you may,” though of course he knew what his goal was.

    What is wrong with distilling what is useful from a philosophical school and using it to guide our lives? Hasn’t that been the very purpose of philosophy since Socrates? The problem with the New Stoicism is that it takes places far, far away from philosophy. The truth is that it flourishes in the familiar terrain of psychology, of our stubbornly therapeutic culture. The subtitle of one of Donald Robertson’s books is Stoic Philosophy as Rational and Cognitive Psychotherapy; and it is true that the Stoics spoke explicitly of their thought as therapy and used medical conceptions to describe ethical values. But many of the Reddit answers were just more of the same bromides of the positive psychology that has been plaguing Americans for over a century now, pushing a vision of success and happiness measured in numbers of friends, dollars, and accomplishments.

    The problem with fixating as an entire culture on what is therapeutic for the individual is that, strictly speaking, it has never actually been therapeutic. Reducing the full range of our institutions, endeavors, and relations to a technique of self-affirmation and self-acceptance falls short of providing the weightier ballast we need to navigate both our suffering and our search for meaning. The fuller assumptions underlying Stoicism in the self-scrutiny and sustained engagements of Seneca and Marcus get lost in some of the New Stoicism’s trite prescriptions for “healing” and “personal growth.” The New Stoicism is often a Stoicism without a critical edge, careening between a Stoicism of chilling out and a Stoicism of pure grit, and either way a Stoicism bent to distinctly un-Stoic ends, whether self-satisfaction or extreme versions of warrior training and cutthroat management styles.

    The most important assumption underlying Stoicism was that the overarching goal of life was goodness. Whether their philosophical system provided ways to pursue it successfully is an interesting question, but at least it aimed high and deep, and it was designed to discomfit, and it was witheringly critical. The Stoics cut quite a figure — when lit from behind. The backlighting, which leaves in shadows some of their less attractive blemishes and contortions, was Platonism. The underlying assumption, the spirit infusing the entire edifice of their philosophy, was that a good life, as in moral goodness, matters. As does love.

    And there was something more in the Platonic background that the Stoics wholeheartedly accepted. One of the reasons that Robertson is such an engaging proponent of Stoicism’s comeback is that he recognizes the way in which the Meditations is invaluable when read as a purely spiritual work, as did the great scholar Pierre Hadot, who identified in Stoicism’s guidance on how to think, feel, and act an intricate set of spiritual exercises and a concept of the human being as sacred. There is a cosmological, or cosmic, dimension to Stoicism, as there is in Platonism. In the conclusion of How to Think Like a Roman Emperor, Robertson takes creative liberties, and I am glad he does. It is jarring, if one is listening in audiobook form, to hear Robertson speaking as if he is Marcus, but the ancient Stoic emperor did mean to be useful, and slowly the scene that Robertson is creating is set. Marcus is dying. As he moves in and out of consciousness, he reflects on his life and death. It is a powerful vision. It is also a Platonist one. “What I spent my life learning I now see everywhere — as I turn my attention from one thing to another, all sides grant me the same vision,” Robertson ruminates, as Marcus. “The universe is a single living being, with a single body and a single consciousness. Every individual mind a tiny particle of one great mind.” This, of course, is no longer an ethics, it is a metaphysics. Such a vision of cosmic unity recalls Plato and Plotinus as much as Marcus Aurelius, though Robertson is right: it is there in Marcus as well. What can a purely psychological therapy do with any of this?

    In the second game of the NBA finals in 2021, Giannis Antetokounmpo of the Milwaukee Bucks charged the defending forward Torrey Craig of the Phoenix Suns, who injured his right knee and had to be helped off the court. Nothing unusual there. Injuries abounded in a season compressed by covid as teams played more games in less time. Just a week before, in game 4 of the Eastern Conference finals against the Atlanta Hawks, Antetokounmpo had been the one crashing to the floor in agony after experiencing a “hyperextension” of the knee (the knee bending backwards) at a gruesomely oblique angle under the full force of his body landing from a jump. He expected to be out not just for the rest of the season but for a year of recovery. It was a miracle that he was back a few days later for the finals.

    The scene around Craig stood out for one small detail. (The replay tape is easily viewable on YouTube.) As Craig lay prone, visible among the uniforms of his own teammates surrounding him was the jersey of a player from the other team, from Milwaukee. It was Antetokounmpo. His look of grave concern and his gentle tap on the shoulder as Craig managed to come to his feet to limp off the floor, buttressed by the shoulders of his teammates, lasted only a couple of seconds, but there was something a little sublime about it. It was a spiritual accomplishment.

    Such acts, so fleeting and minute as to be nearly imperceptible amid the more noticeable big-stage events upon which all eyes are riveted, bear testimony to all that is going on in the inner life of just one person. It is here, in moments that reveal our innermost composition, which we have woven from interior fiber and exterior circumstance, that the Stoics wish to train our thoughts, even — especially — if we must ignore the roar of the crowd to do so. To find the strength for such feats, large and small, we need more than techniques of resolve, more than methods of willing, more than therapy, more than calm. “Keep in memory the universal substance of which you are a tiny part,” says Marcus. We need a vision of the universe, a whole view of all of existence, which can inspire the resolve that we seek. We need the love that is wisdom that is beauty that is goodness. More than a new Stoicism, we need a new Platonism. [END]

    Education and The Economic Menace

    In the halcyon days of the British welfare state, even the poor had the opportunity to go to university. Anyone who had been offered a “place” could apply to the local Education Authority for support — not to cover the fees (there were none), but to meet the expenses of living. But when I took the form to my parents for the necessary signature, they hesitated. Having left school at twelve and fourteen, respectively, they wondered why I wanted “more study.” With my background in mathematics, I was already qualified for “a good job,” one beyond anything they could have imagined for themselves. I might become an accountant, even an actuary.

    That sort of thinking is still too much with us. It dominates the decisions of powerful people who, unlike my parents, have had ample opportunity to appreciate its shortcomings. The assault on public goods, carried out on both sides of the Atlantic, has bequeathed to nations, in varying degrees, an economic conception of education. Politicians as different as Margaret Thatcher and Barack Obama have played variations on the same theme. We need programs to prepare the young to do their bit in maintaining financial health. Perhaps we are not yet at the stage where embryonic citizens are viewed as raw materials, to be hammered into shape, fitting the available slots in the powerful economic machine that the nation will send to do battle in the Global Demolition Derby, but that vision is on the horizon.

    From Plato on, most people who have reflected seriously on education and its aims have not adopted the economic conception. Their attitude towards it is the one adopted by Rick Blaine to the petty crook Signor Ugarte: they would despise it if they thought about it. To be sure, many of them are concerned only with the elite, with people who do not have to worry about earning a living. Yet even Adam Smith, very much occupied with eliminating waste in the classroom or lecture-hall — sweeping away “the cobweb science of ontology,” for example — worries about the intellectual and moral decline of the minimally educated worker, confined by the division of labor to a lifetime of repetitive tasks. W.E.B DuBois eloquently strips the common disdain for market-oriented education of its limitation to class: “The object of education is not to make men carpenters but to make carpenters men.”

    Today, in many affluent nations, the Economic Menace slouches towards the schools and colleges and universities. The funds for public education need to be cut. When school budgets are tight, administrators decide to cut back on “the frills.” The arts and the humanities are the prime targets. Classes in music and the visual arts disappear. Instruction in the “less important” languages, including the dead ones, is abandoned. In the remaining humanities classes, teachers are urged to concentrate on the basics. Less time for literature, more emphasis on functional literacy. Such attitudes are also echoed at higher levels of educational policy. Governments decide to slash the already meager budgets for the humanities and the arts. In the United States, scientists often bewail the reduction of funds to support research, but American humanists can only envy American scientists: although the National Science Foundation and the National Institutes of Health may be pinched by policymakers apparently bereft of any sense of the value of basic research, the scorn directed at the National Endowment for the Humanities and the National Endowment for the Arts has been far more intense. Those agencies are permanently threatened by the barbarians within the gates. (It was almost miraculous to read that the Biden administration has proposed to increase these agencies’ funding.)

    Increasingly, education means training and training means vocational training. Although it is entirely reasonable to identify a capacity for self-maintenance as one of the goals of education, two others ought not to be discarded. Young people should learn how to be citizens and how to be individuals. They should be taught how to be a member of a community, a society, a state, and they should also find their own distinctive place, their own distinctive voice, their own way of contributing to the lives of others, their own projected path to fulfillment. Perhaps tens of thousands of years ago, the small bands of hunter-gatherers of the later Paleolithic reasonably socialized the young by concentrating on the skills required to survive in a precarious world. During recorded history, the vast majority of human beings have been treated as disposable material for rulers bent on maintaining the wealth and power of their domain. Occupations have been routinely assigned by caste or class or gender or race, and their occupants have been efficiently honed to suit the appointed role. Thanks to political and economic developments that are rightly celebrated, some twentieth century societies demonstrated the possibility of doing more than that. They showed, for a time, how all three major aims of education — creating people who are simultaneously workers, citizens, and individuals — were broadly achievable. Why, then, the extraordinary regression, the return to a world in which training substitutes for education? What justifies the penny-pinching and the celebration of economic efficiency above all else?

    Let us suspend, for a moment, the assumption that what young people need — that all they need — is preparation for “a good job.”

    In one of the greatest essays on liberty ever written, John Stuart Mill explains what he takes to be fundamental. “The only freedom deserving of the name,” he instructs, “is that of pursuing our own good in our own way, so long as we do not deprive others of theirs, or impede their efforts to obtain it.” Mill’s vision of fulfillment, of the worthwhile life, centers on autonomy. Each of us should choose the pattern of our existence, setting the goals we view as most important, and, if we have a tolerable amount of success in attaining them, we shall have lived well. But is that enough? Some projects for a life are harmless — they do not interfere with the plans made by others — but also trivial. A person who retreats into solitude, spending the days counting and recounting the blades of grass in a particular area, has not discovered a promising new “experiment of living.”

    Mill’s tendency to focus on an abstract individual, disconnected from society, interferes with a genuine insight. Autonomy, as he recognizes, is important. The life you live should be your own. Still, it is worth asking just what this means. As a banal matter of fact, human beings do not spring into the world equipped with preferences and ideals that can guide their selection of life plans. The horizon bounding their inclinations is set by the possibilities made available by their social milieux. Although they may seek to expand the boundaries, they will inevitably do so by beginning from the options they come initially to understand. Moreover, that understanding emerges from their socialization. More concretely, the direction of their inclinations is set by the education they receive.

    Where, in all this, does the liberal’s cherished autonomy fit? What is the intended contrast with the regimes that have dictated, for the vast majority of human beings, the patterns of their lives? A tempting answer is that the choices made by the developed individual should accord with “who she is.” Genuine education is a “leading out” of a self. It is a process in which the embryonic form is preserved and allowed to grow undistorted. True education, Nietzsche claims, is a kind of freeing: the seeds of the mature individual are permitted to grow, protected from the weeds, rubbish, and pests that would interfere with their development. There is surely something right about this picture. It points to a genuine danger in socialization, to the possibility of distortion, mutilation, regimentation, alienation. Yet it is hard to make sense of the threat. For the idea of a complete character already present in the infant is absurd.

    To solve the difficulty, I suggest, we should not insist on a match between the full-grown person and a pre-formed inherited self, but attend instead to the nature of the process out of which the individual emerges. Think of education in its most expansive form as a dialogue between the growing organism and the ambient society. The educational dialogue goes badly when society fails to listen, insisting on leading in a particular direction whatever signals the developing organism might send. By contrast, education goes properly when the representatives of society, the parents, the caregivers, the teachers, as well as the institutions within which they work, are attentive to the features of the individual, responding to the directions in which they point.

    Long before a child can be credited with a clear sense of self, sympathetic and sensitive observation of distinctive patterns of behavior allows the identification of activities that seem likely to prove interesting. Opportunities are offered, not on the basis of what society supposes appropriate for children of that age and sex, but rather because they seem attuned to the ways that individual child is developing. Nothing is forced. Educators listen closely to the particular voice, trying to follow its individual music, and to help it express its own song.

    Autonomy is thus a matter of degree. It is not a property accruing to individuals who are launched into some space outside of (or above) society. Rather, the extent of someone’s autonomy is the result of developmental processes in which society is profoundly implicated. The choices that we see as autonomous are made possible because our prior socialization has provided opportunities that attentive interlocutors have recognized as attuned to individual features, observed so far in a growing person’s behavior. Those interlocutors have drawn out embryonic traits and instilled capacities for reflective choice. Although they have shaped the self that chooses, they have nonetheless cultivated a degree of autonomy. And in doing so, they have made possible the selection of a life plan that will continue the individual’s dialogue with the ambient society.

    For each of us, in our time, plays both parts. As we have received from the fostering of others, so too we grow into beings capable of providing opportunities in our turn. How that is done depends on our particular choices, on the projects that we select and the activities and values that we mark out as central. Once we recognize the social embedding of our lives, however, it is easy to understand why the project of the solitary grass-counter appears pathological. For that imaginary recluse contributes nothing to the lives of others. A fulfilling life must surely meet an extra condition: its plan must involve a positive impact on other lives, the lives of sentient beings. At the close of his brief for humanistic religion, John Dewey makes the point with uncharacteristic eloquence:

    We who now live are parts of a humanity that extends into the remote past, a humanity that has interacted with nature. The things in civilization we most prize are not of ourselves. They exist by grace of the doings and sufferings of the continuous human community in which we are a link. Ours is the responsibility of conserving, transmitting, rectifying and expanding the heritage of values we have received that those who come after us may receive it more solid and secure, more widely accessible and more generously shared than we have received it.

    Personal fulfillment has an other-directed, a collective, dimension. For this reason, it becomes easy to harmonize two of the three major aims of education. In fostering tendencies to engage with others and to contribute to their lives, we would also induce the capacities for good citizenship. Dewey’s famous linkage of democracy and education is neither arbitrary nor a penchant for teaching what he called (disdainfully) “the civics.” Democracy, he says, is a way of life, a condition in which citizens seek to understand, and to learn from, the perspectives and the life experiences of their fellows. At small scales, in deliberations within the family or in a local community, examples of the delicate dialogue are familiar, even if they are not as common as they might be. The educational challenge, common to promoting fulfillment and to forming citizens, is to find ways of improving this aspect of life — increasing the frequency with which mutual engagement occurs at small scales, and fashioning institutions for introducing it at larger ones.

    And at this point we can even reintroduce the third aim, the goal of preparing young people to sustain themselves. What jobs will remain when the labor market becomes fully global, and when production becomes ever more automated? The most plausible answer: those requiring human contact. We are likely to continue to need human firefighters and doctors, nurses, and gardeners — and especially people to care for the elderly and to teach the children. In a future world in which the stigma of “service employment” has been thoroughly scoured away, and in which these occupations receive the level of compensation that they deserve — wages and salaries that allow those who serve to lead full and fulfilling lives — the skills needed in the labor market will be those needed for attentive interactions with others. Instead of lamenting the closing of the pit or of the factory, we might view automation as an opportunity. Those who have learned from the delicate dialogue may become fluent participants in it, finding satisfaction in the work they do with and for one another.

    Today education should provide, more than anything else, the propensity to engage with one another. How might that be accomplished?

    It will certainly not be accomplished by restricting it to emphasize the skills required by the current vision of the labor market. Instead we should start by fostering the three major capacities required by the aims of education. Children must learn to understand themselves, to enter into the perspectives of others, and to recognize the range of opportunities from which they can select to find their own distinctive place. This requires a broad education — what used to be called a “liberal education.”

    If young people are to find their own distinctive pattern for a fulfilling life, education must enhance their self-understanding. To do so, they should gain some understanding of the diversity of human lives. A view of the different circumstances in which people live and the (often narrow) range of opportunities available to them promotes an awareness of where and how an individual might contribute to the lives of others. That form of awareness is promoted, from all sorts of angles, by a bundle of disciplines that Anglo-Saxons struggle to categorize. German does better, the Geisteswissenschaften comprise all those areas of research and teaching that bear on the human condition, from psychology at one end of the spectrum to the purest of the humanities at the other. Some degree of immersion in these disciplines, especially when they are taught in relation to one another, is essential for people to fashion their ideals of themselves, guiding them to find worthwhile ends to pursue, instructing them in how their own lives can bear positively on the lives of others.

    All too often the delicate dialogue between the growing individual and the ambient society goes awry. Social pressures can diminish autonomy by closing off attractive paths. Barriers can be constructed and obstacles introduced in two different ways. Societies can impose outright bans: women cannot have access to higher education — or even to any education at all; members of a particular caste (or ethnic group) may only pursue a limited number of occupations. Or, as is far more common in the contemporary world, they can couple de jure possibility to a de facto constraint: “Of course any African-American child can become a lawyer” — but, without jobs for the sole parent, without a place in which to live, and with access only to schools awash in drugs and crime, staffed by a rotating corps of disillusioned teachers, any real prospects for a legal career are suppressed from the start. Complacent political leaders preen themselves on the existence of “opportunities for all,” overlooking the myriad ways in which the obstacles to taking advantage of the alleged “opportunities” crush the hopes and aspirations of the young.

    Education may fail the ideal of autonomy by closing off potential lines of development and by failing to provide the support that it needs to grow in attractive directions. The latter failure is most evident in the material conditions of existence. Homeless people, or people mired in poverty or trapped in neighborhoods overrun by violence, need food, clean water, clean air, medical care, and safe shelter. Beyond these necessities, their children also require schools that they can attend without fear — schools that will offer them genuine chances of finding out where their talents lie and what kinds of life might prove fulfilling. In delivering those chances, the humanities and social sciences have an important role to play.

    Formal education is central to the cultivation of positive freedom, of freedom to rather than freedom from. Children should learn who they are and what kinds of people they might realistically become. Without resources to identify their distinctive talents and interests, or to understand what kinds of lives are available to them, their autonomy is diminished. The dialogue between the growing individual and society is unhealthy, not because the voices of authority issue decrees forbidding a large range of options, but because those voices are virtually silent when it comes to positive suggestions. They supply no conceptual nourishment on which the developing imagination can fasten.

    How, then, can the humanities supply what is lacking? By making vivid and illustrating the diverse possibilities for a human life. Literature, drama, film, and the visual arts can all stimulate the young imagination, revealing ways in which human beings might live, how they might prosper and how their lives might prove miserable and empty. History ranges across time to add detailed portraits of parts of the actual past. Geography and ethnography are comparably expansive across space. These humanist studies, at all levels, nourish a sense of possibilities, of horizons greater than the inherited ones.

    In the earliest years of schooling, children are rightly expected to acquire basic skills. They must learn how to read and write, to add, subtract, and so forth. These capacities will be called on as they explore the rich possibilities of human existence. Even at the start, however, they can be captivated by stories, by colorful tales of the past, by accounts of life in strange — and even exotic — places. Long before they are able to probe the details of forms of life, to appreciate nuances of success and failure, of virtue and moral error, they can thrill to simple pictures, entranced by great heroes and heroines, appalled by villains — and recognize that the patterns of their familiar life are not the only possibilities.

    Once reading skills have advanced sufficiently far, a more systematic evaluation of alternative modes of existence becomes possible. Geography and history classes can contain segments in which children learn the details of life in different contemporary places, at different times in the past of their own society, and in circumstances remote both in time and space. Some parts of their study should include arranging conditions enabling them to feel such alien forms of existence. By this, I do not intend the pallid exercises in which children bring to the classroom a motley of dishes allegedly representing those consumed by their forebears — and then sit down to a more-or-less palatable common meal. Rather, they should learn how to make the implements required to grow the crops, they should plant and tend them, and prepare the food — and, if all goes well, share in a convivial final feast. Along the way, they should be asked to solve, individually and collectively, the problems that would have arisen for the community whose way of life they are exploring. Work of this type also allows for practice in applying mathematical skills and coping with the kinds of situations that experimental scientists face daily.

    All this paves the way for deeper engagement with a small number of (diverse) forms of life during the last few years in secondary school. A possible strategy for organizing this is to require each student to choose a particular place and period, to spend a significant part of the school year (perhaps all of it?) researching how people lived there and then, culminating in a detailed historical and ethnographic report and a presentation to fellow students. At each step of the way, students would have opportunities to consult with knowledgeable mentors; it is important that many adults bring their varied perspectives to the classroom. They would be expected to read widely, and to treat some aspects of the alien culture in great detail.

    The three phases envisaged in such a program are intended to help students discover two things. Early training prepares them to enter deeply into lives very different from their own. As they choose to focus on groups of people, located at particular places and times, they discover respect and tolerance and interest across differences as well as what aspects of cultural life fascinate them most, and they are asked to understand the demands and constraints experienced by people very different from themselves. One goal is to bring them to recognize their own proclivities, thus helping them to chart a satisfying direction for their own lives and careers. Another is to widen their consciousness of human needs, their ability to see and feel from other perspectives.

    As they grow, they should discover how to read a particular style of history, not the kind that is focused on the large affairs of states, on rulers and conquests and sweeping social reforms. Rather, they should be asked to attend to the details of everyday life. Reading not only historical analyses but also primary documents, they may be expected to enter into the lives of others. Along the way, depending on their focal society, they may need to absorb, and apply some parts of, economics and sociology. To the extent that their education succeeds, the skills they acquire are likely to prove valuable throughout their lives, as their pursuit of their favored projects leads them to actions with consequences for distant strangers. They will be prepared not to view those whom they affect remotely as faceless masses, and (with luck) they will be inspired to inquire and to reflect before implementing their plans.

    Such an interdisciplinary program whose contributory domains are primarily geography and history, as well as those parts of anthropology flowing naturally from their explorations, might also draw on other areas of social science — economics, sociology, political science, and psychology. As they proceed with the individual studies of their later school years, this program should also be in dialogue with central areas of the humanities. None of the special inquiries would be complete without entering the culture of those whom they have chosen to study. The understanding of others’ perspectives requires engaging with the myths and stories, the poems and songs, the paintings and sculptures, through which a society’s personality is most deeply expressed.

    To attain that understanding, students must first be taught how to read, how to listen, and how to see. While the primary materials with which they engage are works of art, their ability to engage profitably with them will depend also on historians and critics, specialists in understanding and analyzing literature or painting or music. The later studies of the secondary school (as I have envisaged them) can only succeed if there are people trained in the humanities who can enable their pupils to find in works of art what those works have to offer. Historians of art and music, like literary critics, are mediators, “liaison officers,” revealing precious insights in works that would be misunderstood, or dismissed as incomprehensible, without their guidance. The art historian invites students to attend to the relation between two areas of a canvas. The architectural historian points to the curve of a cornice and its echo in the shape of windows. The musicologist plays a short figure on the piano, before accompanying a recording of an orchestral work with exaggerated gestures, inviting hearers to recognize the recurrence of that figure. The literary critic points out the overtones and ambiguities of particular words in a poem.

    In the beginning, none of us knows how to do these things for ourselves. We need instruction from people who have acquired the pertinent skills and who know how to convey them to others. Among the school staff there should be teachers who have so deep an understanding of particular genres as to make works of the pertinent types come alive for their students. Humanists, specifically historians and critics of literature, art, music, film, and drama, are needed at two levels. Researchers advance the interpretation of the nineteenth century novel, or of science fiction, or of conceptual art, or of Renaissance sculpture, or of film noir, or of the Romantic song cycle, or of Greek tragedy, or of television docudramas; their achievements are registered in their own teaching (typically at universities) and in their critical writings; and as they frame the work of interpreting their focal genres, they enable their students and their readers to see how to go on. Armed with the skills acquired by steeping themselves in the research level, these students and readers form a cadre at a second level, one that makes direct contact with pupils in secondary schools (and, perhaps, at earlier educational stages). Humanists at this second level are, and should be, less specialized. They ought to range more broadly, over the visual arts of a number of periods, for example, or through the grand sweep of poetry in their native language. The task for them is to instill in those they teach precursor capacities to those which their own education has given them. From the first stages of adolescence (if not before), their pupils should have the chance to learn to read and see and hear, not in the elementary ways of the very young but with attention to nuance and subtlety. The childhood passion inspired by Jo March may endure, but it is set within an understanding of the strengths and foibles of Elizabeth Bennett and Hester Prynne — perhaps even of Dorothea Brooke, Isabel Archer, and Mrs. Dalloway.

    Schools do not just need some second-level humanists. A significant number of people with diverse tastes and interests should move through the secondary school classroom. Not all of them need to be trained teachers. The principal teachers of literature, music, and art should not only offer their own wide repertory of genres, but also supplement and coordinate the visits of others who can introduce an even broader array of works of art. Using their own interpretive skills, they should seek ways of helping their students come alive to the features that excite the visitors who bring their distinctive enthusiasms. Advising pupils who are stirred to similar enthusiasms, they oversee their further development as interpreters, guiding them to suitable mentors and translating what those mentors convey into formulations apt for inculcating enhanced skills. Humanist teachers thus become liaison officers in two different senses: they mediate directly between some works of art and their students, and indirectly between those students and other potential liaison officers, whose fluency is inadequate to convey the messages they hope to offer.

    Data on the performance of American students in mathematics and science are far better known than the gloomy statistics about the reading of American adults. According to a study conducted in January and February of 2019, twenty-seven percent of American adults had not read any book (or even a part of any book) within the prior year. In the “league table” of adult readers, the United States ranked equal twenty-second (tied with Germany). Other figures show a decline in the amount of time spent per day in leisure reading, from the first decade of the twenty-first century to the second. How much of whatever reading occurs might contribute to self-understanding is anybody’s guess.

    Overall, then, it seems fair to conclude that American schools do not generate a swarm of avid readers, eager to garner the benefits that I have attributed to the engagement with literature; higher education appears to do somewhat better. Laments about the quenching of scientific curiosity are extremely common: why do so few Americans pursue post-graduate studies in the sciences? Yet surely educators ought to worry equally about the quashing of interest in literature. Why do so many abandon reading once their formal education is completed? There are obvious potential explanations. Many of those who graduate from secondary schools may always have found reading difficult. They may have struggled through the materials assigned to them, and felt relief when they no longer have to exercise basic skills they know to be incomplete and inadequate. Perhaps the “golden age of television” has inspired a turn to watching series (documentaries or dramas) offering occasions for reflection and promoting greater self-understanding; but no available data supports so hopeful a conclusion. For the past decade, the most popular television turns out to be sports events, a decline from previous decades that showed clear preferences for comedy and drama series, some of them surely rewarding to watch and providing food for thought. I have a hunch that the people who were gripped throughout all the seasons of The Wire and Homeland are also people who continue to read books.

    The goal of making Americans great readers again — and great observers and great listeners — is educationally urgent. Achieving it demands a serious campaign to help all children become fluent readers by the time of their adolescence. But it also requires more: literacy should be understood as a matter of degree. All of us, including the most adept readers, are imperfectly literate: there will always be some kinds of writing in our native language that we find difficult, even impenetrable. Formal education, through the secondary school, ought to provide each pupil with a profile in reading ability sufficient to make accessible — indeed, pleasurably accessible — a range of written texts capable of yielding the principal fruits of literacy. In particular, it should prepare them to read, with curiosity, interest, and joy, works able to prompt reflection and enhanced self-understanding. To achieve that goal (or some reasonable approximation to it), I suggest strengthening the connections between the two levels of humanists. Teachers of literature in secondary schools should have a passion for the genres they introduce to their pupils, combined with strong interpretive skills. They should be given the opportunities to refine those skills, and to extend them to other styles through periodic leaves that enable them to work with scholars engaged in humanistic research. Their schools should contain enough colleagues to form a community, regularly interacting with one another. The community would constantly be renewed by its contacts with research, in which the individual members can continue to develop and grow. Such a program has already been pioneered by some progressive schools.

    Parallel points apply to the visual arts and to music. Indeed, the status quo in these areas is almost certainly more dismal. Since education in the non-literary arts is so frequently curtailed, lopped off in times of budgetary shortage, visual and auditory literacy only attain more primitive levels. In these areas, the liaison officers are fewer, the time assigned to their mediatory work is minimal, and the range of material they are expected to introduce far too extensive. Like their colleagues in literature, they need to enhance their own capacities for sensitive seeing and listening, to recognize their fundamental task as one in which similar skills are germinated and developed in their students, and to find vital ways of conveying what is required. Teachers of art and music should have the chance to learn from scholars, critics, and historians of art and music, who engage in research and sometimes change the framework within which works of art are understood.

    Very few young people are fortunate enough (as I once was) to listen to a critic make an impassioned defense of the importance of literature. In my last undergraduate year in Cambridge, the university extended its support for the annual collegiate drama festival by recruiting eminent members of the faculty to give lunchtime talks on the plays to be performed. My own college dramatic society was rehearsing The Imaginary Invalid, and we invited the distinguished critic George Steiner to speak to us. His presentation was brilliant — and forthright. One claim he made has remained with me ever since: “Molière and Stendhal have taught us more about what it is to be human than all the psychologists there have ever been, combined.”

    Steiner’s dictum is an exaggeration — even if, in times when the humanities are suspect and the social sciences are viewed as poor relatives of their natural cousins, it is a necessary pushback against the evangelists of scientism. Psychology, pursued from a diversity of perspectives, has enlightened us about many aspects of human mental life and of human behavior, sometimes through understanding the lives of other animals, often by controlled observations of human behavior. Laboratory studies have enlarged the understanding of perception, memory, emotional responses, decision-making, perspective-sharing, and other significant mental processes. These achievements do not displace the contributions of playwrights and novelists, nor are they superseded by them. Learning can, and should, come from all directions. Understanding human life, of the individual and of the group, is best advanced through the interaction of disciplines. Marking off particular areas of inquiry as barren — “nothing there for me!” — is a foolish dogmatism, whether it serves as the proud credo of the scientistically devout or as the reprimand of a great literary critic.

    All young people deserve, however, to be instructed by teachers who are imbued with Steiner’s dictum, who recognize the importance of literature, music, and art for the growth of self-understanding, whose passion for a range of works is accompanied by mature interpretive skills, and who have techniques for instilling precursor capacities in their students. This, to be sure, is a lot to ask for. Yet it represents an important goal towards which education ought to strive.

    Literature, art, and music provide access to possibilities for developing individuals, enabling them to try on unrealized alternatives for size. As my passing remark about identification with Jo March indicates, this is relatively straightforward in the case of literature. The impulse to emulate a fictional character, to be as free and strong as Jo or as brave and resourceful as Jim Hawkins, is only the simplest form of literary impact. More complex novels and plays and poems inspire diagnostic reflections on one’s own conduct. They lead some readers of Bleak House to shudder as Dickens’ characters mouth unfeeling words echoing those readers’ own utterances. The more ambiguous figures delineated in “novels for adults” can provoke extended self-reflection: in the contrast between Dorothea Brooke’s impulsive warmth and generosity of spirit and the caution of her more conventional sister, an attentive reader may find material for exploring either the narrowness of his own altruistic sentiments or his own lack of prudence. Reading Ulysses, or the later parts of the Harry Potter series, can reorient our understanding of virtues and vices, change our conception of the heroic, and provoke new ways of working through the successes and failures of our lives.

    So far I have turned the gaze inward, considering how the conversations with ourselves, provoked by our well-instructed readings, seeings, and hearings, might redirect a purely personal and private search for fulfillment. But we must not ignore the changes wrought in the ways we look outward. Can refined capacities for reading and viewing and listening also broaden and deepen our sensitivities to the lives of others? Do they also sometimes lead us to recognize possibilities to make our own distinctive contributions to a larger human world?

    They can and they do. Education in the humanities not only makes us more self-aware, it also promotes our fulfillment through attuning us more closely to people whose welfare and whose aspirations we might affect. Even that formulation, however, is still held by the inward gaze. Surely we should not clamor for humanistic education on the grounds that, through its opening our eyes to possibilities for aiding others, we may become fulfilled! Those whom we help, aiding in their fulfillment, should not be thought of as means to our own fulfillment. Whatever private good our enhanced understanding of human existence may bring, this is a secondary matter. Rather, we should commend the well-schooled sensibilities for their power to make us better citizens and more highly developed moral agents. So we must turn from the first broad aim of education, the individualistic one, to a second cluster of reasons, which have to do with the preparation for citizenship and good conduct.

    Democracy depends on the ability of citizens to deliberate together, with a commitment to finding an outcome with which all potentially affected parties can live. Tocqueville was correct to see the New England town meeting as the core of America’s experiment with democratic government — and Dewey built on the insight when he declared democracy to be “more than a form of government,” seeing it as “primarily a mode of associated living, of conjoint communicated experience.” But how do we cultivate mutual engagement, the capacity to listen carefully to others, to enter into their lives, committing ourselves to finding arrangements that all can accept?

    The answer is, from prolonged encounters with people representing a wide range of perspectives. Through living and working with those who are different, citizens come to understand alternative points of view. Perspectives once alien, dismissed as the products of ignorance and selfishness, appear in a softer light. They become comprehensible reactions to a difficult situation. Although disagreement, even strong disagreement, may survive the encounter, the once-despised perspectives acquire human faces. Joint deliberations may then begin in a different place and in a different spirit. Reaching behind the opposed attitudes, people who disagree may attend to the underlying predicaments, identifying factors responsible for the conflict and seeking responses to those factors that might combine to resolve it.

    This style of direct broadening can begin with joint planning in diverse groups, starting early in education, and gradually increasing the range of outlooks and the complexities of the situations. There is nothing “relativistic” about this, just as there is nothing “relativistic” about tolerance and compromise and patience. The expansion of one’s understanding is not a betrayal of one’s principles, unless of course they cannot withstand the challenge. Humanistic education is an ally in reinforcing capacities to listen and expanding habits of sympathetic understanding. Learning to interpret works of fiction (as well as to respond to visual art and to music) can induce skills for recognizing the standpoints and predicaments of others. Thus the way is prepared for deeper mutual engagement when citizens need to deliberate together.

    Of course, we cannot expect casual recommendations to read to work magic. There is no evidence as yet to support the thesis that people who read fiction become more empathetic than their non-reading peers; and there is a great deal of historical experience that weakens the correlation between culture and decency. Still, there are also historical cases that suggest the power of particular kinds of fiction to generate greater understanding of, and sympathy for, people whose perspectives previously seemed alien, incomprehensible, or abhorrent. On that basis, a carefully chosen set of readings, skillfully interpreted in the classroom, can provide students with the experience of entering the lives of others. Children who are initially inclined to condemn particular kinds of people or specific forms of behavior can be offered stories whose protagonists are relatively similar, and whose conditions of existence are explored in sympathetic detail. Their teacher can pose imaginative questions about how they might have behaved in the circumstances of the story or whether they would have sided with other characters who refused to listen or provide aid. The aim of class discussion will not be to impose a final verdict, but rather to lead each child to a fuller sense of the range of human motivation. Whatever the concluding judgment may be, children should come to see themselves as making it with a deeper engagement with the situation and with the thoughts and emotions of those whom they had been inclined to dismiss as beyond the pale. Exercises of this kind can be developed in many different ways. Testing will reveal how best to do for schoolchildren what Dickens and Stowe did for their nineteenth-century publics.

    Nor, quite evidently, are the humanistic disciplines on which I have so far concentrated the only areas relevant to preparing good citizens. History is an obvious source. So, too, are geography and anthropology. All these subjects can expand the sense of human possibilities, presenting the challenges people have faced at different times and in different places and recording their diverse efforts to meet them. Students can find ample material to perplex and to repel them — thus provoking questions about how people could have been led to accept such puzzling and apparently repugnant ways of life.

    Although historical education has a place for dates, for the glorious deeds of national heroes, and for the pageantry imaginatively constructed to celebrate a society’s past, these are not, in my view, central to its pedagogical importance. Particularly important for the development of citizenship and moral agency is study of the principal episodes of moral and social change — for good and for ill. Attention should be given to the construction of oppressive hierarchies, in the ancient world and in more recent times. Similarly, students should know how cruel practices were challenged and overthrown. Understanding the forms of slavery in Greece and Rome, in the Renaissance, and in the New World, is a clear and obvious example, profoundly relevant for future American citizens. Since the resonances of the institution of chattel slavery continue to sound in the society to whose future they will contribute, American children need to understand why slaves were originally brought to the American colonies, why the practice of keeping slaves persisted and how it was defended, how it came to be opposed, the character of the debate over abolitionism, how social conditions for former slaves and their descendants evolved, and how the legacy of slavery affects differential life prospects in the United States today. (Obviously there may be differences among historians about such matters. Nothing should be presented as gospel.) Charting that social history is more valuable for the formation of democratic character than depicting the campaigns and the battles of the American Civil War.

    Success in enlarging national history to cover all parts of the human world (however the local and global are balanced) must be based on geographical understanding. If, as suggested, history is far more than dates, geography is not simply maps. Naturally, just as a student’s chronology should be approximately accurate, so too her sense of the spatial relations among nations and societies ought to be roughly right. That sense is the basis for knowledge of the climatic and environmental conditions of people’s ways of life. In turn, appreciating the specific opportunities and challenges posed by living in a particular place underlies consideration of the ways in which local conventions and institutions have developed. Geography, as it figures in the education of citizens, must borrow first from the earth and environmental sciences, from demography, and ultimately extend its reach into economics and anthropology.

    All very well, you may say. How nice it would be if our society could afford to provide for all so enriching an education. But the world is a harsh and competitive place, and we must be “globally competitive,” and so young people must do their bit for the Post-Industrial Labor Army. It is hard to find fulfillment, or to be a good citizen, if you cannot support yourself.

    True enough. (And there is dignity in labor, as we used to say.) The economic menace looms over us, warning of dire consequences if we fail to heed its gloomy warnings. Why, though, should we concede its points about the Global Demolition Derby? Who decreed that our present economic arrangements, and habits of consumption, are the optimal ones, or the best we can do? Isn’t it worth asking if the drive to produce, produce, and produce can be resisted? If human lives can be richer and happier when enough is recognized as enough?

    Mill posed those questions, and he answered them. During a period when almost all of his fellow economists quaked at the thought of declining production, dreading the “Stationary State,” he saw such a state as allowing the development of happy and fulfilling forms of life. Repelled by the cutthroat competition already visible in the early stages of industrial capitalism, at the “struggling to get on,” at the neglect of important values in the pursuit of profit, he supposed that, once a nation had advanced far enough economically, the relentless treading on one another can cease. In recommending the “stationary state,” he warned against the danger of too much movement, of the cruel endless churning of the profit-making machinery. The stationary state, he claimed, “would be, on the whole, a very considerable improvement on our present condition.”

    To be sure, we are not stationary people, or a stationary society: we have been wildly accelerated by our economy and our technology. But that is hardly the last word. A society that provides economic security, and honors the educational requirements for it, need not be a society that tolerates grotesque levels of economic inequality. Moreover, the uncontrolled excesses of our contemporary acceleration can be resisted and rejected. Indeed, it is the technology itself that can make resistance plausible. In a world in which it has developed beyond Mill’s imagination, in which routine jobs can be automated, human labor can be directed toward fulfilling occupations, in which people work with and for one another. Prominent among them is the care and nurturing of the young. Education, broadly conceived, can become what Emerson thought it should be: the main enterprise of the world.

    The Selfless Self of Self

    On A Portrait of Two Beautiful Young People
    A Brother and Sister
    By Gerard Manley Hopkins

    O I admire and sorrow! The heart’s eye grieves
    Discovering you, dark tramplers, tyrant years.
    A juice rides rich through bluebells, in vine leaves,
    And beauty’s dearest veriest vein is tears.

    Happy the father, mother of these! Too fast:
    Not that, but thus far, all with frailty, blest
    In one fair fall; but, for time’s aftercast,
    Creatures all heft, hope, hazard, interest.

    And are they thus? The fine, the fingering beams
    Their young delightful hour do feature down
    That fleeted else like day-dissolvèd dreams
    Or ringlet-race on burling Barrow brown.

    She leans on him with such contentment fond
    As well with sister sits, would well the wife;
    His looks, the soul’s own letters, see beyond,
    Gaze on, and fall directly forth on life.

    But ah, bright forelock, cluster that you are
    Of favoured make and mind and health and youth,
    Where lies your landmark, seamark, or soul’s star?
    There’s none but truth can stead you. Christ is truth.

    There’s none but good can be good, both for you
    And what sways with you, maybe this sweet maid;
    None good, but God — a warning wavèd to
    One once that was found wanting when Good weighed.

    Man lives that list, that leaning in the will
    No wisdom can forecast by gauge or guess,
    The selfless self of self, most strange, most still,
    Fast furled and all foredrawn to No or Yes.

    Your feast of; that most in you earnest eye
    May but call on your banes to more carouse.
    Worst will the best. What worm was here, we cry,
    To have havoc-pocked so, see, the hung-heavenward boughs?

    Enough: corruption was the world’s first woe.
    What need I strain my heart beyond my ken?
    O but I bear my burning witness though
    Against the wild and wanton work of men.

    In the modern scholarly edition of the poems of Gerard Manley Hopkins, which appeared in 1990, the editor notes that in the preceding edition of 1948 a number of poems — eighty-two, in fact — were offered, after the preceding groups called “Early Poems” and “Poems,” as “Unfinished Poems, Fragments, Light Verse.” Those poems were sidelined, and so they were rarely anthologized, taught, or even read. Many of these apparently “lesser” pieces seemed to my young self — and still seem now — just as good as the poems honored in the front of the book. Although the new edition repaired this unfortunate editorial segregation by printing all the poems in chronological order, I still regret the relative obscurity of the “unfinished” poems. A late and ambitious one of them, called “On the Portrait of Two Beautiful Young People,” is worth pondering, not least to draw readers’ attention to its existence. It embodies the agony of Hopkins’ last years, in which he repeatedly staged a debate between his own theory of unbidden creativity and the religious theory of free will.

    At the age of twenty-two, Gerard Hopkins, an ardent young English poet and a recent graduate of Oxford (where he had shone as a brilliant student of the classics), prays that he may willingly advance beyond the legitimate pleasures of the senses in favor of the better joys of ascetic devotion. For the delight of the ear in song and speech, he will substitute contemplative silence and muteness in self-expression; instead of the distractions of worldly life, his eye will find “the uncreated light.” In gentle and unstrained “perfect” quatrains, each rhythmically serene line rhyming exactly with another, he enjoins the five natural senses to fix on spiritual pleasures. He begins the poem, which he calls “The Habit of Perfection,” with the ear and the eye:

    Elected Silence, sing to meAnd beat upon my whorlèd ear,
    Pipe me to pastures still and be
    The music that I care to hear.

    Shape nothing, lips; be lovely-dumb:
    It is the shut, the curfew sent
    From there where all surrenders come
    Which only makes you eloquent.

    Be shellèd eyes, with double dark
    And find the uncreated light:
    This ruck and reel which you remark
    Coils, keeps, and teases simple sight.

    He burned his own copies of his youthful poems (leaving some copies, however, with family and friends and letting others remain in drafts embedded in his diaries). Then, converted from his family’s Anglican Protestantism to Roman Catholicism, Gerard Hopkins became a Jesuit priest.

    He had supposed, before his ordination, that his ascetic desire to disregard sense-pleasure would ensure access to higher spiritual delights, and that he would willingly suspend his writing of poems in favor of consecrating all his time to his priestly duties. He did just that for seven years. Then, with the implied permission (or so he felt) of his Jesuit superior, who had said that someone should commemorate the five German Roman Catholic nuns, legally expelled from Germany for their religion, who had drowned when their ship, the Deutschland, was wrecked off the coast of Kent, Hopkins resumed writing poetry with a fiercely original long poem called”The Wreck of the Deutschland.” Hopkins’ return to the practice of verse continued until his premature death, of typhoid fever, when he was only forty-four.

    After his untroubled and open-hearted early election of silence, now writing as a priest, Hopkins became troubled by an increasingly anxious scrupulosity, suspecting that art must be blighted in its essence, infected by that original sin inherited by all human beings. If he, vowed to God’s service, chooses to write poetry, does he sin by that worldly choice? In an extraordinarily agile swerve, Hopkins rebukes, in the draft of a poem, the idea that composing a work of art — whether architectural or musical — is rightly characterized as an act of free will subject to moral judgment, arguing that the artist is driven to create by a compelling force subject to no external law, divine or secular. (The untitled draft, which we used to know by its first line “How all’s to one thing wrought!”, has had its stanzas rearranged, and the first line now reads “Who shaped these walls has shewn.” The rearrangement seems to me unconvincing.) It is impossible, he believes, to deduce the moral state of the artist or the composer from the work of art, since art is not created on a plane to which the moral law applies. The concept of obedience to a pre-existing moral or spiritual law and the concept of the freedom of the will are simply not relevant to the surprising and unpremeditated surge of inspiration which arises spontaneously, unbidden by the will.

    Drawing on his own experience, Hopkins claims that the artist is moved by a strange controlling power:

    Not free in this because
    His powers seemed free to play:

    He swept that scope he wa
    To sweep and must obey.

    Nor angel insight can
    Learn how the heart is hence:

    Since all the make of man
    Is law’s indifference

    Therefore this masterhood,
    This piece of perfect song,

    This fault-not-found-with good
    Is neither right nor wrong.

    Stung by the frequent condemnation of artworks (and, by implication, of artists) on moral grounds, Hopkins dismisses such judgments by giving to his unphilosophical reader — in absolutely plain, even childish, language — examples of three arts that cannot possibly be evaluated by moral standards: painting, melody, and the “architectural” constructions (such as a honeycomb) of animal instinct. These are “neither right nor wrong,”

    No more than red and blue,
    No more than Re and Mi,
    Or sweet the golden glue
    That’s built for by the bee.

    Clever as this is, it has not come to grips with the semantic content of the poet’s own poems: must they, because their medium is language, exist on the human plane of right and wrong, conforming themselves to a binary ethic of Yes or No? Or can his poems be left to exist solely on the aesthetic plane from which they originated, where the criterion of success is the approach to formal perfection? In judging art, must we abandon the aesthetic of the working artist, where inspiration drives what Keats called “the innumerable compositions and decompositions” intrinsic to the task of writing poetry? May the artist not conceive of his works as belonging among the pure “wild and wide” goods of Nature, all divinely created, all of innumerable shadings?

    For good grows wild and wide,
    Has shades, is nowhere none;

    Or must he instead value his poetry according to its (cruelly restricted) choice of a single presiding ethical “chieftain,” God or Satan?

    But right must seek a side
    And choose for chieftain one.

    Is the purpose of art to achieve a compelled artifact aspiring to perfection of form? Or is it rather to present a substantial moral argument? At this moment in his intellectual evolution, Hopkins offers an enigmatic two-part answer:

    What makes the man and what
    The man within that makes:
    Ask whom he serves or not
    Serves and what side he takes.

    The poet’s riddle asks first “What makes the man” (to which the answer is presumably a natural process, since the poet is not asking “Who makes the man”). Even more strangely, the riddle in its next two lines asks about the verse-product, about “what/ The man within that makes”. What is the context, the “that” which the man is “within”? By its neutral reference, “that” must once again stand for a natural process. In his four-line epigram Hopkins sums up the apparently impersonal natural force impelling the human creator, no less a creature of biological instinct than the bee. Only after asserting the idiosyncratic force that both makes and is made by the artist can the poet raise the question of the artist’s ultimate moral allegiance. When Hopkins stations against each other the natural, passionate, and independent urgency of creation and the strict moral urgency of intellectually ethical decision, the riddling relation between the two urgencies is an uneasy one.

    Seven years pass. It is 1886 and Hopkins is forty-two. Everything has changed in his circumstances: in 1884 he had been dispatched from England to Ireland to serve as Professor of Classics in the Catholic University in Dublin, newly re-founded by the Jesuits. Remembering, painfully, Jesus’ announcement, “I came not to send peace, but a sword” (Matthew 10:34), the poet finds himself isolated “at three removes,” not only from his native England but also from his Protestant family and his former company of English Jesuits. In a post-famine era when Ireland was implacably hostile to England, he was the only English Jesuit in all of Ireland. He realizes, desolately, that his university appointment in Dublin is likely to be permanent:

    To seem the stranger lies my lot, my life
    Among strangers. Father and mother dear,
    Brothers and sisters are in Christ not near,
    And he my peace/ my parting, sword and strife.
    ****
    I am in Ireland now; now I am at a third
    Remove.

    His depression is so intense that he refers to himself in the past tense, as though he had died: he is not a “beginner” but “a lonely began.”

    The religious, intellectual, scholarly, psychological, and physical trials that Hopkins had undergone in Ireland brought him, in 1886, the year of his poem on the portrait of two beautiful children, to a crisis of distress in which his commitment to accuracy obliged him to graph, in a meticulous and experimental style, his mind’s unbearable nervous oscillations among countless possible thoughts. His aesthetic drive is increasingly conditioned by a merciless self-scrutiny, the “helpless self-loathing” that he reveals in his later Dublin retreat notes. His familiar but ever-changing stand-off between the aesthetic and the moral now reappears in a tense poem that he names as if it were a commentary: “On the Portrait of Two Beautiful Young People.” When he sees, presumably in a wealthy family’s house, a watercolor portrait of two children, he is on his Christmas holiday. He has found it almost impossible to write poetry while teaching, but with his incessant anxiety somewhat reduced by distance from the university, he has the freedom to admire the art of the portraitist as well as the appealing beauty of the children. His title renders them merely as ungendered young “people,” but his subtitle (“a Brother and Sister”) represents them as pre-adolescent siblings, their innocence still undamaged.Although the poet’s initial response is an admiring one, ratifying the double beauty of both the children and the tender portrait, his anxiety soon awakens reflection on the children, their parents, and their probable future, and his mind leaps to an immediate moral sorrow inseparable from his initial aesthetic admiration. “O I admire” he says while attaching — without taking a breath — his second verb and its exclamation point, “and sorrow!” In that first line, he establishes the rapid — the instantaneous — fluctuations of response that the poem will mirror. His mixed emotions sometimes arrive in balanced opposition, but more often they emerge as restless questioning, philosophical perplexity, or psychological dread. Compared to the tranquility of “elected silence” in “The Habit of Perfection,” the turbulent stirrings besetting him before the portrait allow him only temporary moments of repose, each one rapidly disturbed into further disquiet. The tide of admiration that had in the past prompted “How all’s to one thing wrought!” ebbs, undermined by the looming sorrow prompted by the portrait.

    Hopkins’ increasing depression will bring him, by the end of the poem, to a desperate conclusion — that destiny may destroy these children. In that respect, the “beautiful” but static watercolor is mendacious. The artist has framed the children within a wreath of unspoiled flowers, fruit on tree-branches, and flourishing grapevines (generating the “bluebells” and “juice” of the first stanza), but does this unblemished context truthfully reflect the variable and endangered course of human fate that must be undergone by the children? Remembering the tree of the knowledge of good and evil from which Eve, tempted by the serpent, plucked the forbidden fruit, the poet arrives, in his penultimate stanza, at a dire image of the children’s likely future, crying out, as he approaches his conclusion, against the fall of man that brought disease and death into the world:

                 What worm was here, we cry,
    To have havoc-pocked so, see, the hung-heavenward
    boughs?

    As Hopkins mentioned in a letter, he adopted as his verse-form the sedate, if sorrowful, perfectly rhymed quatrains of Gray’s famous “Elegy Written in a Country Churchyard,” because he thought that he could “make something.” He, too, is writing an elegy grieving the children’s potential end, likely to be a blighted one. Six years earlier, responding to the tears of the child “Margaret” mourning the fall of leaves, he had expressed a horrible certainty with the words “born for”: “It is the blight man was born for, / It is Margaret you mourn for.” At least Gray could predict the certain future of his dead villagers: they will be recalled, if at all, in “the short and simple annals of the poor.” Their humble life, while it may exclude the prospect of political or literary fame, also keeps them from the hideous criminal acts of the powerful: they will not “wade through slaughter to a throne, / And shut the gates of mercy on mankind.” Hopkins, by contrast, can say nothing so reassuring about the children: his fear disrupts his lines, hurling them far from Gray’s even pace and calm punctuation. Both the poet’s quandary before the mendacious (and therefore sentimental) portrait — representing life falsely as a paradisal scene — and his vexing ignorance of the children’s moral future obliterate tranquility as soon as the poem opens.

    Hopkins’ emotional announcement — “O I admire and sorrow!” — had burst forth, forcibly breaking its single line in two by the word “sorrow” and the full stop at the exclamation point. The second stanza — Hopkins’ unsettling transcription of his almost simultaneous mental self-contradictions — presents, as its mimicry of agitated thought, an unstable and incoherent syntax and an over-supply of punctuation (nine commas, an exclamation point, a colon, a semi-colon, a period). We could paraphrase the stanza and iron out our first impressions of incoherence by following Hopkins’ line of reasoning: we could say that Hopkins first exclaims how happy the parents are in having such beautiful children, then feels that he spoke too quickly. No, he stammers, the parents cannot be said to be permanently happy (that particular luck cannot be ensured); but at least the parents are happy thus far, but we must recall (he reluctantly admits) the frailty of fortune; and after the present fleeting phase of childhood there will be an “aftercast” of time’s baleful dice — with what result? Following that gamble, the children’s fate and its effect on their parents can only be speculated upon. These young creatures are to their parents, thinks the poet in summary, a “heft” (a burden, from “heave”), a hope, a hazard (all alliterating with, and thereby reinforcing, the initial “Happy”). Yet in the final noun, they become merely an “interest.” This, the sole “neutral” noun of the series, detaches the spectator Hopkins from his initial emotional participation in the “happy” parents’ good fortune.

    If we consider, now, our first glimpse of this second stanza, we see Hopkins — hardly yet into his subject — becoming almost unintelligible as he daringly mimics on the page the confusing and reckless speed of his turbulent thought:

    Happy the father, mother of these! Too fast:
    Not that, but thus far, all with frailty, blest
    In one fair fall; but, for time’s aftercast,
    Creatures all heft, hope, hazard, interest.

    Hopkins will continue to represent the very melody of his mind’s oscillations in distress: these sometimes appear in balanced oppositions but more often become percussive in restless questioning, philosophical perplexity, or further dread. His music culminates in a broken syntax and ill-coupled words as he addresses one of the endangered and innocent siblings, the brother, approaching (the poet fears) that almost certain ruined end:

    Your feast of; that most in you earnest eye
    May but call on your banes to more carouse.
    Worst will the best. What worm was here, we cry,
    To have havoc-pocked so, see, the hung-heavenward
    boughs?

    As the poem reaches its deepest image, which had been lurking in the poem from the opening, the words careen from zenith to pit, from pit to zenith. First Hopkins announces the benign “feast,” reflecting the boy’s earnest and innocent eye, but the feast immediately generates (in forward linear time) the gluttonous “carouse” of agents of ruin. By contrast, the next two examples are grimly retrospective: the outcome (“worst”) is prophesied of its beautiful past counter-superlative “best,” and behind the disastrous “havoc-pocked” branches we glimpse the originally healthy “hung-heavenward” boughs. The passionate indignation behind such retrospective truth declares itself in this disequilibrium of time, as it goes forward and backward in depicting hope succeeded by despair. (Hopkins must have noticed, as a schoolboy, the originality of Shakespeare’s pre-positioned disasters in lines where time has already dealt a death-blow before we are allowed to see the charm of youth, and time is already digging the grave of a fair countenance: “Time doth transfix the flourish set on youth, / And delves the parallels in beauty’s brow.”)

    Against the buffets of threatening speculation, Hopkins allows occasional points of repose. The first is his initial admiring gaze on the tableau of the quiet and lovely portrait, where the artist has wreathed the beautiful siblings in a cluster of unspoiled flowers and grapevine, generating the bluebells, vine-leaves, and “juice” of the first stanza. But that initial pleasure is destroyed as soon as “sorrow” transfixes “admire,” “tyrant years” obliterate “youth,” and “tears” mar “beauty.” In the second instance in which anxiety is momentarily stilled, the poet creates in fantasy a serene future portrait where a conjectured young wife would replace the sister now by the boy’s side, while the boy himself, with the passage of time a husband, is poised (unlike his conditionally inserted bride) in a present-tense future, looking with steadfast confidence to his coming life:

    She leans on him with such contentment fond
    As well the sister sits, would well the wife;
    His looks, the soul’s own letters, see beyond,
    Gaze on, and fall directly forth on life.

    Yet even in the poised moment of that imagined marriage portrait, the poet’s intellectual skepticism speaks out, warning the promising boy that he must embark on a moral and spiritual search for a reliable principle of stability:

    But ah, bright forelock, cluster that you are
    Of favoured make and mind and health and youth,
    Where lies your landmark, seamark, or soul’s star?
    There’s none but truth can stead you. Christ is truth.

    In this direct address, echo-words create deliberate shadow-appearances of earlier poems. The first two echo-words, “mark,” and “star,” recall Shakespeare’s saying of love,

              It is an ever-fixèd mark
    That looks on tempests and is never shaken;
    It is the star to every wandering bark,

    Shakespeare’s ever-fixèd mark is (to Hopkins’ mind) a landmark, a lighthouse, and his star is the North Star by which sailors can navigate. The third echo-word here is “stead,” creating a shadow-appearance of Keats’s praise of the North Star in the sonnet “Bright star, would I were steadfast as thou art!” That line supplies, with its adjective “steadfast,” Hopkins’ “archaic” (OED) verb, “to stead” (to support, to help) for his desired outcome — that something will “stead” the boy in consistent uprightness as he advances in time.

    Having asked the question of where lies that which will “stead” the boy, Hopkins offers, in a startling turn, two different and independent answers: “There’s none but truth can stead you. Christ is truth.” Each of the two answers in the line occupies a free-standing sentence uttering a single thought; each exhibits an unequivocal closing period; and each exists on its own distinct plane. The planes do not intersect. The first sentence replicates the poet’s instant answer to his own question on the plane of moral philosophy: “There’s none but truth can stead you.” The second sentence, “Christ is truth,” astonishingly inserts itself into the very same line-unit as the first, even though it exists on an entirely different level, that of Christian scripture. The first sentence sets before the boy’s eyes a daily intellectual and moral value — truth — indispensable to Hopkins himself, as his letters fiercely show. Yet the second, a scriptural sentence, “Christ is truth,” echoes Jesus’ saying: “I am the way, the truth, and the life” (John 14: 6). Such a credal assertion is less universally applicable than the common recommendation of moral truth as a reliable bulwark of the virtuous life.

    To be true to himself, Hopkins has had to separate his two truths — one natural, one theological — and give to each its independence. Yet to present himself as he is, the poet has had to twin the two independent answers within a single line, as permanent absolute values equally held by him, inseparable. The poem balances precariously over the abyss between the two absolute standards, philosophical and theological. As he stands immobilized between the two distinct truths, Hopkins’ anxious mind, worrying for the young boy, hopes to bridge the abyss with a moral parable from a scriptural source: Jesus’ parable of the rich young man. There, invoking a value — “the Good”—that is apparently more flexible than “truth” and of a wider application, Hopkins creates another temporary point of mental repose, but blights it by recounting the Biblical parable’s unhappy conclusion.

    In the brief parable, a rich young man, addressing Jesus as “Good Master,” asks what he must do to be saved. After rebuking him for the flattering title (“There is none good but one, that is, God”), Jesus tells the young man to observe the commandments. When the young man, professing that he has obeyed the commandments from his youth, asks what he must further do to be saved, Jesus offers a direct two-part answer: “Go and sell that thou hast, and give to the poor. . . and come and follow me.” The evangelist recounts the failed outcome: the young man “went away sorrowful, for he had great possessions” (Matthew 19:16-22). In Hopkins’ poem, the young man “was found wanting when Good weighed.” Knowing that the young people in the portrait are rich, Hopkins hopes to “stead” them by recalling to their minds Jesus’ warning that there is none good but God; he also reminds them that the young man was found wanting when he could not imagine depriving himself of riches. (A discarded stanza tells us that the poet forecasts a similar potential sin in the children, saying to them of the young man, “Rise he would not; fell/ Rather; he wore that millstone you wear, wealth.”)

    So far, the poet’s forensic move in recommending truth has been to direct the boy to three different sources of truth — philosophical, credal, and parabolic. Unsatisfied with all three, Hopkins takes refuge in another source entirely, one existing on a plane unknown to his initial three. In one of the moves that made readers of the first edition of Hopkins’ poems in 1918 find him excitingly “modern,” the poet’s mind leaps to a psychological plane, a trustworthy source to “stead” the boy’s future. This resource, very obliquely described, has already been hinted at in one stanza of “How all’s to one thing wrought!” Although the artist there, said Hopkins, “changed in choice,” he did so not freely but compelled by “his being’s bent”: that “bent” “was an instrument / Which overvaulted voice.” The voice must follow, in every moment of choice, the “being’s bent,” the innate inclination of the self-being of the artist. In this way, the poet of “On the Portrait” leaps, in the climax of his poem, to the plane of the unfathomable, surpassing the three earlier recommendations — moral, credal, and scriptural. All three, after all, are conventional and well known to Christian homiletics. But now — on the analogy of his lingering Anglican belief in predestination — that God has decided from all eternity who will be saved and who will be damned, Hopkins has conceived a psychological plane of predestined idiosyncratic selfhood.

    Somewhere inside the person, inside the very self, there is a pre-planted banner of the inaccessible source within the self. The banner remains in an inscrutable furled form which hides its emblem of the “the selfless self of self” — an abstraction unnamable as self or soul because it precedes both, resembling the Aristotelian “form” that determines a distinctive species-nature within inert earthly matter. For Hopkins, it was a matter of introspective fact that not only did each individual species differ from every other species, but every individual human being differed entirely from every other individual being, each bound, when engaged aesthetically, to express itself as itself. In the sonnet “As kingfishers catch fire,” Hopkins writes the poem of that conviction:

    Each mortal thing does one thing and the same:
    Deals out that being indoors each one dwells;
    Selves — goes its self; myself it speaks and spells,
    Crying What I do is me: for that I came.
    I say more: the just man justices.

    As he says in a notebook, he finds his self-being both distinctive and incommunicable: “My selfbeing, . . . that taste of myself, of I and me above and in all things . . . is more distinctive than the taste of ale or alum, more distinctive than the smell of walnutleaf or camphor, and is incommunicable by any means to another man.” At its deepest, his self-being was inaccessible even to the poet himself; otherwise he would have been able to find words to communicate it to others. Instead, he knows it only as the region from which the bolt of inspiration arrives, not at his summoning but as a gift. (Before Freud speculated on the incessant activity of the unconscious mind, insights were felt to be bestowed, not self-generated.)

    From the outset of “On the Portrait of Two Beautiful Young People,” the poet’s linguistic patterns have attempted to arrange themselves under the broad concepts of “authentic” and “inauthentic” (on the aesthetic plane) and “good” and “evil” (on the moral plane). The mounting mass of words of uncertainty in the poem, amid its repeated stops and starts of syntax and punctuation, undoes the overt emotional contrast of “Admiration” and “Sorrow” with which the poet had begun. In short, the aesthetic aim of “On the Portrait of Two Beautiful Young People” is to mimic, credibly, the poet’s unpredictable vicissitudes of response as he perceives, from the angle of experience, the precarious future of innocence — by no means an unknown theme in previous verse, but rarely found, as here, in a single extended panicked struggle. The mind here, as it thinks one thing, instantly thinks its opposite, or else it cannot find clear opposites, or it is sunk in prophetic conflict, or it is about to relinquish altogether any attempt at coherence. In earlier verse, experience, modeled on religious conviction, tended to judge firmly: think of Blake’s Auguries of Innocence, with their instant condemnations and approvals: “A robin redbreast in a cage / Puts all heaven in a rage”; “If the sun and moon should doubt, / They’d immediately go out.” Hopkins’s intellectual subtlety instantly raises not one judgment, but all possible ones, but his love of truth makes him abhor his own skepticism.

    As Hopkins suffers through his meditation on the portrait, he has seen the future of the children threatened by agents that could blight their branches of bright leaves and fruit. The worm — at once the serpent of the Fall of Man and Shakespeare’s canker-worm in the rose — devours the optimism of the poet’s initial hope, and deprives him, by the time his lyric ends, of all notes but tragedy and fury:

    Enough: corruption was the world’s first woe;
    What need I strain my thought beyond my ken?
    O but I bear my burning witness though
    Against the wild and wanton work of men.

    In the fair copy that Hopkins sent to his friend Robert Bridges, the poem “On the Portrait” ends there, but Hopkins showed that he hoped to add more by appending two lines of asterisks before he mails it off for Bridges’ scrutiny. Nowhere does he explain the function of “the selfless self of self,” and how it fits, or does not, with the triple recommendations of truth preceding it.

    “How all’s to one thing wrought!” had been spoken from the point of view of an audience admiring a successful creation in architecture or music. But now, gazing at the portrait to which he himself is audience, Hopkins is prompted to turn his gaze toward the absent painter’s own view: how does the artist himself explain the arrival of inspiration? Hopkins asserts, drawing on Wordsworth, that our inner aesthetic sense — physiological fellow to our other five — is as helpless not to feel joy in responding to perfection of form as the conventional senses are not to see or hear what presents itself to eye or ear. The aesthetic sense, he discovers, is indistinguishable in nature from the other impressionable senses, of which Wordsworth said, in “Expostulation and Reply,”

    The eye — it cannot choose but see,
    We cannot bid the ear be still;
    Our bodies feel, where’er they be,
    Against, or with our will.

    The aesthetic sense produces responses in us strictly comparable to those that we experience as our natural (and spontaneous) biological responses to seeing, hearing, and tasting. The senses are all natural goods, shaded and variegated, of infinite potential, divinely created, knowing no modification by the will:

    For good grows wild and wide,
    Has shades, is nowhere none.

    When Hopkins eventually permitted moral questioning to arise in “How all’s to one thing wrought!” he could not avoid the unequivocal question of salvation, of the “right” and the “wrong,” so different from the aesthetic judgment of “perfect” or “imperfect.”

    But right must choose a side
    And choose for chieftain one.

    What makes the man and what
    The man within that makes:
    Ask whom he serves or not
    Serves and what side he takes.

    As Jesus warned of ethical choice, “No man can serve two masters” (Matthew 6:24).

    This is Hopkins’ inventive argument: just as we cannot choose whether we see or hear or taste, so the aesthetic sense, equally of biological origin in the body, is exempt from serving one master or another. The crucial religious question, “Whom do you serve?” does not apply, Hopkins believes, to poetic inspiration, which arrives not as a decision but as an unexpected event that cannot be willed into being. What any masterpiece offers us is a sense of the mind that invented it, but the mind’s temporal potential is infinitely diverse, and the artist-as-bee, though creating honey from the flowers all about him, cannot exhaust the meadow’s nectar in a single retrieval. The mind, says the poet, is always awaiting the next perfection arriving unheralded from its indescribable “selfless self of self.” Of the artist’s flowers and honey he securely says,

    The brightest blooms lie there unblown
    His sweetest nectar hides behind.

    Masterpieces are not the weak records of a life that is no more, as the sentimental watercolor will be when the children are dead. On the contrary, Hopkins declares, masterpieces arrive at a quintessence of mind and medium stronger than architecture, sweeter than music, and of a permanence exceeding that of human life.

    The blessed potential of poetic invention had seemed ever-fresh at the time that Hopkins, released from the ascetic silence of “The Habit of Perfection,” exclaimed “How all’s to one thing wrought!” But by the time he composes “On the Portrait of Two Beautiful Young People,” he can no longer imagine the poem existing in an independent aesthetic space, aspiring to formal perfection; he has come to understand, through his own suffering, Wordsworth’s observations in “Tintern Abbey” that “a deep distress” had “humanized” his soul, that poetry must absorb and reflect not only nature but also “the still, sad music of humanity.” Hopkins’s “music of humanity” was only occasionally “still” and “sad”; rather, his physiological intensity — in which every feeling, from the exalted to the bitter, was heightened to acute nervous pitch — led him, in moral terms, to a witnessing aflame with pity and indignation. Although the masterpiece never cedes the aesthetic plane guaranteeing its existence into futurity, it is nonetheless now obliged, when it treats of human life, to take on the weight of moral judgment.

    During the Christmas holiday of Hopkins’ composing “On the Portrait,” he was completing the last revisions of a poem he had been working on for two years, called “Spelt from Sibyl’s Leaves.” In his youth, delighting in finding symbols in the world of his own mixed nature as priest and poet, he had written in praise of “pied beauty”: “Glory be to God for dappled things.” But now he was obliged, morally, to separate all the things of the world, both natural and human, into “two flocks, two folds — black, white; | right, wrong.” To the lover of the dappled world, this last judgment, arising when “earth her being has unbound, her dapple is at an end” is an agonizing reduction of the inextinguishable fertility of the creative mind. But what Hopkins has come to understand, in contemplating the scattered leaves of his sybilline book, is, at last, his “selfless self of self.” He had given it that name so as to connect it to the self while representing it as alien to the same self: it is “most strange,” and above all it is voiceless (“most still”). From it came his moments of inspiration (always already “foredrawn to No or Yes”), which seemed to arrive independently only to be quenched by a force outside himself. Inspiration was, he said in a sonnet dedicated to Robert Bridges and written only six weeks before his death, a delight that “breathes once and, quenchèd faster than it came, / leaves yet the mind a mother of immortal song.” Hopkins had thought of his inspiration as coming from God, a suitably unknowable source, but “Spelt from Sibyl’s Leaves” articulates his shocked understanding that inspiration came entirely from his own self, and that its cause was the constant struggle of opposing on the battlefield of the self, under the unfurled banner of the integral being. It was, he now knows, his own suffering that, over time, kindled the later inspiration that generated his “burning witness.” The “selfless self of self ” was (as we would say) the unconscious mind sequestering his suffering until it could find its necessary formal attributes and emerge as a shaping inspiration.

    Having understood that he was alone with himself in a material world of natural causes, the poet, in his apocalyptic sonnet of unrelieved darkness in which no God appears and heaven is populated only by the natural stars, describes his state as that of a man bound on a rack where he is both the torturer and the tortured. His thoughts, like upper and nether millstones, grind pitilessly, their product his wordless “groans,” poetry’s last expiring utterance, pain without words. He has necessarily become, now that his being has been unbound, the selfless self of self naked, stretching itself on an instrument of torture, a self unprotected, unhoused, beset by abrading contradictions, strung

                on a rack
    Where, selfwrung, selfstrung, sheathe- and shelterless, |
    thoughts against thoughts in groans grind.

    It is this pitiless and accurate self-portrait that has replaced the equivocal portrait of the beautiful children.

    The Last Night of the World

    Unfinished dares avoiding rules override Avernus.
    Were you wrong about everything to say anything?
    Plausible laws insufficient to our lives are blowback
    For the myth of liberty. Depeopled palace trumpets
    Silent where they were once wild for slaughter calls.
    Sons and daughters commandeer our mercy for grief
    Discerning: Who would harm a stranger harms me.
    Outstare the world’s visionary company parched so
    Thin on desire and conscience for things as things are
    And not as they appear, which appears contradictive
    To one who saw dynasties of patience vanish whole:
    The golden apple of discord is rotten at the core.
    Silence captures how our deep shadow work in sin
    Poises eternity for love to comfort us — or anyone.

    Animal Magnetism

    Phantom intelligence of the soul knows touch echoes
    Trace gestures prone to outcompose their originals —
    Springs and veils marvel at their sudden plainness,
    Nincompoops with tongs abandon the whole shebang.
    My stupefaction remains wholly blunt and untamed,
    That once familiar summer will ripen and destroy all
    Backward glances aimed at divining an elusive theme,
    Leaving room to fall in a series of parallel inclusions.
    A scholar of a hallucination taking altar in my brain
    Renews a languishing more terrible than perdition.
    The storm once more came on fast with annotations
    I meant to communicate but haven’t written yet —
    The dead hang out their songs to cure in the forest.
    The gates of heaven and hell are hard to tell apart.