The Prophetic Environmentalism of Rabindranath Tagore

    The great British historian E. P. Thompson once remarked that “India is not an important country, but perhaps the most important country for the future of the world. Here is a country that merits no one’s condescension. All the convergent influences of the world run through this society: Hindu, Moslem, Christian, secular; Stalinist, liberal, Maoist, democratic socialist, Gandhian. There is not a thought that is being thought in the West or East which is not active in some Indian mind.” Some may cavil at his assertion that India is (or ever was) the most important country for the future of the world. But I want rather to endorse Thompson’s other claim —
    namely, that there has been an astonishing diversity of intellectual opinion in India. This is a product of the country’s size, its cultural heterogeneity, and its daring (if admittedly imperfect) attempt to construct a democratic political system in a deeply hierarchical society. Indeed, among the countries of the so-called Global South, India is notable for the vigor, sophistication, and self-confidence of its intellectual traditions. In this respect it stands out even compared to its larger neighbor China, where the scholarly legacy of the past has been brutally crushed by a totalitarian state. 

    For too long a significant strand of the Indian intellectual tradition has been neglected: its rich speculations about the past, present, and possible future of human relations with the natural world. The burgeoning contemporary literature
    on the history of environmentalism is also guilty of this omission, owing to its narrow geographical focus. The challenge to American intellectual hegemony in this field first came from Europe, and what took place there was noticed in America. The traffic of ideas across the Atlantic was intense. Yet the conversation has been conducted as if environmental movements and environmental thinkers could not exist outside Europe and North America. 

    I hasten to add that this bias did not originate in any sort of colonialist condescension or feeling of racial superiority. Rather, it most likely had its roots in conventional social science wisdom, which stubbornly held that environmentalism was a “full stomach” phenomenon, possible only in societies where a certain level of material prosperity had been reached. By the canons of orthodox social science, countries such as India are not supposed to have an environmental consciousness. They are, as it were, too poor to be green. As the economist Lester Thurow notoriously remarked in 1980,
    “If you look at the countries that are interested in environmentalism, or at the individuals who support environmentalism within each country, one is struck by the extent to which environmentalism is an interest of the upper middle class. Poor countries and poor individuals simply aren’t interested.”

    This haughty dismissal of any possibility of poor countries being interested in the fate of the natural environment was, at least with regard to India, years out of date. In the spring of 1973, a popular peasant movement in the Himalaya, known as Chipko, threatened to hug the hill forests to stop them from being felled by commercial loggers. Many of the participants
    were unlettered, but its leaders, though themselves from peasant backgrounds, were informed and articulate about the wider issues. They wrote essays and tracts (usually in Hindi) tracing the direct link between industrial forestry, soil erosion, landslides, and floods. These showed that what at one level was an economic conflict — between the subsistence demands of peasants for fuel, fodder, and so on, and the commercial motivations of paper and plywood companies — had deeper ecological implications as well. 

    Still, the incomprehension persisted. Consider these remarks, from 1994, by Eric Hobsbawm: “It is no accident that the main support for ecological policies comes from the rich countries and from the comfortable rich and middle classes (except for businessmen, who hope to make money by polluting activity). The poor, multiplying and under–employed, wanted more ‘development’, not less.” Unlike Thurow, Hobsbawm was a historian — a great historian, and therefore more attentive to the messiness of social life, more interested in exploring hidden details than in postulating grand generalizations. And unlike Thurow, whose life was lived largely within the North American academy, Hobsbawm had a keen interest in Latin America, a continent he traveled widely in. He kept himself abreast of current events in Africa and Asia. One would have thought a scholar as learned and well-informed as Hobsbawm would have heard of the Chipko movement in the Himalaya or the Green Belt movement in Kenya. Or at least, given his strong connections with Latin America, of the movement of Chico Mendes and the rubber tappers in Brazil — another example of the environmentalism of the poor. Perhaps Hobsbawm’s Marxist faith did not allow him to see environmentalism as anything other than a bourgeois deviation from the class struggle.

    Chipko was followed by a series of other grassroots initiatives around community access to forests, pasture, and water. They likewise posited subsistence versus commerce, the village versus the city, the peasant versus the state, the subaltern versus the elite. Studying and reflecting on these conflicts, some scholars argued that they showed the way to reconfiguring India’s development path. Given the country’s population densities, and the fragility of tropical ecologies, India had erred in following the energy-intensive, capital-
    intensive, resource-intensive model of economic development pioneered by the West. When the country got its freedom from British rule in 1947, it should have instead adopted a more bottom-up, community-oriented, and environmentally prudent pattern of development. And yet, the argument further proceeded, it was not too late to make amends. 

    The environmental debate in India was at its most vigorous in the 1980s. Scientists, social scientists, journalists, and activists all contributed to it. The debate operated at many levels: philosophical, political, social, technological. It touched on the moral and cultural aspects of humanity’s relations with nature; on the changes required in the distribution of power to promote environmental sustainability; on the design of appropriate technologies that could simultaneously meet economic as well as ecological objectives. The debate embraced all resource sectors — forests, water, soil, transport, energy, biodiversity, pollution, and industrial safety.

    The post-Chipko environmental upsurge led to some institutional changes within India. New laws seeking to conserve forests, protect wildlife, and control pollution were enacted. In 1980 the government of India started a new Department of Environment, upgraded later into a full-fledged Ministry of Environment and Forests. New centers of ecological research were set up in Indian universities. Terms such as “ecological history,” “environmental sociology, “and “ecological economics” began entering the teaching curricula and research agendas of the academy. A new breed of “environmental journalists” came into existence, and their reports on forests, pollution, biodiversity, and grassroots struggles featured in newspapers and magazines.

    In those years, thinkers and activists in India played a profound role in shaping global conversations about humanity’s relationship with nature as well. Indian scholars proposed the idea of “livelihood environmentalism” in contrast to the “full-stomach environmentalism” of the affluent world. Some of the most pungent criticisms of excessive consumption in the West came from Indian writers. Scientists such as Madhav Gadgil and A. K. N. Reddy, journalists such as Anil Agarwal and Sunita Narain, and activists such as Medha Patkar and Ashish Kothari acquired international reputations. Ideas first developed in India were discussed and debated in other countries and continents.

    And then there was the early and original example of Rabindranath Tagore — the poet, novelist, playwright, and essayist; the man who transformed the Bengali language through his prose; the composer who wrote hundreds of songs and set them to music, many of which are still sung decades after his death, among them the national anthems of Bangladesh and of India; the first Asian to win a Nobel Prize and the founder of a major university; the friend of Mahatma Gandhi and the mentor of Jawaharlal Nehru; the painter who took up his brush in his late sixties; the restless traveler who made three trips to Japan and five trips to the United States, and spent time in Europe, Latin America, China, Indonesia and Iran, winning friends, admirers and the occasional critic in all these places. But he was also one other thing. He was a precocious environmentalist — an unacknowledged founder of the modern environmental movement. 

    It is past time to recover Tagore’s thoughts on how nature shapes human life and how humans shape nature. His writings on the use and abuse of nature were more important to his worldview and literary achievement than has generally been recognized. (There are a few notable exceptions: in the 1960s, Niharranjan Ray and G. D. Khanolkar wrote insightfully on the subject.) The rehabilitation of Tagore’s environmental thought is not merely of academic interest; his words and warnings speak directly to the environmental challenges that confront India and the world today. 

    Rabindranath Tagore was born in 1861, into a family of wealth and privilege. In his memoirs, he says of the Bengali children’s primer which gave him the first elements of an education that the “two literary delights that still linger” in his memory were both images of nature”: “the rain patters, the leaf quivers” and “the rain falls pit-a-pat, the tide comes up the river.”

     

    Tagore grew up in the family home in the north Calcutta locality of Jorasanko. This was a three-storied mass of buildings built around several courtyards, in which lived several generations of family members as well as their maids, cooks, and bearers. A room that Rabindranath frequented as a little boy had a window, from which he saw “a tank with a flight of masonry steps leading down into the water; on the west bank, along the garden wall, an immense banyan tree; to the south a fringe of cocoa-nut plants. Ringed round as I was near this window, I would spend the whole day peering through the drawn venetian shutters, gazing and gazing on this scene as on a picture-book.”

    In this pool adjacent to the Tagores’ house, their poorer neighbors came to bathe. The child watched them with fascination, noting their idiosyncrasies — one man “who would never step into the water himself but be content with only squeezing his wet towel repeatedly over his head,” another “who jumped in from the top steps without any preliminaries at all,” a third who “would walk slowly in, step by step, muttering his morning prayers the while.” As the morning progressed the line of bathers grew thinner, until the “bathing-place would be deserted and become silent. Only the ducks remained, paddling about after water snails, or busy preening their feathers…” 

    Once the humans had departed, the boy’s attention wandered to the birds, and then, to a large tree that lay at the tank’s edge. He was fascinated by the “dark complication of coils at its base.” It was of this tree that many years later the poet wrote:

    With tangled roots hanging down from your branches, O ancient banyan tree,

    You stand still day and night, like an ascetic at his penances,

    Do you ever remember the child whose fancy played with your shadows?

    Of his childhood encounters with nature Tagore was to remark: “How intimately did the life of the world throb for us in those days! Earth, water, foliage and sky, they all spoke to us and would not be disregarded.”

    Tagore’s memoirs were written when he was approaching the age of fifty. Towards the end of the book, he reflected on what nature had meant to him over the course of his life:

    From my earliest years I enjoyed a simple and intimate communion with Nature. Each one of the cocoa-nut trees in our garden had for me a distinct personality. When, on coming home from [school], I saw behind the sky-line of our roof-terrace blue-gray, water-laden clouds thickly banked up, the immense depth of gladness which filled me, all in a moment, I can recall clearly even now. On opening my eyes every morning, the blithely awakening world used to call me to join it like a playmate; the perfervid noonday sky, during the long silent watches of the siesta hours, would spirit me away from the workaday world into the recesses of its hermit cell; and the darkness of night would open the door to its phantom paths, and take me over all the seven seas and thirteen rivers, past all possibilities and impossibilities, right into its wonderland.

    Tagore’s family owned vast estates in the eastern part of the Bengal Presidency. When he was in his twenties, his father commanded him to oversee their holdings and their management. He went reluctantly, loth to leave his family and the city, but once there he fell in love with the landscape of deltaic Bengal. His letters to his family are redolent with natural imagery, as he delicately describes the interplay of land, water, plants and animals. Here is a typical example, from a letter written by Tagore to his niece Indira Debi sometime in the 1890s, from an unnamed place in eastern Bengal:

    Our boat is moored off a lonely grass-covered island in the river. The world is at rest. What a glorious day it is, today! Such loveliness all around! After many days I am really meeting Mother Earth again, and it is as if she says “Here he is.” and I reply “Here is she.” We sit side by side without stir or speech. The water gurgles, the sunlight sparkles, the sand crunches. Tiny wild shrubs crane their heads to watch. A stray bird gets up calling chik-chik. It’s all like a dream, and I feel like writing on and on, just about that and nothing else — the gurgle of the water, the glitter and shimmer of the sunshine, and all the dreaminess of this island. I want to wander day after day along these sandy banks, and write about nothing but this — oh, how badly I want to!

    And this, from another letter by Tagore to his niece Indira, written in 1895 from the family estate in Shelidah in Eastern Bengal: 

    We can draw a deep and secret joy from nature only because we feel a profound kinship with it. These green, fresh, ever-renewing trees, creepers, grasses and lichens, these flowing streams, these winds, the ceaseless play of light and shade, the cycle of the seasons, the stream of heavenly bodies filling the limitless sky, the countless orders of life — we are related to all this through the blood-beat in our pulse — we are bound by the same rhythm as the entire universe.

    These letters, written in Bengali, bear comparison to the writings of American naturalists in the nineteenth century, exploring new landscapes with wonder and excitement, capturing their diversities of species and habitats in their prose. The “profound kinship” that Tagore felt with nature makes him akin to John Muir. We know Muir as a pioneering American environmentalist, but this label has thus far been denied the Indian poet, perhaps because his attention to nature, though impressive enough, was merely one part of his extraordinarily various and multi-faceted achievement. 

     

    When he was about thirteen, Rabindranath was taken by his father for a long holiday in the Himalayas. Above the town of Dalhousie (in present-day Himachal Pradesh) the family rented a bungalow. As they climbed up from the plains — father and son being carried on a palanquin by bearers — the terraced hillsides “were all aflame with the beauty of the flowering spring crops.” The boy’s “eyes had no rest the livelong day, so great was my fear lest anything should escape them. Wherever, at a turn of the road into a gorge, the great forest trees were found clustering closer, and from underneath their shade a waterfall trickling out, like a little daughter of the hermitage playing at the feet of hoary sages rapt in meditation, babbling its way over the black moss-covered rocks, there the jhampan bearers would put down their burden, and take a rest. Why, oh why, had we to leave such spots behind, cried my thirsting heart, why could we not stay on there for ever?”

    When he was sixteen, the young man was sent to England for a spell. He first lived in London, where he tried, unsuccessfully, to learn Latin from a tutor. He went for an excursion to the Devon countryside, and was charmed by what he saw. “I cannot tell you happy I was,” he wrote later, “with the hills there, the sea, the flower-covered meadows, the shade of the pine woods…” One day he walked down to the coast, where he found a “flat bit of overhanging rock reaching out as with a perpetual eagerness over the waters; rocked on the foam-flecked waves of the liquid blue in front, the sunny sky slept smilingly to its lullaby; behind, the shade of the pines lay spread like the slipped-off garment of some languorous wood-nymph.” His sensitivity to the natural world was preternatural.

    Tagore was traveling in Europe from the time he was a young boy. It was in 1916, when he was in his early fifties, that he visited Japan for the first time. It was a long and leisurely journey by ship, from Calcutta to the capital of British Burma, Rangoon, and from there to the town of Penang in British Malaya, and from there on to the Chinese city of Hong Kong, also a British protectorate, and from Hong Kong to his final destination, the island nation of Japan, a country that had never been ruled by Europeans (and which thereby added to Tagore’s fascination for it). En route the poet kept a diary, where he recorded his impressions of the ever-changing human and natural world that he and his traveling companions
    were to encounter. 

    Tagore’s first impressions of the premier port of Malaya were altogether pleasant. 

    Our ship reached the port of Penang just as the sun was setting. Seeing the water and land clasp each other in a bond of love, I had a deep sense of the earth’s beauty. The earth, stretching both its arms, was embracing the sea. The faint rays of light that pierced the clouds and fell on the bluish mountains were like the thin vein of gold that covers the face of a bride without completely hiding her features. Water, land and sky played together a divine tune from the gates of heaven as the evening approached. 

    But as the great steamship he was on prepared to dock, Tagore’s mood grew darker. 

    As our ship slowly drew near the wharves, the full horror of the great effort of man to overcome nature became conspicuous: the machine was cutting with its sharp, angular claws into the soft curves of nature. What ugliness the enemies of man within him can create! On every beach, in every port, the greed of man is making grotesque faces at the sky — and thereby banishing itself from the kingdom of heaven.

    In 1927, a decade after his journey to Japan via various British-ruled ports, Tagore chose to visit the Dutch East Indies. This time the poet does not appear to have kept a detailed diary on board. Fortunately, we have a letter he wrote to his niece Pratima describing his first impressions of the little and predominantly Hindu island of Bali. 

    When we crossed over to Bali we saw the Earth in all the freshness of its eternal youth. The old centuries here have their ever new incarnation. The habitations of its people nestle in the lap of shaded woodlands, lulled in a limpid leisure —a leisure decorated with preparations for frequent festivity. In this secluded little island there are no railroads. The railway train is the vehicle of modernity. The modern age is miserly, and reluctant to make provision for any kind of surplus; time is money, says the modern man and, in order to avoid any waste of it, the panting locomotive perspires smokily as it thunders on from country to country. But in this island of Bali the modern time has spread itself over the past centuries and become one with them. It has no need to shorten time, for everything belongs here to all time, as much to the past as to the present. Just as its seasons flow along, opening out flowers of many a color, ripening fruits of many a flavor, so also do its people live on from generation to generation, sustaining the superfluity of their traditional ceremonials, rich in form and color, song and dance.

    Tagore’s evocation of the rural-ecological idyll of Bali continues:

    But if railways are not, there is the modern globe-trotter, and for him there are motor cars. What if this child of a constricted age has come into the land of unbounded leisure — he must all the same get through his sight-
    seeing and his enjoyment within the minimum time. For myself, as I was being whirled along by hills and woods and villages, raising clouds of dust, I felt all the while that this was above all the place where one should walk. There is not much of a loss if one’s eyes are raced over rows of buildings lining a street, but where, on either side of the road, feasts of beauty offer their regalement, this steed of emergency should be kept interned in its garage.

    In the 1920s, the motor car was widely admired as a sign of progress and human achievement, though from our own perspective a hundred years later one cannot but see it as having contributed rather substantially to global warming. Tagore was not prophetic enough to recognize this, of course. Yet it would be a mistake to portray him, as some of his contemporaries did, as an anti-modern Luddite. Rather, he was working his way towards a vision where technological innovation would serve humans and harmonize with nature, rather than dominate them both utterly. He appreciated Bali because its inhabitants did not seek to conquer time or space in the manner of the residents of New York or London. In this little island, the fields, the houses, the modes of transport, all reflected a way of life that sought to blend and merge culture with nature.

    Tagore prized technological innovations on the human scale, where man was a partner rather than a servant of the machine. This passage, written as his ship was entering Penang, is suggestive:

    In the harbor we saw many small boats. There are few things created by man as beautiful as these small sailing boats that skim over the surface of the water to the rhythm of the wind. Indeed, when men have to move in tune with nature their creations cannot be anything but beautiful. The boat has to make friends with the winds and the waves, and so it comes to partake of their beauty; whereas a machine, pretending to look down upon nature from the pinnacle of its power, only displays by this vanity its own ugliness. A steamship has many advantages over a sailing vessel, but the beauty has been lost.

    Decent English translations of Tagore’s early nature poetry are hard to come by. Among the exceptions is a sonnet by Tagore entitled “Sabhyatar Prati” (“To Civilization”) and originally published in 1896 in his collection Chaitali. The poem sharply contrasts the soulless, denaturalized, and concretized city of Calcutta with the verdant beauties of the rural landscape of eastern Bengal. As translated by the Bangladeshi scholar Fakrul Alam, the sonnet reads:

    Give back the wilderness; take back the city —
    Embrace if you will your steel, brick and stone walls
    O newfangled civilization! Cruel all-consuming one,
    Return all sylvan, secluded, shaded and sacred spots
    And traditions of innocence. Come back evenings
    When herds returned suffused in evening light,
    Serene hymns were sung, paddy accepted as alms
    And bark-clothes worn. Rapt in devotion,
    One meditated on eternal truths then single-mindedly.
    No more stone-hearted security or food fit for kings —
    We’d rather breathe freely and discourse openly!
    We’d rather get back the strength that we had,
    Burst through all barriers that hem us in and feel
    This boundless universe’s pulsating heartbeat.

    Tagore’s most famous poem is the verse sequence Gitanjali, which won him the Nobel Prize in 1913. The previous year he was in London, where, at the home of the painter William Rothenstein, he read some of his early poems in his own English translations. In attendance was his friend C. F. Andrews, who later provided a report of the soirée for readers back in India. Andrews remarked: “At every verse the Bengal scenery — the Monsoon storm clouds, the surging seas, the pure white mountains, the flowers and fields, the lotus on the lake, the village children at play, the market throng, the pilgrim shrine—came before the eyes, molded into melodies of exquisite sweetness.” And twenty years after the publication of Gitanjali, Tagore composed a volume of poems called Banabani (The Voice of the Forest), which approached trees and the forest in a mystical and religious spirit. The opening poem of the collection “Vrikshavandana” (“A Prayer to the Tree”) reads, in translation:

    O Tree, you are the adi-prana [first or original breath], you were the first to hear the call of the sun and to liberate life from the prison-house of the rock. You represent the first awakening of consciousness. You brought to the earth beauty and peace. Before you the earth was speechless; you filled her breath with music.

    This was published when Tagore was in his seventies. His poems thus display a lifelong engagement with nature, with what plants, trees, birds, animals, as well as land, sky and water, meant to him and the human world to which he belonged, and also with the human responsibility for not damaging nature’s blessings and wonders.

    In 1914, Tagore published his first extended piece of writing in English. It provided an outline of his thinking on morality, aesthetics, and faith. The tract, called Sadhana, began by speaking of the role of forests in Indian culture. It “was in the forests that our civilization had its birth,” he declared, “and it took a distinct character from this origin and environment. It was surrounded by the vast life of nature, was fed and clothed by her, and had the closest and most constant intercourse with her varying aspects.” 

    “To realize this great harmony between man’s spirit and the spirit of the world,” continued Tagore, 

    was the endeavour of the forest-dwelling sages of ancient India. Even after the forests had given way to cultivated fields, and cities and kingdoms had emerged, Indians continued to look back with adoration upon the early ideal of strenuous self-realization, and the dignity of the simple life of the forest hermitage, and drew its best inspiration from the wisdom stored there.

    Tagore contrasted this attitude with that of the West, which “seems to take a pride in thinking that it is subduing nature; as if we are living in a hostile world where we have to wrest everything we want from an unwilling and alien arrangement of things.” Tagore argued that “in the west the prevalent feeling is that nature belonged exclusively to inanimate things and to beasts, that there is a sudden unaccountable break’” between humans and nature. He firmly rejected such a view. “The Indian mind,” he claimed, never has any hesitation in acknowledging its kinship with nature, its unbroken relation with all.”

    In another essay, published five years later, Tagore called the forests “the one great inheritance” of India and Indians. He offered an intriguing contrast between how forests shaped Indian history and how the sea had shaped the history of northern Europe. “In the sea,” he wrote, “Nature presented itself to these [European] men in her aspect of a danger, of a barrier, which seemed to be at constant war with the land and its children. The sea was the challenge of untamed Nature to the indomitable human soul. And man did not flinch; he fought and won…” Tagore contrasted the European conquest of the sea with the level tracts of peninsular India, where “men found no barrier between their lives and the Grand Life that permeates the Universe. The forest gave them shelter and shade, fruit and flower, fodder and fuel; it entered into a close living relation with their work and leisure and necessity, and in this way made it easy for them to know their own lives as associated with the larger life.” 

    This essay of 1919 was called “The Message of the Forest,” the title mirroring that of his early volume of poems. Three years later Tagore published a sequel, which he titled “The Religion of the Forest.” Here he spoke of how in ancient India, “the forest entered into a close living relationship” with the work and leisure of humans. They did not therefore think of their natural surroundings as “separate or inimical.” So “the view of truth, which these men found, did not manifest the difference, but the unity of all things.” 

    This view of a primordial attachment of Indians to forests was perhaps somewhat rose-tinted. The texts and the scriptures of ancient India by no means speak in one voice on this matter. While the Upanishads do talk of the unity of all creation, and Sanskrit drama does contain moving evocations of nature, one must not overlook the episode in the Mahabharata where the burning of the Khandava forests and the killing of its animals is celebrated as proof of the advance of civilization, the necessary and even mandatory conquest of primitive hunters and gatherers by a sophisticated agrarian civilization. And surely Tagore saw that many, perhaps most, Indians of his day treated forests in severely utilitarian terms, as a source of raw materials rather than of pleasure or spiritual upliftment. Perhaps the writer wanted to believe that his own love for nature, and for forests in particular, was not an idiosyncratic individual taste but a deep and enduring civilizational inheritance.

    Tagore was a builder of institutions. Most significantly, he founded Santiniketan, which began as a school for boys in 1901 and grew into a full-fledged university, and Sriniketan, an accompanying experiment in the renewal of village life, that was started in the 1920s. Both were based in rural Bengal, in what is now the district of Birbhum, about three hours by train from Calcutta. They were located originally on a property owned by the Tagore family, with more lands being acquired over the years, as the institutions expanded in size and grew
    in numbers.

    Krishna Kripalani, a scholar who worked closely with Tagore, summarized his educational ideals in terms of ten maxims, of which the first is: “The child should be brought up in such environments as would provide him with opportunities of direct and close contact with Nature. Civilized existence in society imposes, in any case, such severe restraints on the first, fresh and vital impulses of life that human nature tends to be perverted unless its impulses are renewed and revitalized with constant reference to Nature.” Other maxims include learning through the mother tongue, an equal emphasis on individual initiative and group action, and an appreciation of cultural heritage. Nature makes a reappearance in the sixth maxim, which Kripalani glosses as: “When the child’s senses have been trained to a proper awareness of his surroundings and he has learnt to observe and love Nature, his experiences should then be made intelligible to him, at a later stage, in terms of scientific categories.”

    In July 1927, by which time the school he started, Patha Bhavan, had been in existence for a quarter of a century, Tagore found himself speaking to the Indian Association in Singapore, to an audience of parents whose own children were educated in a resolutely metropolitan environment. He told them of his own school, where “boys are taught amidst natural surroundings. They grow up in the midst of the sights and sounds of Nature, among trees, birds, in the open air. This school seeks to enable my boys to realize their bond of unity with Nature.” In another speech in Singapore, to a gathering of children and teachers in the city’s Victoria Theatre, Tagore expanded on his method of learning in and with nature. He told his audience about how and where the children in his school had their lessons:

    We have a mango grove. It is full of shade, and in summer, full of the beautiful perfume of the mango blossoms and there are innumerable birds and moths and all kinds of insects living on them. This you may think might distract their attention. But that is not so. I allow them sometimes to have their lessons and to look more closely at some of the things which attract their eyes. Very often they call my attention to some strange birds that have come and perched on the bough — “Sir look at the bird? What bird is that?” — right in the middle of their lesson. And then I talk to them about that bird… They should observe that bird. It would have been wrong were their minds absolutely dull to these impressions, and I would much rather be interrupted in my lessons than force them to keep their minds only on what has been placed before them. Often, again, they would speak to me of their admiration for something unusual — such as an especially fine bunch of mango leaves. I find that helps them, and that this constant movement of their mind is necessary for them. It is the method which nature has adopted in her own school for the young.

    In 1921, Santiniketan (the Abode of Peace) became home to a new and more advanced educational experiment, a university which carried the name Visva-Bharati, indicating its ambition to bring the world (visva) to India (bharat), as well as to take India to the world. The university went on to have departments dedicated to the study of Japan and China, to the study of classical and contemporary Indian languages, as well as a celebrated art school.

    The land acquired for the university’s construction was dry and bare. To make the place more appealing to the eye as well as more conducive to the sort of learning he desired to impart, Tagore inaugurated in 1928 what was to become an annual festival. The Briksharopan (tree-planting) ceremony was held in July, shortly after the onset of the monsoon. In a play staged on the occasion, the five basic elements of nature — earth, water, sunlight, air and sky — were represented by five students playing these roles. Saplings of carefully chosen (and mostly indigenous) species were planted by boys and girls with loving care, the ceremony accompanied by music and poetry. Over the decades, these saplings, now full-grown, helped transform a barren landscape into one dotted with trees and groves.

    In a lecture to the Santiniketan community, Tagore explained his idea behind Briksharopan:

    Man’s greed grew as he received Mother Earth’s bounty. … Men cut down trees to meet their endless needs and stripped the Earth of shade. As a result, the air became increasingly hotter, while the fertility of the soil increasingly diminished. That is how northern India, deprived of its shelter of forests, now lies scorched by the harsh rays of the sun. With all this in our minds, we initiated a tree-planting ceremony to teach the children to replenish the plundered stores of Mother Earth.

    The tree-planting ceremony was one of several festivals – Spring Festival, Welcoming the Monsoon, Autumn Festival, Ploughing Ceremony, Harvest Festival — begun in Santiniketan by Tagore, with a view to nurturing among students an affectionate and caring relationship with nature, so that they could seek to harmonize their own lives with its rhythms and variations. 

    Tagore took great care in choosing the shrubs and trees that surrounded his homes in Santiniketan, making sure that there were flowering plants throughout the year. In the campus as a whole, there were groves dedicated to specific species: one for the stately sal trees; another for the trees bearing the most delicious of all fruits, the mango; and so on. When he was away from Santiniketan, Tagore’s letters home often asked about the plants and trees he had left behind or hoped would flourish in his absence. In the summer of 1933, he wrote to his daughter Mira: “Ask them [the staff] to plant neem, shirish, and other trees on the street that leads to my room this monsoon. It’s not a bad idea to plant a few jackfruit trees either.” 

    In these efforts to plant up Santiniketan with trees and flowers, Tagore was surely inspired by the verdant landscape of eastern Bengal in which he had spent so much time in his youth. With an arid, sandy, soil, and with far less water available, the place where the university was located could never remotely parallel the natural beauty of the Padma river and its surroundings, but it could still be made green and pleasant and welcomingly habitable. And so, under the poet’s guidance and instruction, it became.

    Tagore grew up in the city, but became increasingly disenchanted with urban lifestyles. As the writer Aseem Shrivastava observes, Tagore believed that “the ecological alienation of metropolitan life profoundly cripples our sensibility, leaving humanity in a self-destructive state of spiritual destitution.” The poet was thus encouraged to locate his educational experiment, Santiniketan, deep in the countryside rather than anywhere near the city of Calcutta. For Tagore, “open skies, planted fields, and swaying palms [were] more essential to untrammeled learning and the formation of the mind than the hectic cultural exchanges a modern metropolis affords (and a village denies).”

    A century before Tagore, English writers had responded in a similar fashion to the radical alterations in the natural landscape that the expansion of cities such as London represented. In his classic book on the subject, Raymond Williams explains why, for poets in particular, the country conveyed a more appealing ecological aesthetic than the city: “The means of agricultural production – the fields, the woods, the growing crops, the animals — are attractive to the observer and in many ways and in the good seasons, to the men working in and among them. They can then be effectively contrasted with the exchanges and counting-houses of mercantilism, or with the mines, quarries, mills and manufactories of industrial production.” 

    Like his English forebears, Tagore saw modern cities as being parasitic on the natural resources of the countryside. At the same time, he was not unduly romantic about the village life that he had witnessed at first-hand. His family owned large tracts of agricultural land in eastern Bengal. He and his brothers were sent by turn to manage them. Rabindranath was assigned this responsibility in the early 1890s, by which time he was an established poet, admired and much feted in Calcutta. In a lecture given many years later, Tagore wrote of how in this first extended experience of the countryside, “gradually the sorrow and poverty of the villagers became clear to me, and I began to grow restless to do something about it. It seemed to me a very shameful thing that I should spend my days as a landlord, concerned only with money-making and engrossed with my own profit and loss.” Over the next decade, as Tagore spent more time in his estates, these feelings of guilt intensified. He wished to ameliorate the poverty of the peasants through constructive social work. In 1906, he sent his son, his son-in-law, and a friend’s son to the University of Illinois at Urbana-Champaign to study modern methods of agriculture and dairying, with a view to implementing them in India.

    Tagore’s philosophy was anti-industrial but not anti-modern. He wished to renew village life with the principles and techniques of modern science. That is why he sent his son to study agricultural technology in Illinois. But the son did not prove entirely worthy of his father, so Tagore went looking for someone else who could scientifically supervise programmes of rural uplift in the villages around Santiniketan. He found him in the person of an idealistic young Englishman named Leonard Elmhirst, whom he met in New York in 1920.

    Born in 1893, the son of a Yorkshire curate, Elmhirst had studied history at Cambridge before enlisting in the Army during the First World War. He fell sick in Mesopotamia, and came to India to recuperate. There he became interested in agriculture, through meeting the British missionary Sam Higginbottom, who ran an experimental farm outside the northern Indian city of Allahabad. This encouraged Elmhirst to go to Cornell University in upstate New York to study agricultural science. In November 1920, when Tagore was in New York, he heard of the young Englishman and arranged to meet him. This is how, years later, Elmhirst recalled Tagore’s words to him at that meeting:

    I have started an educational enterprise in India which is almost wholly academic. It is situated well out in the countryside of West Bengal at Santiniketan. We are surrounded by villages, Hindu, Muslim, Santali. Except that we employ a number of these village folk for various menial tasks in my school, we have no intimate contact with them at all outside their own communities. For some reason these villages appear to be in a state of steady decline. … Some years ago I bought from the Sinha family a farm just outside the village of Surul, a little over a mile from my school. I hear that you might be interested in going to live and work on such a farm in order to find out more clearly the causes of this decay.

    Elmhirst came out to Santiniketan in November 1921, a year after meeting Tagore in New York. The experiment in Surul originally was called the Institute for Rural Reconstruction, before Tagore came up with the crisper and more elegant “Sriniketan.” Tagore asked Elmhirst to find better methods for villagers to grow their crops and vegetables, to help them gain access to credit and get a fair price for their produce. He also hoped to augment their farm income with cottage industries such as rice milling and umbrella making.

    In Sriniketan, Elmhirst began taking Bengali lessons. In January 1922, Tagore told him that ten Santiniketan students had come to him and were keen to do work in villages after graduating. Since they knew both Bengali and English, they could assist the foreign-born expert in his activities. Tagore now instructed Elmhirst: 

    Stop your Bengali lessons. If you learn too much Bengali yourself you’ll want to go on your own to the village to ask questions. You will then make the great mistake of trying to become indispensable to this enterprise like any foreign missionary. I want you never to go alone to any village but always to take with you either a student or a member of your staff to act as interpreter. Only
    in this way will they learn what kind of questions
    you ask and just how the farmers and villagers frame their answers. These answers they will then have to
    interpret back to you. In this way they will never forget the experience.

    After Elmhirst had been in Sriniketan for a couple of years, Tagore told him that it was time to move on, so as “to give the Indian staff of the young Institute a chance to find their own feet.” Elmhirst traveled with Tagore to China and Japan in 1924, and from there to Latin America. The following year, with the poet’s blessings, Elmhirst married the American heiress Dorothy Straight, and the couple now set up home in rural England in a medieval manor called Dartington Hall, which they refurbished and made the center of an experimental farm. Straight helped to support Sriniketan financially, while her husband remained in close touch with Tagore.

    Tagore was a keen observer of the natural world, yet what we might call his “nature aesthetic” was merely one element of a wider ecological consciousness. He was sharply critical of the environmental devastation caused by unbridled industrialization. When the poet was growing up, the Hooghly river had homes and farms all along it, but then the banks became dotted with factories. Writing in 1916, he observed that he was fortunate in having been born before the iron flood of ugliness began clinging to the river banks near Calcutta. 

    At that time the embankments of the Ganges, like the arms of the villages on the banks, embraced their people and kept them close to their bosom. In the evenings people would go for boat-rides on the river. The current of the people’s hearts and the flow of the river — there was no hard and ugly demarcation between them. The beauty of the Bengali countryside could be seen even in the immediate vicinity of Calcutta. As commercial civilization began to spread, however, the beauty of the countryside was slowly and steadily obscured, to the point where now Calcutta has segregated all of Bengal from the lands surrounding it. The vernal beauty of the country has succumbed to the hideous form of Time, showing its iron teeth, belching smoke and fire. The following year, in a lecture in America, Tagore warned thus against the dehumanizing and destructive cult of the machine: “Take man from his natural surroundings, from the fulness of his communal life, with all its living associations of beauty and love and social obligations, and you will be able to turn him into so many fragments of a machine for the production of wealth on a gigantic scale. Turn a tree into a log and it will burn for you, but it will never bear living flowers and fruit.”

    During the course of Tagore’s life, the city of his birth became a bustling industrial powerhouse. In the second half of the nineteenth century, jute mills proliferated in and around Calcutta, processing the raw fiber grown in eastern Bengal into packing material sent all around the world. The first jute mill was established on the Hooghly in 1855, six years before Tagore was born. In 1869, when he was a little boy, there were a mere five mills with nine hundred and fifty looms operating. By 1910, when Tagore was approaching the age of fifty, there were a staggering thirty thousand looms in operation, exporting more than a billion yards of cloth. The waterways around Tagore’s native city, whose banks once featured little hamlets and fishing boats, were now lined with the large chimneys of the ever-proliferating jute factories, emitting tons of smoke. This transformation repelled him. 

    In 1880, when he was in his late teens, Tagore went for a boat ride up the Hooghly, from Calcutta to the French enclave of Chandannagar, where his brother Jyotindra had a riverside home. Tagore had just returned from a spell in England, and this re-immersion in the Bengal countryside was for him a joyous experience. Of the boat journey and the stay on the river’s banks in Chandannagar, he wrote:

    The Ganges again! Again those ineffable days and nights, languid with joy, sad with longing, attuned to the plaintive babbling of the river along the cool shade of the wooded banks. This Bengal sky full of light, this south breeze, this flow of the river, this right royal laziness, this broad leisure stretching from horizon to horizon and from green earth to blue sky, all these were to me as food and drink to the hungry and thirsty. Here it felt indeed like home, and in these I recognised the ministrations of a Mother.

    That was back in 1880. By the time Tagore came to pen his reminiscences thirty years later, much had changed. 

    That was not so very long ago, and yet time has wrought many changes. Our little river-side nests, clustering under their surrounding greenery, have been replaced by mills which now, dragon-like, everywhere rear their hissing heads, belching forth black smoke. In the midday glare of modern life even our hours of mental siesta have been narrowed down to the lowest limit, and hydra-headed unrest has invaded every department of life. Maybe this is for the better, but I, for one, cannot account it wholly to the good.

    Such passages inevitably recall the great British poets of the eighteenth and nineteenth centuries, who were likewise repelled by the outrage done to nature by the expansion of cities and factories, and who wrote so movingly (and despairingly) about it. Blake (“dark Satanic mills”), Wordsworth, and John Clare come immediately to mind, though perhaps most akin to Tagore’s thinking was William Morris, who, while by no means in the same league as the other three as a poet, had, like Tagore, many interests in life outside his poetry, being an activist and a builder of institutions. Consider this passage from Morris’s long narrative poem “The Earthly Paradise,” from 1868–1870, which begins by asking the reader to —

    Forget six counties overhung with smoke,
    Forget the snorting steam and piston stroke
    Forget the spreading of the hideous town;
    Think rather of the pack-horse on the down,
    And dream of London, small, and white, and clean,
    The clear Thames bordered by its garden green…

    Morris wished for a harmonious relationship between the city and the countryside, and between humanity and nature. And so did Tagore. The parallels between these writers are owed to their shared experience continents apart. They lived through a similar historical process — the radical transformation of landscapes and social relations that modern industrialization brought with it. 

    Unlike those British poets, whose thoughts and experiences were confined to their island nation or at most to a few culturally (and ecologically) akin countries of the Continent, Tagore had a global vision. He had travelled all over the world, encountering many different landscapes, cultures, religions, and ways of life other than those of his native Bengal. As an Indian living under British rule, moreover, he had an understanding of what Britain had wrought in its colonies, something denied to those (otherwise so gifted and acutely sensitive writers) who lived in Britain itself.

    This wider understanding of the modern world is strikingly manifest in some passages of Tagore’s famous tract, Nationalism, from 1917, which includes a profound ecological message that has escaped most commentators. Since the book was originally written in English, it has been far more widely read than his poems, plays, and stories, which first appeared in Bengali. Its readers have focused on its warnings against xenophobia and nationalist hubris while ignoring its powerful environmentalist critique of industrialism and imperialism. Here, for example, is the poet-turned-prophet analyzing the environmental consequences of European imperialism, while speaking of the devastation caused by the rampant greed and new technologies of the new industrial age:

    The political civilization which has sprung up from the soil of Europe [and] is overrunning the whole world, like some prolific weed, is based on exclusiveness. It is always watchful to keep at bay the aliens or to exterminate them. It is carnivorous and cannibalistic in its tendencies, it feeds upon the resources of other peoples and tries to swallow their whole future. It is always afraid of other races achieving eminence, naming it as a peril, and tries to thwart all symptoms of greatness outside its own boundaries, forcing down races of men who are weaker, to be eternally fixed in their weakness. Before this political civilisation came to its power and opened its hungry jaws wide enough to gulp down great continents of the earth, we had wars, pillages, changes of monarchy and consequent miseries, but never such a sight of fearful and hopeless voracity, such wholesale feeding of nation upon nation, such huge machines for turning great portions of the earth into mince-meat, never such
    terrible jealousies with all their ugly teeth and claws ready for tearing into each other’s vitals.

    Those words were spoken at a public event in Japan in 1916. That Asian nation was far more advanced, economically and industrially, than Tagore’s native India, yet he nonetheless hoped that Japan would restrain itself from going all the way down the route mapped by Europe. He reminded his hosts that too eagerly embracing the urban-industrial way of life would be a denial, even a repudiation, of their own culture, of “the spiritual bond of love she [Japan] has established with the hills of her country, with the sea and the streams, with the forests in all their flowery moods and varied physiognomy of branches…” Tagore urged Japan to offer the world a vision of humanity’s relations with nature rather different from that being envisioned and put into practice in modern Europe. The visiting poet reminded the Japanese that “the ideal of maitri [friendship] is at the bottom of your culture — maitri with men and maitri with Nature.” That ideal had to be renewed and reaffirmed, even if it might seem like “an anachronism, when the sound that drowns all voices is the noise of the market-place.” He defiantly stated his own belief “that the sky and the earth and the lyrics of the dawn and the dayfall are with the poets and the idealists, and not with the marketmen robustly contemptuous of all sentiment — that, after all the forgetfulness of his divinity, man will remember again that heaven is always in touch with his world, which can never be abandoned for good to the hounding wolves of the modern era, scenting human blood and howling to the skies.”

    Six years later, in 1922, Leonard Elmhirst, the newly appointed Director of the Institute of Rural Reconstruction in Sriniketan, gave a lecture on the renewal of village life. Elmhirst’s talk was prefaced by some introductory remarks by his mentor and employer, Rabindranath Tagore. Tagore offered a parable of environmental destruction, imagining that on the moon a new race of beings was born, “that began greedily to devour its own surroundings.” 

    Through machinery of tremendous power this race made such an addition to their natural capacity for that their career of plunder entirely outstripped nature’s power for recuperation. Their profit makers dug big holes in the stored capital of the planet. They created wants which were unnatural and provision for these wants was forcibly extracted from nature. When they had reduced the limited store of material in their immediate surroundings they proceeded to wage furious wars among their different sections, each wanting his own special allotment of the lion’s share. In their scramble for the right of self-indulgence they laughed at moral law and took it as a sign of superiority to be ruthless in the satisfaction each of his own desire. They exhausted the water, cut down the trees, reduced the surface of the planet to a desert, riddled with enormous pits, and made its interior a rifled pocket, emptied of its valuables.

    This parable of what might happen on the moon resonated with what Tagore was witnessing on earth in his own time, where the age of industrialism and colonialism had led to an unprecedented assault on the earth and its resources. This imaginary race of rapacious beings on the moon, he continued, “behaved exactly in the way human beings of today are behaving upon this earth, fast exhausting their store of sustenance, not because they must begin their normal life, but because they wish to live at a pitch of monstrous excess. Mother Earth had enough for the healthy appetite of her children and something extra for rare cases of abnormality. But she has not nearly sufficient for the sudden growth of a whole world of spoiled and pampered children.”

    I have chosen to cite Tagore’s own words at length not only to give the reader an experience of their prodigious beauty, but also to establish the depth and the prescience of his environmentalist thinking. He grasped in all its enormity the devastating environmental consequences of industrialism and imperialism, and anticipated by many decades the now influential idea of the ecological footprint, the impact of the vast and unsustainable demands that the production and consumption patterns of a particular nation or social class makes upon the earth. Even though the term “environmentalist’” had not acquired its present meaning in his lifetime, Tagore was indeed a pioneering environmentalist. In his lecture at Sriniketan, Tagore offered in passing an aphorism that can serve, a century and more later, as a maxim of environmental responsibility for our times. “When our wants are moderate, the rations we each claim do not exhaust the common store of nature and the pace of their restoration does not fall hopelessly behind that of our consumption.” 

    Blackbirds

    “She is brown,”

    I said to you, 

    less in annoyance

    than wonder 

    when she flew 

    past us with a certain flamboyance

    not over but under

    our gate

    to settle down

    into the tree beside her mate.

    “But he is black,”

    you replied,

    “and the name is his.”

    “As it always is,”

    I poked.

    “I was your bride

    and took your name,

    yet we are not the same.”

    You’d have joked

    back

    but couldn’t deny it.

    We grew quiet

    when we heard the blackbirds

    sharing words

    between them.

    Whose song 

    it was we would 

    never know, not having seen them

    sing. But it would be wrong

    to say, even if we could.

    Needlefish

    In that instant, 

    dear daughter,

    when they flashed

    like cupid’s arrow

    through the current 

    of saltwater

    where you splashed,

    more narrow

    and more terse

    than any gleam,

    I thought I felt 

    within my gut

    love’s old curse

    entering the dream —

    that through no fault

    of yours but

    beauty, fresh 

    as it is fierce, 

    you should become

    as bait

    to any fish

    whose point would pierce,

    as if from

    nowhere, while I wait.

    Ladybirds

    A ladybird, or ladybug (call it 

    what you will) has crept 

    onto my pillowcase —

    this one so small it 

    can hardly be seen. Except 

    I do see it; it is marking the place 

    where I slept

    like a bloodstain. 

    You shrug, tell me it’s good luck, 

    give our duvet a perfunctory sweep.

    But I cannot possibly sleep

    here: on the windowpane

    a new brood crawls and keeps

    watch — we are sitting ducks.

    It happens almost every night:

    cloaked in red,

    one marauder, or two, takes flight,

    infiltrates our bed

    with dishevelled wings — something thin and black 

    always trailing sideways from its back,

    which it eventually pulls tight

    as if tucking its own covers 

    1. Semi-annual, this infestation

    nevertheless surprises us, like the changing of the clocks.

    The first few we discover

    have the charm of snowflakes

    (no two the same!), but soon a whole nation

    of scarlet flocks

    to the house: a British 

    invasion. Once adored

    but now persona-non-grata:

    isn’t it always the way? Two on the headboard

    are making me skittish —

    one in bloodred, poured

    with black, the other flecked terracotta.

    La Farfalla / The Moth

    after Petrarch 

    In August, out on the veranda, it

    is not uncommon for a moth to fly

    into the light and singe its wings to dust.

    The lantern is so beautiful — it must.

    I used to watch them burn and wonder why,

    before I came to understand the bit

    about desire, how there’s no gentle landing,

    not when it comes to fire. Your eyes demanding

    mine, I’d fly into them every time

    despite my certain harm and your regret.

    You look away, and still, I seek them out.

    Hellbent on serving my affliction, I’m 

    less bothered by my own pain than your doubt.

    Although I burn, I haven’t perished yet.

    The Saddest Poem

    When I was eleven years old, reading Robert K. Massie’s Nicolas and Alexandra, I encountered the longest word I had ever seen: counterrevolutionaries. It muscled out most other words in the line, squatting nigh unpronounceably, like an undigestible bolus or sedimentary rock, all prefix and suffix. Even though it had no sensuality or visual aspect to it at all — its physicality is coded into the etymology of volvere, to roll — it repudiated its own abstraction by taking up space, reflexively enacting a trochaic reversal of our customary iambics, and vibrating with the engine of its r’s. I was eleven and I was fascinated. 

    It strikes me as odd, this random memory of a single word, but children have a primal feeling for language that manifests in their perennial games — tongue-twisters, nursery rhymes, punning jokes — and in their penchant for naming. When he was nine or ten, my son told me that his favorite word was overwhelming. Then he decided it was lawless. It is a truism that kids pick up foreign languages more easily, and that adolescents drive new slang — that language lava. It is hardly unknown for adults to revel in wordplay, to create diminutives and invent limericks, but it isn’t endemic to American English. Not right now, at least. We are more likely to proscribe words, or swap them out for endless euphemism. (Some petty functionary, somewhere, really thinks “unhoused” is a meaningful improvement on “homeless.”) English is the lingua franca of commerce, with its bureaucratese and finance jargon: also not a good sign. And this impoverishment at the general level extends to the literary culture, where too often decisions about what kinds of writing get rewarded are determined by the grownups’ ideological commitment to plainness as a democratic value. Restriction and instrumentalism reign, not freedom and play. 

    I remember the wealth of adverbs that I absorbed from children’s books, how each had a different emotional shade as vivid as a box of Crayolas. “‘Okay,’ she said ruefully.” Or wistfully. Or sardonically. Or wryly. One could be convivial,
    or congenial, or amiable, or amicable: these radiated subtle differences. Joy, mirth, bliss. Desuetude, ennui, acedia. Looking back, it seemed I amassed these words like a toy army to deploy in those little booklets I made, folding drawing paper in half and stapling them to shelter my stories and drawings (inseparable concepts at that age, as Alice in Wonderland attests). Words were physical; they created physical sensations mimetically. What they provided, also, was an education in emotional nuance. 

    I told my students that we were going to read the saddest poem in twentieth-century American poetry. I thought it would perk them up, pique their curiosity, or even put on them on their guard. Sadness, especially “sad girl” sadness, is a staple of pop culture, a commodity. People go looking for extraneous sadness in songs and in poetry. Why? Why break your own heart on a normal day? Is it a release from boredom and numbness? Is there a secret pleasure in private pain? Of course I have done this too, but I recognize its perversion. 

    Yet even as I told my students that this is for me the saddest American poem of the twentieth century — leaning on the hyperbole — I knew that when they finally read the poem they would be baffled. And indeed one student did show up at my office hours to confess that this one defeated her. The poem was Amy Clampitt’s “The Kingfisher.” If you polled the MFA’s of America, this would probably not be the most obvious contender for the title. 

    THE KINGFISHER

    In a year the nightingales were said to be so loud
    they drowned out slumber, and peafowl strolled screaming
    beside the ruined nunnery, through the long evening
    of a dazzled pub crawl, the halcyon color, portholed
    by those eye- spots’ stunning tapestry, unsettled
    the pastoral nightfall with amazements opening.

    Months later, intermission in a pub on Fifty-fifth Street
    found one of them still breathless, the other quizzical,
    acting the philistine, puncturing Stravinsky — “Tell
    me, what was that racket in the orchestra about?”—
    hauling down the Firebird, harum-scarum, like a kite,
    a burnished, breathing wreck that didn’t hurt at all

    Among the Bronx Zoo’s exiled jungle fowl, they heard
    through headphones of a separating panic, the bellbird
    reiterate its single chong, a scream nobody answered.
    When he mourned, “The poetry is gone,” she quailed,
    seeing how his hands shook, sobered into feeling old.
    By midnight, yet another fifth would have been killed.

    A Sunday morning, the November of their cataclysm
    (Dylan Thomas brought in in extremis to St. Vincent’s,
    that same week, a symptomatic datum) found them
    wandering a downtown churchyard. Among its
    headstones,
    while from unruined choirs the noise of Christendom
    poured over Wall Street, a benison in vestments,

    a late thrush paused, in transit from some grizzled
    spruce bog to the humid equatorial fireside: berryeyed,
    bark-brown above, with dark hints of trauma
    in the stigmata of its underparts — or so, too bruised
    just then to have invented anything so fancy,
    later, re- embroidering a retrospect, she had supposed.

    In gray England, years of muted recrimination (then
    dead silence) later, she could not have said how many
    spoiled takeoffs, how many entanglements gone sodden,
    how many gaudy evenings made frantic by just one
    insomniac nightingale, how many liaisons gone down
    screaming in a stroll beside the ruined nunnery;

    a kingfisher’s burnished plunge, the color
    of felicity afire, came glancing like an arrow
    through landscapes of untended memory: ardor
    illuminating with its terrifying currency
    now no mere glimpse, no porthole vista
    but, down on down, the uninhabitable sorrow.

    When Willard Spiegelman’s biography of Amy Clampitt, Nothing Stays Put, came out last year, the reception was muted. No review in the New York Times; a mere mention in the New Yorker’s “Briefly Reviewed,” despite Clampitt having been one of the magazine’s stars in the 1980s. (Harold Moss was the first to publish her.) But even when she was at the height of her fame, she was out of step. (Poetry seems perpetually on the outs with the chattering classes.) There are fascinating parallels to be made between Amy Clampitt and her near-contemporary, the novelist Shirley Hazzard, whose biography by Brigitte Olubas, published within a year of Clampitt’s, has received much more attention. Both women were passionate devotees of literary tradition; both practiced the art of discretion in an age of confessionalism; both were outsiders (Clampitt from Iowa, Hazzard from Australia) who gravitated to New York for a life of culture. Hazzard’s brocaded romances may appeal more broadly than Clampitt’s ornamented poems, but readers of the one really ought to be readers of the other. Hazzard’s prose is infused with poetry; Clampitt’s poems are condensed novels. And so it is with “The Kingfisher.” That’s what I told my class, who vaguely intuited that the poem told the story of a love affair, though the term “narrative poem” seemed to elude them. They were bamboozled by Clampitt’s trick of turning background into foreground and vice versa. 

    I told my students — encouragingly, I hoped — that
    I read way above my level when I was their age. What did I understand, then, of Proust or Tolstoy, not to mention Donne or Hart Crane? It had seemed only natural at the time: I had no concept of “relatability,” and anyway I would have shunned anything “relatable” to my mediocre suburban milieu. So it irked me when Spiegelman couched his first reading of “The Kingfisher” in the American Lunkhead register: “Poetry readers — I remember, because I was one of them — scratched their heads and asked, ‘Where did this come from?’” Considering that this was the century of “The Waste Land,” and the decade that Seamus Heaney, Derek Walcott, and Joseph Brodsky became international stars, Spiegelman’s “poetry readers” sound awfully disingenuous. 

    When I asked my class to describe the poem to me, they got it roughly right: not hexameter, but a six-beat line; accentual, not accentual-syllabic. Six lines per stanza; seven stanzas in all. Lovers at sixes and sevens. Not all of them saw, at first, that the stanzas rhymed; I enjoyed the epiphanic expressions on their faces as I pointed out what now looked obvious. They were slant rhymes, to be sure, and not in a regular pattern; but within each stanza each end-word was paired. “Isn’t that a telling choice for a doomed love affair?” I asked them. “Rhymes, but not full and not regular? This is an imperfect coupling.”

     Alerted to the possibilities of one-sentence poems in a couple of Clampitt’s shorter works, they noticed that at first each stanza is one long perfect sentence ending in a period. The etymology of “stanza,” I remind them, is “room.” These are rooms for lovers. Then, halfway through, the sentence-lengths start varying (less breathless) and sentences straddle the stanzas like “communicating” rooms: rooms that open into other rooms. 

    So much for the architecture of the poem, with its twisting, turning sentences inside the scaffolding of seven sestets. One student had called it a “rigid structure,” but by the end of our analysis she could see that regular stanzas alone do not a rigidity make. I could have pointed out the variety in the caesuras that pace each line — and the few lines that fly to their enjambments without a caesura at all — but I didn’t want to get too bogged down in technical details. I wanted to know if they had done their research on kingfishers. 

    In the eleventh book of Ovid’s Metamorphoses, the tale of Ceyx and Alcyone is wobbly; its elements are wind and water, dreams and tears. Here are, respectively, the happy king and queen of Trachin, in Thessaly. After a series of troubling events — including the metamorphosis of Ceyx’s brother into a hawk from grief over his own daughter’s demise — Ceyx decides to sail to an oracle in Ionia. The implication is that he is taking action against his own melancholia. 

    Alcyone is shaken: her father is Aeolus, king of the winds, and she is terrified of what the sea and wind can wreak. She begs to go with him, but he lovingly refuses. “Lifting up her watrey eyes” (in Arthur Golding’s 1567 translation), she watches him depart, riveted to the spot, until she can no longer see his ship. Ceyx does indeed perish, in Ovid’s lengthy description of a maelstrom. Knowing nothing of this, Alcyone heaps offerings on Juno’s altar until the goddess commands Iris, the messenger goddess, to go to the Cimmerian caves, where the god Sleep resides, and tell him to send a dream giving Alcyone the bad news. 

    Mad with grief, Alcyone goes to the selfsame spot where the couple said their last goodbyes, and there she sees a corpse floating towards her — her husband’s. Running now along the jetty toward it, she — “a wonder sure” — sprouts wings and starts to fly. When a wave bears up Ceyx’s corpse to receive Alcyone’s beaky kisses, the gods take pity and metamorphose him as well. They live happily ever after in this form, as pelagic birds, making their nest on the water. Aeolus withholds his winds every year for seven days to assist their brood; these
    are called the “halcyon days,” after Alcyone. These birds we
    call kingfishers. 

    Wind and water: water figured in the sea and in tears, wind figured in maelstrom and passion, but also dreams and phantasms. Alcyone has a premonition of the fate that will befall her husband; in her speech to him, she foretells vividly what the elements may do, and in the next few pages Ovid echoes her with his description of the real thing. When she pesters the goddess of marriage with her prayers and offerings, Juno sets in motion a chain of mediations between heaven and earth: Iris (the rainbow) goes to the cave of Sleep, which is thronging with fantastical eidolons — passim varias imitantia formas. Sleep then dispatches one of his sons, Morpheus, a feigner of shapes, to invade Alcyone’s bedroom in the form of her husband, so that news of his death seems to come from his own lips. She “did stirre her armes, and thrust them forth his body to embrace,” but the eidolon eludes her, and she awakens with awful recognition. 

    This relay of spirits and phantoms is not as unlikely as it seems — what are words if not airs, what is a message but signals in the ether, what is lyric inspiration but a breeze? Bodies reduced to love, even in death, partake of this insubstantiality, thus Alcyone’s lament: nunc absens perii, iactor quoque fluctibus absens,/et sine me me pontus habet. Stanley Lombardo translates it arrestingly: “But now I have died far away from myself;/Far from myself I am tossed on the waves,/And the sea holds me without my being there.” To my ears, this echoes at least two other moments in Ovid’s Metamorphoses where selves are cleaved: Marsyas crying, as he is flayed, Quid me mihi detrahis? “Why do you separate me from myself ?,” and Teresias telling Narcissus’s mother Leriope that her son will only live long si se non noverit, “if he doesn’t know himself.” Alcyone’s lost self is most heartrending because it is anchored to a beloved, and the beloved is not-her. The beloved is also an absence with a difference: he is not only dead but lost at sea, he can ever only have an “empty tomb,” as she herself says in her speech to him. The self only seems insubstantial until sorrow gives it weight. 

    The meandering way that Alcyone is told of her husband’s death mimics Ovid’s own telling of the myth: it comes to us as a story within a story within a story, as Book Eleven meanders between Orpheus’s death — he is dismembered by the Maenads and his head is sent downriver to the sea, singing — and the metamorphosis of Priam’s son Aesacus, bringing us to the brink of the Trojan War. Aesacus also turns into a seabird, also for love, but not happily; his is just a brief coda to the kingfisher myth. But before that, Ceyx’s brother is turned into a hawk; Peleus rapes the nereid Thetis and makes her pregnant with Achilles; Midas, Bacchus, and Pan have a comic interlude, a parodic version of the contests between god and man whose tragic versions are embodied by Marsyas and Arachne. 

    In Book Eleven tragedy is interleaved with comedy, happy endings with sad endings. It is worth noting that Orpheus, after all his travails, is reunited with Eurydice in the Underworld, where they are never parted again, and Ceyx and Alcyone enjoy wedded bliss, and even progeny, in their new form as kingfishers. Book Eleven throngs with eidolons, phantasms, mirrors, and echoes. It is a book of streams and seas, dreams and music and submergence. The first word of the book is carmine, or songs, and the last line says of Aesacus, aequora amat nomenque tenet, quia mergitur illo, “he loves the sea and shares its name because he is immersed in it.” 

    This quality of submergence, of immersion, runs through Clampitt’s “The Kingfisher,” not least in the way that the biographical details are submerged. It is nothing short of astonishing that in our day and age —in which we feel we must have access to all information—we know as much about Clampitt’s lover in the early 1950’s as we do about the beloved in Shakespeare’s sonnets. “All ye know, and all ye need to know,” to quote her beacon Keats, is what is contained in her poem. (We have a guess — a British co-worker at Oxford University Press named Charles Johnson — but it can’t be confirmed, and no love letters survive.) 

    Clampitt even puts the story in the third person — but then, she is sixty-two when the poem is published, so it may well seem as though this thirty-year-old affair happened to someone else. 

    In a year the nightingales were said to be so loud
    they drowned out slumber, and peafowl strolled screaming
    beside the ruined nunnery, through the long evening
    of a dazzled pub crawl, the halcyon color, portholed
    by those eye- spots’ stunning tapestry, unsettled
    the pastoral nightfall with amazements opening. 

    In that first stanza, not only are we plunged into one long winding sentence, we are immersed in a hectic realm of birds—three of them, straight from Ovid, and eerily interchangeable with each other, like those eidolons in Sleep’s cave (“they drowned out slumber,” she says right in the second line). Nightingales derive from the dark myth of Philomela, whose metamorphosis into the sweetest songbird made up for her amputated tongue. Peacocks derive from the dark myth of the watchman Argos’s slaughter by Hermes — Juno retrieved his hundred eyes and put them in her bird’s tail. And yet the “halcyon color,” “portholed” in the peacock’s feathers, is also the kingfisher’s. 

    These emblematic three link up with other birds deployed metonymically throughout the poem — Stravinsky’s Firebird (“like a kite,” eliding the identities of those birds in just the way the peacock and kingfisher are elided), Bronx Zoo “jungle fowl,” bellbird, quail, thrush (Hardy’s bird, as the nightingale was also Keats’s; to allude to birds is to allude to predecessor poets, the buried reference in the “berryeyed”). These creatures of air sail over “entanglements gone sodden:” the liquid element in this poem being in part the sea (those portholes in the peacock’s feathers ferry back and forth between England and New York), and in part alcohol. Pub, pub crawl, “another fifth” — a musical interval as well as a bottle of spirits — and most telling of all, the allusion to Dylan Thomas’s death by alcohol poisoning in St. Vincent’s Hospital, whose other claim to poetic fame lies in its appropriation by Edna St. Vincent Millay, who was born there. (“A favorite of mine when I was fifteen,” Clampitt noted in her Paris Review interview.) It seems indisputable that this is an account of a love affair doomed by two kinds of water, two kinds of long distance: the wide sea, and drink. Her lover’s “hands shook.” In Golding’s translation of the Alcyone and Ceyx myth, the wife cries: “By shipwrecke he is perrisht: I have seene him: and I knew His handes.” This isn’t quite in the Latin, though Ovid does say that Morpheus did an excellent impersonation of Ceyx’s voice and hand gestures. 

    What is the submerged meaning of the “ruined nunnery” repeated twice in the poem? Is Clampitt playing Ophelia here, with rosemary and rue? I think of it (at least in part) as a reference to the Cloisters, the museum of medieval art in upper Manhattan, where she loved to go and ruminate,
    particularly among the unicorn tapestries. As she wrote in an illuminating poem to her brother Philip in 1956: 

    … I wandered in and out, visually speaking, among the little wild strawberries, the bluebells and daisies and periwinkles and dozens of other flowers (so faithfully rendered that nearly all have been botanically identified)
    which are woven into the background of each of the scenes of the hunt, for the very reason that it was a composite work rather than that of a single individual—and not only composite but anonymous; not only the weavers, but the designer and even the place of origin are unknown, and even for whom it was commissioned is a matter of conjecture—I found it more satisfactory than painting. 

    Here, surely, is as close a description of what she was trying to achieve in “The Kingfisher” as any we are likely to find. She was trying to make it anonymous; she was trying to make it “composite,” with references to Ovid, Keats, Stravinsky, Millay, Hardy, and Dylan Thomas — some “veiled,” (hello, ruined nunnery!) some not. The “faithfully rendered” is a principle that weds ethics and aesthetics: “love as the search for a spiritual reality” was a theme of the fiction she wrote before she turned full-time to poetry. 

    In seven stanzas, the halcyon days of a love affair crumble to “down on down, the uninhabitable sorrow.” The pun on “down,” the ramifications of “uninhabitable” — blighted domesticity, nowhere to nest — make that much more painful the appearance of the kingfisher: “the color/of felicity afire.” Its “glancing like an arrow” is Eros’s arrow, appearing at first like happiness, and then — “terrifying” — revealing itself to be sorrow without end. I told my students that this is the saddest poem in the world, a bit of exaggeration to make them think about sadness. For surely lost love is not the worst tragedy. Surely that is what Clampitt is conveying, in part: that this is an ordinary sorrow, not distinctive enough to even give names to these protagonists. The flash of otherworldly splendor in eros: it will cost you. But it is still a gift: ought we to mourn a gift? 

    Poems shouldn’t tell you what or how to feel. Yet “measure” is a synonym for poetry — those metrical feet! — and what it might do is give you the measure of a situation, the measure of a person. Proportion or the lack of it should be clear to the reader. Clampitt gives us the measure of the doomed affair, the effect of the intervening years, and the proleptic reappearance of ardor in memory, that kingfisher, piercing her with regret. Clampitt gives us the privacy of it: she intimates what happened. Not a false confiding tone, but a true readerly intimacy that makes us lean toward it, as toward an almost inaudible confession. Is this saudade? Melancholy? Wistfulness? Or is it unappeasable heartbreak? The various shades of sorrow present themselves to us, in a variety of locutions and between the lines. “Sad” is such a tired word. We need a bigger vocabulary. 

    This is one thing I tend to repeat when I teach writing: we need a bigger, and more precise, vocabulary. But the other thing is this: art is emotional, yes; poetry is emotional, yes; but to be at the mercy of one’s own emotions — let alone other people’s emotions — is not a virtue, nor do good poems arise from it. The massive liberatory forces in the mid-twentieth century that freed poetry from meter and traditional form also freed it from other constraints: “Why not say what happened?” as Robert Lowell famously put it. It must have seemed exhilarating if you grew up under a regime of silence and repression. But what if you grew up in a regime where emotions could turn on a dime, become unhinged? What if tempers were unpredictable, flagrant, disproportionate; what if words were hurtful — and then reading books offered you another view: quiet, reason, proportion, precision and order? To be battered by other people’s moods, and to be bullied by them because feelings are supposedly ungovernable — this is what I have dreaded ever since I was a small, trapped child. Ovid’s characters are pitched into metamorphosis at the moment that emotion becomes most unbearable; metamorphosis is the only escape. 

    The happy ending of Ceyx and Alcyone can only cast Clampitt’s “The Kingfisher” in a dramatically ironic light. In fact, the ending of the poem really recalls the unhappy Aesacus. He is so outraged at being turned by Tethys into a seabird as he dives toward his death that in his bird incarnation he continues to hurl himself at the sea (Clampitt’s “down on down”) in a vain attempt to die. And yet his story is not as wretched as the tale that precedes Ceyx and Alcyone, wherein Ceyx’s brother Daedelion tries to kill himself when his daughter Chione is struck down by the goddess Diana. (Typical for a teenage beauty: she scoffed at Diana’s looks and received an arrow through her tongue.) In a foreshadowing both of Alcyone, who starts flying as she runs toward her husband’s corpse, and of Aesacus as he dives, Daedelion runs off a cliff and starts to grow wings. Apollo turns him into a raptor because grief made him vengeful:

    And bycause him selfe ere this
    Did feele the force of sorrowes sting within his wounded hart,
    Hee maketh others oftentymes to sorrow and to smart. 

    A father’s grief surely surpasses a lover’s. Clampitt’s discretion in “The Kingfishers” is a function of the knowledge that there are worse griefs. And yet it is her grief, altering her life: and this is her poem, her stifled cry. 

    Ovid’s Alcyone rehearses what all bereaved lovers, and writers of love poems, do: she revisits the places that hold the memory of the beloved. Here is Lombardo’s version: 

    Morning had broken. She left her house
    And went to the seashore to find the spot
    From which she had watched him set sail.
    She lingered there a while, musing,
    “Here he loosened the cable, he kissed me
    On the beach here as he was leaving.” 

    What is “The Kingfisher” but Clampitt’s own return to memory’s grave? And even as she can lay no claim to this lover, to whom she was not married, she may say, along with Alcyone, “If not an urn,/Our epitaph will unite us. Our bones/Will not touch, but I will touch your name with mine.” All love poems strive for this—perhaps especially for those loves that were never legitimate, where there can be no proper nostos. Borrowing from Aesacus, we can say of Amy, whose name derives from the French for “beloved,” that “she loves love and shares its name because she is immersed in it.” 

    How Lincoln Created Democracy

    Democratic theory, like democracy itself, comes in various shapes and sizes. There are today two leading theories of democratic legitimacy. Realist theorists such as Joseph Schumpeter and Robert Dahl focus on actual political institutions such as voting, competitive elections, and party elites. These are regarded as not only the necessary but also the sufficient conditions for democratic societies. Idealist theories of democracy typically draw inspiration from Jean-Jacques Rousseau, who focused on ways of enhancing citizen participation in deliberation and decision-making. A fully engaged citizenry, Rousseau believed, was a requirement for any theory of democratic legitimacy. 

    Each of these theories captures only a part of the truth. Realist theories of democracy correctly draw on the actual practices of democratic societies, but they are often indifferent to the ways that these societies are maintained and preserved. Institutions are not self-sustaining. Democratic institutions require democratic citizens who are trained and educated in democratic practices if those practices are to be sustainable. Without the proper inculcation of democratic values — tolerance, fair-play, a willingness to compromise, open-mindedness, a concern for truth — democracies are prone to degeneration and decay over time. Moreover, we have seen in places such as Turkey and Hungary how the rise of “illiberal democracies” are able to maintain the progressive patina of embracing competitive elections and political parties but have in fact tilted the playing field in such a way that the outcomes are generally predetermined.

    Idealist theories are prone to the opposite deficiency. Drawing on experiments in preference aggregation, such theories display a naive faith in the “wisdom of the crowd” as a means of arriving at the common good. They treat politics as if it were a debate society or a college seminar in which the “force of the unforced argument” — in Jurgen Habermas’ phrase — is supposed to win the day, but they fail to recognize that what is deemed the better argument by some may be a cause of further contestation and debate by others. Deliberative democrats also ignore the darker and irrational passions such as envy, resentment, and hatred, and the other negative dispositions that motivate human behavior. Such theories are often indifferent to the need for leadership and authority.  To be sure, democracy is about deliberation and debate, but it is also about decisiveness, authority, and command.

    Neither of these views pays sufficient attention to the actual founding of democratic societies. How regimes are founded will determine at least in part whether they will survive. There have been many foundings in history, but relatively few have stood the test of time. This is because successful foundings require far-seeing statesmen possessed of the qualities of vision, strength, wisdom, courage, and magnanimity. These qualities define the greatest statesmen — the fathers of the Constitution, as it were — that help to establish the permanent framework within which the right handling of changing situations can take place. 

    Just as important as institution building, the statesman’s art consists in the creation of the language by which a people understands itself. If Shelley was right when he said that poets are the unacknowledged legislators of mankind, statesmen are the acknowledged legislators of particular peoples. This requires the gift of rhetoric, an art that is largely neglected by modern political science and remains a singular cause for the debasement of contemporary political discourse. The statesman must be above all a “great communicator” capable of educating the public mind through the selective use of images and stories. In his book Leadership, Henry Kissinger commented on the importance of “deep literacy” as a background condition for successful statecraft. This only comes with the experience of careful reading. (Statesmen as readers – is this already asking too much?) 

    Nowhere was this example of deep reading given more profound expression than in Abraham Lincoln’s Gettysburg Address. Here Lincoln made the Declaration of Independence, and in particular its principle of equality, the centerpiece of his speech, proving himself to be a master reader of the hallowed text. At a time when appeal to the Declaration and its principle of equality was being dismissed as a “glittering generality” by one Senator and a “self-evident lie” by another, Lincoln restored the idea of equality to its place of importance. In her great book American Scripture, Pauline Maier showed how Lincoln took this fading relic of the Revolutionary War and turned it into a national mission statement, a canonical formulation of how we regard ourselves as a nation, what we are, what we stand for, and what we look up to. It is owing to the Gettysburg Address that we now read the Declaration of Independence much as Lincoln read it. 

    The Gettysburg Address is the most famous funeral oration ever given, at least the equal of, if not superior to, Pericles’s great funeral oration reported in Thucydides. Both of these speeches use the occasion of a solemn state funeral to provide a tribute to their respective democracies. Lincoln understood the power of words more deeply than any other American president; Saul Bellow once called him our greatest prose stylist. When Lincoln declared that “the world will little note nor long remember what we say here but it can never forget what they did here,” he was clearly engaged in a piece of willful deception. We remember Gettysburg today precisely because of Lincoln’s address. 

    We often say that deeds speak louder than words, but Lincoln knew very well that without the power of words, deeds are quickly forgotten. Herodotus — sometimes called the father of history — remarked that he was setting forth his account so that “time may not draw the color from what man has brought into being.” Like Herodotus, Lincoln understood that language alone has the power to confer immortality. It is a people’s collective memory that makes a people and a nation. There is scarcely a word about the battle of Gettysburg in the Gettysburg Address. It is a speech about the meaning of equality and the purpose for which the war was being waged.  In just two hundred and seventy-two words, in a speech that was an action, Lincoln accomplished nothing less than the creation of American democracy. 

    It would not be for almost four months after the battle that Lincoln delivered his speech on the occasion of the dedication of the new national cemetery. The construction of the cemetery was overseen by David Wills, a local banker and civic leader who had been appointed by David Curtin, the governor of Pennsylvania. The main speaker of the day was to be Edward Everett; Lincoln’s role was only that of performing the “last solemn act” of dedication. The opening of the cemetery was originally scheduled for October 23, 1863, but this did not allow Everett sufficient time to prepare his eulogy, so the date was moved to November 19. 

    The differences between Lincoln and Everett could not have been more striking. Everett was a Boston brahmin, a classicist who had been both a professor of ancient literature and president of Harvard. He had served in the House of Representatives, was a Senator from Massachusetts, and had served as Secretary of State in the Fillmore administration. A protégé of Daniel Webster, Everett had studied abroad in Germany and had met many of the leading figures of the Romantic age, including Byron and Goethe. He was one of the most distinguished scholars and sought-after speakers of his time.

    In his book Lincoln at Gettysburg, Garry Wills showed how Everett was one of the chief architects of the “Greek Revival” movement that was a part of this new-found American passion for the golden age of Athenian democracy. Everett’s speech was self-consciously modeled on Pericles’ Funeral Oration. It was intended as an act of memorialization. In fact, battlefield oratory was something of a specialty of Everett’s. He had spoken at leading sites of the American Revolution, including Concord, Lexington, and Bunker Hill.  He clearly regarded his speech as establishing Gettysburg as another famous site in the history of democracy that would rival Marathon and Thermopylae in the ancient world.

    Every revolution drapes itself in the history of its ancestors. The seventeenth century Puritans regarded themselves as the new Israelites creating a New Jerusalem. In The Revolution of the Saints, Michael Walzer explained how these Puritans – “God’s Englishmen,” they have been called — formed the basis of Oliver Cromwell’s New Model Army. They saw themselves on a sacred mission to redeem the world, by violence if necessary, and to eliminate all who could not live up to their standard of saintliness. Their American brethren likewise saw themselves on an “errand in the wilderness” akin to Moses leading the chosen people out of Egypt and into the Promised Land. Karl Marx caught the spirit of the age in the Eighteenth Brumaire of Louis Napoleon when he compared the French Revolution to the English Revolution:

    Camille Desmoulins, Danton, Robespierre, Saint-Just, Napoleon, the heroes as well as the parties and the masses
    of the old French Revolution, performed the task of their time in Roman costume and with Roman phrases. . . Similarly, at another stage of development, a century earlier, Cromwell and the English people had borrowed speech, passions, and illusions from the Old Testament for their bourgeois revolution. When the real aim had been achieved, when the bourgeois transformation
    of English society had been accomplished, Locke supplanted Habakkuk.

    The eighteenth-century was a Roman or neo-Roman age.
    In England, it was referred to as the Augustan Age after the Roman emperor who established peace after a century of turmoil and civil war. Americans similarly draped themselves in Roman garb. In his book Fame and The Founding Fathers, the historian Douglass Adair has shown how Plutarch’s Lives was a continual source of inspiration for the founding generation. George Washington was depicted as the American Cincinnatus, after the Roman general who saved the republic and surrendered power in order to return to his farm. The authors of the Federalist Papers adopted the pseudonym Publius after one of the legendary founders of the Roman republic. This explains why in America we have a Senate and a Capitol rather than an Ecclesia and a Pnyx. (The Pnyx was a hill west of the Parthenon where popular assemblies were convened and individuals freely addressed the deliberating crowd from a stone platform that still survives.) 

    The rediscovery of democracy at the beginning of the nineteenth century went hand in hand with a turn away from Rome to Greece. Democracy and the related term, popular sovereignty, were still relatively new expressions in the American lexicon: they had only begun to make an appearance during the age of Andrew Jackson. Democracy was a Greek term, a portmanteau of two words, demos or people and kratia or rule, which put together means rule of the people or, more succinctly, people power. Democracy was rehabilitated by philosophers, poets, and historians during the Romantic era’s new-found fascination with all things Greek. 

    In 1801, Lord Elgin secreted the marble sculptures from the Parthenon to entomb them in the British Museum. The poet Byron died on the battlefield of Missolonghi fighting for Greek independence against the Turks. The historian George Grote, an advocate of electoral reform, wrote a History of Greece in twelve volumes. When John Stuart Mill, in On Liberty, praised the virtues of “pagan self-assertion” as “one of the elements of human worth,” it was from the Periclean image of Athens that he was drawing. In America, the historian George Bancroft wrote a ten-volume History of the United States that presented America as the successor state to Greek democracy. Alexis de Tocqueville’s Democracy in America was similarly a product of this new egalitarian sensibility. It could not have been written, at least with that title, half a century earlier.

    Lincoln’s performance at Gettysburg is usually contrasted to Everett’s, who was to be the main attraction of the day. Everett had been invited in September and spent several weeks honing his speech and studying the battle that he described at length. Lincoln, however, was invited at the last minute, and was asked to deliver only “a few appropriate remarks” for the occasion. Everett’s speech lasted over two hours; Lincoln’s consisted of ten sentences that took under three minutes to deliver. For most of the fifteen thousand people assembled that day, Lincoln’s speech was over before it had begun. Only a single photograph of Lincoln delivering the speech exists. 

    Everett sent a congratulatory note to Lincoln several days after the commemoration, commenting on “the eloquent simplicity and appropriateness” of the president’s remarks, adding that “I should be glad if I could flatter myself that I came as near to the central idea of the occasion in two hours as you did in two minutes.” Lincoln replied in kind: “In our respective parts yesterday, you could not have been excused to make a short address, nor I a long one. I am pleased to know that in your judgment the little I did say was not entirely a failure.” 

    It is an urban legend that Lincoln’s speech was written on the back of an envelope on the train journey up to Gettysburg. This is false, but in America, as John Ford insisted, when the legend becomes fact, print the legend. He continued to refine the text during the trip, but we know that he took extraordinary care in composing his speeches. Lincoln rarely left the White House during the war. Given the fact that the trip from Washington to Gettysburg was not an easy one and involved various transfers — it even required him to leave the day before — we can infer that this was a speech that he very much wanted to deliver. It was not going to be a casual occasion. He wished to make it a decisive moment.  It was the speech that he had been working on, in one way or another, for his entire adult life.

    The Gettysburg Address begins with what is the most famous sentence in American oratory: “Four score and seven years ago our Fathers brought forth on this continent a new nation, conceived in liberty and dedicated to the proposition that all men are created equal.” What was Lincoln doing in saying this?

    The outset of the speech concerns the moment of political founding. Do the math. Lincoln fixes the date at 1776, the year of the Declaration of Independence, and not at 1787, the year of the ratification of the Constitution. This is a point that Lincoln had made on several occasions, most memorably on the centennial of Jefferson’s birth in 1859:

    All honor to Jefferson — to the man who, in the concrete pressure of a struggle for national independence by a single people, had the coolness, forecast, and capacity to introduce into a merely revolutionary document, an abstract truth, applicable to all men and all times and so to embalm it there, that to-day, and in all coming days, it shall be a rebuke and a stumbling block to the very harbingers of re-appearing tyranny and oppression.

    At Gettysburg, Lincoln sought to endow Jefferson’s “merely revolutionary document” — as if the revolution was simply the occasion for Jefferson to announce a philosophical truth — with a sacred history. His language — “four score and seven years ago” — is deliberately biblical and archaic. It is drawn from Psalm 90: “The days of our years are three score years and ten.” 

    This attempt to endow the founding with a new beginning indicates Lincoln’s departure from Jefferson. The Declaration of Independence was Lockean and Whiggish in its doctrines of natural rights, government by consent, and the right
    of rebellion. It presupposed an Enlightenment psychology according that peoples, like individuals, act rationally in pursuit of their goals, which are power, status, and economic goods. Governments are created to defend those rights and may be overthrown when they fail to do so. 

    But Lincoln’s language is less Whiggish and contractual than organicist and developmental. The biological images of birth, growth, and maturation suffuse the speech.  The nation was “brought forth” and “conceived.” The image of conception is an obstetric metaphor suggesting not the language of the social contract that requires reflection and choice, but the biological and Aristotelian ethos of something that grows and develops out of its own immanent principles. Lincoln does not call the country a “Union,” suggesting a social contract based on freely consenting agents; he uses instead the more organic term of “nation” to suggest a collective enterprise. A nation is not something that can be created by an act of will, but something that needs to grow over time. His reference to “this continent” — the land and the soil — further suggests the organic quality of the nation.

    The same holds true for what follows. The Declaration had made its case for independence by appealing to “the Laws of Nature and Nature’s God.” There is something impersonal and deistic in this formulation. It has more in common with the God of Spinoza than the God of Abraham. References to God or the Almighty are used sparingly throughout the Declaration. In the second paragraph, Jefferson refers to the “Creator” as the author of our “unalienable rights” to life, liberty, and the pursuit of happiness. Not until the end of the text does he again call on “the protection of divine Providence” under whose guidance the signatories are said to pledge “to each other our Lives, our Fortunes, and our sacred Honor.” All of these, I think, are fairly boilerplate formulations.

    Lincoln’s language could not have been more different. He refers not to the impersonal “Laws of Nature and Nature’s God,” but to “our fathers,” reminding us of the biblical patriarchs who “brought forth” and “conceived” a new nation. The image of Moses leading the Israelites out of Egypt and into the Promised Land immediately comes to mind. The language of conception, birth, and new birth runs throughout the speech. It is to this new nation to which we must be dedicated. The words “dedicate” or “dedicated” are used five times; the words “consecrate” or “consecrated” twice; and the word “devotion” twice. The speech makes no formal recognition of theology — it mentions only once “this nation, under God” — but Lincoln’s cadence is biblical, intended to endow the founding with a sacred beginning. 

    Perhaps the most striking formulation — or rather reformulation — in Lincoln’s speech is the claim that this new nation is “dedicated to the proposition that all men are created equal.” What makes this striking? In a text intended to honor and to recall the fathers, Lincoln shows his greatest departure from them. His paraphrase of the Declaration is revealing. Jefferson had written, “We hold these truths to be self-
    evident that all men are created equal,” proclaiming equality to be a “truth” which we know with certainty to be the case, but Lincoln put forward the idea of equality as a “proposition” to which we must remain “dedicated.” 

    Now, what is the difference between a truth held to be self-evident and a proposition to which we are dedicated? The language of self-evidence comes from mathematics, or more specifically from Euclid’s Elements. A self-evident truth — Kant called it an analytical truth — is a statement that is judged to be true as a consequence of the definition of its own terms. 7+5=12 is the standard form of such a truth. It is a truth independent of social context, language, and history, but is deemed true in terms of reason and logic alone. Jefferson believed that human equality was a truth of this kind. To know what it means to be a human being is to know that human beings are equal in a way that does not hold between human beings and horses or dogs or any other species. We are equal by nature of the terms involved.

    But then to call equality a “proposition” is something quite different. Propositions are statements that still require verification. A proposition suggests a certain indeterminacy, something that has yet to be established. While Jefferson called the idea of equality a possession — “we hold these truths to be self-evident” — Lincoln’s proposition is something to which we must be dedicated if it is to be either proved or disproved. No one needs to be dedicated to the Pythagorean theorem; it is true no matter what we may believe or what language we may speak. But this is not the case with a term like equality. It depends on our degree of belief in, or commitment to, its truth. The term “proposition” carries an existential quality that requires faith and belief. It must undergo a trial by fire before its truth can be recognized. We are getting close here to William James’s description of truth as what we will to believe — of truth as an event, something that happens or that we make happen. Lincoln offers no guarantee whether the proposition that all men are created equal will be proven true. Its verification will depend entirely on us — on our dedication to it — to make it so. 

    It is often argued that the centrality which Lincoln ascribed to the Declaration of Independence fundamentally altered the way that we think about the American founding. In particular, his focus on the equality idea appears to be setting out a new course for the nation.  What legal status did the Declaration — “a merely revolutionary document” — actually have? Was Lincoln’s insistence on the fundamental character of the idea of equality at odds with the Constitution’s concerns
    with limited government, the separation of powers, and the protection of individual rights? Was Lincoln fundamentally
    redefining a new direction for the nation and therefore
    imposing a new understanding of the Constitution?

    The claim that Lincoln rejected the original Constitution, with its federalist understanding of the Union and its limited place for executive power, was once the standard fare of the political right, from ex-confederates such as Alexander Stephens to neo-confederates such as Willmore Kendall. Today the argument that Lincoln or, to speak more precisely, the post-Civil War amendments established a new constitutional order — as Noah Feldman argued in his book The Broken Constitution: Lincoln, Slavery, and the Refounding of America — is endorsed by the political left. The only difference is that while the right attacked Lincoln for his departure from the founding principles, the left has applauded him for rejecting them. Both agree that Lincoln undermined the Constitution. Both of their claims are false.

    Lincoln did not see equality and liberty as pulling in opposite directions, nor did he see emancipation as overturning the Constitution. Instead, and with extraordinary wisdom, he regarded equality and liberty as two mutually supportive aspects of a democratic nation. In a fragment
    on the Constitution written shortly after his election, he considered the relation of the Declaration of Independence and the Constitution. In a commentary on Proverbs 25, “a word fitly spoken is like apples of gold in pictures of silver,” Lincoln considered the idea of equality from the Declaration as the golden apple and the Constitution as the silver filigreed frame that surrounds it. “The Union and the Constitution,” he wrote, “are the picture of silver, subsequently framed around it [the Declaration]. The picture was made, not to conceal or destroy the apple; but to adorn and preserve it. The picture was made for the apple — not the apple for the picture.” The Declaration and the Constitution, in his view, are one and inseparable.

    Lincoln saw the Declaration’s equality principle as setting out the telos for the nation and the Constitution as the means by which to achieve it. The two are mutually supportive and simultaneously necessary. But he saw equality as the grounding principle from which rights and liberty are derived. Without equality there can be no liberty, because a shared commitment to equality is the premise for the belief in the experience of democracy to which we aspire. I will return to this theme later.

    And this brings us to the second sentence of the Address: “Now we are engaged in a great civil war, testing whether that nation, or any nation, so conceived and so dedicated, can long endure.” Lincoln presents the war as a test, as a great experiment. (The theme of democracy as an experiment runs through the American intellectual tradition from Hamilton to Holmes.) And like any test, we cannot know in advance what the result will be. But what is the war a test of? Among other things, it is a test of ourselves as a nation – not a Union – that transcends North and South, section or locality. It is a test of whether a free people — a sovereign people and not an aggregation of states — can preserve itself against the threat of disunion. It is a test that will prove transformative, suggesting qualities of endurance, sacrifice, and struggle.

     This description of the war as a test of ourselves indicates his difference from the euphemism “the war between the states.” There cannot be a war between states because the states have no standing outside a sovereign people. “This is essentially a People’s contest,” Lincoln wrote in his July Fourth Message to Congress in 1861. “It is a struggle for maintaining in the world that form and substance of government whose leading object is to elevate the condition of men — to lift artificial weights from all shoulders — to clear the paths of laudable pursuit for all — to afford all an unfettered start and a fair chance in the race of life.” 

    Lincoln’s use of the expression a “People’s contest” may have been intended as a reminder of the famous preamble of the Constitution, “We the People.” Without a sense of peoplehood, of a shared national identity, democratic government is not possible, although this is a problem that has been too often ignored. It has been common to think of Lincoln primarily as a theorist of natural rights and to regard consent as the foundation of government as it would have been found in the philosophy of John Locke. This seems consistent with the Declaration, which begins with an enumeration of certain individual rights to life, liberty, and the pursuit of happiness, but says nothing about the rights of the citizen as a member of a sovereign people. There has even been a tendency to deny the importance of democracy because of its association with the Jacksonian doctrine of popular sovereignty. This would
    be to ignore the central importance that Abraham Lincoln
    gave to peoplehood.

    It is often useful to ask what is the most important word in a text. I want to suggest that no word is more important in the Gettysburg Address than Lincoln’s numerous uses of the first person plural pronoun: “we are engaged,” “we are met on a great battlefield,” “we have come to dedicate,” “we cannot dedicate,” “we cannot consecrate,” “we cannot hallow,” “it is rather for us,” “the great task remaining before us,” “we here highly resolve.” It was at Gettysburg that Lincoln transformed the first-person singular into the first-person plural. It is not I but we who are engaged in the struggle to preserve democracy. We are part of a collective noun. From now on, Lincoln appears to be saying, it will no longer be possible to think of the government simply as a contract between the states or even between individuals, but as a nation composed of a people with a common sense of moral purpose without which democratic government would be impossible. Lincoln’s address is a momentous exercise in nation-building, rooted in a shared way of life and a common destiny. It was at this moment that America became not a collection of states or a mass of individuals but a people.

    The third paragraph of the speech contains the longest sentence of the Gettysburg Address — nearly one-third the length of the whole. It is also an incredibly dense and complex sentence: “It is rather for us to be here dedicated to the great task remaining before us — that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion — that we here highly resolve that these dead shall not have died in vain — that this nation, under God, shall have a new birth of freedom — and that government of the people, by the people, for the people, shall not perish from the earth.”

    Lincoln did not use the word “democracy” at Gettysburg, but his reference to government “of the people, by the people, for the people” gives the speech an undeniably democratic character. His goal was not simply to defend the Union and the Constitution, but to embrace the principles of democratic government. His three-fold repetition of the word “people” — of, by, and for — points to another important difference between the Gettysburg Address and the Declaration of Independence. The Declaration was actually silent as to the forms of government. Jefferson had said only that legitimate government must be based on the consent of the governed, but the question of what a people might legitimately consent to is not addressed. There is no mention of democracy or any other form of government. At a minimum, the Declaration does not require democracy as we understand the term today. 

    Here grammar matters. Not only is the Declaration silent as to the forms of government, it does not endorse a single government. The signatories identify themselves as “the Representatives of the united States of America.” The use of the lower-case “u” in the name “united States” suggests the colonies may have been united in their struggle against British rule, but did not yet think of themselves as living under a common political authority. In Jefferson’s text, “united” is descriptive but not yet normative. Similarly, the Declaration goes on to affirm that “the United Colonies are, and of Right ought to be Free and Independent States.” The plural form of “Free and Independent States” suggests not only that they did not endorse democracy, but they did not yet think of themselves as one people. Furthermore, the signatories claimed to pledge only “to each other” their lives, fortunes, and sacred honor. This was not intended to include all Americans in their pledge.  

     

    We are often told that philosophers “do things with words.” The Gettysburg Address was a speech act. It was also an act of linguistic appropriation. At the core of the text is an attempt to recapture the word “people.” The language of democracy had previously been the property of the Democratic Party, of Andrew Jackson and Jackson’s heir apparent, Stephen A. Douglas. Douglas was the great defender of Popular Sovereignty in the territories, or “pop sov,” as Lincoln repeatedly derided it during their debates. For Douglas, democracy meant a simple form of majority rule. Whatever a majority of the people of a state or territory — or at least of free white males — wanted was the source of legitimacy. This was the basis of this claim of “indifference” to the future of slavery. If a state or territory wanted to vote slavery in or out, either choice was equally legitimate in his view so long as it expressed the majority will. It was a debased form of Rousseau’s “general will.”

    The unique achievement of Lincoln’s speech was to rescue democracy from this kind of majoritarianism — more precisely, white majoritarianism — or what the historian George Frederickson has called (echoing Nietzsche) Herrenvolk — or master race — democracy. Lincoln did what great writers have always done. He appropriated this term from his rivals and gave it a new meaning. Henceforth, a democracy worthy of the name was not simply what the majority agreed to, but a majority constrained by a prior commitment to the ideal of equality. “As I would not be a slave, so I would not be a master. This expresses my idea of democracy,” Lincoln wrote. “Whatever differs from this, to the extent of the difference, is no democracy.” Lincoln had a powerful sense that slavery and democracy could not continue to co-exist, that slavery bred attitudes of servility and resentment on the part of the enslaved and attitudes of domination and entitlement on the part of the enslavers. Both dispositions were incompatible with a free people. A democracy worthy of the name is one that recognizes the dignity of every individual.

    In words that were still a novelty, Lincoln yoked together the struggle against slavery with the struggle for democracy. This had not always been the case, as slave-holding democracies in ancient Greece or master race democracies in the American South proved the rule. Democracy and hierarchy had never been regarded as incompatible.  Even the American founders had disparaged democracy as inherently unstable and prone to violence. Madison rejected the very idea of collective deliberation on which Athenian democracy had been based. The genius of Lincoln was to give the old idea of democracy a new meaning. He was engaged in a piece of conceptual transformation as bold as anything ever attempted in the history of political thought.

     

    Lincoln saved the best for the very last. He began his address with an allusion to “this continent” but concluded with a reference to “the earth,” suggesting that the struggle for democracy is not just an American phenomenon but something that will affect the whole of mankind. The speech reveals an upward trajectory from the particular to the universal. This returns us to the “we” that is at the core of Lincoln’s text. Were his words addressed only to Americans, or was he speaking to the world?  Who was Lincoln’s audience? 

    Lincoln’s speech, I want to suggest, addressed not only fellow Americans but also a larger world-wide democratic movement. Well before the war, he kept abreast of European support for emancipation. As a reader of European journals and newspapers, he maintained a lively interest in the work of reform movements abroad. Yet Lincoln was viewed skeptically in Europe, where he was often compared to an aspiring tyrant much in the mold of Napoleon III. “They saw the United States under Lincoln as another example of democracy degenerating into military despotism,” Helena Rosenblatt has written in her excellent The Lost History of Liberalism. Victorian liberals such as Walter Bagehot and William Gladstone regarded him as a dangerous demagogue using the issue of slavery as a pretext for acquiring absolute power. One notable exception was John Stuart Mill, whose articles “The Contest in America” and “The Slave Power” took issue with Southern apologists who regarded emancipation as a dangerous moralizing of politics.

    Yet it was not intellectuals but ordinary men and women who provided the greatest support for Lincoln’s policy of emancipation. On December 31, 1862 at the Free Trade Hall in Manchester — the day before Lincoln issued the Emancipation Proclamation — a meeting was held that passed numerous resolutions in favor of Lincoln’s efforts. This show of support was of considerable importance as it came just at the moment when British economic interests were being seriously affected by the northern blockade on southern cotton. Despite heavy pressure on the government to try to break the blockade, the workers of Manchester continued their support for the embargo even while this was causing them immense economic hardships.

    Lincoln was deeply gratified for the Manchester resolutions and responded with a letter delivered by Charles Francis Adams, his ambassador to the Court of St James, which contains the following moving passage:

    I know and deeply deplore the sufferings which the workingmen at Manchester and in all Europe are called to endure in this crisis. It has been often and studiously represented that the attempt to overthrow this government, which was built upon the foundation of human rights, and to substitute for it one which should rest exclusively on the basis of human slavery, was likely to obtain favor in Europe. . . Under these circumstances, I cannot but regard your decisive utterances upon the question as instance of sublime Christian heroism which has not been surpassed in any age or in any country.

    Today a statue of Lincoln dating from the World War I period stands in Lincoln Square in Manchester. 

    From the outset of the war, Lincoln was aware that the eyes of the world were upon him. Nothing less than the future of self-government — of democracy itself — was at stake.  In his Fourth of July Special Message to Congress in 1861, he wrote: “This issue embraces more than the fate of these United States. It presents to the whole family of man, the question whether a constitutional republic or a democracy — a government of the people, by the same people — can, or cannot, maintain its territorial integrity, against its own domestic foes.” The phrase “the whole family of man” shows that Lincoln was clearly thinking of the war as a matter of global historical importance. His concept of democracy was universal. His reference to the “domestic foes” of democratic institutions should have pointed meaning for us today.

    Exactly a month to the day before he issued the Emancipation Proclamation, Lincoln presented the struggle over slavery in words that anticipated the promise at Gettysburg. In his Annual Message to Congress of December 1, 1862, he provided what was in effect a preamble to the Proclamation, declaring that it was no longer possible to return to the past. “Fellow-citizens,” he warned, “we cannot escape history”:

    The fiery trial through which we pass will light us
    down, in honor or dishonor, to the latest generation. . .
    In giving freedom to the slave, we assure freedom to the free – honorable alike in what we give, and what we preserve. We shall nobly save or meanly lose the last best hope of earth. Other means may succeed; this could not fail. The way is plain, peaceful, generous, just – a way which, if followed, the world will forever applaud, and God must forever bless.

    Lincoln’s repeated references to the earth and the world — “the last best hope of earth,” “the world will forever applaud” — anticipate his closing words at Gettysburg — “shall not perish from the earth.” Both texts address not just the present but also the future. He understood that the war that began as a purely sectional conflict between North and South had become part of a global struggle for democracy. How this struggle will be decided will depend on the degree of our dedication. Lincoln’s appeal to history, which brings to mind what Hegel called “world history,” was central to his teaching at the cemetery.

    Lincoln’s reading of the Declaration of Independence is an example of what Harold Bloom called “misprision,” or the kind of creative misreading that all great writers and poets do with the works of their predecessors. Lincoln was the embodiment of one of Bloom’s “strong poets”: he reinterpreted the Declaration both to show his respect for and to indicate his independence from his great predecessors. For the first time, he made explicit the connection between the Declaration’s equality principle and the need for democracy. His promise of “a new birth of freedom,” a startlingly ambitious phrase, almost a revolutionary one, announced his attempt to endow democracy with a new beginning purified from the stain of slavery and rooted in respect for the dignity of the human being. It was Lincoln who gave to democracy the almost sacred meaning that we attribute to it today. When we speak of democracy, we do so in the language that Lincoln created.

    Teaching Ellison

    In 1955, The American Scholar published a discussion among influential writers and editors titled “What’s Wrong with the American Novel.” In the symposium Ralph Ellison remarked that “I just feel that we are called upon to do a big job, not because someone is going to give us a star on the report card, but because this is America and our task is to explore it, create it by describing it.” His conviction came from his success in communicating a creative vision for self-determination in his novel Invisible Man three years earlier. From experience, he knew that writing into a conversation about the nation’s identity could change the ways other people understood their lives and commitments. Widely read and acclaimed, the book draws together and intervenes in multiple cultural conversations. It is a virtuosic celebration of dynamism and fluidity. 

    When I started teaching English in a public school in 2016, just after finishing a dissertation on American literature, I was happy to see worn class sets of the novel in the book room. I have taught it most years since. I am lucky to have a choice. In New York City, as in other districts across the country, public elementary and middle schools, and some secondary schools, have given up on novels altogether in favor of canned, test-aligned English curricula with proprietary textbooks. Instead, I teach to external exams that require no knowledge of writers or books. This framework could allow for a lively, decentralized approach to literature instruction if other infrastructure supported such a thing. But with good reason, even teachers with some degree of autonomy give up on long, old novels. Excerpts are easier. Working through Invisible Man in this climate, I began to think about what it might look like to teach it as though I had learned something from Ellison about literature, and about America. 

    Ellison’s archives at the Library of Congress house materials from two courses that he taught during the summer of 1954, just after the Brown v Board of Education ruling. The first was at the Tuskegee Institute, his alma mater; the second was at the Salzburg Seminar, an international gathering of American faculty and European students. As is often the case with archival records, the files make it difficult to parse exactly what he said and to whom he said it. The folder on Tuskegee holds correspondence, syllabi, press releases, and his convocation speech. But lectures on literature are all in a folder labeled “Salzburg,” and most are undated. Some are scrawled on a notepad in blue ink, some on lined paper in green, some are typed. Many pages have complete paragraphs and revisions, others are lists of loosely related phrases and fragments. Even when his notes are comprehensive, there is no guarantee that he was faithful to his own scripts. Ellison had planned to use the opportunity at Tuskegee to prepare for Salzburg. Sometimes he made explicit remarks about his intended audience, but discerning which fragments were intended for which location is often difficult. 

    On February 23, 1954, Ellison wrote to his wife Fanny from Tuskegee. Having delivered a lecture two days prior titled “Literature and the Crisis of Negro Sensibility,” he had agreed to return in late June to teach a week-long American literature seminar and to give a convocation address. Exasperated by some of the school’s administrators, he exclaimed to her: “Here during a time when integration is in the air, they have no courses in American lit! I asked them how did they expect students to become integrated when they know nothing of the society into which they were to be integrated? It hadn’t occurred to them that literature had anything to do with it.” For Ellison, teaching an American literature survey represented an opportunity to shake students out of complacency, to help them “recognize that there was no longer any need to think of themselves as less than human.” In the same letter, Ellison also mentioned that the series at Tuskegee would “help me to prepare my Salzburg material.” 

    In both cases, Ellison believed that students had become acculturated to far less humane systems of government than he envisioned possible. One lecture, scrawled in green ink on lined sheets, was called “Recapitulation of Necessity for Art and Literature,” and has two dates and places along the top margin: June 22, 1954, Tuskegee, and August 23, 1954, Salzburg. Presumably, he told these two groups of students some version of the same thing: “Laws are commands on how to act and stereotypes are commands on what to be. We must obey the law, perhaps, but we must refuse for our own salvation, and for the salvation of America, to become stereotypes.” Power to break free from stultifying conformity and become most fully human could only come from bucking expectation. 

    His plans to teach literature during the summer of 1954 reflected his hopes for, and his awareness of the limits of, federal policy in hastening full black participation in American life. On May 17, 1954, the Supreme Court unanimously ruled that racial segregation in public schools violated the Fourteenth Amendment. On May 1, in his literature course’s planning stages, he had written to a group of administrators: “If I am successful, this highly concentrated series should serve to establish some concrete awareness within the students’ minds of their relation to American literature — and thus, through grasping some of the symbolic implications of this literature, gain a consciousness of their vital involvement in those levels of the American experience that lie far deeper than the relatively shallow one of civil rights.” Policy was one avenue to opportunity. But regardless of the Supreme Court’s decision, reading and theorizing about stories could deepen students’ capacities for interpreting and reframing narratives about their lives. 

    One of the letter’s addressees was Morteza Sprague, Ellison’s own beloved college literature teacher, who had a major influence on his reading habits. But he had stood out to Ellison because, based on his other experiences on campus, and his sense of the current administration’s priorities, he expected that Tuskegee students had limited exposure to what he called in that same letter, “the specific character of American literature and its origins, and a description of its cultural function.” Ellison would introduce students to America’s literary traditions and defamiliarize the nation’s defining intellectual and aesthetic habits for the sake of full citizenship.

    This project of offering a week-long course on “The Basic Ethical Function of American Literature” was humane in terms of both substance and approach. Ellison explained to Saunders Walker, chair of the English department, that even in these preliminary sessions, “The idea is not to quiz [students], but to help them recognize some of the terrain over which I will attempt to guide them.” In turn, Walker told the instructors who would lead discussion sections in advance of the lectures, “Fundamental to our contribution would be, it seems to be, the effort we make toward stimulating the students to read further writings by the authors we shall discuss.” Guidance and stimulation defined the teachers’ roles, not dominance. 

    Ellison expanded on his teaching philosophy on June 23 in his convocation lecture, “The Role of the Negro Teacher in Preparing for a Non-Segregated Way of Life in the United States.” He told his audience of black teachers that “The Negro finds himself invariably in the role of a social critic, a gauge of the state of humanity under American democracy. And when we look for that section of our population whose experience most closely parallels that of those millions of Europeans who suffered under fascism…we find again that it is the Negro American.” Ellison argued that this tradition of subjugation had endowed black teachers with the capacity and the responsibility to promote self-governance in an integrated society — in his words, “a fully realized American democracy.” Defamiliarizing students’ surroundings, offering narratives that could reframe the stories that they told themselves, held the promise of bringing the country closer to what Ellison called in that lecture “the humanistic ideals upon which the American system was founded.” He expressed hope that in an integrated public school system, black teachers could make the nation’s educational system more humane.

    Ellison’s contribution to this task included mapping a national canon. He built on work by the early and midcentury’s
    most prominent white critics, including Lewis Mumford, Edmund Wilson, Richard Chase, Philip Young, D.H. Lawrence, Constance Rourke, and F.O. Matthiessen. The course touched on what he and others called the “sacred documents” — the Constitution, the Bill of Rights, the Declaration of Independence — in addition to essays and novels. Required readings included Melville’s “Mardi” and Benito Cereno, Thoreau’s “Civil Disobedience,” “Slavery in Massachusetts,” and “A Plea for Captain John Brown,” Twain’s Huckleberry Finn, Guthrie’s
    The Big Sky, Hemingway’s “Green Hills of Africa,” “The Battler,” For Whom the Bell Tolls, and Death in the Afternoon, and Faulkner’s “Delta Autumn,” “The Bear,” and The Sound and the Fury. He also asked students to read W.J. Cash’s The Mind of the South. His time at Tuskegee was short, and the syllabus consolidated a constellation of celebrated white writers of whom he thought students ought to be aware to participate most fully in American life. 

    Running through his notes on the value of studying a national canon is an insistence on what he called in one place “the unsettled character of society,” and in another, an “unknown country…diverse as to customs, manners, big and difficult to fit into novel forms.” Since American culture, in his view, was defined by its fluidity, fiction could reduce the flux and “equip us to live by perceiving that which is valid in the past and to point out the possibilities and necessities of the future.” Repeated references to the role of “mass media” in celebrating particular books demonstrate his awareness, if not anxiety, about any given author’s outsized influence. He believed in the enduring value of the books that he taught, but he was careful to explain that their popularity was contingent, not fixed. He also noted, “much of the literature in the U.S. is still oral.” By 1963, he argued at an education conference that there was no such thing as a “culturally deprived” student: “That kid down in Alabama whose parents have no food, where the mill owner has dismantled the mills and moved out west and left them to forage in the garbage cans of Tuskegee, has nevertheless some awareness that he is part of a larger American scene, and he is being influenced by this scene.” For Ellison, American education ought to relate for the student “the cultural traditions and values of his parents to the diversity of cultural forces with which he must live in a pluralistic society.” Even as he taught preeminent books, he made clear to students that they were only part, with sometimes contrived degrees of influence, of the nation’s cultural makeup. 

    A few weeks after the series at Tuskegee ended, Ralph and Fanny departed for Europe, where he was to teach for six weeks as part of the Salzburg Seminar, a bi-annual gathering of scholars and writers promoting American democracy abroad. Dexter Perkins, president of the organization facilitating the courses, wrote to Ellison: “We do not go to Salzburg and jam America down the Europeans’ throats. Chauvinism does not exist in the Seminar’s framework. We tell what America stands for to a highly intelligent group of Europeans chosen because they are in a position to mold public opinion. The free democratic teaching at Salzburg is in the traditions of Harvard and we have carefully preserved the ideals handed to us when the administration changed from Harvard students to an independent board.” At Salzburg, Ellison assigned the same readings as he had at Tuskegee. His stated main lecture topic was “The Role of the Novel in Creating the American Experience,” and he promised the seminar would explore “rhetorical problems of the American novel” in a variety of frameworks:

    As a medium of communication
    As a definition of experience
    As a gauge of democratic health
    As projector of the American image
    As the search for identity
    As reflecting the racial situation and the problem
    of value

    But because the new course ran for weeks instead of days, and his audience was then comprised of Europeans, his reading list there was more expansive, and included scholarship like Kenneth Burke’s Counter-Statement: The Philosophy of Literary Form, Gunnar Myrdal’s American Dilemma, E. Franklin Frasier’s The Negro Family in the US, and Brown Davis Lee’s The Negro Caravan. He also added the Life and Times of Frederick Douglass, Saul Bellow’s The Victim and The Adventures of Augie March, Richard Wright’s Native Son, Black Boy, and The Outsider, James Baldwin’s Go Tell It On the Mountain, James Jones’s From Here to Eternity, Harriet Beecher Stowe’s Uncle Tom’s Cabin, and some readings on and from Abraham Lincoln, John DeForest, and Ambrose Bierce. He also taught a separate seminar on “The Background of American Negro Expression in Folklore, Writing, and Music.”

    The contrast between the two campuses was stark, but Ellison’s pedagogical aims appear to have been largely consistent. Of Schloss Leopoldskron, the Salzburg Seminar’s Austrian site, the faculty information brochure warned that “as in any castle, certain mechanical difficulties arise.” The charmed setting, the program’s funding from Rockefeller and Commonwealth, and its associations with elite universities gave it prestige. But as a student named Margaret Meek wrote to Ellison a month after the seminar ended, “European students can be a pretty cynical lot ten years after a war.” 

    On its face, the purpose of the seminar was to cultivate a sense of earnest investment in the American project, as defined by the scholars and writers who volunteered their time to the cause. Perkins had warned Ellison in December that students would “probably have…a distorted view of the Negro problem, of the nature of American capitalism, and of academic freedom on the United States. They will be quite open minded in discussions but will react against a strongly nationalistic attitude, or for the matter of that, against an apologetic one.” Students would not tolerate indoctrination. Neither would Ellison. In January, he responded to Perkins: “I shall adapt my lectures to the backgrounds of the students as I come to know them…The problem of their possible attitude toward America in no way bothers me. I feel neither the necessity to attack or defend. I am interested only in helping them discover the complex truth of American reality. To this end, I commit myself and will give as much of my time as the students are willing to take.” 

    He appears to have been a dedicated teacher. On September 26, he filed a letter of recommendation for a Salzburg student to continue his studies of American culture stateside: “He is a young man whose eager intelligence has revealed to him the multiplicity of possibilities which lie before us just before we make the final choice of the medium, profession, etc, through which we set about to fashion our own identity and through which we strive to make our mark upon the world. I speak to the functional ‘center’ around which our personalities take on sharp form.” Ellison was aware that institutions and the shared experiences that they facilitate could enforce a stultifying conformity. Still, reflecting his optimism about public education for unifying without homogenizing, Ellison hoped that they could also offer opportunities for students to develop their individuality. 

    Calculating the ratio of Ellison’s hubristic Cold War Americanism to his deep personal engagement with democratic possibilities is impossible. He told students in Salzburg, “When you see the reports in American newspapers and the pictures of black and white children gracing the cover of such news organizations as Newsweek, you are witnessing not simply news, or propaganda, or simply a form of American self-congratulation — but the dramatization of a profound moral reawakening.” A Seminar brochure boasted, “At the Seminar, where the free atmosphere of an American educational institution and community is re-created, genuine communication and understanding take place. After a month in such an atmosphere, the participants return to their countries and disseminate their re-oriented views of the United States through their newspapers, labor unions, universities, and in their daily contacts.” Time magazine’s Salzburg coverage celebrated freewheeling seminars that held potential to promote international accord: “Opinions flow so freely at Salzburg that a Yugoslav seminarian once pulled a knife on an Italian. By contrast, a Norwegian fellow spotted a German at whom he had thrown a grenade during World War II, and they became intellectual buddies.” Reconciling nationalistic, even propagandistic, rhetoric with an intellectually honest project can be difficult. Yet the Establishment scholars and teachers who volunteered their time at Salzburg, such as Max Lerner, appeared to have had a sincere faith in the mission to promote democracy. 

    In both Tuskegee and Salzburg, Ellison professed optimism about the potential of educational institutions for reinforcing the best of the American political tradition through the transmission and contemplation of a distinct, integrated national culture. For Ellison, “integration” signaled an embrace of diversity and individualism — goals that may seem dissonant today but went naturally together in Ellison’s formulation of them. His definition of cultural literacy required awareness not only of nationally prominent art and expression, mostly by white male writers, but also of the obscure and local traditions and artifacts that make and preserve meaning on a smaller scale. 

    As Ellison was defining for himself and for his students the notion of a common cultural core, college administrators such as Robert Hutchins at Chicago and James Conant at Harvard were at work attempting to institutionalize what the Harvard president called America’s “liberal and humane tradition.” The Harvard Red Book, published in 1945, had argued that liberal education at scale could encourage a sense of connection to the American project, and to help students think about that project in terms of ideas, rather than (as Ellison put it in one of his lecture notes) “an Anglo-Saxon rite.” But Ellison reached thinkers beyond the confines of elite college campuses. Drawing on Frederick Jackson Turner’s then voguish “frontier thesis,” he instructed his audience at Tuskegee that “all Americans under pioneer experience must take pioneers risk. We must prepare for those jobs which just now appear closed to us. We must learn how America operates. We go to the novel for the definition of what America is, what an American is.” He advocated for a kind of cohesion through division: exposure to common cultural experiences could bolster the experiment in multi-racial democracy by inviting conversation about different means to connect past and present through narrative. 

    [inisgnia] 

    Ellison’s own novel is a valuable teaching tool in part because it invites readers to think about how and why Americans interpret stories. On a quest for self-determination, Invisible Man’s nameless narrator navigates familial and cultural inheritance. His journey through rejection, injury, and disillusionment at the hands of organizations promising and failing to secure his dignity leads him to value his own mind above external expectations, and at the same time to renounce both sentimentalism and nihilism for the sake of self-preservation. 

    When the story begins, the narrator has faith in school. Over time, his stance on formal education becomes critical. By the last few pages of his narrative, he has a nightmare in which he “lay prisoner of a group” including his former school superintendent and college administrator. The institutions and their employees may have failed him, but he parses learning from schooling. As he explains to the reader in the book’s dazzling epilogue, “I sell you no phony forgiveness, I’m a desperate man — but too much of your life will be lost, its meaning lost, unless you approach it as much through love as through hate. So
    I approach it through division. So I denounce and I defend and I hate and I love.” His hope for a future in which others recognize his humanity is radical, and central to his vision for such a world are affirmations of deliberation, diversity, and nuance. In the world of Ellison’s novel, this project takes time: navigating contradiction requires hundreds of pages. 

    Reflections on the value of reading and writing in pursuit of these ideals permeate the book. It opens with a rejection of the notion that its narrator is uneducable, or that he stands outside of American literary culture. He declares: “I am an invisible man. No, I am not a spook like those who haunted Edgar Allan Poe.” A dense web of high literary and folkloric allusions calls to mind Eliot’s theory of canon in “Tradition and the Individual Talent”: “for order to persist after the supervention of novelty, the whole existing order must be, if ever so slightly, altered; and so the relations, proportions, values of each work of art toward the whole are readjusted.” Ellison challenges readers to reframe Shakespeare and Emerson within a cultural tradition that also includes Br’er Rabbit and, in the narrator’s words, “we who write no novels, histories or other books.” From the start, the narrator is devoted to reading books, at first for school, and, later, to learning from a broad range of voices for his own cultivation.

    Ellison’s narrator also recognizes that a written tradition can be a powerful mechanism for exclusion, even elimination, from public life. After the police murder his sometimes friend and sometimes rival Tod Clifton, he reflects that “I tried to step away and look at it from a distance of words read in books, half-remembered. For history records the patterns of men’s lives, they say…But not quite, for actually it is only the known, the seen, the heard and only those events that the recorder regards as important that are put down, those lies his keepers keep their power by.” The narrator grapples with a scarring pain that might result in cynicism. But ultimately he seeks freedom rather than destruction, and writing against his own nonexistence becomes a form of emotional release. He explains: “The very act of trying to put it all down has confused me and negated some of the anger and some of the bitterness.” Creating a narrative capacious enough to add dimension to history and sociology and to liquefy otherwise hardened norms affords him at least a taste of the agency that he imagines possible. 

    Invisible Man makes overt demands on its readers. In the prologue, the narrator asks, “Could this compulsion to put invisibility down in black and white be…an urge to make music of invisibility?” With this question, he makes reading the book a self-conscious act. And in the following paragraph, he warns readers who are about to embark on his life story that “Irresponsibility is part of my invisibility; any way you face it, it is a denial. But to whom can I be responsible, and why should I be, when you refuse to see me?” He continues that “Responsibility rests upon recognition, and recognition is a form of agreement.” By engaging with the book, and, in turn, by recognizing the narrator’s humanity, readers become complicit in the system of rights, burdens, and culpabilities from which he imagines himself disentangled. But asking for attentiveness, even vigilance, to the signs and symbols that inform his choices invites judgment about readers’ responsibilities when confronted with injustice within their own society.

    The novel also responded to scholarship attempting to consolidate an American literary culture. Perhaps the most obvious example is its ambivalent commentary on Lewis Mumford’s The Golden Day, the book of essays after which Ellison named the tavern and brothel full of traumatized black veterans on the outskirts of his college campus. Mumford, a significant social thinker, was concerned about the deleterious effects of industrialization, and argued that antebellum American life represented a cultural apex. He acknowledged that during that period “fierce debate found the Southerner frequently denying that the Negro was a human being,” and he lamented the consolidation of executive power, the decline of local life, and the growth of finance and cities following the Civil War. The tavern in Invisible Man is a symbol for Ellison’s refusal to participate in sentimental laments for a segregated era: behind a celebration of flowering Romanticism is a community bound by disenfranchisement. Yet Ellison both “denounces and defends”: Mumford also called for writers to gather “all the living sources of its day, all that is vital in the practical life, all that is intelligible in science, all that is relevant in the social heritage and, recasting these things into new forms and symbols, to react upon the blind drift of convention and habit and routine.” 

     

    For decades, Invisible Man has helped readers see themselves and their country with fresh eyes. Every few years, the novel’s champions publish paeans in legacy magazines and on must-read lists, attempting to reaffirm its place in American letters for a broad readership. Secondary school lesson plans abound, from Random House, PBS, the National Endowment for the Humanities, and enterprising teachers. I used to think of the book as an indisputable high school classic. I read it for the first time as a senior at a Brooklyn public school in 2002. My twelfth-grade English teacher assigned it in late spring, just as my friends and I had given up on caring about school. His clear enthusiasm for discussing it cut through our general apathy about assignments and grades. We did the reading. I could not have understood it then as I do now, even as I recognized that the writing was powerful and intricate. I have been grateful to have occasion to revisit it as an adult, and to share it with my own students. 

    But the book’s place in schools is now precarious. The problem began with the misguided priorities of recent educational policy. Bush, Clinton, and Obama all promised educational equity and progress through measurement and testing. Our current Secretary of Education, Miguel Cardona, makes another dimension of the quantitative rationale explicit: he likes to repeat that his job is to “align education with industry demands.” State standards enforce a desiccated skill set sold as close reading. Curriculum publishers and testing companies profit from courts’ agreement that, in the words of a Seventh Circuit ruling, “expression is a teacher’s stock in trade, the commodity she sells to her employer in exchange for a salary.” Distrust of teachers, the emphasis on the economic justification of education, the notion of education as an impoverished kind of job training, and the prevailing technocratic ethos of our society diminish space and motivation for slow, deliberate contemplation of democratic citizenship through literature. Some teachers have taken to social media to build “adhocracies” in their spare time — small communities committed to discussing the books that should be a central focus of our work. Philanthropies and colleges could furnish accessible summer literature seminars for teachers, but at present such opportunities are limited. 

    Meanwhile, the growing classical education movement brands itself as restoring great books in schools, especially in states with laws that forbid teaching “divisive concepts.” It emphasizes reverence for a narrow canon, determined by opaque authorities. The Classic Learning Test is billed as a literary and Catholic alternative to the SAT. Jeremy Wayne Tate, the company’s founder, turned to classical education after teaching in a New York City public school and getting fed up with drilling skills and what he perceived to be a progressive policy preference for top-down directives. Ron DeSantis embraced its use for admissions at public universities. The company’s author bank does not include Ralph Ellison. And his magnum opus appears on the reading lists of neither the arch-conservative Hillsdale nor the more moderate Great Hearts classical charter networks. It is unclear why it didn’t make the cut, especially in a time when it could offer intellectually mature illumination about issues that have been roiling our society. 

    In 2013, in response to a school board’s decision to remove the book from its curriculum, Ishmael Reed explained in The Wall Street Journal that Ellison intended for the book’s chapter on incest “to woo voyeuristic mainstream readers.” Reed concluded that the novel “remains a classic if only because it shows the ongoing conflict between the committed independent artist and institutions that desire to corral their creativity, including the state.” High school students are minors. Whether they can or should handle a book as provocative as Invisible Man is, or maybe was, a matter of local jurisdiction. The humanistic failures of our schools notwithstanding, we must hold space also for approaches to literature instruction beyond the classical model.

    Like many literary critics and high school English teachers, I am skeptical of an ossified canon, and of cultural experiences so widely and routinely shared that they flatten creative and intellectual possibilities. The National Book Award crowned Invisible Man as an Establishment darling immediately after its publication. Never mind the narrator’s off-putting misogyny. There is also a body of scholarship that warns against fetishizing the book object in the face of technological advance and new forms of media. Some might argue that Ellison’s work is outdated and that books are altogether obsolete. I would never try to force someone to teach Invisible Man. But I don’t believe that we should abandon it either, and make invisible the very book that introduces us to the ethics of visibility. 

    On an undated piece of notepaper interleaved into Ellison’s lecture notes, he scribbled the words “humanities not secondary but primary.” By balancing simultaneous critical distance and emotional engagement in our experience of art, Ellison championed the value of “those human qualities which we have developed through the discipline of the Negro experience and despite the obstacles and the meanness.” Invisible Man is a rejection of the attempts by institutions and authorities to deprive the narrator of a distinctive subjectivity. At the center of Ellison’s pedagogy, and of his theory of American literature, is a repudiation of the kind of despair that might have lulled his narrator into abandoning his quest for cultivating his interpretive capacities. 

    Ellison taught American literature so as to invite students into fuller lives, and to stimulate their critical assessment of America’s ability to realize its stated ideals. We appear to be at a national impasse about whether these goals are worth pursuing in tax-funded public schools. Teaching Invisible Man in such a setting is certainly not the only way to work towards them. But as standardization and testing destroy literature instruction at scale, and an ascendant right-wing faction seeks to rebuild it in its own image, learning from Ellison about American literature’s “cultural function”
    is as good a way as any to think about where we might go from here.

    Hate Lands

    Agnieszka Holland was six years old when she heard the word “Jew” for the first time. It was in Warsaw in 1954 — several decades before one of the films she directed was first nominated for an Oscar. She was playing with the local toddlers and one of the gang called her a “dirty Jew.” The children were gamboling, as they always did, in the ruins left behind by the war. She had not yet been taught the word for the hatred of Jewish people — the foul indulgence which facilitated so much of the horror to which the rubbled playground testified. The destroyed and dilapidated buildings were the physical manifestation of the post-war anxiety which was as ordinary to Agnieszka as her mother’s smile and her father’s booming voice. Upon returning home she asked her mother: “What is a ‘Jew’? Is it true? Am I a dirty Jew?”

    Agnieszka’s mother, Irena Rybczynska-Holland, must have spent some amount of the preceding six years preparing for such a question. Irena was born in 1925 in Lutsk, in Lithuania, to a father who was the first of his family to make it out of the impoverished countryside where his ancestors had dwelled for generations. He became a teacher and Irena attended elementary school in Warsaw. She was in high school during the Anschluss in 1938. Like all her friends, Irena was forced to finish her studies in secret over the course of the war. Unlike most of them, Irena was simultaneously serving in the Polish underground as a liaison officer, a medic, and a senior rifleman. And, equally as dangerous, she hid Jews, protected them while in hiding, and worked with her senior officer to secure them false papers. One, a man who had escaped from the local ghetto, lived with her for a time masquerading as her fiancé to deflect suspicion about his Jewishness. When he needed a medical operation Irena organized a fundraiser to cover the cost. They remained close for the rest of their lives. 

    Irena and her best friend, another young Christian Pole, promised one another that, if they survived the war, they would marry Jews and give the world Jewish children. Irena made good on her promise. Her first husband, Henryk Holland, was a prominent Jewish Communist who had tried to convince his family to flee Poland with him when the war started. They refused, and all of them save one sister were killed. That sister, Agnieszka’s aunt, escaped the Warsaw ghetto by hiding inside her sister’s coffin when the corpse was being transported for burial. 

    After the war Henryk became the editor of a communist youth newspaper, which is how he met and aggressively courted Irena. She was very beautiful — blonde, lively, and warm. She made a name for herself and built an impressive career as an editor and journalist, but the most formative experience of her life was witnessing the Holocaust. It was this fierce empathy that, Agnieszka suspected, compelled Irena to marry Henryk. Agnieszka always assumed that her father exerted enormous emotional pressure to persuade the beautiful blonde Christian woman to succumb to his advances. That he was a Jewish orphan figured enormously in Irena’s decision. It was not a happy marriage. They had two daughters, Agnieszka and Magdelena, and divorced when Agnieszka was eight.

    Agnieszka was a sickly child and was often forced to stay home from school. She loved listening to her father hold forth with his friends. He was a charismatic man, a dramatic conversationalist who relished presiding over their small apartment, arguing with comrades about politics and philosophy. Agnieszka was enraptured, and silently tracked his ideological disillusionment from deep inside the Communist cell to his defection from it. When she was thirteen, Communist officers came to their apartment and accused her father of treason. Somehow, during the scuffle, he fell out of the window and died. Everyone suspected at the time that he had been pushed, but decades later, when the records were finally made public, Agnieszka determined that it was just as likely that he
    had jumped.

    Agnieszka was fascinated by the afterlife of the war that was so present to her from so young. It was a very exciting landscape, and very dangerous — one boy from her neighborhood was killed playing in the same ruins she did. She used to listen to her mother’s stories about being in the resistance and play them out with her friends in make believe games. The atmosphere was one of distended dread. Everyone in Agnieszka’s life expected a new war to break out again. This anxiety was a constant feature of her childhood. When there were crises — the Korean war, for example — people would run out in a panic and buy masses of flour and sugar. The war shrouded the recent past and loomed ahead as a possible future. One night while Agnieszka was lying awake in her childhood bedroom, she heard a plane overhead and assumed viscerally that they would be bombed. “Finally — it’s starting,” she thought. 

    The afternoon when she came home and asked her mother whether or not she was a dirty Jew was the same afternoon she learned for the first time about her father’s family. Irena told her six-year-old daughter about the grandparents who had died in the Warsaw Ghetto. Irena said there was no reason to be ashamed of her Judaism, that she never had to hide it. This, Agnieszka would soon learn, was an unusual orientation. Most of her parents’ friends didn’t talk about the war with their kids, even if the evidence of it was everywhere. If they were Jews, the fact was hidden from them. Antisemitism was part of Polish identity, even in the aftermath of Auschwitz. 

    It had been baked into Polish identity for centuries. By the 1700s, roughly eighty percent of the global Jewish population was living in Poland. When the country was partitioned into Russia, Prussia, and Austria at the end of the eighteenth century, most Polish Jews suddenly became Russian Jews, and the Russian regime imposed upon them professional and geographical limitations authorized by the Tsar himself. Jews in Russia and Poland moved from shtetls to cities such as Warsaw, Vilna, Krakow, and Lodz. In the 1880s and 1890s a series of pogroms incited mass immigration — two million Jews left Russia, most of them for America. By the time World War One was over, Poland remained home to one of the largest Jewish communities in the world. Somewhere between three million and three and a half million Jews were in Poland when Hitler invaded in 1939. 

    Polish antisemitism had metastasized as the population swelled. During the interwar period the Polish government supported Jewish emigration to Palestine, hoping that the establishment of a Jewish state would drain the country of the unwanted hordes. Foreign Minister Józef Beck committed Polish support at the League of Nations for building up Jewish defense forces in Palestine, and campaigning for and assisting the facilitation of Jewish emigration there. In 1937, he announced that the country could hold only five hundred thousand Jews, and hoped that over the next thirty years a hundred thousand would choose to leave annually. 

    They would not get the chance. By the time World War II was over, only 380,000 were left. Agnieszka Holland was one of them.

     

    Holland, as is well known, became a movie director. She made Europa, Europa in 1991. Its plot is complicated and exciting, but what the film is really about is the relentless and maddening complexity, the unimaginable sense of precarity in large things and small, which is characteristic of social breakdown, and is laid bare in wartime. Europa, Europa depicts a world of terrifying contingency. It is a demanding film. Holland is exacting. She deprives her audience of comfortable and familiar packaging: no character is two dimensional. The victims are as human, as respectable, and as pathetic, as the villains. We are forced to feel for both sorts — indeed, she seems to be insisting in every frame that everyone is both. This complexity is personified in the film’s protagonist: a Jewish pole named Simon Perels who is able to disguise himself as a Nazi youth because of his fluency in three languages: German, Russian, and Polish. Well, disguise himself as effectively as any circumcised man can. 

    The film begins with Solomon’s circumcision. Time and again we are reminded of the significance of that flap of missing flesh. Solomon would have been able to assimilate totally into the Nazi world had it not been for that split second of ritualistic allegiance, for the rabbinical blade’s swift slice. Instead, through a series of absurd misunderstandings, Solomon is mistaken for a Nazi hero. Perel, who was a real figure and on whose autobiography the film is based, survives the war by feigning Aryan status. The cost of that strategy was a confused identity that drove him to the edge of madness. The weight of reconciling the love and care of his German friends with the white-hot hatred that they evince for people exactly like himself tears Perels apart. These men and women who insist that he should be honored to take part in a holy war against the Jewish race — they are the only family he has left. 

    While traveling with German soldiers across a battlefield in Russia, they come upon two dead children hanging by nooses in front of a farmstead. A sign has been strung around the older boy’s limp neck. It reads “so it will be for everyone who helps the Bolsheviks and the Partisans.” Solomon — or Jupp, as the Nazis call him — tries to look away, but his friend grabs his jaw and forces his face towards the bodies: “These swine killed your German parents! They took your house! See how they pay for it now! Look at these animals! You must learn to hate!” Overwhelmed, Solomon grabs the machine gun on his car and shoots maniacally into the house behind the corpses, setting it aflame. “That’s when I first felt how confused I was,” Solomon’s voice narrates as we watch the house burn. “Who was my friend and who was my enemy? How could they be so kind to me and at the same time kill so viciously? What set us apart? A simple foreskin?” The screen fades to black and the next scene opens with an unfamiliar voice saying “I feel German pride for spilling my blood for the Führer and homeland.” Solomon’s friend is helping another soldier write a love letter to his girlfriend back home. 

    The foreskin that keeps Perels from safe and secure integration also keeps him from succumbing to the hatred that would set him free. “You must learn to hate!” his friend hectors, understanding, intuiting, that the essential cohesive mechanism in all extremist politics is this capacity: Hate for me, my love. Hate them because they are unlike us. But Perels’ circumcision is a physical reminder that the disguise cannot last forever. He cannot give Leni, his Nazi girlfriend, Aryan babies; he cannot even seduce her, as she demands he should. Like all fascistic political parties, the Nazis glorify male domination, but Perels’ masculinity is the very thing that will betray him. “Limp dick!” Leni taunts him after he refuses to take her virginity. His emasculation saves him from devolving into the same monster his fellow soldiers are. He could do it, Holland intends for us to understand, he could kill mercilessly, if he were not so afraid for himself. The only thing worse than denying Leni would be not denying her. He would love to be the kind of perfect Aryan male for which she mistakes him. 

    Should we judge him for that? Does his capacity for evil strip him of a right to our mercy? Of course not. All people are both Simon and Leni. All of us are caught in a mélange, a human mess, of idiocy, intelligence, brutality, and kindness. The machinery for all of it, for viciousness and kindness, is not just latent in us, it is always active, every day. The same minds that hang enemy children also write love letters.

     

    As a three-year-old watching her father argue with his friends about the ideology that would kill him while in the shadow of the extermination that he had hardly escaped, Agnieszka was a student studying the mutilated, talking remains of history’s victims. That early instruction determined her great cause: the close study of the lived experience of horror that civilized human life facilitates by supporting, justifying, or ignoring it — sometimes all three at once. She has made several historical films treating this theme, some of them about the Holocaust and the Holodomor. But last year she made a film, Green Border, which begins in October 2021 and unfolds over the next year on the border of Belarus and Poland. It tells the story of a group of Syrian and Afghan men, women, and children who are heartlessly ping-ponged back and forth across the border in a macabre game overseen by Belarus’ dictator Aleksandr Lukashenko and Poland’s then-Prime Minister Mateusz Morawiecki. 

    The film is divided into three parts: The Family, The Border Guard, and The Activist. It begins with a family of Syrians on an airplane about to land in Belarus, where Lukashenko has promised them safe entry, knowing that he can tempt refugees, some fleeing the Syrian carnage, and force them into Poland — the blessed EU where they hope they will be safe. But the Poles, Lukashenko knows, are no less racist than his own people. They will regard the surge of refugees as an attack, and will force them back. Over the course of the film we watch as The Family is brutalized by The Border Guard — a sweet, loyal Polish man genuinely horrified by cruelty and also capable of it — and finally saved by The Activist. Well, some of them are. But even those lucky ones will have to work hard to resuscitate assurance in their own humanity. 

    The black-and-white austerity of the film reminds audiences that this is a near-documentary: virtually everything depicted happened almost exactly as we see it unfold. The lethal policies that living and breathing autocrats instituted to demonize foreigners and galvanize the hatreds of their populations: all of these themes are familiar to anyone who knows Holland’s oeuvre. They are also familiar to anyone who has read international news on any day over the past decade. And as in her earlier movies, viewers recognize themselves in the victims, the heroes, and the villains: all of us are, can be, have been, and will be, some or all of these types. Luck plays a significant role in determining whether, in our future, we are the immigrant, the border guard, or the activist. Holland has said if her producers had let her, she would have named the film Europa, Europa II. 

    This time, in this movie, the brutalities that it depicts are contemporary with the movie itself. They are taking place outside the movie theater. New ruins are being created for other children to play in. 

    The first part of the film ends as a group of immigrants are beaten and herded by border guards back over the border from Poland to Belarus. Then the screen dims to black as their shrieks fade to silence. White letters appear on the pitch screen:

    2.
    THE BORDER GUARD

    A classroom fills with men in uniforms sitting at rows of small tables. A brutish, corpulent officer stands in front of the
    blackboard, soliloquizing. We are witnessing a training session for border patrol. The guards we have just watched brutalize the immigrants – the “immigrants” who, not twenty minutes earlier, were just families on an airplane – must have sat through a similar session at the start of their service. Holland is drawing a straight line from the state to state-sanctioned violence. Make no mistake: that brutality was official business. The officer instructs, “You have to show professionalism. Give them an apple, and they’ll say it was poisoned. And then what? This isn’t propaganda, it’s a real threat! One mistake, and in six months we’ll have bombs on the Warsaw subway. Did you hear what Minister Kaminski and what Minister Wasik said? They found terrorist, pedophile, and zoophile materials! Have you seen their phones?”

    The camera then zooms onto a man whose phone keeps buzzing. We see that his pregnant wife is texting him. His concern at her repeated pings and what they may portend about the pregnancy juxtaposes grotesquely with the contempt for the migrant children that we have just witnessed. We watch anxiety flicker over his face as the guard continues: “You couldn’t afford half of what they have. And I don’t want to hear about their kids! They hire or buy children and then blow smoke in their eyes to make them cry. I saw the footage myself. We’re saving them.” He pounds the table. “You have children of your own. What father would take his child down this road? Well? They want to play on our Polish compassion while Lukashenkio rubs his hands. Remember this is classic hybrid warfare. They aren’t people, they are weapons of Putin and Lukashenko. They aren’t people… they are live bullets!” 

    He pauses and breathes deeply, then continues, “You’ve all had first-aid training at some point. Now listen up. There should be no dead bodies on your watch, is that clear? No one is to be found dead! If you see a corpse, make it disappear, understand? … I was sent here to build up your morale, but I don’t care about that. If you’re not ready to wear the uniform, you now have the chance to think it over and leave the room if you want.” The guard with the pregnant wife gets up and walks out to make a phone call, but his absence is interpreted as defection. “Anyone else?” the officer bellows.

    I had the honor of discussing Green Border with Agnieszka Holland during the movie’s opening in New York City last summer, at a screening cohosted by FilmForum and Liberties

    I asked her if the movie felt intensely personal, and if it was difficult to state so publicly such a thorough condemnation of her own people. Was this an expression of loyalty, or a betrayal?

    This movie is about my country. That it was a personal story is the primary reason I was moved to tell it. But the subject has been very important to me since long before the movie is set. This film is linked to the historical subjects I made other movies about. It is just like Europa, Europa, though that movie concludes long before this one was ever conceived. They are both about the force of evil incubating and then being unleashed. And when I started to feel that that the serpent’s egg, as Shakespeare calls it, is maturing again, and indeed maturing very close to where I live, and, that this egg — this harbinger of evil poised to be let loose — will spread over the Europe again just as it did before, I knew I had to make a movie about it. Europa, Europa or In Darkness [a movie about Jews who survived the war in the Warsaw Ghetto by hiding in the sewers that Holland made in 2011], those movies made me the person who is capable of telling Green Border, who knows how to fit it into the historical context.

    The immigration issue is the biggest and the most dangerous issue of our time. Since 2015 I have become obsessed with this issue, since the big Syrian crisis when I watched and understood how fragile and cowardly and cruel Europe is and how easy it is to use refugees and migrants as political tools. Because of how the world is changing — literally, because of climate change but also as democracy and liberalism become more and more scarce and weak. You call it the “immigration crisis,” which I don’t like because the implication is that the number of immigrants is the crisis and not the fact that none of our countries have the wherewithal to draft and institute the proper and humane policies for responsibly absorbing these human beings, these refugees. But even if that word didn’t connote blame for the refugees themselves, even if it was simply gesturing towards the horror which forces millions of people from their homes every year, “crisis” is still not a big enough word for what it is, in the same way that climate change is more than a crisis. These are both huge, perhaps insurmountable challenges, which demand that all of us change the way we live, change our expectations of life itself. 

    And of course climate change and immigration are related. They somehow are working together. And apparently we, collectively, on a global scale, do not want to deal with them in a rational and efficient way, because dealing with them responsibly would cost us our comfort. And our comfort, as our actions make clear, is the most important thing to us. We are willing to sacrifice the values that postwar Europe was built on simply because we do not want to be inconvenienced. That’s why I called it Europa, Europa: there are two Europes, the one we can rise to, and the one we slum to in spite of our best judgment. We are choosing this second Europe just as you are choosing the ugly America. And this reality means democracy and freedom and human rights and the right to statehood and solidarity — all these values, all these goods, we are really ready to sacrifice them to keep our comfort.

    The first reaction of the people who live on the border, and of the activists who went there to help, was that they had been transported eighty years back in time. Because it all seemed to echo the horrors of the 1930s, I chose to make the film in black and white as a way of invoking the footage from the Holocaust. I wanted to combine the rawness of a documentary style with a feeling of timelessness, a metaphorical timelessness of the reality that I’m showing so as to communicate that this horror recurs in human history, that it is endemic to who we are. The images, the faces, the situations at the border are so similar to what they already knew from the stories they were told by their grandparents or from movies about the Second World War. It shocked them to see it all happening again in life.

    The film does not depict much ghoulish brutality. It is stomach churning because of the implied or overheard violence, but not because of blood and guts.

    Then I asked her about the most difficult scenes to watch — one in which we see the head of a child we have come to know as a sweet son and brother drop beneath sight into quicksand, and another in which we hear the shriek and shredding of fabric and flesh as a pregnant migrant is thrown over (not quite over) the barbed wire border fence. I asked Holland how she chose which horrors to depict.

    Normally, in my movies, it is my belief that my imagination can be more interesting and more conducive to film than reality, so I’m not ordinarily very faithful, even when I’m doing historical films, to the exact facts. I have intuition, I am an artist, it is my job to present it in my way. But I didn’t feel that way about this film. The government hated that I was making this movie. I knew that they would do whatever they could to make it seem as if I was lying. The powerful machinery of state propaganda would be used against me. I had to be sure that everything you see on that screen that could shock you not only happened but that I could bring at least two sources which would verify that the events occurred as I presented them. 

    I know that it is hard to watch, but you must understand that when I showed the film to activists who work on the border, they all said the same thing: the reality is much, much worse. They all ask me ‘Why didn’t show X or how could you leave out Y?” Activists find the film too subtle! But, well, others don’t experience it that way.

    I laughed. The entire audience was still shellshocked from the previous two hours. Holland nodded and smiled.

    I know, I know. It’s tough material. But if you analyze exactly how much violence there is in the movie, it’s very little. You don’t see a lot of blood. There’s far more gore in the average thriller. The psychological violence is what makes this film so unbearable. It begins by forcing you to recognize the migrants as human beings with their hopes, desires, fears. You meet them first in a civilized context: they are taking a plane — you’ve taken a plane, you know what that’s like — and they believe that they are about to reach a normal world, that they are escaping a place of horror and blood. And slowly they descend from one circle of hell to the next. At some point while watching you realize that this is what life is like, that every victim of human cruelty expects their life to be normal before it is ripped apart. Refugees don’t have a higher tolerance for brutality than you do just because they happen to be victims of it and you aren’t. Not yet, anyway. Human beings don’t expect to be forced into hell.

    I asked her whether she felt that film is a useful medium for combatting these inhumane and discriminatory policies.

    Fascism starts with the dehumanization of a group of people. This is true everywhere, not just in Poland. But in this case, in the case I’m treating in this film, the Polish government made a conscious decision to dehumanize refugees in order to use them as a political tool. Trump does the same thing when he talks about his moronic wall. And Putin and Lukashenko were playing the same game when they started this barbaric charade and invited the refugees into Belarus, knowing that they would flee to Poland, and also knowing that Poland, like all of Europa, hates newcomers. It was a way of exposing and punishing European xenophobia. These two men understood that, just before an important election, they could use European racism as a tool to manipulate their respective electorates in their favor. Putin and Lukashenko wrote the script, but the Polish government, which was at the time a far-right nationalistic populist government, understood very quickly that this strategy had advantages for them as well. They started a propaganda campaign casting these refugees as pedophiles, zoophiles, and terrorists. This should sound familiar to Americans. 

    And when our deputy Prime Minister, Mr. Kaczynski, who is the real architect of all Polish foreign policy, decided to close the free zone and not let any media near the border, I understood how important it was to make a movie like this. Kaczynski was extremely forthright and cynical about his reasoning. He said “Americans lost the war in Vietnam because they allowed the media to come. When Americans saw the images from Vietnam, and it changed public opinion about the war.” He didn’t want images. So I created them. His prohibition was my marching order.

    What did she intend for this film to be, I asked — a warning? A howl? A sob?

    The film is a warning. I am trying to warn the audience — warn the whole world — that it’s very easy to cross the line into crimes against humanity. It’s only a few steps towards becoming the kind of people who treat human beings worse than animals. We are all doing it all the time. We, we the filmmakers and you the audience, we cannot turn our heads away. The way the German soldier twists Solomon Perel’s face in Europa, Europa — remember that scene? — and insists that he learn to hate. I am twisting the faces
    of the audiences towards this suffering and insisting that they
    learn to care.

    Some American citations to buttress Agnieszka Holland’s point:

    An extremely credible source has called my office and told me that @BarackObama’s birth certificate is a fraud.

    Donald Trump on X, August 2012

    [Eisenhower] moved 1.5 million illegal immigrants out of this country, moved them just beyond the border. They came back. Moved them again beyond the border, they came back. Didn’t like it. Moved them way south. They never came back. … We have no choice.

    Donald Trump, November 2015, praising Eisenhower’s Operation Wetback deportation program. Operation Wetback was implemented in 1954 and used military-style tactics to deport Mexican immigrants, irrespective of their American-citizenship status.

    Why are we having all these people from shithole countries come here? Why do we need more Haitians? Take them out.

    Donald Trump, January 2018.

    They’re poisoning the blood of our country. That’s what they’ve done. They poison — mental institutions and prisons all over the world. Not just in South America. Not just the three or four countries that we think about. But all over the world they’re coming into our country — from Africa, from Asia, all over the world.

    Donald Trump, December 2023

    A few days after September 11, 2001, I was standing beside a friend and her mother in the kosher food section of our local grocery store. Just beyond the aisle we were browsing in, a woman in a hijab was walking with her daughter. My friend’s mother widened her eyes, looked down at the two toddlers in her care, and muttered, “They all want to kill us.” I was five years old, one year younger than Agnieszka Holland had been the first time she had heard the word “Jew.” Like Holland, I didn’t know the word for the hatred of which that utterance was symptomatic — a hatred which had facilitated and would continue to facilitate the mass murder of millions of innocent people. But even at that age I sensed that those words were a kind of primal poison. I remember believing instinctively that a mind that would formulate such a thought is a mind that could do terrible things.

    I have no idea what the difference is between a child who hates and a child who doesn’t. The little girl standing next to me that day has grown, over the subsequent two and a half decades, into a woman who blesses the destruction in Gaza on her Instagram feed. I wonder what she thinks of Trump. I wonder what she thinks of Putin. 

    America, America — there are two versions of us, too. They have existed since the inception of the country. Both are real in the literal sense — and also real philosophically. Liberal America, the America founded on the radical affirmation that all men are created equal and with certain inalienable rights, is at odds first with the White Anglo-Saxon Protestant tribal tradition of America, with its apple pie and shotguns, spurs, saddles, stars and stripes, and then with the tribalisms that followed the celebration of multi-ethnicity in the 1960s and 1970s. Tribalism is the vice that liberalism was articulated to constrain. Unlike Europe, the right to establish America and govern the American people was based on the conviction that a just government is undergirded by the universal principles of liberalism. But tribal America balks at universalism and at those principles. Tribalism loves biological sameness, soil, exclusivity, and teaches, as the increasingly Americanized Nazi philosopher Carl Schmitt did, to divide the human world into friends and foes and to hate enemies. The one America is the antithesis of the other America. It is our great luck that, when America gives itself over to racist hatred, America betrays itself. 

    The great American tug of war is the vicious, bloody, and unending struggle between these two factions. It is the substance of our ongoing political and social crisis, and of the upcoming general election. Tribalism is an appetite. Liberalism is a virtue. And both of them live on American soil. The disenfranchised and the privileged alike are stamped by these competing visions. We understand both because we are creatures of both. 

    No animal hates like the human. Americans in particular, by virtue of our early, vital dependence on the liberal tradition, are called to bridle that hatred — to blot it out in ourselves and to oppose it everywhere else. This is not easy work. As Schmitt noted, and not glumly, it is not natural for people to be kind and welcoming to strangers. Every sophisticated human civilization develops complex mechanisms, etiquettes, and expectations for teaching hatred and for marking it in blood. Never in my lifetime has hatred been so much in vogue. Perhaps every generation gets their chance to learn that the cruelties we read about in history classes were committed by human beings like ourselves. The worst monsters do not live under beds or in our imaginations; they sit in paneled offices behind mahogany desks, signing bills into law, raising and razing cities with the same hand. Each new day freshly demands a strong stomach, and maybe mine should be stronger, but the bloodshed, and the appetite for it, that I have watched flow so freely over portions of the globe about which I care deeply — it makes me nauseous. I hope it always does. That nausea will save us.

    Three Republican Fallacies

    Believing crazy things is not the mark of
    whether somebody should be rejected.         

    J.D. Vance

    Where the elite meet.          

    Margo Channing

    I

    “You know, one of the things that you hear people say sometimes is that America is an idea.” So said J.D. Vance in his bootlicking acceptance speech at the Republican convention last summer. I sat up, because I suspected that an important admission was about to be made, and because I am one of the idea-of-America people whom he was about to assail. He continued: “And to be clear, America was indeed founded on brilliant ideas, like the rule of law and religious liberty. Things written into the fabric of our Constitution and our nation.” Of course American ideas are not what we mean by the idea of America. We mean an ideal that, for all its abstraction, is sufficiently true and just to serve as the basis of a permanent allegiance, a profound patriotism.

    Vance, by contrast, promoted an entirely different foundation for his love of country. “America is not just an idea,” proclaimed the opportunist with the temerity to lecture others about conviction. “It is a group of people with a shared history and a common future. It is, in short, a nation.” And a nation, he went on, is “not just an idea, my friends. That’s not just a set of principles. Even though the ideas and the principles are great, that is a homeland. That is our homeland. People will not fight for abstractions, but they will fight for their home.” 

    It did not occur to Vance’s shallow and self-congratulatory mind that a nation is not coterminous with a homeland, because history often forces nations out of homelands. The imagination of exile is too much to ask of the apostle of rootedness. He was preaching a doctrine of hillbilly Heimat, and disguising the most odious form of Western nationalism as the wisdom of Appalachia. Vance, who grew up in Ohio, gave as his proof the moist example of a graveyard in eastern Kentucky that is “near my family’s ancestral home.” 

    Like a lot of people, we came from the mountains of Appalachia into the factories of Ohio, Pennsylvania, Michigan and Wisconsin. Now in that cemetery, there are people who were born around the time of the Civil War. And if, as I hope, my wife and I are eventually laid to rest there, and our kids follow us, there will be seven generations just in that small mountain cemetery plot in eastern Kentucky. Seven generations of people who have fought for this country. Who have built this country. Who have made things in this country. And who would fight and die to protect this country if they were asked to.

    The account of this graveyard was designed to fill the hall with pathos. It filled me with alarm, and I want to explain why. I respect the man’s family history but I abhor his inference from it.

    For a start, people will most certainly fight for abstractions. Americans will, and Americans have — Americans, the notoriously practical people. The cemeteries of Normandy (pardon the shopworn illustration, but we are in the kingdom of first principles) are filled with Americans who died in the struggle to defend admirable ideals, admirable abstractions. I have no idea if Vance would have supported that war; it was quite inconsistent with the spirit of Milwaukee. Those Americans fought for the homes of other people, because their homes and our homes shared a philosophy about how human beings should live together and distribute power. Vance suffers from a coarse notion about the relationship of the abstract to the concrete. He believes that they are contradictions. Like the universal and the particular, however, the abstract and the concrete are partners, friends, lovers; they need each other, they long for each other. The abstract is instantiated in the concrete, and gives it its reasons; the concrete rescues the abstract from preciosity and irrelevance, and vindicates the abstract (or not, as the case may be) in the means by which it absorbs it into individual and social life. Call it dialectical, except that the interpenetration, the reciprocal impact, is so plain. Even hillbilly life is suffused with concepts!

    Vance was propounding one of its imbecilities of our age, which is the worship of the local. It is the pendant to another one of the imbecilities, which is the worship of the global. Historians have abundantly documented the inadequacy of these categories — the simultaneous workings of the local, the regional, the national, and the global in human affairs — the blessings and the curses that the dimensions constantly visit upon each other. The local is never only local. It is never all there is or the sole source of significance. The local is itself always shaped by influences from outside, positively or negatively. Even local sanctities have alien beginnings: “Kentucky” is an Iroquoian or Algonquian word and Jesus never preached in the Bluegrass. Conservatives and reactionaries have always recoiled from the porousness of human boundaries, but it is precisely this unwanted seepage which educates us ethically and makes co-existence possible. What is an education if not an expansion? Tradition needs to be protected, but not in a way that will petrify it and render it unpalatable to the age into which we aim to transmit it. Tradition is not solely a form of time travel. It is almost never  entirely local. And the death of tradition is too high a price to pay for a fanatical loyalty to an early version of it. 

    A perfect insularity, which is a dangerous fairy tale, would bespeak not only a great fidelity, as the localists believe, but also a great insecurity; and if the insecurity is felt by the majority in a society, terrible actions against the imaginary threats of the minority may ensue. (Perhaps minorities have a more plausible claim upon the uses of insularity, in the way that some medieval Jews prudently demanded a walled quarter as a condition of their settlement, but for them, too, it was a futile strategy for self-preservation.) If only because of the power of human imagination, if only because of the power of human desire, we never live only where we live. We have migrant minds. In the manner, say, of a young striver in the woods who has heard about a faraway place called Yale. And now the rube has a golden resume.

    Consider that cemetery in Kentucky. It represents a touchingly intimate relation to American origins. Good for the Vances! But who cares? Every Americanness begins at some point in time, early or late. When, a century and more ago, only one or two generations of Vances rested in that ground, they were no less American than the older citizens whose prior monuments surrounded them. There is no chronological requirement in the American theory of belonging. To believe otherwise is to promote an inverted aristocratic ideal, a reverse-snobbish Plymouth Rock-like belief in the primacy of the antique and the inherited. In a cemetery in Queens my American family is so far represented by only two graves, which are the graves of my parents, who came to these once-welcoming shores in 1947 as refugees. They are surrounded by Jewish immigrants who came and who died generations before them, and who, like them, and like the Vances, built this country and made things in this country, and there they all rest in a naturalized peace, in an atemporal American equality. 

    The whole point of this society is that it should not matter who got here first. Previousness should confer no privilege, at least if we are serious about the moral implications of multi-ethnicity. (Those who think otherwise can find sanctuary in Newport, Rhode Island.) This is true, in my view, even about Native Americans, whose claim on our conscience is not that they preceded those who came later, but that they were brutally treated by the latecomers, whose sin was not their belatedness but their brutality. But we must all enjoy the same rights now. What matters is that we act justly now. There is moral magic in that now: democracy is based on the emboldening fiction of an eternal present, and equality is a triumphant overcoming of the pull of the past. Can a country have an ontological condition? If so, ours is what Irving Howe called “the American newness.” Nothing levels history like human dignity.

    Vance is peddling a right-wing version of indigeneity. A scholar of the Heimat movement in late nineteenth- and early-twentieth century Germany summed it up this way: “Heimat conveyed the feeling of always having been there.” From the standpoint of a liberal order, however, the indigeneity of the right is no more legitimate than the indigeneity of the left: both introduce a moral hierarchy where hierarchy itself is the moral problem. In a democracy we count cardinally, not ordinally. Indigeneity, moreover, is myopic, a severe narrowing of perspective, another variety of a blinkering originalism. Pity the individual who has not known such warmth, to be sure: it is easy to understand Vance’s devotion to the family burial ground. But his family is only his family and only he has his heart; and this is true for us all. If we are to live peaceably together, it is not because you agree to feel my emotions. 

    The mindless side of populism, a political phenomenon that should more properly be called emotionalism, is owed in part to that misguided expectation: when emotions are borrowed and passed along, when contagion usurps persuasion, when collective feeling preempts individual feeling, when conformity is confused with empathy, there is the unity of the mob. It is the duty of every citizen of a morally serious society to question the authority of subjective experience, to contest the solipsism that comes naturally to finite creatures. In recent years this has become known as the challenge of “intersubjectivity,” but intersubjectivity is perhaps not best accomplished at the graves of one’s forefathers. Vance is deaf to the nasty implications of his notion of homeland (or maybe not). For the more concrete homeland gets, the more specific it gets, and the more specific it gets, the more it runs the risk of disdaining the specificities of others. Every conception
    of authenticity is exclusive. Exclusion, the celebration of impassable boundaries, is its very point. And the bragging rights of authenticity often serve as the grounds of prejudice. 

    What distinguishes the abstract idea-of-America school of Americanness from the “homeland” school of Americanness is that membership in the latter is easily limited and easily closed, whereas the former is essentially hospitable. And it owes its hospitality precisely to its abstraction. The only condition for admittance into the dream is that you exist. Since America is an idea, anybody may dream of it. The idea of America, the vision of an American second act, has saved millions of oppressed people all around the world from despair, and it is nothing to be ashamed of. The form that the escape from despair takes is immigration. Who were the Vances who acquired that plot of  Kentucky soil? When did they arrive, and from where? I do not detect even the faintest trace of Commanche in the Republican candidate for vice-president. (Some decades ago the most noisy nativist in Congress was a man named Tedeschi, which is the Italian word for German. Born on the Fourth of July!) 

    The Vances who came to America could not have regarded it as a homeland, because they did not yet live here; they can only have come here because they had an idea of it, because of its idea — the notional American promise that they sought to actualize for themselves. Their illustrious descendant’s Blut und Boden-notion of human racination would have discouraged them from the journey. Surely there was a time in early nineteenth-century Kentucky when the mamaws and papaws of Mamaw and Papaw were indigent arrivals, recent and unKentuckyan, greeners, inmigrantes, and potential scapegoats for locals with problems. Did Grandaddy Vance threaten anybody’s job?

    It is also worth noting that Vance, by stripping our homeland of its philosophical basis and its universal appeal, is shrinking it, mutilating it. The world is lousy with ethno-states, with countries that are hostile to their minorities, with polities that cling to the disreputable but time-honored distinction between the native and the foreigner, with societies (not least in the enlightened West) that view their own heterogeneity as the problem and not the solution. There are so many nation-states of which nobody will ever dream. What, after all, is the promise of Pakistan? Does the idea of Russia fill anybody with hope? Who, except for a few demented American authoritarians, seeks a second chance along the Danube? The American revolution consists also in this: that the American definition of the nation is a repudiation of the modern, that is, the European, definition of the nation. The American nation is not a political consolidation of similarity. Our national sameness, our sameness as Americans, is not contingent upon a prior sameness, or upon the renunciation of a prior difference. Our commonality is additive, not substractive. Rights are magnificently indifferent to difference, even when we are not. 

    From Vance’s callous and primitive construction of American identity it follows pretty quickly that that the “American spirit” — not to be mistaken for the effete over-
    educated bicoastal Kambalan “American idea” — is captured for him by the image of an old woman’s house filled with nineteen loaded guns. Suddenly I understand why Vance and his fellow woodsmen are unmoved by climate change: it will destroy the coasts! 

     

    II

    In January, 13, 1864, Frederick Douglass made a speech in New York. It was written a year earlier and was delivered in cities across the country. Like many Americans in those cataclysmic years, he sometimes searched for God in history, and attributed contemporary events to the workings of a divine plan. In 1861, for example, he had declared that “when all our earthly helps and hopes break down, the soul goes up to the eternal and invisible for help.” Abolitionists in particular liked to invoke heavenly intentions and heavenly outcomes. But Douglass’ sympathy for the glance heavenward was limited. In the address in New York, called “The Mission of the War,” Douglass tired of religious explanation, of all the pious and tendentious interpretations of the war. He had begun to notice that the strident appeal to God can have the effect of diminishing the human sense of responsibility, and of making excuses for human errors and evils. And so he set out to toughen up his audiences. “Speaking in the name of Providence, some men tell me that slavery is already dead, that it expired with the first shot at Sumter. This may be so, but I do not share the confidence with which it is asserted. In a grand crisis like this, we should all prefer to look facts sternly in the face and to accept their verdict whether it bless or blast us. I look for no miraculous destruction of slavery.” He came to the conclusion to which all true emancipators and all true movements of liberation eventually come, that “it is cowardly to shuffle our responsibilities upon the shoulders of Providence.” Just as in 1861, in a similar moment of philosophical severity, he had followed up his remark about the reliance of the hopeless upon heaven with these disabusing words: “The day of miracles is past.” Americans, he said, would have to “answer their own prayers.”

    Here is another and more renowned instance of American providentialism and the resistance to it. On March 4, 1865, in his second inaugural address, Abraham Lincoln spoke what is probably the most spiritually mature reflection in the annals of American religion. It included a providentialist explanation of the war over slavery — except that the language of his American theodicy was so wrenchingly humble. “If we shall suppose that American slavery is one of those offenses which, in the providence of God, must needs come, but which, having continued through his appointed time, he now wills to remove, and that he gives to both North and South this terrible war, as the woe due to those by whom the offense came, shall we discern therein any departure from those divine attributes which the believers in a living God always ascribe to him?” It is startling to see the certainty of faith become uncertain, and described as a supposition. He followed this with a breathtaking “if”: “If God wills that it continue until all the wealth piled by the bondsman’s two hundred and fifty years of unrequited toil shall be sunk, and until every drop of blood drawn by the lash shall be paid by another drawn with the sword, as was said three thousand years ago, so still it must be said, ‘The judgments of the Lord are true and righteous altogether.’” 

    If! Where are the believers who say if? In my own adventures in the religious life, nothing has disaffected me more than the certainty with which believers speak about occult entities and forces. The clarity is arrogant. But there is not an echo of the intellectual vanity of the devout in Lincoln’s formulation of his faith. He is sure and he is unsure; which is to say, he is, in the most exemplary way, human. Nor does the mention of God fill him with good cheer, or with a complacent assurance of perfect understanding. There is too much that he does not know. He respects the opacity of God even as he affirms the activity of God. The war is theologically obscure to him: “Both read the same Bible, and pray to the same God; and each invokes his aid against the other. It may seem strange that any men should dare to ask a just God’s assistance in wringing their bread from the sweat of other men’s faces; but let us judge not, that we be not judged. The prayers of both could not be answered — that of neither has been answered fully.” And then, most exquisitely of all: “The Almighty has his
    own purposes.” 

    Providentialism runs throughout American history, from the sermons of its earliest divines to the “firm reliance on the Protection of Divine Providence” in the Declaration of Independence to — well, we’ll get to that. It has many sources and many uses. Psychologically, it has provided comfort — “as the clay in the potter’s hands, so are the nations of the earth in the hands of Him, the everlasting JEHOVAH,” instructed William Smith, the provost of the College of Philadelphia, in 1775 in his Sermon on the Present Situation of Americans Affairs — and courage — in 1774 John Hancock exhorted his countrymen “to play the man for our God.” It has also furnished a theory of Divine favor, of American chosenness: “It appears that He gave [the land] to us to inherit, in that Americans from our ancestors downward, ‘till very lately, have had the peculiar smiles of Heaven,” preached William Stearns, a minister in Massachusetts, in 1775, in A View of the Controversy Subsisting between Great Britain and the American Colonies. “This is the Lord’s doing, and it is marvelous in our eyes!” The war with Britain was portrayed by preachers in the colonies as a central stage in God’s special plan for America. And when the war with Britain was going badly, the Continental Congress legislated fast days. (I gratefully learned about many of these eighteenth-century sources from a marvelous book by Nicholas Guyatt called Providence and The Invention of the United States, 1607-1876.) 

    Philosophically, providentialism may be owed to nothing more than a desire to make sense of history, to find order in its inchoateness. Providence is the antithesis of chance, and chance is the enemy of meaning. But it is not only order that is sought: for many people, history does not seem coherent unless its order also is necessary. Necessity is a larger claim than coherence, but it is necessity that they crave, because coherence alone lacks the power to justify, and theories of history are usually designed as justifications for history. (The Greek historians are the exception.) Where there is necessity, there is inevitability, and nothing relieves the perplexed individual more than the discovery that his life as he has lived it is an inevitable outcome — that it has been foreordained. In our time the carapace of inevitability is provided by three sources of certainty: religion, science, and identity. 

    For monotheistic believers, providentialism is entailed by their conception of the deity, whose role as the creator of the world and its governor is foundational. Whether or not believers must banish God from all or some the public square of an open society, as many liberals believe, they are certainly under no obligation to banish God from history. How could they? (There are no Deists anymore, which is a pity because Deists leave God out of today’s news and are therefore not inclined to culture wars.) There are many varieties of the idea of Divine interventionism, from lurid apocalyptic ones to suave naturalistic ones. The naturalistic ones pose less of a threat to the order of things and try not to avail themselves of the thrill of miracles, except insofar as natural law is itself regarded as a miracle. There is also the occasionalist view, according to which every incident on earth, no matter how small, is caused by a particular exertion of God’s will: miracles everywhere. In the Christian tradition, to the best of my knowledge, this model of God’s caring causality was established in Matthew 10:29–31: “Are not two sparrows sold for a farthing? and one of them shall not fall on the ground without your Father. But the very hairs of your head are all numbered. Fear ye not therefore, ye are of more value than many sparrows.” (In 1880, William James retorted, on behalf of immanent causality: “Not a sparrow falls to the ground but some of the remote conditions of his fall are to be found in the milky way, in our federal constitution, or in the early history of Europe.”) In Islamic philosophy, the classical example is the falling of a leaf, whose every twist and turn on its way to the ground is determined by a series of Divine decisions. In Jewish thought occasionalism appears only rarely, unless one regards petitionary prayer as an appeal to the occasionalist promise, in which case it appears everywhere. 

    The occasionalist explanation of everyday life brings one very close to superstition, and sharpens the question that we use to call “the efficacy of prayer.” The more specific the expectation, the greater the likelihood of disappointment; or so my experience has led me to believe. My grandfather, a worldly man and a captain of industry, once traveled by train across Poland to consult with his wonder-working rebbe about a toothache. I assume that the offending tooth was pulled, but I ascribe the resolution of the matter not to the saint but to the dentist. The occasionalist outlook, and more generally the coarser varieties of providentialism, has also this problematic aspect: it makes science impossible, because it denies the regularities of nature, and leaves the natural order — and the human order, where we also depend for our understanding upon regularities, even if they are less apparent and verifiable than the natural ones — essentially unpredictable and forever vulnerable to disruption by the whim or the wisdom of a higher intriguing power. 

    This theological excursus is almost done, and we will soon get back to the American crisis. But first it needs to be added that providentialism is not only religious. There are secular doctrines of historical necessity, too. Some scholars regard any theory of historical necessity as religious, so that Marxism-Leninism, for example, is now commonly described as a religion. But a belief is not religious merely because it is intensely held, and we must respect, I think, the reality of the modern rupture. There are absolute beliefs that have nothing to do with God, and we misunderstand them if we picture them merely as the new version of the old. The terms that are most commonly used to capture a godless historical necessity are destiny and fate. 

    In the ancient world, individual fate was sometimes imputed to the control of the gods and sometimes not, the other determining causes being environmental and physiological and temperamental. Often the subject was taken up, especially by the Stoics, from the standpoint of ethics: the question of fate was the question of free will and determinism, or whether the gods can be blamed for the behavior of men. Cicero worried that the belief in fate would render us mentally passive and morally sluggish — that we would not go to a doctor, for example, but instead succumb to the seduction of the inevitable. Fate is drift without worry. The defense of the freedom of the mind was a challenge to the doctrine of necessity that fate seemed to represent. But mainly the idea of fate was embraced as an alternative to the unbearable idea of chance. 

    What is striking about these discussions is that chance is not rejected in the name of purpose. It was reported of Chryssipus, a prominent Stoic thinker, that he taught this: “Fate is a sempiternal and unchangeable series and chain of things, rolling and unravelling itself through eternal sequences of cause and effect, of which it is composed and compounded.” Nothing divine there, and nothing teleological. Here is a later summary of the fatalistic view: “All things which happen by fate occur in a certain order and sequence and have an element of logical consequence in them. They say that fate is a string of causes.”

    The problem is that a string of causes is not spiritually nourishing or morally obligating. A logic is not a purpose. A purposeless necessity may be as unbearable as chance. And so causal necessity was promoted into providence and universal meaning was infused into raw events. (I apologize for the pottedness of what follows.) This occurred in the Hebrew Bible, which included, or rather invented, a linear theory of history for which the sequences and the series had a climax and culminated in redemption. In Christianity, the chain of causes was transfigured into Heilsgeschichte by Augustine and Joachim of Fiore. And many centuries later, after the aforesaid rupture, the numinous was for various reasons expelled from historical thinking — it has been suggested that the theology of history was explicitly replaced by the philosophy of history for the first time by Voltaire — and what remained is now conventional intellectual wisdom: non-sacred teleological history, with a rich menu of teloi. (Bacon’s, Condorcet’s, Hegel’s, Marx’s, Darwin’s, Alfred Rosenberg’s, Frederick Jackson Turner’s, Fanon’s, Fukuyama’s, Kurzweil’s, and so on.) We usually call this a “sense of history.” For centuries our culture has been captive to historical and developmental and evolutionary styles of thought — among the great Western thinkers about history only Burckhardt found a path between teleology and meaninglessness — though we can strenuously undertake to be otherwise. 

    This brings us, naturally, to Donald Trump.

    Among the many sinister comedies of the age of Trump has been his elevation to the status of a world-historical, even a meta-historical, figure. A shocking number of Americans, particularly Christian Americans, venerate him and have incorporated him into the salvific saga of the Earthly City. This has taken place not only at the level of folk religion, where it was to be expected of the trivializing notion of a cosseting and constituent-service God, but also in some of the higher reaches of American Christianity. The attempted assassination of Trump last summer played right into the hands of MAGA’s political theology. The talk of martyrdom spread almost as swiftly as the shooter’s bullet, not least because Trump viscerally grasped the theatrics of martyrdom, the political utility of his encounter with his mortality, even before he stood back up at the podium on the bloodied stage. Even as he bled, he cosplayed. His talent is not for deals, but for spectacles. 

    Later that day I began to notice on the internet a picture of Trump in the company of Jesus. He sits at the Resolute Desk and behind him, in a vaporous glow, stands the Son of God, in white robes and a smart beard, looking like an extra from Ben Hur. Jesus puts his hands on Trump’s shoulders as Trump moves his right hand to touch His left hand. Behind them is the Stars and Stripes. The picture was the work of one Danny Hahlbom, born in Long Island in 1949, from whose website I was not surprised to learn that his work was “honored for openings in select theaters in Texas for Mel Gibson’s The Passion of the Christ,” and also that he never took any art lessons. He calls his masterwork of MAGA iconography You Are Not Alone. He makes Sallman look like Ingres. 

    The problem with his image is not that it is kitsch: I have always believed that religious kitsch is a sign of religious life, and that lived religion does not require masterpieces. (Secular spectators, who do not read religious art religiously and often cannot identify the figures in a Biblical scene, are the ones who insist on Rembrandt and Tintoretto.) The offense in Hahlbom’s image is not aesthetic, it is religious. Even to a stiff-necked Jew such as myself, it seems blasphemous. How can anyone who knows the first thing about Jesus and the first thing about Trump imagine such a union? The only error in the picture more insolent than the theological one is Trump’s red tie, which has been given blue stripes. 

    Ah, but it is precisely because Trump is a sinner that Jesus loves him! He is a more perfect messenger precisely for the magnitude of his imperfection. This is an old idea, and not only a Christian one. (In the thirteenth-century a German rabbi was asked whether a deformed man is permitted to lead prayers, and he replied: “It is not permitted, it is mandatory, because the King of Kings prefers broken vessels.”) But the loveliness of the idea can be easily exploited by, well, scumbags. There are sinners for whom forgiveness is nothing more than a scam. Call this not moral hazard but metaphysical hazard. At a rally of pastors and ministers in West Palm Beach last summer, Ben Carson made the point explicitly, though I have no doubt that it is made all over the Christian evangelical world: “David was a slimy guy too, in the Bible. I mean, murder, and adultery, and deceit. And yet God said he’s a man after my own heart.” Never mind that David was guided by the Holy Spirit and heard
    the voice of God, and Trump is guided by the spirit of Roy Cohn and hears the voice of Steve Bannon. It is true that we are all imperfect, but Trump abuses the privilege. 

    The religious justification of Trump is premised on a complete abandonment of ethical standards for the presidency of the United States. It is unperturbed by the president’s — the candidate’s — crimes and convictions, about which it conflates prosecution with persecution. In this tacky partisan exegesis, Trump’s unworthiness is what makes him worthy — that, and the cold fact that he proposes to advance a particular agenda. What won’t these people believe in order to keep their bump stocks? Faith need not be as unintelligent as the church of Trump makes it seem. And so Trump has been inducted into the pantheon of the holy sinner, of redemption through sin. He is the redeemer and Hulk Hogan is his Elijah. 

    In America, moreover, providence has often been associated with wealth. God has been good to Trump, hasn’t He? Look at his house! This, too, is an American trope. In 1877, Junius Morgan, who was J.P. Morgan’s father, declared at a fancy dinner that “a kind of providence has been very bountiful to us, and under this guidance the future is our own hands.” Note that this was a version of providence that did not interfere with the satisfactions of the entrepreneurial ego, that was not a deterministic insult to the plutocrat’s vanity. John D. Rockefeller also believed that his success was the Lord’s plan, and bluntly asserted that “God gave me money.” In truth, rich people are not the only ones who interpret success as providence. We all hope for luck, and there are some who take the trouble to make themselves deserving of it. But when luck comes our way, we recoil at calling it luck. Me, an accident? We want our accomplishment to signify more. Luck is indifferent and unflattering. We prefer that our trajectory not be random; that it be, if it is benevolent, our just desert; that it be validated by the highest of authorities. These pious escapes from historical contingency put me in mind of Heine’s joke about a French industrialist: “He is a self-made man, which absolves God of a terrible responsibility.” Providentialism is a reading of luck, a coaxing of necessity out of the demoralizing doctrine of caprice. (In Hebrew, the word for “fate” is the same word for “lottery” — goral, which nicely illustrates the tense proximity of the poles. In English there is a looser relation: lot, lottery.)

    Needless to say, there are people for whom the doctrine of cosmic caprice is not at all demoralizing, who are blithely unburdened by the stresses of locating meanings in patterns and experiences. Sometimes I envy them, though I cannot imagine how they live or why they vote. Nor do I intend to suggest that luck is all there is: human outcomes are significantly determined also by human actions, and the self-made man is not at all a complete fiction. (That was Obama’s infamous insult to capitalism.) One should have pride in the work of one’s hands — but not only pride. The factors are always mingled. Human agency and human freedom are compatible with both providence and luck. Yet the doctrine of providence can be a way of raising human achievements to a metaphysical prestige that they do not deserve. The Bible warns against the vainglory of asserting that “my power and the might of mine hand hath gotten me this wealth,” but the American theory of success generally does not come with a corollary about humility and complexity. Behold Trump.

    My favorite example of capitalist providentialism — no, not Max Weber — was produced in the Reagan euphoria. In 1981, the American Enterprise Institute published an imperishable pamphlet by Michael Novak called “Toward a Theology of the Corporation.” It included this passage: 

    In thinking about the corporation in history and its theological significance, I begin with a general theological principle. George Bernanos once said that grace is everywhere. Wherever we look in the world, there are signs of God’s presence: in the mountains, in a grain of sand, in a human person, in the poor and the hungry. The earth is charged with the grandeur of God. So is human history.

    If we look for signs of grace in the corporation, we may discern seven of them — a suitably sacramental number.

    The Creator locked great riches in nature, riches to be discovered only gradually by human effort. The agency through which inventions and discoveries are made productive for the human race is the corporation. Its creativity makes available to mass markets the riches long hidden in creation. Its creativity mirrors God’s.

    Novak went on to speculate about the spiritual message in the name of the W. R. Grace Corporation.

    Donald Trump would like us to regard him as a businessman, when in truth he came to American politics from the sewers of reality television, professional wrestling, beauty pageants, fixers and shell companies, and the thuggish ethos of twentieth-century Queens. His profanity extends to his business practices, as court records now show. On the other hand, Trump’s wealth, his very solvency, is indisputably a miracle. 

    A friend of mine was present at that Christian celebration of Trump in West Palm Beach. He sent me some notes. “What can I tell you? Trump was in full-on martyr mode and the televangelist-like speakers were talking about his ear as though it were some magical Christ-like thing. ‘You guys ready to worship?’ one speaker began. ‘Maybe God spared Trump,’ at the two-minute mark of Charlie Kirk speaking.” But then he noticed an interesting complication in the swooning crowd. They were not all eschatological idiots. “In my interviews,” he continued, “people openly admitted that they do not love Trump, calling him a necessary evil and hateful, and they wish they had someone else. He’s an imperfect vessel, but hell, what can they do…” Some of them suggested that it would be more historically appropriate — which is to say, less religiously embarrassing — to compare Trump to Cyrus, the founder of the Achaemenid Persian empire. The Cyrus comparison was often invoked by supportive Christians in the early years of Trump’s rise. It was a nice attempt at a Scriptural precedent for Trump, except that Cyrus was the splendid opposite of Trump: he was perhaps the most tolerant king in history, a legendary respecter of difference, an ancient pluralist. He was kind even to the Jews — as a child I was taught to revere him: he had fans in Flatbush — by bringing their Babylonian captivity to an end and supporting them in their return to Zion and in the rebuilding of their Temple in Jerusalem. I guess this means that, like Trump, Cyrus was in construction, but there is no record of any royal demand that the Temple bear his name.

    The Cyrus analogy is a way of retaining the claim for Trump’s gigantic status without wrapping him in holiness — a softer providentialism. There are many ways of being duped. In the aftermath of the attempt on Trump’s life in Pennsylvania, Ross Douthat provided one of them. “The scene on Saturday night in Pennsylvania,” he wrote breathlessly in the New York Times, “was the ultimate confirmation of his status as a man of destiny, a character out of Hegel or Thomas Carlyle or some other verbose 19th-century philosopher of history, a figure touched by the gods of fortune in a way that transcends the normal rules of politics.” (He really should not cast stones at verbosity.) He then proceeded to a little lesson about Hegel, linking to Hegel-by-Hypertext. “In Hegel’s work, the great man of history is understood as a figure ‘whose own particular aims involve those large issues which are the will of the World Spirit.’ Hegel’s paradigm was Napoleon, the Corsican adventurer whose quest for personal power and military glory spread the ideas of the French Revolution, shattered the old regimes of Europe and ushered in the modern age.” Douthat did not cite Carlyle, but here is a taste of what reminded him of Donald Trump: 

    I liken common languid Times, with their unbelief, distress, perplexity, with their languid doubting characters and embarrassed circumstances, impotently crumbling down into ever-worse distress toward final ruin; — all this I liken to dry dead fuel, waiting for the lightning out of Heaven that shall kindle it. The great man, with his free force direct out of God’s own hand, is the lightning. His word is the wise healing word which all can believe in.

    Trump to a tee, no?

    But Douthat’s epiphany was not over. “What if progress isn’t linear, and the World Spirit’s purposes are a bit more complicated than an optimistic form of liberal Protestantism expects? What if an era is decadent rather than vital? What if there is no obvious next political stage for a civilization’s development? What if stagnation and repetition rule the day? What does a man of destiny look like then? I think we have to say it looks like Donald Trump.” Go ahead and gasp. But it gets better, or rather worse. The reason for Trump’s anointment
    by destiny is that he, unlike the other authoritarians of our time, has no fixed beliefs. Unlike philosophical and political conviction, you see, “actual populist sentiment is more protean, more flexible and opportunistic, more certain of its enemies than its policy commitments.” This is another version of the unworthiness-as-a-source-of-worthiness argument. All hail the man who believes nothing, who has the genius to pick enemies over policies! 

    Trump’s greatness, says Douthat, is “vibes-based,” rather like Douthat’s analysis. But Trump’s greatness may be the biggest lie of all. Who can doubt that history will show him to have been the most puny of all American presidents? Then Douthat seems to pull back, sort of. He is wary of sounding downright religious about the felonious monster. There appears this slippery remark: “He’s a man graced, this past weekend especially but always, with incredible, preternatural good luck.” Well, which is it, grace or luck? I thought that grace is the antithesis of luck — that grace is luck doctrinally interpreted. Isn’t the characterization by a Christian of grace as luck a colossal mistake? Douthat continues, pretending to sober up: “That last quality is understood by some of Trump’s religious supporters as proof of divine favor and a reason to support him absolutely. But this is a presumptuous interpretation. (Some notably sinister historical figures have enjoyed miraculous-seeming escapes from assassination.)” So have a lot of ordinary people who were out walking in the evening on the wrong street in the wrong neighborhood. Note also that Trump is exempted from the company of “notably sinister historical figures.” 

    The providentialist standpoint, and hero worship generally, prefers to dwell on the successes and leaves the failures to the philosophers and the historians, who must answer to standards of objectivity and honesty about the entirety of the picture. Tucked away inside the commonplace notion of providence is an optimistic attitude to human affairs, which will eventually work out owing to the high quality of the supervision. A cruel providence, by contrast, a malevolent God, would put too much pressure on the belief system, which might smash on the rocks of theodicy. Come to think of it, theodicy would be a pretty good framework in which to set the ascendancy of Trump — at least the problem would be factually stated. So cling to your providentialism, if you will – but acknowledge, please, that Trump may be not our reward but our punishment. In his essay on fate, Emerson insisted that “the way of Providence is a little rude. Providence has a wild, rough incalculable road to its end, and it is of no use to try to whitewash its huge missed instrumentalities, or to dress up that terrific benefactor in a clean shirt and white neckcloth of a student of divinity.” 

    Perhaps catching himself in the philosophical inflation of a crook, Douthat next asks: “Why talk about Trump in these sweeping terms, the anti-Trump reader might say, bringing in God and history and building him up to be something more than just a charlatan and demagogue?” An excellent question, but do not mistake it for a saving moment of skepticism. Douthat has an answer at the ready: “Because otherwise you’re just not dealing in reality.” There’s the tell. Sooner or later every defense or extenuation of Trump must turn the epistemological trick and assert that criticism of the titan is owed to another definition of reality. There is no way for the case for Trump to survive even the most perfunctory scrutiny except by pleading alternative facts and running for the hills. Douthat’s enthusiasm for reality is surprising. It is a betrayal of faith, I think, to allow reality the last word. Religion does not exist to be complicit with reality or to affirm it. Hegel and Carlyle were history’s flacks. But religion must be supremely counter-cultural, so as to rain critique upon history, upon reality. And what is this “reality” which we despisers of Trump fail to recognize but must honor? Is Trump not a charlatan and a demagogue? 

    This is what fellow-traveling with the fanatics looks like. But luck is not destiny and destiny is not luck. If Trump is lucky, then he is not destined. Luck is not intentional or legible. But Douthat’s hero-worship, his on-deadline divination, is based on a specious assumption about the legibility of history, which is transparent to Douthat in a way that it was not, say, to Lincoln. Rising to prophetic decibels, he warns that “in a struggle with a man of destiny, there is no normalcy to be restored.” Douthat is just another post-liberal pyromaniac. If “normalcy” includes truth, reason, decency, respect, patience, argument, evidence, care, compromise, science, and social peace, then its restoration is nothing less than a sacred civic obligation, along with the resounding rejection of the titillating abnormalcy of the man of destiny. 

    It was only a matter of days before the clean-shirted Douthat was himself forced to deal in reality. What happened was that Trump opened his mouth. He gave his acceptance speech at the Republican convention and it was a world-historical dud. All of the man’s meanness and narcissism, all of his intellectual and cognitive limitations, all of his cynicism and opportunism, were grindingly on display. The nimbus of Butler County was gone. This was not the performance of a great man of history. Hegel and Carlyle would have switched to TCM. But there was poor Douthat, the bringer of good tidings, trapped in the rubble of his own absurdities. The headline on his next piece was “How Trump Sabotaged His Own Apotheosis.” When I saw it I permitted myself a historiosophical chuckle. “After the assassination attempt,” he began, “I called Trump a ‘man of destiny’ and nothing about his rambling speech changes that assessment.” The apologetics here entered the realm of desperation, as apologetics often do. Distinctions would have to be swiftly fabricated. “But it’s crucial to understand,” he went on, “that the nature of his destiny, most likely, is not to fully rule. He cannot bestride American politics the way F.D.R. or even Ronald Reagan did, or for that matter in the way of contemporary nationalist leaders like Hungary’s Viktor Orban or India’s Narendra Modi. In part that’s because he doesn’t have their kind of disciplined agenda, but in part it’s just that he can never be other than himself.” An excess of authenticity is what hobbles the supreme American leader. And here was Douthat’s final revision of his providentialist punditry: “The Republican nominee dominates our politics not by winning all the time or getting everything he wants (he may not even be sure of what he wants from day to day) but by making everything else exist perpetually in relationship to him — even when he’s losing, out of office or suffering policy defeats — and refusing to allow any order separate from himself to be established so long as he’s alive and kicking.” 

    Ponder what he is praising in those words and join me in a round of patriotic revulsion. Perhaps Congress should legislate another fast day. Or should it be left to the states? 

     

    III

    As I was crossing the road the other day, I kept bumping into elites. Or rather, they kept bumping into me. They were everywhere. They seemed to know each other, though they were too refined to wink. They kept getting in my way. Their sense of entitlement about getting to the other side of the road — and before the light changed! — was obnoxious. I reached the pavement in a fit of egalitarian pique. When I looked behind me, I observed the elites walking haughtily to their destinations, as if their destinations were better than mine. When one of the elites entered a dry-cleaning store, a guy next to me mumbled something about how they were too good for washers and driers. When a member of the Council on Foreign Relations entered the vegetarian market, the experience of status degradation was too much. I needed a drink. I headed straight for my bartender at the Four Seasons and ordered an herbed butter–infused Michter’s Bourbon with black pepper honey and orange bitters. The people’s drink. 

    Another of the sinister comedies of the age of Trump is its frenzied anti-elitism. Sometimes I think that anybody who uses the word “elites” is ipso facto a member of the elites. We are awash in gold-card populists. The campaign of 2024 is often depicted as a contest between right-wing populists and left-wing populists, but it could just as accurately be described as an internecine war between anti-elitist elitists. This is not new. The tribunes of the people — Marxist intellectuals and Knightsbridge socialists, not to mention the Joe Sixpack who lives in Palm Beach and Bedminster — have often emerged from more privileged classes; Marx himself, a bourgeois thinker consecrated to the proletariat, went to great pains to explain how people such as himself were possible. The most sterling example of the crossover was certainly Franklin Delano Roosevelt, the traitor to his class. These defections and paradoxes are all edifying, because they defy the tyranny of origins. (Sometimes they are phony, as when Tom Joad — I mean Bruce Springsteen — returns from a concert about hot soup on a campfire under a bridge to his estate in New Jersey.) 

    They also refute the sociology of knowledge that is at the root of many aspects of our crisis — the idea that we are all merely spokespeople for our groups, that our outlooks are socially determined, that intellectual independence is impossible and even undesirable. On the left they call this doctrine “standpoint epistemology,” in the same spirit in which Richard Rorty used “ethnocentrism” as a term of praise. If you add to this a preference for one group over the others,
    so that the members of that group are (as Marx claimed about the proletariat and we claim for all our favorite victims) epistemologically privileged, wiser, deeper, sagaciously scarred, more acquainted with what is really real, then debate is destroyed by deference. So much of our culture consists in nervous exercises in deference, which is why the heat is high but the light is low, and why many views are not heard, and why epistemological treason is evidence of freedom of thought. Not everything that disadvantaged people say is right and not everything that advantaged people say is wrong. In ancient days, when Americans still cared about truth, we understood that the merit of an opinion is not established by the social position of the individual who espouses it. 

    Yet the Trump-Vance anti-elitism is guilty of worse sins than hypocrisy. Their manipulation of resentment is a betrayal of the hurting people whom they purport to save. There is a basis in reality, after all, for those resentments. The economic inequality in our country is incontrovertibly obscene. But what do we do when legitimate grievances turn poisonous? It is the central perplexity of our politics. And the Trump-Vance strategy is precisely to make them poisonous, and thereby distract the poor and the dispossessed from the search for solutions. (Daniel Bell once remarked that the opposite of anti-elitism is fairness.) They offer them policy nostrums that pander to the poor but benefit the rich, and an endorsement of malice, and a terror of the future. Gullibility is often the child of unhappiness. When Trump and Vance concur in their wounded electorate’s disinclination to change, they condemn them, the MAGA millions, to more of the circumstances that have been responsible for their wounds — to more coal. Economically speaking, they want their votes in return for nothing. For all their populist rhetoric, they lag behind the groupuscule of Republican thinkers and activists who seem to have sincerely discovered the actual American working-class and befriended organized labor — I am tempted to call them the Democratic wing of the Republican Party, except that the Democratic Party is itself struggling to establish its own relationship with American workers, whose struggles are not adequately personified by George Clooney. 

    The trouble with anti-elitism is not that elites do not exist. It is that they exist everywhere, in every movement and party, on all sides. A mere revolt against elites makes a politics stupid. The important question is what the respective elites stand for. Talking truth to power makes no sense if power happens to be acting rightly. It similarly makes no sense if power happens to know what it is talking about. For this reason, populists should have more respect for technocracy, and for their indebtedness to technocrats. One of the most serious obstacles in American politics is the arcane character of some of the most urgent issues facing it. Who really grasps the particulars of health care and climate change and financial regulation? (Not me.) There are policy questions that are not just moral matters, which we are all more or less qualified to evaluate. And even some moral matters must await the clarifications of social science. This means that the vast majority of American voters are regularly asked to make decisions about things that they do understand. They are democratically enfranchised, but they are democratically incompetent. They vote in ignorance, as if a vote is nothing more exigent than an expression of emotion and not of informed reasoning, as if the right to vote is all that matters about the vote. But along with the right to vote comes the duty to vote intelligently. Knowledge should not be an amenity in a society that governs itself by its opinions, nor should it be maligned as a badge of eliteness. When Trump says that he “loves the poorly educated,” a love that in his case is also a self-love, he slanders the democratic promise; and the collapse in recent years of the prestige of a college education bodes ill for the quality of our social order. 

    In order to vote intelligently, therefore, we often require the assistance of the archetypal villain of populist politics: the mustache-twirling expert. But do not be afraid. Expertise is simply an earned authority about a small piece of the world. It may be exasperatingly technical, but its specialization is its strength; and it is not a conspiracy or a mask for power, unless of course you disagree with it, or, more fundamentally, with its faith in the possibility of expanding objectivity. Expertise is the inexorable consequence of the complexity of life, and certainly of the administration of public policy. I am aware that there are experts on both sides of every issue, and I have no answer to the question of how to adjudicate between them, because if we could adjudicate between them we would have no need of them in the first place. (There are also experts on expertise.) Anti-elitism, insofar as it enshrines a hostility toward the intellect and proposes to acquit us of our citizenly responsibilities without sufficient knowledge, is precisely what it says elitism is: a technique for keeping people down. 

    And who are “the people”? They are a myth invented by the populists, who in their selective and self-regarding empathy always expel people from “the people.” “The people” are never the people. Hillbilly standpoint epistemology is as intolerant as woke standpoint epistemology. It, too, is in the business of choosing a community of the elect, of giving out halos. Cognitive equality — the ability to see all people equally — turns out to be as elusive as political equality. The populists’ “people” is the very contradiction of the Constitution’s “People.” Both are fictive, but the latter is at least a generous and bettering fiction, which is why its own principles could be effectively deployed against its racist stain. The former is a surly and inward-looking crowd, a wrathful mass, a mob waiting to form, or rather to fuse. It seeks recognition more ardently than it seeks relief. The psychology of unfairly treated people, of humiliated people, must be respected, but it plays into the hands of unscrupulous politicians to prize affirmations over solutions. Even if it is always awkward to counsel patience for people in pain, to preach to them about letting the system work when it has not worked before, still there is no wisdom in impatience if one is serious about change. The time that it takes to remedy an injustice is itself a part of the injustice. 

    “The people,” moreover, are always accompanied by a partner. He is “the leader.” The leader incarnates the people and charismatically intuits their needs and speaks for them more powerfully than they could speak for themselves. “The people” can almost be defined as that segment of the population who are in thrall to “the leader.” The compatibility of populism with authoritarianism is one of the saddest stories in political history. MAGA is the contemporary American chapter of that history.

    Lest Professor Moyn accuse me of a lack of sunniness, I should note that the gloomy view of “the people” was developed as a response to the long historical experience of “the people.” In the nineteenth century, in the early decades of mass politics, sunniness about the mobilization of citizens in the streets was plausible, since the causes for which the early crowds agitated were (mostly) emancipatory and political participation was a cherished value of the emerging liberal order. Jules Michelet could write Le Peuple, for example, a rapturous paean to the people that now reads like a liberal hallucination. By the end of the century, however, when mass politics had shown its ugly face (or faces), the enthusiasm for the streets was getting harder to justify, even if the Nietzschean neo-aristocratic contempt for the canaille, according to which the poor are animated not by need but by envy, was contemptible; and by the middle of the twentieth century anybody who had no misgivings about uprisings and insurrections was a fool. Hitler’s crowds, and Lenin’s, and Mao’s, were also “the people,” and their rebellions were more the result of remorseless top-down manipulation than sentimental bottom-up spontaneity. (“Never play with insurrection,” Lenin instructed, “you must go to the end.”) The rebellions of the 1960s were supposed to restore the good reputation of mass politics and let the sunshine back in, but they had the opposite effect, as the mobes become mobs. 

    Many of the epochal leaps in social progress in those decades were undeniably a result of that turbulence — but not of its progressive extremes and its populist excesses, which jeopardized the prospects of reform. Would Lyndon Johnson and his political elite have signed the Civil Rights Act and the Voting Rights Act if not for the pressures from below? We may sunnily conclude, then, that sometimes elites can be moved. And we may also conclude with a smile that elites are not omnipotent: they sometimes fail, as when billionaires sometimes fail to elect their bought-and-paid-for candidates.

    It is not all rigged. Or is it? Anti-elitism is a crusade of the American left as well as the American right, and recently it produced an impressive manifesto, called Elite Capture: How the Powerful Took Over Identity Politics (And Everything Else) by the philosopher Olúfémi O. Táíwò, who is the C. Wright Mills of our day. The book is smart and — always a good sign — ringingly critical about its author’s own congregation. (I owe the idea of the deleterious effect of “deference politics” to Táíwò, who introduces it in a brilliant criticism of woke commonplaces such as “listen to the most affected” and “center the most marginalized.”) Táíwò offers a definition of what he deplores: “Elite capture happens when the advantaged few steer resources and institutions that could serve the many toward their own narrower interests and aims.” Nothing controversial there, though he would do well to note the many counterexamples, most conspicuously the Ford Foundation. 

    Like many investigations into elites and elitism, Taiwo’s book is at bottom another attempt to find an explanation for the perennial failure of radical politics. Yet the explanation is not far to find. In fact it is his right under his nose. The “everything else” in his title is the giveaway. He ought to consider the totalistic mentality that his own book exemplifies. For example: “It’s not just that wokeness is too white. It’s that everything is.” Beware those italics. You cannot properly understand the possibilities for change unless you recognize that there are exhilarating holes in “everything,” unconquered spaces, respites from domination, incomplete captures — or to put it differently, that there is no such thing as “everything.” Ordinary experience, poor and rich, black and white, female and male, does not substantiate such an airless and seamless picture of society, such a perfectly dystopian monolithic prison. The reliance of radicalism upon the systemic standpoint has always been its undoing, not least because “everything” is never going to change. 

    “Systemic” denotes more than pervasiveness and ubiquity, which is horrific enough when speaking of oppression; “systemic” is a sociological euphemism for a condition of doom, and also for a warrant for hopelessness. (Perhaps “systemic” is more kindly understood as an emotive term for the amount of distress that motivates a social analysis: misery often feels like everything.) I do not see how the social action that Táíwò prescribes is possible in the social world that he describes. He cites approvingly a bright idea called “pragmatic utopianism,” but that is a glib oxymoron, a bumper sticker, and in practice not at all an advance on the incremental liberal reform that radicals detest, which still remains the most reliable and least damaging pursuit of justice. A pragmatic utopian, who works with institutions of power and takes his victories where he can find them, in progress that is not epic but stubborn and steady, is a person who behaves like a liberal while dining out on his hatred of anti-perfectionist liberals. 

    Who’s afraid of Vilfredo Pareto? He was once a towering figure in Western intellectual life and his picture was on the cover of Saturday Review. In 1901, fifteen years before he produced his vast Treatise on General Sociology, Pareto published a paper in an Italian journal of sociology in which he contended that “the history of man is the history of the continuous replacement of elites: as one ascends, the other declines.” A replacement theory! Elites change. Later he called this “the law of the circulation of elites.” Those were the days when laws of history were still being codified. Pareto did not idealize elites: “The new elite which seeks to supersede the old one does not admit to such an intention frankly and openly. Instead it assumes the leadership of all the oppressed, declares that it will pursue not its own good but the good of the many; and it goes to battle not for the rights of a restricted class but for the rights of almost the entire citizenry. Of course, once victory is won, it subjugates its erstwhile allies, or, at best, offers them some formal concessions.” He regarded socialism as a means for the creation of “a working-class elite” — a wise irony. 

    Those were also the days when the problem of elites vexed many social thinkers. In 1896, Gaetano Mosca published Elements of Political Science, now known as The Ruling Class, in which he explored the mystery that in every society a majority is ruled by a minority. He, too, was a realist about the perdurability of hierarchy. “There can be no human organization without rankings and subordinations. Any sort of hierarchy necessarily requires that some should command and others obey. And since it is in the nature of the human being that many men should love to command and that almost all men can be brought to obey, an institution that gives those who are at the top a way of justifying their authority and at the same time helps to persuade those at the bottom to submit is likely to be a useful institution.” In 1911, Robert Michels, another Italian but born in Germany, and a close friend of Max Weber, published Political Parties: A Sociological Study of the Oligarchical Tendencies of Modern Democracy. Recognizing “the impossibility of a complete practical application of the principle of mass-sovereignty,” Michels famously coined “the iron law of oligarchy.” He declared that “society cannot exist without a ‘dominant’ or ‘political’ class, and the ruling class, while its elements are subject to a frequent partial renewal, nevertheless constitutes the only factor of sufficiently durable efficacy in the history of human development.” (Even Táíwò concedes that “we may not be able to entirely eliminate elite capture from the world.”)

    None of these thinkers were democrats. They all believed that the permanence of elites, and their indispensability for the functioning of a society, rendered democracy impracticable. This is Michels: “In all the affairs of management for whose decisions there is requisite specialized knowledge, and for whose performance a certain degree of authority is necessary, a measure of despotism must be allowed, and thereby a deviation from the principles of pure democracy.” He started out in the Italian socialist party and eventually joined the fascists, because — fellow countrymen, take note — he was struck by the rapport between the leader and the workers, and fell for the illusion that the fascist was therefore the likeliest instrument of justice. And this is Mosca, who was “opposed to pure democracy because I am a liberal” and bravely denounced Mussolini and thwarted his bills in the Italian senate: “The term ‘democratic’ seems more suitable for the tendency which aims to replenish the ruling class with elements deriving from the lower classes.” His democratic ideal was nothing more ambitious than a heterogeneous elite — diversity at the top. 

    What is strange about all these theories of the elites is that they contrast themselves only with “pure” democracy. Their unspoken subject is their disappointment in the Rousseauist reverie of a general will, of a society completely reconciled with itself. They write as if Madison never lived, and they seem to regard the permanence of social distinction and political conflict as a significant discovery. The American political tradition already knew this. It inculcates a certain unsurprisability about the roughness of living together — an unsurprisability that is nonetheless not immune to outrage. When Madison and his colleagues established representative democracy over direct democracy, they invented a political elite, and in perpetuity. A mediated relationship with the people and their passions seemed more conducive to calm and fair governing, but it introduced a vector of verticality that now haunts our politics and diverts its attention from what actually ails us. We are squandering ourselves in a prime-time soap opera of superiority complexes and inferiority complexes, which is perhaps unavoidable in a society in which an apartment sells for two hundred and fifty million dollars. But the problem before us is not that we have elites. The problem is that some of our elites may be wrong, and mendacious, and cruel. The choice between the Trump-Vance elite and the Harris-Walz elite is a genuine choice, even if they are both elites. They all seek power, but they will use power differently. The powerful can help either the powerful or the powerless, and sometimes both. But comparing ourselves incessantly to each other will teach us less than comparing ourselves to the future. 

     

    A Stupid Cartoon and the University Ideology

    Among the thousand currents of the university turmoil during these last several months, the tiny ripple that most securely caught my eye was a distinctly minor scandal at Harvard back in February, which caused not a single broken window or student riot or mass invasion by agents of the state. This was a scandal over a cartoon. The minor scandal had the virtue, however, of casting a retrospective light on an earlier scandal at Harvard, the original scandal, which was pretty much the founding moment of what eventually became the enormous tide of university protests and controversies. 

    The original scandal was a statement signed by more than thirty Harvard student groups in the first days after the October 7, 2023 massacre blaming Israel (“entirely responsible”)  instead of Hamas (unmentioned) for the atrocities — after which came the clumsy dithering of Harvard’s president, Claudine Gay, to speak up in a sufficiently articulate fashion about the massacre and the student statement, which led to her notorious failure in testimony to Congress to find anything condemnatory to say about students calling for genocide of the Jews (“depends on context”), which led to everything else. And this was not just in America. In Paris, Sciences Po, aka the Institut d’études de politiques de Paris, which is more or less the Harvard of France, generated its own scandal, beginning in March. The Sciences Po students held a pro-Palestine meeting. A Jewish student got up the courage to enter the amphitheater. And the Jewish student was greeted in a manner that was sufficiently obnoxious to attract the attention of Emmanuel Macron himself, who thought it his duty to underline the “unspeakable and perfectly intolerable” behavior — which led, by late April, to a student occupation of a stairwell, the intervention of riot police, indignation over the menace to academic freedom, and generally the turmoil that any number of universities and arts organizations have come to know. In this fashion, the enormous and sometimes scandalous wave of protests against Israel and Zionism that got started at Harvard has turned out to be, well, maybe not universal. Problems and protests like these seem not to have occurred in the Latin American universities, which is curious. Nor in various other regions. But the wave has been very large. The cartoon scandal — the mini-event at Harvard in February — was brought on by two student organizations, the Harvard Undergraduate Palestine Solidarity Committee and the African and African American Resistance Organization, with the unfortunate support of still another organization called Harvard Faculty and Staff for Justice in Palestine. The two student groups set out to show and acclaim the historical origins of African–American solidarity with the Palestinian cause. This reaches back to 1967 and the rebellious young activists of the civil rights movement. The Harvard student groups wanted to explain that, in adopting the Palestinian cause, the young rebels of those long-ago times took a major step in advancing the larger struggle for black liberation. The students composed an infographic making those points, and the graphic within the infographic was a charcoal-line cartoon by an artist named Herman “Kofi” Bailey, which the students lifted from the young rebels’ newsletter from 1967. 

    The cartoon showed blacks and Arabs being jointly oppressed by their enemy, the Jews. A black man and an Arab man (who might have been Muhammad Ali, the boxer, and Gamal Abdel Nasser, the president of Egypt) gazed helplessly upward from the cartoon with nooses draped around their necks. At the top of the cartoon a white hand, bearing on its back a Star of David tattoo encasing a dollar sign, held the two nooses loosely in its fingers, ready to give the fatal yank. But salvation was in sight. This was a scrawny arm brandishing a machete, with the arm and machete labeled “Third World Liberation Movement,” ready to slice the ropes and liberate the doomed. The cartoon was, in short, a melodrama of victimhood (blacks, Arabs), victimizer (Jews), and savior (Third World Liberation). The Harvard student groups saw sufficient value in the cartoon to post it on their Instagram site. Someone at the Harvard Faculty and Staff for Justice in Palestine was sufficiently impressed to repost it, signaling approval (even if, in reality, the faculty-and-staff group had no idea what was being reposted). And the mini-scandal was at hand. 

    On this occasion, Harvard’s new interim president — Claudine Gay was gone by then — demonstrated that he had learned from Gay’s mistakes and was quick to condemn. And the dean of Harvard College, Rakesh Khurana, did the interim president one better. Dean Khurana called the Instagram post “unmistakably anti-Semitic and racist,” which was a sharp phrase, given that, at Harvard, the two student groups surely regarded themselves as racism’s boldest enemies. And the phrase was doubly sharp, given that Harvard Faculty and Staff for Justice in Palestine had made a point, in their founding statement, of disputing the claim that “critique of the Israeli state is anti-Semitic.” Their own critique of the Israeli state turned out, however, to be anti-Semitic. Said the dean: “It’s become clear that some members of our community are intent on testing the limits of how low discourse can go — and it now appears that we are hitting rock bottom.” 

    Everyone apologized. Harvard is civilized. And yet, no one likes to be insulted. And the people under accusation by their dean may have felt that, even if they had failed to examine their cartoon closely enough, the general opinion among students at Harvard, and among a good many faculty as well, was on their side. Harvard Faculty and Staff for Justice in Palestine accordingly lamented their participation in the affair with a fine panache of the passive tense: “It has come to our attention that a post featuring antiquated cartoons which used offensive anti-Semitic tropes was linked to our account.” 

    The student apologies ventured still further into the zones of passive aggression. The student groups expunged the disgraced cartoon from their Instagram post. But they replaced the cartoon with a photo of the leader of the young rebels whose newsletter, back in 1967, had originally published it. This was Stokely Carmichael, in later years known as Kwame Ture, a charismatic man whose most famous slogan was the stirring “Black Power,” but whose second most famous slogan (famous, at least, within the corner of the public that was singled out for death) was the off-hand snarl, “The only good Zionist is a dead Zionist.” 

    The aspect that catches my eye now was how ghostly the scandalous element turned out to be, as if haunted by the truly original scandal, not the one at Harvard immediately after October 7, but the original’s original, which was in 1967. The rebellious young people in 1967 were members of the Student Non-Violent Coordinating Committee, or SNCC, pronounced “snick.” In the period leading up to 1967, SNCC was — if I may put it this way — the most glorious student organization that has ever existed. Martin Luther King, Jr. and a solid bloc of experienced stalwarts were the commanders of the civil rights movement in its adult division, and the young people in SNCC, who were black and white alike, were the human-wave foot-soldiers, marching across the South to undergo arrests and beatings and ultimately achieve victory. Young John Lewis in Atlanta was SNCC’s chairman. In South Carolina, young James Clyburn was among the SNCC stalwarts. In New York, SNCC’s high school division mobilized the youth of the youth. 

    By 1965, though, Stokely Carmichael and his fellow-thinkers were beginning to take over the organization. They succeeded in expelling the whites, which tended to mean the Jews. By 1967, young Lewis had left the organization.  Carmichael inherited the chairmanship. The Six Day War broke out in the Middle East. In the Arab countries, the shock at seeing so many Arab armies defeated so quickly and ignominiously by Israel set off a political earthquake, which meant radicalization, a major event across the region. The Palestinian terrorist campaigns got underway. And the war set off an additional earthquake in the American civil rights movement. The new team at SNCC rebelled against the civil rights old guard and its many alliances, above all the alliance with the American Jews. And the SNCC newsletter ran an article making the case against Zionism. 

    It was a ferocious case. Zionism in SNCC’s portrayal was ugliness itself. Zionism was racist even against darker-skinned Jews. It was exploitative of black Africa, hostile to African liberation, and Nazi-like against the Palestinians in Gaza. Zionism was a creature of British and American imperialism. Zionism’s purpose was to help white America exploit Arab oil. Zionism was a product, finally, of a Rothschild “conspiracy with the British” — the Rothschilds, who, in capital letters, “CONTROL MUCH OF AFRICA’S MINERAL WEALTH.” The noose cartoon faithfully illustrated the article. And the spirit of that article and its cartoon became a trend, visible in SNCC and in the brand new Black Panther Party, too, which went on to publish its own cartoons on similar themes. 

    Those were big developments, which perhaps could be presented, as the Harvard students and their faculty supporters did a few months ago in their infographic, as contributions to a “heightened awareness” within the black struggle for liberation. But it also could be argued that, all in all, the young people’s anti-Zionist rebellion in SNCC in 1967, together with the rise of the Black Panthers, pretty much blew up the national political coalition that King and Rustin and the old-school civil rights leaders had so brilliantly put together. The blow-up took place at a crucial moment, too, just when, under Rustin’s inspiration, King was taking early steps to bring about a basic transformation of the civil rights movement. 

    The historic movement was a campaign for legal rights. By 1967, though, the major specific demands of the historic movement had made their way into law, voted by Congress and signed by Lyndon Johnson. But Rustin had come up with a new idea, which was to convert the movement for legal equality into a campaign for economic equality. The idea was to expand the civil rights coalition into an immense multi-racial campaign for social democratic reform, under the command of the old civil rights leadership, which meant King himself and his circle. This was a proposal to move United States social policy significantly to the left, in the European style, with the collusion of the Johnson administration. Only, the entire universe conspired against Rustin and King and this very ambitious project. 

    It wasn’t just the young radicals in SNCC, together with their comrades in the Black Panther Party and everyone who admired SNCC and the Panthers, which was a lot of people. The social-democratic trade unions — the historically Jewish garment unions, that is, plus the Auto Workers — maintained their own youth auxiliary, which was the mostly white Students for a Democratic Society, or SDS, who took pride in being SNCC’s loyal allies. And the young white hotheads of SDS and their own friends among the hippies and “freaks” were already doing their own bit to blow up the old coalition, though not generally in the name of anti-Zionism or hating the Jews. In truth, not everybody did hate Zionism and the Jews. But everyone hated the older generation — everyone in the left-wing and activist youth movements, that is, except for a few. It was rough on the fifty-year-olds. Rustin’s vast social democratic project depended, in any case, on King and his charisma, and the assassination in Memphis, which was in April 1968, brought the project pretty much to an end. And Richard Nixon was elected president. And Stokely Carmichael departed for a new life in Africa. 

    SNCC’s turn toward anti-Zionism has always seemed a little puzzling, and that is because of Carmichael himself. Carmichael  was born in the West Indies, but he came of age in the Bronx, New York, where Jews were not an exotic species. At the Bronx High School of Science, which he attended, the scions of the Jewish Bronx crowded the corridors, and none of those students were Rothschilds, and various of them came from backgrounds not lacking in enthusiasm for King and the civil rights movement. Todd Gitlin was one year behind Carmichael at Bronx Science — Gitlin, who went on to Harvard, where he became a national leader of Students for a Democratic Society. Harvard expressed an interest in Carmichael, too, and offered him a scholarship. Carmichael preferred to go to Howard University in Washington, the most distinguished of the  black colleges. 

    And yet, at Howard one of his more significant friends appears to have been Tom Kahn, who was yet another socialist from a Jewish family, in this instance from Brooklyn. It was young Kahn who brought Carmichael into the circle around Rustin — Kahn, who went from Max Shachtman’s famously clever socialist faction to strategizing for Rustin, and from there to strategizing for the AFL-CIO. How, then, could someone like Carmichael, with any number of friends and comrades from the world of Jewish support for the black cause, have made his way to “rock bottom,” in the Harvard dean’s phrase, amid the ancient superstitions and the belief that African-America’s oppressor was international Jewry? The descent into that sort of thing can make people wonder if some terrible incident didn’t drive him to a crude response — a knavish landlord, a nasty math teacher, a catty high school clique, or who knows? 

    But those are silly speculations. Carmichael was a serious man, and his evolution was a matter of serious reflection — a matter of intellectual sophistication, in some degree, and not a lack of it. The classic civil-rights idea, in Rustin’s version, was itself a mighty sophistication. It was an internationalism, with inspirations drawn from India’s anticolonial rebellion and the non-violent philosophy of Mahatma Gandhi, mixed with support for the anticolonial campaigns of black Africa, which Rustin deftly mixed with still more inspirations drawn from multiple currents of American Protestantism and the African-American tradition, and still more inspirations from the social-democratic wing of the labor movement and their Shachtmanite advisors, together with the particular corner of American liberal reformism that tended to be Jewish. That was a fabulous concoction. But those were ideas from the 1940s and 1950s. 

    Young Carmichael was a man of the 1960s. He drew his own inspiration from Frantz Fanon, the philosopher of decolonization, who was a psychiatrist from Martinique — and Fanon’s ideas seem to me key in this development. Fanon was angrier than Rustin, and bitter — which accounted for his appeal to a younger generation. And he was an ambitious thinker. His ideas unfolded in phases. Initially his project was to sharpen and affirm a black consciousness adapted to the mid-twentieth century — a trans-national black consciousness, suited to his own French Caribbean, to the blacks of France, to various regions of West Africa, and even to the blacks of the United States. He became active in the Algerian struggle against France, and he extended his purpose to speak on behalf of revolutionary Arabs as well, though I am not sure that his insights into Arab consciousness amounted to much. He broadened his purpose still again to animate and illuminate what he considered to be a worldwide program of anti-colonial revolution and post-colonial development — which mostly that meant the black and Arab worlds, with side glances at other corners of the globe, sufficient to suggest the universality of his ambition.

    Ultimately his goal was to help the whole of humanity achieve a full and undeceived self-awareness. This is the self-awareness that is made possible, at last, by a human recognition from others. He wanted to promote a self-aware-ness of this sort among blacks internationally, and among broader populations of color, and then universally. He was, in sum, an unapologetic Hegelian, and, given his background in Martinique, this gave him an undeniable power, analytically  and emotionally. Hegel was, after all, the philosopher who stipulated that slavery and the struggle against it are the starting point of all of history — which might sound like a philosophical metaphor to people in other parts of the world, but was actually the case in the Caribbean.

    C. L. R. James, the Trinidadian intellectual, was Fanon’s predecessor in thinking along those lines. James wrote a history of the Haitian slave revolution, The Black Jacobins, in 1938, which was also a contemplation of the African decolonization  movement, and, in doing this, he, too, gazed on events through a lens of Caribbean Hegelianism. Only, James’s Hegelianism was Marxist. He converted Hegel’s abstract categories — the Master, the Slave — into concrete realities of class struggle, where the traits and interests of one class might intermingle with traits and interests of the other class. James’s anger at slavery was volcanic, but his Marxism allowed him, even so, to identify ways in which the Haitian slaves, who had every reason to hate the French, were able to borrow ideas and ideals from France. And the slaves were able to benefit, if only fitfully, from the solidarity of France’s revolutionaries, and were able even to offer a solidarity of their own to France’s revolution, quite as if the struggle, which was deadly, contained within it a negotiation. And the negotiation pointed to a possible better future — which made for an angry book that was also a subtle book.

    Fanon’s Hegelianism, though, was not a Marxism — not  in his early book Black Skins, White Masks, and not in his more famous The Wretched of the Earth, in 1961 (even if the title is a line from the revolutionary anthem The Internationale). Fanon recognized the reality of economic conflicts and struggles. But his vision of the world emphasized, instead, conflicts that were psychological, or perhaps cultural. He recognized the existence of social and economic classes, but his vision of the world emphasized the clash of entire nations against one another, instead of social classes. These were the colonized nations against the colonizing nations, and their struggle was the global struggle of the Third World against the European empires (and the second Europe that is the United States). Sometimes he spoke of entire races, and not just of nations. In a number of rhapsodic passages here and there, he spoke of a higher synthesis emerging from the worldwide conflicts. But mostly he pictured a struggle that was going to lead to a victory for the colonized and a defeat for the colonizers, or the opposite, without any intermingling of traits that might contain within it a hidden negotiation, and without much prospect for a higher synthesis, except in the vaguest of ways.

    Fanon was not, on these points, a proper Hegelian, which he punctiliously acknowledged. His vision of the struggle was blunter than Hegel’s, and blunter than James’s, and the bluntness led to a strictly violent concept of the struggle. He considered that violence was unavoidable for the oppressed. And he considered that, in some respects, violence was positively good. In his view, power relations defined identity, such that the oppressed were defined by their oppression, and not by any cultural or religious wealth that might be their own. (That is why, in The Wretched of the Earth, the various colonized nations are indistinguishable, one from another, since all of them are victims of the same colonialist oppression.) And since the oppressed are defined by their oppression, the only way for them to assert a new and better identity and resolve their psychological problems is through an exercise of force, which means violence. Gandhi and the Gandhians and their American civil-rights emulators  considered that non-violence was a tactic which was also a principle. In their eyes, non-violence conferred meaning. But Fanon looked on violence as a tactic which was also a principle. It was violence that conferred meaning. Violence was therapy for the colonized. Violence allowed oppressed people to become fully human, or “men.” 

    In his recent biography of Fanon, The Rebel’s Clinic, Adam Shatz offers a satisfyingly intelligent and thorough account of the man, and argues that Fanon has been subject to a lot of unfair criticism on the violence question. “The violence of the colonized,” in Fanon’s interpretation, as Shatz explains it, “was a counter-violence.” The imperialists were to blame, not the enemies of imperialism. This explanation may not survive a reading of The Wretched of the Earth. Something is alarming in Fanon’s odes to the violence of the oppressed: “At the individual level, violence is a cleansing force. It rids the colonized of their inferiority complex, of their passive and despairing attitude. It emboldens them, and restores their confidence,” and so forth. Violence makes Fanon sit upright in his chair. He is electrified. He was, in this respect (and in other respects), a true disciple of Sartre, who spent a lot of time sitting upright in his own chair, excited at the prospect of open conflict. Or Fanon ended up looking like Georges Sorel, the syndicalist — Sorel, the author of a once-famous book called Reflections on Violence, whose revolutionary doctrine rested on the direct- action anarchists of the 1890s, and found encouragement  in the violence of the frightening lumpenproletariat, and hinted at the fascists of the years to come. 

    You could be excused for wondering if the nationalism- violence-and-lumpen combination in Fanon’s imagination likewise didn’t flirt with extreme-right possibilities — though Fanon was plainly more a man of moods than someone with an extremist vocation. And, in scattered passages of The Wretched of the Earth, his better judgment allowed him, as his biographer correctly observes, to grant that violence was not, in fact, an ideal, and could even be a big mistake, tactically speaking. And Fanon was eloquent, finally, on the meaning of freedom. But this meant only that, on a series of fundamental questions — violence, the nation — Fanon was ambiguous. 

    His emotional force, though, his power of condemnation, which was a power that comes from being frank — this was not ambiguous. The anger in him and even the ambiguities  seemed to speak for vast percentages of the human race — the vast percentages that were in the course of throwing off the European empires and trying to construct a new world system. The man himself was appealing, with his enthusiasm for ideas and his effort to get at the real psychology of people, and this made it easy to overlook his infelicities of one sort or another. If he contradicted himself, which he did almost systematically, this, too, was not without appeal. He was a man in a hurry because world events were in a hurry, and there was no time to straighten out every little contradiction. Besides,  he was immensely self-confident, and self-confidence made him glamorous. 

    His glamour was rendered official, too, by an endorsement from Sartre himself — Sartre who, in the 1960s, floated on a sea of worldwide prestige. Sartre endorsed him by writing a wild preface to The Wretched of the Earth, more violent even than Fanon himself: “Murderous rampage is the collective unconscious of the colonized….” And on every continent, ​​the hippest-of-the-hip in the 1960s, who were the young, understood intuitively that Fanon’s ideas and even his excesses were the spirit of what was, in fact, a revolutionary age. Wasn’t that Stokely Carmichael’s experience? I am sure that it was. I imagine Carmichael turning Fanon’s pages and saying to himself, “Yes, that is me he is talking about. And the world he describes is the world that actually exists.”

    I imagine this because, in a fashion that could hardly be more different, that was my own experience. My copy of The Wretched of the Earth — the copy on my table right now — is a $1.25 paper-back, which I purchased in 1969. The faded yellow magic marker lines running through its pages remind me how earnestly I studied it. I did that at Columbia College in the spring semester of 1969, under the guidance of my professor, Edward Said, who himself was still in a stage of voraciously absorbing influences from Fanon and the French philosophers. I took from my reading that Fanon’s Wretched of the Earth offered a schema, which was neither liberal nor Communist, for analyzing  every conceivable thing. The violent passages — there were many — alarmed me not at all. “For the colonized, life can only materialize from the rotting cadaver of the colonist,” wrote Fanon, and the rotting cadaver seemed to me, from my stand-point at the age of nineteen, creepy with energy, which made it marvelous. I, too, believed that Fanon spoke for vast portions of the previously silent or silenced human race.

    Only, I found myself wondering about the many populations that might not fit into a simple tabulation of the colonized and the colonizers. Not everybody does fit into those two categories, after all, or into any two categories. 

    The Jews, for instance — where did they fit? I wasn’t much concerned with Jewish issues, but, still, as I bent to the task of drawing yellow lines, I did wonder. And I wondered again as I faithfully attended a campus teach-in, at the urging of my professor, in order to learn about the secularist and progressive  ideals of the Popular Front for the Liberation of Palestine, who were represented to me as the true exponents of Fanon’s philosophy, but whose secularist and progressive ideals left me uneasy — as if a little voice whispered in my ear that, fifty-four years later, the Popular Front for the Liberation of Palestine was going to participate, as it did, in the October 7 massacre. So I responded with excitement to Fanon, and also grew reserved.  Now, Fanon himself, it must be said, did give some thought to Jewish questions. He ruminated over the psychological situation of the Jews, perhaps more than he ever did over the psychological situation of the Arabs — though mostly he did this in reference to his study of psychological circumstances among the blacks, which was his chief concern. His thoughts were sympathetic. In Black Skins, White Masks, he made clear that nothing in his sympathy for the Jews and their plight was begrudging: “Anti-Semitism hits me head-on: I am enraged, I am bled white by an appalling battle, I am deprived of the possibility of being a man, I cannot dissociate myself from the future that is proposed for my brother.” He understood that hatred of Jews and hatred of blacks tallied up, in the last analysis, to the same sum. “It was my philosophy professor, a native of the Antilles, who recalled the fact to me: ‘Whenever you hear anyone abuse the Jews, pay attention, because he  is talking about you.’” Or, in other words: “an anti-Semite is invariably anti-Negro.” 

    He drew up comparative observations on the oppressive prejudices that descend variously upon Jews and blacks, and on Jewish and black psychological reactions. He was tolerant and charitable. He proposed a diagnosis of a Jewish psychiatric  patient who, in “a fine example of a reactional phenomenon,” angrily and pathetically sided with the anti-Semites. “In order to react against anti-Semitism,” Fanon explained, “the Jew turns himself into an anti-Semite” (with the acuity of this diagnosis revealed by the fitness of his present tense). 

    But Black Skin, White Masks is not so widely read. In The Wretched of the Earth, he occupied himself with other matters. But even there he paused to note, if only in passing, that Germany was paying reparations to Israel, which he seemed to approve. And his approval left no doubt that his approval extended to Israel as well, even if he did not spell out his approval explicitly. Does this seem surprising?  I suppose that, in the atmosphere of our own moment, Fanon’s evident sympathy for the Zionist project might, in fact, seem surprising. 

    But it should be remembered that, in 1961, when The Wretched of the Earth came out, an approving view of Israel was entirely normal and natural among intellectuals of the traditional left. Israel was, after all, a refugee state, and everyone on the traditional left did understand this. Israel was filled with people who, in Fanon’s phrase, “were forced to leave” other countries, and who, in their new country, which was also their ancestral homeland, were trying to avoid getting massacred — which made the Israelis objects of sympathy as a matter of left-wing instinct. The concept that a nation of refugees ought to be regarded as an imperialist imposition, soon to be erased (“the world’s last settler-colonial state,” as Adam Shatz confidently puts it in his Fanon biography), had not yet taken hold. Fanon made clear that he expected Israel to endure: he speculated about a new collective unconscious emerging among the Jews, after a hundred years of Israeli existence. And then, at the age of thirty-six, he succumbed to leukemia, and there was no opportunity to work up further thoughts on Jewish or Zionist themes. 

    Fanon’s very early death was a tragedy in a dozen ways, but one of those ways, I think, touches precisely on those themes. A man with his acuities and philosophical breadth, and his recognition of Jewish suffering and its complexities, might have been able to explain Israel to the Arabs in a fashion that no one else has been able to do — or so I like to imagine. He could have made clear that Jews fleeing to Israel from places like Algeria were not the equivalent of people from France deciding to become settler-colonists in Algeria. He could at least have pointed out a few realities along those lines to the American professors who pride themselves as experts on oppression. Perhaps he could have explained a few things to the Jews, too, in his role as kindly psychiatrist. 

    Then again, in connection to Zionism, his early death might have been a tragedy also for the blacks of Africa and maybe in other parts of the world. He wanted to affirm a lucid black consciousness, wanted to define a distinctly black perspective, which meant that he wanted to cast off the white insistence on imposing white definitions on everything touching on blacks. His finest pages explored those themes. And, in connection to controversies over Zionism, there was an obvious point to make — obvious, I would think, to someone like him, who, in drawing up his analysis, paid careful attention to an additional sophistication that was Sartre’s. This was a sophistication in regard to dishonesty. Sartre fixated on what he called “bad faith,” which was a great theme of Sartre’s, maybe his greatest theme of all — a grand theme, at least, in Being and Nothingness, which Fanon made a point of invoking. “Bad faith” meant the particular mendacity of someone who knows the truth, but does not like the truth, and therefore prefers to lie about it, and lies about having lied. And it is the mendacity of someone who may even convince himself that his lies are truths, and his lies about lying are likewise truths — even while knowing that lies are lies. Bad faith, in short, is a twisted consciousness.

    The black perspective, then, in regard to Zionism — what was it? 

    What should it have been? In recent decades, the black liberation struggle has acquired a worldwide prestige that Fanon could only have fantasized about. The black struggle has become the modern ideal of a righteous struggle for a better world. And in the context of this development, the anti-Zionist movement, beginning in a small way in the 1960s, and continuing in a large way in the years after 2000, has taken to arguing that, in the modern age, Zionism ought to be seen not as one more liberation struggle, but as the enemy of liberation struggles. Zionism ought to be seen as a participant in the white supremacist and colonialist movements that oppressed blacks in the past. Zionism ought to be seen not as an enemy of Nazism and its systematic exterminations, but as a counterpart to Nazism. And anti-Zionism, by contrast, ought be seen as the heir and brother of the black struggle. Or better still, anti-Zionism  ought to be seen as indistinguishable from the black struggle, given that Zionism is white supremacism itself. The success of this argument has been, of course, extraordinary in different parts of the world, which is why on various continents the anti-Zionist cause has acquired the supreme moral prestige of our moment, not just in the universities. 

    But someone with an orientation like Fanon’s can only notice that, amid the worldwide din on behalf of the anti-Zionist cause, the actual black liberation struggle — the struggle by actual black people, that is — has once again, exactly as in the past, been drowned out by non-black voices. And everyone knows this to be true, and pretends not to know, in a classic display of Sartrean bad faith. The largest ethnic horror of the last several months has taken place, after all, within the Arab world, but not in the poor stricken corner of it that is Gaza. The ethnic horror has been the sustained assault on the Masalit people of Sudan, who are black, conducted by the predominantly Arab forces in Sudan’s renewed civil war, with disastrous consequences for the blacks, measured in deaths (Le Monde has put the figure at something like 75,000 in the last several months), and rapes, and a refugee crisis consisting of eight million people, and a dire food shortage for some eighteen million people. I say that everyone knows this because these events do get reported, not just in obscure human rights publications. 

    But the anti-Zionists have succeeded in commandeering the language of black liberation, and they have used the language to drown out the actual blacks who are suffering. To drown out the cries of victims in other parts of the world has been a main function of the anti-Zionist movement for many years now. This point was elegantly made as long ago as 2001 by Bernard-Henri Lévy in an essay called Les Damnés de la guerre, or The Wretched of the War, which invoked Fanon in its title. (Lévy’s Les Damnés de la guerre rhymes with Fanon’s Les Damnés de la terre — though Lévy’s rhyme disappeared in the English translation, and along with it the invocation of Fanon.) But who will make that point about anti-Zionism today? Anti-Zionism  as an instance of what used to be called “false consciousness”? And who will point out that, by contrast, a function of Zionism itself — when Zionism is healthy — is to raise a cry on behalf of the tiny nations, instead of the enormous nations: the little populations that, like the Zionist nation itself, are surrounded by enormous hostile states and populations?

    So the voices of the Masalit people go unheard by everyone around the world (even though everyone does, in fact, hear), except the specialists in human rights and a handful of reporters. At Columbia University just now, as I am writing, the student uprising is led by the group called Columbia University Apartheid Divest, with reference to the white supremacist social system of not-so-long ago in South Africa — quite as if the uprising at Columbia amounted to an uprising in favor of oppressed blacks resisting racism in Africa. 

    But the Columbia uprising merely claims to have done so, with its invocation of apartheid — oh, perhaps with a perfunctory nod to Sudan now and then, in passing. A main student leader of Columbia University Apartheid Divest has become famous, instead, for saying, “Be grateful that I’m not just going out and murdering Zionists” — which is not, after all, a bizarre thing to say, since it merely reprises Stokely Carmichael. And it reprises the Hamas charter. But since everyone by now has read the charter, everyone ought to know, as well, even if they do not know, that in Article Thirty-Four and elsewhere the charter calls for slavery, too. Perhaps that deserves a comment, too? But no one is going to ruminate over Islamist fundamentalism and the history of Arab marauders attacking African blacks. 

    Mightn’t those be Fanon’s observations, if he were alive? — the observations that reflect a black bitterness, alert to the layers of falsity that bear the stamp of bad faith? But I do not pretend to know. I am not Fanon. And I am not an oppressed Sudanese. But I am definitely sorry that he is gone. 

    Fanon’s died in 1961, the same year as the publication of The Wretched of the Earth. His outpouring of acute moral observations and psychoanalytic complications and simple and too-simple angers and political analyses reached its end. And, under those circumstances, his readers were bound to succumb to the allure of his sharp division of world affairs into a conflict between the good nations and the bad nations. And his readers were bound to succumb to that idea without regard to what anyone’s ideas and intentions might be, on the assumption, which was his, that identity is conferred by power relations, and not by what people actually think and believe. 

    The Algerian Revolution figured within the larger Arabist movement, which, in its struggle against the French and British empires, could only be seen, from Fanon’s simple standpoint, as the last word in progressivism. But the Arabists also nursed an absolute hostility to Zionism, which suggested that absolute hostility to Zionism must be, by definition, likewise a progressive sentiment. Everyone who thought along the lines of good nations versus bad nations was likely to reach that conclusion. And everyone was likely to set aside as irrelevant the ideas and intentions of the Arabist movement in regard to Zionism and the Jews. 

    But what if ideas and intentions do, in fact, matter? What if Fanon’s habit of excluding ideas and goals from his analyses was bound to produce a systematic blind spot? The grand French philosophers could never make up their minds on this question — on whether ideas matter. Or should every struggle around the world be judged simply on the basis of who appears to be down and who, up? Sartre was a model of confusion on these matters. His sympathy for the downtrodden led him to line up on the Algerian side against the French, and likewise to line up on the Palestinian side against the Zionists. He did so with a lot of vehemence, too, such that, having applauded random violence against the French in Algeria, he went on to applaud Palestinian violence against random Israelis, too — Israeli athletes, for example. He thought hostility was justified, and he did not worry about how the hostility was expressed, so long as it was, in fact, expressed, and the more ferociously, the better. He, too, thought that violence has meaning. This made him an exciting thinker, of course. He took ruthlessness to be the sign of honesty, and he was the philosopher of honesty. 

    Then again, it ought to be obvious that Sartre went a little crazy in these ways. Fanon in Black Skin, White Masks speculated about a syndrome that he called, drawing on the psychoanalytic literature, a “Manichaeanism delirium.” This meant a delirium based on the Manichaean idea that everything in the cosmos reflects a battle between Good and Evil, in eternal conflict. In Sartre’s case, surely it was a Manichaeanism delirium that led to his repeated impulse to applaud murderous violence against people whose guilt, if they were guilty, was merely a matter of extrapolation or second-order imputation. 

    And yet, Sartre did live through the Nazi era and the German occupation, and though his knowledge of Jewish life was minimal, he drew the requisite conclusions. He had no trouble recognizing that anti-Semitism’s victims were one more downtrodden population. And, though he would never have put it in these words, Zionism was the downtrodden’s  obvious resort. So he roused himself from his delirium sufficiently, at least, to sympathize with Israel, after all. In 1967, when the Six Day War broke out and Israel’s survival appeared to be at risk, he put his prestige on Israel’s side. It was a choice. He was reluctant to make the choice. He wobbled. He had to be pushed. Still, he did it — which might seem impossible, given his preference for the Palestinians and his deliriums. But vacillation is conscience, sometimes. And conscience, too, is honesty. And Sartre did not mind looking foolish, so long as he was authentic, or, at least, looked authentic. 

    Fanon’s widow was furious when she heard that Sartre had chosen to stand with Israel. But when I look back on Fanon’s Black Skin, White Masks and its impassioned pages about solidarity with the Jews and hostility to anti-Semitism, I find it easy to imagine that Fanon himself, if he had lived, would have thought hard about Sartre and his choice. I can even imagine that, gathering his courage, Fanon might have joined Sartre in his waverings, perhaps in recognition that he, too, had succumbed for long periods at a time to a Manichaeanism delirium, all too visibly in whole portions of The Wretched of the Earth. And he ought to pull himself together. Fanon’s biographer,  Shatz, gives this possibility a thought and declares it “unlikely.” But I wonder if Shatz, for all his admiration of Fanon, hasn’t underserved him in certain ways, mostly by underplaying  how seriously he took his engagement with Jewish themes  and how closely his instincts meshed with Sartre’s. 

    Sartre’s waverings, in any case, set a pattern. Michel Foucault followed that pattern a few years later. Foucault watched the Iranian masses overthrow the tyrannical Shah. That was in 1978 and 1979. Foucault watched the Islamist mullahs come to power. And he was ecstatic. He considered that Iran’s revolution was an outbreak of freedom, which led him to spend some time in Iran, relishing the joys. But the time he spent there led him to discover that Iran’s outbreak of freedom was actually a festival of ideology, and the ideology was anti-Semitic. Iran’s revolution against tyranny was an outbreak of tyrannous bigotry. Which Foucault found repulsive. And the ideas and intentions that people cultivate do matter, and what may appear at first glance to be progressive  may turn out to be, at second glance, not so progressive. Such were, in any event, the repeated shaky conclusions of the wobbling French philosophers — not all of them, but perhaps the greatest of them, whose wobblings might be the very thing that rescued them from the temptation of philosophy, which is rigidity. A consistent philosopher is, after all, a madman. 

    And in America? Stokely Carmichael, the sophisticated young champion of Black Power, took his own view of these matters. His instinct was to accept the national-identity vision of worldwide struggle, and not to engage in any wavering of his own. So he accepted Arab nationalism’s absolute hostility to Zionism, and he preferred not to fret over any contradictory aspects or complexities that might have crept into the hostility. This required, of course, a willful blindness on  his part. The accusation against Zionism — the real-life  accusation, and not just the philosophical ideal of an accusation — was a layered affair, and contradictory aspects and complexities did creep into some of those layers, and they did so from the start. 

    Is it inappropriate for me to note what those layers were? On the surface, the anti-Zionist accusation was a local accusation about land, which was easy to understand. At a deeper layer, it was a more grandly scaled accusation about imperialist colonization, which could seem accurate, if you viewed it from one angle, or maliciously distorted, if you viewed it from another angle. There was a still deeper layer to the anti-Zionist accusation, the bottom-most substratum, which was theological. This is what you can see in the Hamas charter. It was an accusation against the Jews drawn from ancient Islamic texts, as interpreted by the grandees of the Muslim Brotherhood and the modern Islamist movement, who made clear that Judaism was a plot against the Prophet Muhammad and the whole of Islam. A cosmic crime. 

    The accusation drew on a German influence from the later 1930s and the early 1940s. This, too, you can see in the Hamas charter, with its scrupulous invocation of the mother- document of modern European insanity about the Jews, The Protocols of the Elders of Zion, respectfully cited as if The Protocols were one more revered and ancient Islamic text, which they are not. The Protocols are a compendium of nineteenth-century European fantasies about Jewish conspiracies, which were published as a hoax in Russia in 1903 and went on to enjoy a spectacular success on the extreme right. 

    Adolf Hitler invoked The Protocols in his own charter, which was Mein Kampf. The German government during the Hitler era distributed The Protocols in Arabic and other languages in the Middle East, where they enjoyed still more success because they appeared to confirm and modernize the many imprecations against the Jews in the ancient Islamic texts. The accusation against Zionism, then, managed to compile in layers the reasonable and the absurd, the progressive  and the appalling, the Middle Eastern and the European, the ancient and the modern, all squeezed together into a sandwich of resentments, loyalties, exaltations, ideas, theologies, and superstitions. 

    But attention to complication was foreign to Carmichael’s image of himself. He read the wavering French philosophers  (his reading of French philosophers figured in his own glamour), but he chose to be a radical instead of a philosopher,  and he signaled his radicalism by choosing Fanon as his favorite philosopher. A radical is defined by his refusal to waver. Carmichael preferred, instead, to provoke. The TV interviewer David Frost famously asked him who among white men he most admired. And Carmichael boldly displayed his fidelity to the anti-Zionist cause by answering in the fashion that, from time to time, the leaders of the Muslim Brotherhood have always liked to do, which is slyly and provocatively, with the explanation that, although he felt no admiration, the greatest of white men was Hitler, “a genius.” Hitler — even if “what he did was wrong, was evil, etc.” 

    That was in 1970. The interview shocked a great many people who admired Stokely Carmichael. This he must have enjoyed. All those kids he used to know at Bronx Science? Bayard Rustin? He stuck it to them! But no one should have been surprised. The cartoon in the SNCC newsletter in 1967, the one that reappeared at Harvard back in February of 2024, had already made obvious what sort of intellectual evolution was at work. 

    The cartoon was a small thing, artistically speaking. Ideologically  speaking, though, it was capacious. It was the anti-Zionism of the Middle East in its grotesque sandwich version, minus the tasty fundamental ingredient of Islamic theology, which was not suitable for Western palates. This meant a joining together of the global revolutionary left and the extreme right, anti- imperialists and fascists alike — a cartoon whose iconogra-phy drew on the Cuban left-wing poster-art style of the 1960s (visible in the machete that is about to sever the nooses), and drew as well on Nazi graphic art of the 1930s and 1940s. Or perhaps the cartoon drew on the iconography of the anti-Se-mitic campaign during the Dreyfus affair in France in the 1890s, with its images of a hidden and sinister Jewish power, lurking fiendishly over a helpless world.

    This fateful and miserable cartoon, then — how did this minor revenant make its way into the Harvard University turmoil, five months after the October 7 massacre? Harvard has established a further commission on anti-Semitism, now that a first commission has fallen apart, and the members of the new commission, unless it, too, has fallen apart, are bound to pause over that cartoon and its ghostly reappearance. But I suspect that inquiries into university anti-Semitism are never going to get to the heart of this particular controversy, nor any of the related controversies across the academic world. 

    There is a problem even in the subject of the inquiry, which by now everyone has come to notice. The definition of anti-Semitism, after all — how is that going to be nailed down? If someone says that anti-Semitism nowadays consists of holding Israel to standards that apply to no other country, somebody else is bound to reply, “Well, I do think that Israel is the worst country on earth. And a white settler-colonialist state has no right to exist just because it happens to be Jewish. And how dare you drag in the Nazis! These are slanders demagogically deployed to prevent large numbers of us from expounding the well-supported human-rights conclusions of our scholarly research, which are endorsed by seventeen Jewish professors!,” and so forth — which will sink the inquiry into a muddle from which only bubbles will rise to the surface. 

    If I were a university president with the autocratic power to make professors do what I want, I would mobilize the more level-headed ones under my command to undertake a broader investigation. This would be an inquiry into a climate of opinion that hovers over the university humanities departments and maybe a few other places, and over the art world and the literary world, and seeps at times into the mainstream press. The climate of opinion is conventionally described as a leftism. But I think it is more usefully described as a politicized legacy of the avant-garde, which is why the arts and the humanities departments tend to be its principal center, instead of the social sciences and the economics department, where leftwing opinions normally ought to flourish, if they are going to flourish at all. This is the avant-garde that has oscillated for more than a century from extreme left to extreme right, and from the marvelous to the horrendous, and back again, always in pursuit of a single notion, roughly speaking. 

    The single notion is the idea that deep truths lurk invisibly beneath the falsities of modern life, and, if only the truths were revealed, a new era would dawn. The new era might be described in different ways. It might be a new literary religion, according to the splendidly creative anarchist poets and their friends in France in the 1890s, who largely founded this strand of the modern avant-garde; or a return to the barbarian glories of authentic experience, according to the right-wing German philosophers in the 1920s and 1930s; or a social emancipation, according to the French postmodernists, whose genius consisted of drawing together the artistic flashes and playfulness of the left-wing poets and the profundities of the right-wing philosophers. And the deep truths might likewise be described in different ways. They might be a binary truth of language, based on a contrast of signs and differentiations. Or a binary truth of music, based on a contrast of sounds and silence. They tend to be, in any case, almost mathematical  in their symmetries. They are elegant truths, pleasing to  the religious imagination, or the Platonist imagination, or the poetic imagination — truths suited, in short, to the arts and to the vagaries of metaphysics.

    The version of this sort of thing that has lately condensed into a climate of opinion in the humanities departments and the world of the arts is less abstract, therefore less pleasing. But the simplicity has remained appealing. It is a social analysis, in which the deep truth is considered to be Fanon’s conflict between the colonized and the colonizer, or the oppressed and the oppressor. Everyone has noticed the more-than-political  success of this analysis. You see it in the art reviews, where the critics are likely to detect in the biography of artists under review an aesthetic of oppressed-versus-oppressor, whose dialectic accounts for whatever it is the artists may have done. Or you see it in the museum labels of older works, where the artists of the past are routinely deplored for having contributed  to dreadful oppressions of times gone by, instead of doing what artists are supposed to do, which is to advance the progressive cause. Or you see it in the art itself, which turns out to be a visual commentary on an imputed verbal text, which, by implication, recounts a story of oppression and resistance. 

    This sort of thing may strike some people as very exciting for political reasons, or for moral reasons. It may seem uplifting, the way that bourgeois arts in the nineteenth century were supposed to be uplifting. The excitements may be philosophical–and–aesthetic. There is a satisfaction in supposing that art can be reduced to a dialectic of two elements. To see complexities and simplicities dissolve into one another is always stimulating. And if other people see a species of higher idiocy in the relentless art-world insistence on radical reductionism and moral sermonizing — well, better yet! Provocation is beauty, and beauty, provocation. 

    But the primary victim right now of this sort of thinking has turned out, somehow or another, to be the Jews. I suppose the somehow-or-another has been inevitable, given the allure of the either/or habit of mind. Stokely Carmichael was a man of our own moment, in this respect. It ought to have been obvious, in connection to Israel and Palestine, that reductionist simplicities of the colonized/colonizer sort were never going to apply in any ordinary or realistic way. It is not just a matter of mistaking refugees for colonialists. Everybody does know, after all, or sort of knows, that a good half of the accused white colonialist settlers, perhaps a slight majority of them, fled to Israel from the Arab countries and the largely Muslim zones of Central Asia, not to mention a couple of Jewish demographic percentage points that fled to Israel from East Africa. 

    If the planet is to be divided into — still another Manichaean phrase — “the West and the Rest,” it ought to be obvious that Israel falls into the West and the Rest at the same time, ethnographically speaking, which ought not to be possible, Manichaeanly speaking. The war right now in Gaza may even hint at Israel’s bifurcated nature. Israel’s army and its commanders turn out to be extremely capable, disciplined, and conscientious in the style of a modern Western army. But the army and at least some of its commanders also appear to have worried about mass suffering only begrudgingly, and some of the better-known leaders of Israel’s disastrous government  make a display of worrying about mass suffering not at all. Or they stand openly in favor of mass suffering, quite as if Israel, which appears on the map to be merely one more Middle Eastern country, may be, in fact, one more Middle Eastern country, militarily speaking. And just as Saudi Arabia’s anti-Islamist intervention in Yemen produced a humanitarian disaster, so has Israel’s anti-Islamist intervention in Gaza, even if not on the Saudi scale, in all-too-faithful conformity to the regional style. 

    But everything about the prevailing climate of opinion in corners of the academy and in the world of the arts makes it difficult to look the various complexities and nuances in the face. So there are a great many people who gaze at Israel and prefer to see South Africa and its past. They do not see one more bloodbath in a history of even larger Middle Eastern bloodbaths. They prefer to see what the Islamists have always claimed to see, which is the crime against God, or the maximum crime of crimes, namely, an outright extermination of an entire people, such that “genocide,” the word, has become a catch-phrase. They see the Jews as Nazis, which has been a theme of the Islamist hysteria against Zionism for many decades. They decline to see anything at all about Hamas’s nature, doctrines, and practices, even if they do see those things. They see that resistance to what they imagine to be white settler-colonialism is righteous, and self-defense is monstrous. And the October 7 massacre seems to them — such is the logic, it is inescapable — a good thing, not just on balance. The October 7 massacre is a good thing absolutely. A good thing in the name of humanitarianism. And in the name of enlightenment, no less. It is a good thing, morally speaking, or psychologically speaking. An occasion for joy. Which some people express openly, even while denying that they want to kill the Jews; and other people merely infer, while denying they are inferring anything of the sort; and other people claim to oppose, but infer anyway. 

    The celebration of bad faith reaches its acme in the dreadful chants, “From the river to the sea” and “Globalize the intifada,” which mean, of course, the reduction of fifty per cent of the world’s Jewish population to statelessness (in the first instance) and a worldwide terrorist campaign against Jews (in the second instance) — but which, we are told, mean, instead, “human rights for Palestinians” and “spirited worldwide protest.” Except that everyone knows that, on the contrary, those slogans are ventures into transgression, which is why young people like to chant them. And no one wants to acknowledge what the transgression is. And no one wants to acknowledge how shocking it is that, in the United States and in France and perhaps in other places, a mass movement of students, led by the student elite, has arisen in favor of those unacknowledged transgressions. 

    What should the universities do? I would mobilize my imaginary committee to confront the broader climate of opinion as a whole. This would mean recognizing that the wave of virulent campus anti-Zionism, hidden and overt, together with the wave of virulent hatred in the art and literary worlds, amounts to something more than a failure of civility. It is an intellectual crisis. And the source of the crisis is not the students, and not a handful of radical organizations, either, even if the radical organizations are awful. Nor is the source merely the handful of professors who look and sound crazy. The source is a series of doctrines and assumptions that have degenerated from something authentically interesting into something grotesque, quietly presided over by professors  who look and sound not just reasonable but attractively up-to-date. It is a development similar to the intellectual degeneration many decades ago of the brilliant and fiery Stokely Carmichael, except on an enormous university scale. 

    I would mobilize my committee to inquire into the origin and evolution of the doctrines and assumptions, and the manner of their degeneration. My model for this would be Marx and Engels, who formed a two-person committee of their own to do something similar in their own day by composing a book called The German Ideology. This was a study of the German philosophers in their era and the climate of opinion they generated, with “ideology” understood in the Marxist sense, which is pejorative. I would mobilize my committee to produce something along similar lines, to be called The University Ideology. It would be a study of the delusions of the humanities departments and related fields in our own era, with “ideology” likewise meant pejoratively. An intellectual revolution would be my committee’s goal — a self-revolution in the universities, in the hope that the art and literary worlds might respond with similar self-revolutions. This would be wonderfully stimulating. 

    But it may be that self-revolutions are not every university’s  first instinct. It may be that, in the university administra-tions, a good many people, having observed the coarsening of discussion and debate over the last few months, will prefer a different course of action. They will prefer to mount a scapegoat persecution, intent on singling out the more obstreperous students. They will blame the “outside agitators,” who plainly do exist. Or they will focus their attention on the more outrageous and embarrassing professors, who are not too numerous, in the hope that, if only the obstreperous, the outsiders, the outrageous, and the embarrassing were suspended, expelled, arrested, chastised, fired, or demoted, the universities could breathe in peace for a moment. And then, at last, the universities could move on to the main step. This will be a call for renewed civility, for academic freedom, for tolerance, and for reasoned debate. It will be, in short, a search for the perfect speech code. 

    Am I right about this? If I am, the university response to the crisis of the last many months will end up as an institutional  effort to avoid looking into what is fundamentally the problem, which is not an outbreak of incivility, but is, instead, a bad-faith bad turn, ultra-left and ultra-right at the same time, in the evolution of ideas, not just in the universities but in the art and literary worlds, not just in America, but also in Europe. 

     

    The Heroic Illusion of Alexei Navalny

    Alexei Navalny was killed in the far north above the Arctic Circle, in the small town of Kharp, where the Ural Mountains are intersected by a railroad leading to the city of Labytnangi on the Ob River. This place of death, this scene of the crime, is not random. It puts a period to the argument with fate that Alexei Navalny led as a man and politician — even, one could say, to his argument with Russia and its history. The man who came up with The Beautiful Russia of the Future as image and slogan died in the horrible Russia of the past.

    Approximately fifty kilometers southeast of Kharp, beyond the Ob, is the city of Salekhard. The sadly famous Road 501, the Dead Road, leads east from there. It is one of the last projects born of Stalin’s megalomania, a railroad branch to the Enisei River that would traverse uninhabited places unsuitable  for construction across the permafrost and the swamps of western Siberia. All that remains of that pharaonic project are a few hundred kilometers of embankments, dilapidated camp barracks, and steam engines rusting in the tundra. And corpses. Corpses in nameless ravines and pits, without a cross or a marker, unknown, buried without funerals, the dead whose killers and torturers remain unpunished.

    This is the region of the Gulag, the wasteland of the murdered and the murderers. In these places, geography helps the work of the jailers, and the climate serves as a means of torture. Here, in this ideal geographic nothingness, a space beyond history, beyond evidence, the Soviet state cast out people doomed to annihilation. This is the place where Russia’s historical sin is preserved in material, sometimes even imperishable, form — permafrost, after all. Here lie Russia’s guilt and responsibility.

    Alexei Navalny’s political credo, which changed over the years and is not easily summarized, did have one constant premise, one characteristic feature. He denied — or rather refused to consider — the power of the totalitarian past. He would not recognize the genealogy and the continuity of state violence, and most importantly, its long-term social consequences.

    Yes, he would come to the Solovetsky Stone — a monument to the victims of communist crimes in Lubyanka Square in Moscow, consisting of a large boulder brought a great distance from the very first Soviet penal camp — every year on October 30, the day commemorating the victims of political oppression, and lay a bouquet at the monument: the proper gesture. But his image of the “real” Russia was always that of a tabula rasa, an ideal community over which the past had no power — the strange notion of a society that experiences the oppression of an authoritarian regime but somehow automatically aspires to democracy and is in a certain sense innocent, historically undetermined, without, so to speak, a medical record. His “beautiful Russia of the future” was already here, it already existed in the present, in his own generation, it only needed to be unblocked, unveiled, unpacked, affirmed in reality.

    Yet it is unlikely that he could explain how it came to be, how it was born. He announced it with the disproportionate confidence of a fakir with a grateful audience that also wished to believe that you can turn over a new leaf without acknowl-edging historical guilt or admitting historical responsibility, without recognizing the stubborn presence of the past, without punishing the criminals and thereby severing the umbilical cord of violence.

    Navalny told a fairy tale about a miracle. In classical myth, crimes and sins give birth to monsters, chthonic creatures, the embodiment of fate. He offered a postmodern reverse myth, the story that monsters are capable — simply by the force of history’s progressive course, or because you want to believe it — of giving birth to beautiful, ideal children. In other words, this was a rather spectacular case of a denial of trauma. It was premised on a population without memory and without unhealed scars. But history cannot be fooled. Monsters, if not completely killed, give birth only to monsters.

    Chechen war raised and solidified his ratings, turning him into a national leader. That base and ruthless war turned the Russian Army into a punitive tool, because it not only fought with Chechen troops but it also “pacified” the population. It was a war with tens of thousands of victims; a criminal war from start to finish. It certainly was a crime that on the scales of justice — and in common sense — significantly outweighed any number of stolen billions and any amount of cheating at the polls. It is strange to judge a murderer for the theft of office supplies, or to accuse a serial killer of forging lottery tickets. The right to life is the highest value. Vladimir Putin — like Boris Yeltsin before him — took away that right from tens of thousands of Chechens.

    Before 2014, before the annexation of Crimea and the war in Ukraine, this was Putin’s greatest crime. Without acknowledging the guilt and punishing the perpetrators in the two wars against Chechnya, which set Russia back on its old imperial and colonial path, and unleashed the spiral of state violence, and turned Chechnya into a “black hole,” a zone of lawlessness from where the lawless practices spread throughout Russia — without confronting all this, no bright and real “Russia of the future” would be possible. Without an answer to the cardinal question of the right to secede, without a recognition of the centuries of repressive policies toward ethnic minorities, the Russia of the future will always be the Russia of the past.

    Alexei Navalny was silent about the main crimes of the Putin regime and of Vladimir Putin personally. If you think about it, it seems inexplicable. Or, perhaps, explicable but not justifiable — but the explanation destroys the very concept of the beautiful Russia of the future that needs only to be released from Putin’s regime to emerge. Navalny was silent 

    either because he did not consider the Chechen war significant or because he understood all too well that even the liberal part of Russian society did not care about dead Chechens, about crimes far away in the Caucasus committed in the name of Russia. The discouraging truth is that Russian society had grown accustomed to war, it no longer reacted to pricks of conscience, and it became alert only when it came to matters of personal interest — for example, the reforms of social benefits, or the crushing of hopes connected to the allegedly more liberal rule of Dmitri Medvedev (during whose administration Russia attacked Georgia in 2008), or the news that Putin would go for a third term.

    Then came 2014 and the invasion of Ukraine by Russian troops. The number of military and civilian dead was in the thousands, but Russia’s main opposition figure stubbornly continued to expose the economic crimes of Putin and his henchmen. As if no blood had been shed and international law was not being cynically and odiously violated. Whereas it could be said, in explanation of Navalny’s earlier behavior, that Russia’s war against Chechnya took place before he became a famous opposition politician, no such extenuation can be made of his diffidence toward the war against Ukraine, which occurred when he was already the informal leader of the opposition and a brand name.

    That extraordinary status, one would have thought, demanded only one strategy: to speak out against the war clearly and consistently, and to create a broad antiwar coalition. As we know, Navalny cannot be accused of cowardice. It was not fear of repression by the government that kept him from taking this path. But I am certain that in this regard he felt fear — a fear of a different kind, the fear of every populist politician. He was afraid of losing support.

    Again, this is just my supposition, but I think Navalny sensed that a radical antiwar position would not increase the number of his supporters but would in fact decrease it. In 2014–2022, almost all of Russia accepted Putin’s formula of pretend war, a limited conflict in which Russia was not even involved. Of course, everyone understood that Russia was deeply involved; I doubt that anyone was fooled by the clumsy camouflage, all those “volunteers” and “national republics.” The pro-war radicals demanded that the cards be shown without shame and organized in support of war. What did the antiwar people do? It would require a work of literature, 

    a novel in the spirit of Musil’s The Man Without Qualities, to capture their delinquency — the mix of semi-apathy, semi-activity, intentions without intentions and protest without protest, that the liberal part of society used to delineate its Fronde, refusing to confront the issue for an either-or answer, continuing to cooperate with state institutions, seeking positive aspects in the capital’s urbanistic changes — in other words, to live an ordinary life.

    Navalny, wittingly or not, played into the hands of that mass pretending to be a mobilized protest by lowering the drama and the ethical intensity of the situation, with his dominating anti-corruption agenda.

    In August 2020, after an obvious falsification of the results of the presidential election, the people of Belarus went out onto the streets. It was truly a mass protest, not like the Russian ones. For a few days it seemed that the situation was balancing  on a hair: President Lukashenko could flee to Moscow like President Yanukovich did in the Ukrainian revolution in 2014, to add to Moscow’s collection of retired dictators — or Putin could invoke the status of Belarus as a Union state and dispatch the troops to complete the annexation of Belarus.

    It was during those days that Alexei Navalny was poisoned by Novichok, the poison of choice of the Russian security agencies, which has been used in several attacks. There is much speculation on whether the intention was to kill him. My own view is that the more important fact is that Novichok was used, because it is the calling card of Russian state violence. It was a clear signal to all Russian oppositionists, and the poisoned Navalny transmitted the signal.

    It is very possible that Navalny had no intention of following the radical example of the Belarussians. But from the point of view of the Kremlin, he was the only person capable of stirring up a serious wave of protest in Russia, and that was why he was left in a coma, so that any Russian echo of the Belarussian turbulence would die before it was born. But it left something like a legend of the peaceful protest that almost won — as opposed to the brutally violent repression of the Ukrainian protest, with the burning tires of Maidan.

    Nothing contributed to the demobilization of the pro-democratic community in Russia more than the temporary loss of its leader and the persistence of the narrative, presented as perfectly obvious, of peace, as if the liberal opposition  had only to wait for the right moment (which would definitely come) for everyone to go out onto the streets. Then came the brilliant investigation by Bellingcat, which proved beyond a reasonable doubt that Russian special services had attempted to assassinate Navalny, and Navalny’s extraordinary phone call to one of his unsuccessful killers, when Navalny, playing a state official, literally forced him  to confess.

    So the proof of the regime’s culpability was there. But again Navalny preferred bravado, laughter, the merry mocking of the stupidity of the agents. This, instead of a serious conversation about the system, about the institution of political murder that had reappeared under Yeltsin, about the dozens of people who were poisoned, shot, beaten to death: Politkovskaya, Shchikochikhin, Yushenkov, Starovoitova, Kholodov, Litvinenko, the Skripals, Nemtsov, Estemirova, and many, many others. (Vladimir Kara-Murza, for example, survived two poisonings and is now in a Russian prison.)

    When he recovered and was literally back from the other world like a mystical hero, Navalny could and should have presented Putin with the bill: to speak out about first principles and on the behalf of everyone whose life Putin had taken. But Navalny did not submit the bill in full. Even though he had had one foot in the grave, he preferred to play (it was not an act, it was his nature) the apostle of the beautiful Russia of the future that does not demand unsettling revelations about the past.

    Some might say that this insouciance was the highest level of heroism, the highest bravery — literally to be resurrected and behave as if death had no business in your body and to troll the hapless executioners. I agree that there is courage there, and nerve. But sometimes it is more useful to be scared, to comprehend and proclaim the historical continuity of murders and murderers, to speak in the name of all who had been killed secretly, who were led in the 1930s to execution pits by the same Cheka agents with the same headquarters on Lubyanka. But that was not for him — too old-fashioned, perhaps. I can’t find a better word.

    It would have put him among the ranks of denouncers and prophets such as Valeria Novodvorskaya, whom the liberal public liked to put down with the humiliating tag demschizo — democratic schizophrenics who were rabidly against the 

    regime. Navalny did not want to be a demschizo. He did not want to be a harsh and bitter prophet. He wanted to be the less distressing harbinger of hope.

    In the declassified archive of the Lithuanian KGB, I have seen documents concerning an attempted political assassination. In 1980, local officers wrote a letter to Moscow, to General Filipp Bobkov, head of the infamous Fifth (ideological) Directorate. It was about “special measures” that were planned for a Catholic priest. Of course the letter did not contain the word “assassination.” The Lithuanian KGB asked Moscow to approve the mission and to send two “technical specialists.” Later these “specialists,” disguised as traffic police officers, stopped the priest’s car on a night road under the pretext of checking documents.

    The priest was not surprised. He was used to highway patrols stopping him more frequently and combing through his papers. It was more of the usual harassment. But this time, while one Moscow visitor opened the hood and checked the engine number with the priest, another sprayed some kind of substance on the driver’s seat. A few hours later, the priest was brought to the hospital with a diagnosis of “radiation burns.” He survived. But a few years later he died in a very strange road accident.

    Nothing was said about it in his operational development file. And General Filipp Bobkov, who should have been in prison, after 1991 became a member of media mogul Vladimir Gusinsky’s Mediamost Group, which owned, among other things, the independent television channel NTV. Bobkov was head of security. Amazingly, the many decent people, the bold journalists who worked for Gusinsky, never asked any questions. How could their security be guaranteed by a monster? But Bobkov could scare off the pettier monsters who multiplied in the freewheeling early 1990s.

    The KGB’s murderous reputation and its connection to the political liquidations in the Soviet era were well known, even if not always proven. But no one quit, or even complained. Everyone chose the required cohabitation with evil. As did the citizens who later accepted the generals who fought against Chechnya as politicians and governors, such as Andrei Lugovoi, the assassin of Alexander Litvinenko, a member of parliament. It was a kind of covert social agreement that even the democratic community accepted: not forgetting the past completely, but also not exaggerating, not going to extremes, not shouting “killer” at killers. Or not shouting it loudly.

    This house of cards fell apart halfway on February 24, 2022, when the Russian army openly invaded Ukraine. Hundreds, even thousands, of smartphone cameras filmed how the Russians fought and showed it to the world in its full barbarity. Russians are often rebuked for protesting too weakly against the invasion. They defend themselves by pointing to the number of anti-war protestors who were arrested. I admit that I can understand people who would not risk protesting against the regime with the radical methods of Maidan. Not everyone can be a hero. The problem, rather, is that we did not notice, we were not aware of, how we ended up at that point of impotence. 

    In truth, we got there in part because of all the lulling speeches about how Russia was actually different, how we were an undeniable force, how there were so many of us — all the naïve speeches that we are the power here and Putin fears us. One of the most frequent reactions of liberal Russians to the atrocities of Russian troops in the early weeks and months of the invasion was shock. Could Russian soldiers really behave this way? Some tried to shift the blame to the units from the Russian Federation’s national republics: it was the fault of those savages. But those who remembered the actions of Russian troops in Chechnya were not surprised at all. It was the familiar pattern of violence: mass reprisals against civilians, the wanton destruction of civil infrastructure, executions on the spot.

    Our civic memory seems to have had its long-term function disabled. Every political generation now starts from scratch, zeroing out the account of responsibility and denying (or being utterly shocked by) the continuity of violence in both the state and the society. Alexei Navalny was a brilliant tactician, but when it came to larger questions of morality and strategy he was a perfect avatar of this terrible tendency. 

    Russia’s open war against Ukraine revealed yet another fatal flaw in the Russian opposition: a systemic inability for decolonizing thinking, an unwillingness to admit that Russia itself consists of subjugated and partially “digested” nations that have undergone, in the words of the Ukrainian dissident Ivan Dzyuba, a process of forced denationalization. Without the voices of these nations, without their equal representation in the opposition, no conversation about the future of Russia has the right to take place, or will lead to a just result. 

    Navalnyism always bypassed or ignored the issue of national rights. When Navalny, who began his political career among Russian nationalists and made chauvinistic comments in the early period of his activism, emerged as a recognized leader, he turned out to be a kind of supranational democrat. He did not divide his supporters by nationality, or recognize their specific national demands; instead he addressed them as conventional people of goodwill who are conscious (or modern) enough to also rise above national feelings and unite for the sake of the beautiful Russia of the future.

    This point of view is dominant in the bearers of Great Russian Culture. It is a mixture of a sense of superiority, neglect, chauvinism, colonial-educational fervor — and a subconscious fear of finding out one day that these Others do not really want to be part of Russia at all. 

    And here the approach of the irreconcilable opponents, Putin and Navalny, surprisingly converged. In Putin’s case, this perspective is clear. But it is sad to admit that his most talented and certainly his most relentless opponent turned out to be a hostage of the same imperial paradigm. Navalny had a chance to change history — but for this he had to first accept  it himself, to hear voices in other languages presenting a historical account. And Navalny was too Russian for that. 

    It is noteworthy that there is something here that Navalny had in common with Soviet Russian dissidents of the past. They (there were some exceptions who proved the rule) often treated with a cold lack of understanding the ideas and the agendas of dissidents of the national republics who spoke about the colonial role of Russia, about language rights, about the right to self-determination. To Russian dissidents, it probably all seemed too archaic, a dead-end; and without noticing it they regarded these aspirations for emancipation through the optics of high Russian culture, into which are embedded a hierarchy of cultures and the idea of the national as backward and obsolete. The texts of the dissidents of the republics of the USSR contain bitter philippics addressed to conditional “Muscovites” who talk about human rights, but the “human” in their human rights does not seem to have  a nationality.

    Soviet Russian dissidents, whose personal qualities were certainly extraordinary, failed to become a tangible and independent political force. Navalny, by contrast, became such a force. But the dissident project contained at least half of the needed reckoning with the past: the memory of the victims of Soviet terror. Navalny’s project, directed into the future, did not offer even that half. Therein lay his power: people wanted to forget, to seal a compact of silence, as in Spain. And therein lay his weakness: because it was in that field of silence, in that agreement to forget about spilled blood, that Vladimir Putin’s multiple tyrannical ambitions flourished and eventually destroyed Alexei Navalny himself.

    Rationally speaking, Vladimir Putin had no need to fear Alexei Navalny. Navalny would not have been serious competition for Putin even in honest elections. He would have gotten a maximum of ten to fifteen percent of the votes of the liberal electorate — a lot, to be sure, but not enough even for a second round. Stories that Navalny would have beaten Putin are the electoral fantasies of his supporters, a fairy tale with a happy ending, with no convincing sociological or political basis.

    So Putin’s worries about Navalny, his fears of Navalny, were completely irrational. To understand them we need to take an excursion into the dictator’s head — a voyage that a writer, rather than a scholar, is perhaps best qualified to take.

    Putin was and is an officer of the secret police, counter-intelligence, a trained paranoid whose picture of the world is irreversibly deformed by ideological indoctrination and professional “education.” I have read enough internal KGB documents to say this confidently.

    The key word, the key concept, of this worldview is “object.” The idea is to depersonalize people, to cleanse them of subjectivity, of selfhood. Object of surveillance. Object of influence. Object of operative interest. Object of development. Yours or someone else’s. In the world of objects, no one acts on his own. There are always hidden reasons, there are always puppet masters. But Navalny’s personality, his charisma, his preternaturally unflappable spirit, was a challenging anomaly for Putin, who was certain that all people were objects; and this exception to the rule, this man who somehow could not be made into an object, created an almost superstitious feeling in him. It is known that Putin never called Navalny by his name until after he died in the camps.

    To better understand the genesis of Putin’s attitude toward Navalny, we must go back to Dresden in late 1989, where Putin was serving at the time. The local Stasi office and its corresponding KGB office were on the outskirts of the city, in an idyllic area of two-story villas near the Elbe. Right there, about two hundred meters away, stood a typical urban five-story building, which housed families of Soviet officers. A walk along those streets today reveals that it is all one tidy whole, a cozy corner where apartment and work are close together. The neighborhood is so sentimental, so gemutlich, so safe.

    But in late 1989 the coziness ended suddenly, when demonstrators surrounded the Stasizentrale building and blocked the KGB officers inside their neighboring villa. Descriptions of those turbulent days share an important feature: the protestors acted wisely and in an organized  way, while both the Stasi and the KGB were in disarray. 

    And biographies (and hagiographies) of Putin mention the moment when he allegedly came to the gates and calmed down the crowd that was ready to storm the KGB villa, behaving like the tamer of wild elements, a man with a cool head. Journalists  recorded the event: Moscow was silent, Moscow gave no orders, and Putin acted independently, at his own peril.

    That was the moment, I think, of his deepest and most destructive fear. Those East Germans, the obedient sheep, the objects that he and those like him were used to bossing around — they rose up, they acted with a firm and free collective will, they invaded a space that he was used to considering private and inviolable. Besides the threat to him and his family, he must have felt a deep-rooted fear, which is always absolute in a state security officer, of people who turned in a flash into subjects of their own fate and history. It was no accident that on his first visit to Germany as president, just a bit more than a decade after the event, he travelled to Dresden. They had thrown him out of there, but he came back — and as master of the situation. Putin would carry that fear — call it the Dresden fear — throughout his life: the fear of “color revolutions,” of Ukrainian Maidans, of any street protests where he can imagine the sudden doubling of a crowd’s energy and the invented foreign power behind it, the eternal conspiracy of Western influence.

    There is another event in Putin’s life that is important in this context. In 1996, he was deputy mayor of St. Peters-burg, a man little known at the federal level and unknown to the general public. Yet only three years later, in 1999, he was prime minister and Yeltsin’s heir. That promotion cannot be described in terms of a consecutive career. There were no such careers. It is Fata Morgana, a postmodern composition in which one of many (and a career counterintelligence agent to boot) accustomed to conspiracies and to manipulations behind the scenes is suddenly elevated and made heir to the throne.

    He would spend the rest of his life arguing that it was not an accidental choice, that he in fact was a leader, a historical  figure, a messiah; part of the historical pattern, not just Yeltsin’s whim. Hence his obsession with history, his search for the ideological genealogy of his power. He is like a commoner determined to create an aristocratic background for himself. Trained to be no one, anonymous, a gray man in a gray coat, he is possessed by a megalomania stemming from a deep fear of imposture. His perception is schizophrenically distorted: he is simultaneously sure of his right to rule the kingdom and waiting in dread to be finally exposed, in the fatal end of the play in which he was once assigned a role.

    The two fears converged in the figure of Alexei Navalny. In contrast to Putin, Navalny created himself. In contrast to Putin, Navalny was a genius of the masses, a born leader of the protest minority. Putin’s passion for history is profoundly pathological because it is only in the external world, the world of acts of power, acts of aggression, that he can confirm over and over that he really is the ruler. Taught to rule people through fear and submission, knowing neither love nor trust nor solidarity, he is prey to fears. Navalny terrified him.

    As declassified KGB archives in Ukraine and Lithuania show, the work of the secret police did not end when a target was arrested and sentenced, or when a political prisoner was sent to the camps. His file was sent with him, so that they could continue their persecution there. 

    They could try to compromise him, to ruin his reputation in the eyes of his comrades. To force him to change his views. 

    To incline him to self-denial, to compel his repentance, his denunciation of his previous activity. A combination of carrot and stick. Play, tempt, press, force.

    Judging from reports by Navalny’s lawyers and comrades, they did not play with him. They simply tried to destroy him. To kill him a second time. But Navalny had a lot of life in him. Actually, he, with his body, his character, and his strength, symbolized life. Life against death. His mistakes, misunderstandings, and failures were the qualities of a living person. And so he spoke out — albeit belatedly — about the criminality of the war against Ukraine. He spoke from solitary confinement.

    His surname came from the verb navalivatsya, to pile on, and it was the surname of a fighter. I do not feel it is necessary to discuss why he returned to Russia. He made his choice. The Christian connotations ascribed with almost religious fervor to that return, so as to make of him a redemptive sacrificial victim, are outrageously inaccurate. What he certainly never intended to be was a sacrificial victim. Not in a psychological, or legal, or sacral sense.

    Basically, this is the main lesson of his life, his main gift, his main legacy: you can live and act freely in Russia, you can live without feeling doomed, without acknowledging the right of the regime to punish or pardon, without a bent spine. That is how we will remember him: the harbinger of an unfulfilled hope.

    Real politics in Russia will emerge only when the subject of liberal-democratic thought becomes the question of what to do with the so-called Federation, in which the “subjects” of this Federation have no political subjectivity. What to do with a country that is a conglomeration of forcibly annexed nations, whose national identity has been and is being erased, whose culture is being Russified? What to do with the last empire, afflicted with residual imperial megalomania, and with a nuclear arsenal?

    Everything else is not politics, but a way of avoiding this urgent and extremely painful question. For this reason, the distance between Vladimir Putin’s United Russia and Alexei Navalny’s Beautiful Russia of the Future is not as great as it may seem. Both of these supposedly visionary concepts are just screens, a way to hide the real poverty of the political toolkit.

    Alexei Navalny’s utopia was futuristic, modernist, it functioned like a time machine, which is to say, he imagined that the future could simply be summoned rather than earned. The future drew its magic power from time as such: one day the future would come, and the future would put everything in its place, canceling the past. It is necessary only to live, to wait, as one waits for the change of seasons. Vladimir Putin’s utopia, by contrast, is retrospective: what makes us strong is our connection to the past, to the figures of our archaic ancestors, the victors in World War II. The West, which rushed into the future, is afflicted with moral corrosion, while we are becoming morally stronger because our future is the past.

    In relation to the real, historically conditioned Russian Federation, which began to unravel back in the 1990s, a process that was reversed by the ostentatious massacre of Chechnya and the establishment of an authoritarian regime, both political projects described above are mirages. Real democracy in the Russian Federation, which would give representation and political power to national minorities, will always (potentially) raise the question of political architecture, subjectivity and, in the end, independence.

    Vladimir Putin, the self-appointed tsar, will never understand or recognize this. Alexei Navalny could probably have understood. He could have learned, which is a capacity only of the living. He had come a long way, from flirting  politically with street nationalism to fighting against tyranny. In a bitter irony, flowers for him in the days of his death were left at monuments to the victims of Soviet repression, an unwitting recognition of the continuity of Russian violence, which he tried to deny with his life.