The Job Poet and the Order of Things

    The writer responsible for Job is the greatest of all biblical poets and one of the most remarkable poets who flourished in any language in the ancient Mediterranean world. He is a technical virtuoso, deftly marshaling sound and rhythm for expressive effects, at times deploying brilliant word-play — as when he writes, “My days are swifter than a weaver’s shuttle, / they snap off without any hope,” the word for “hope,” tiqwah, punning on a homonym that means “thread” — utilizing a vocabulary that is the most extensive of any biblical poet, with borrowings from Aramaic, an enlisting of rare words, and even introducing words that seem to be his own invention. His range of metaphors is inventive and often dazzling, drawing on cheese-making, weaving, horticulture, and much more. Had there been bicycles in ancient Israel, I suspect we would find a bicycle simile somewhere in his poem. He exhibits an interest in nature quite untypical of biblical poets. And no other poet of his time and place possessed his ability to link together different passages with recurrent terms and images, even over long stretches of text.

    We know nothing about this anonymous heterodox genius except that he probably lived in the fifth century B.C.E., and even that has been disputed. In the fluidity of forms that characterized the Late Biblical Period, it would certainly have been possible for him to frame his argument as prose, but poetry was an inevitable choice for him. The power of poetic expression gave him the means to articulate the full measure of Job’s anguish and of his outrage at having been severely mistreated by God, as well as conveying the dizzying span of God’s vision of the created world in the Voice from the Whirlwind at the end. And one should also say that he surely knew he had a mastery of the poetic medium and relished its deployment in the great work he produced.

    The outlook of the Job poet is a radical dissent from the mainstream biblical consensus, and in this regard, too, poetry was a powerful vehicle for him to express his dissent. In what follows here, I will be examining two rather long passages, the first a complete poem, in order to show how the resources of poetry enabled him to say what he wanted to say.

    The Job poet strategically frames his poetic argument by beginning with a harrowing death-wish poem that communicates 
Job’s acute sense that his existence has become so unbearable — all his children dead, his property in flocks destroyed, 
his body afflicted with an excruciating burning rash — that he wishes he never would have been born. Here is the poem that takes up all of chapter three. (This and all the excerpts that follow are my translation.)

    Annul the day that I was born
    And the night that said, “A man is conceived.”
    That day, let it be darkness.
    Let God above not seek it out,
    nor brightness shine upon it.
    Let a cloud-mass rest upon it,
    Let day-gloom dismay it.
    That night, let murk overtake it.
    Let it not join in the days of the year,
    let it not enter the number of months.
    Oh, let that night be barren,
    let it have no song of joy.
    Let the day-cursers hex it,
    those ready to rouse Leviathan.
    Let its twilight stars go dark.
    Let it hope for day in vain,
    and let it not see the eyelids of dawn.
    For it did not shut the belly’s doors
    to hide wretchedness from my eyes.
    Why did I not die from the womb,
    from the belly come out, breathe my last?
    Why did knees welcome me,
    and why breasts, that I should suck?
    For now I would lie and be still,
    and would sleep and know repose
    with kings and the councilors of earth,
    who build ruins for themselves,
    or with princes, possessors of gold,
    who fill their houses with silver.
    Or like a buried stillborn I’d be,
    like babes who never saw light.
    There the wicked cease their troubling,
    and there the weary repose.
    All together the prisoners are tranquil,
    they hear not the taskmaster’s voice.
    The small and the great are there,
    and the slave is free of his master.
    Why give light to the wretched
    and life to the deeply embittered,
    who wait for death in vain,
    dig for it more than treasure.
    who rejoice at the tomb,
    are glad when they find the grave?
    To a man whose way is hidden,
    and God has hedged him about.
    For before my bread my moaning comes,
    and my roar pours out like water.
    For I feared a thing — it befell me,
    what I dreaded came upon me.
    I was not quiet, I was not still,
    I had no repose, and trouble came.

    The first word of the poem, yo’vad, literally means “perish,” but unfortunately “perish the day” is no longer a viable English equivalent because the locution in our era has slid into prissiness (“perish the thought”). The effort of a couple of modern translators to give the expression punch in English (“damn the day”) inserts an inappropriate tone or implication because there is nothing about damning, in either the invective or theological sense, in the Hebrew. The transitive verb “annul” has the justification that the poem is all about expunging the day from the calendar. The two versets of this line exhibit an altogether original use of the dynamic of intensification from the first verset to the second. Job wishes not merely never to have been born but, moving back nine months, never to have been conceived. Thus the conventional poetic word-pair, “day” and then “night,” is given a startling new force.

    The poet then picks up “night” from the second half of this line and launches on a rich orchestration of synonyms for darkness. After the primary term “darkness,” he enlists “cloud-mass,” “day-gloom,” “murk.” (“Day-gloom,” kimrirei yom, seems to be his coinage, probably derived from an Aramaic root that indicates darkness, with the expression here possibly referring to a solar eclipse.) The poet’s tapping of the Hebrew lexicon for synonyms is evident throughout the book. (There are five different terms for “lion” in biblical Hebrew, and at one point he uses all five in two consecutive lines.) With elegant appropriateness, Job in this poem wants the night of his conception to have been “barren,” of course the opposite of conception. 

    In the process of intensification as the poem continues, he then moves up to a mythological register: “Let the day-cursers hex it, / those ready to rouse Leviathan.” A minor emendation to the Masoretic text yields “Yamm,” the primordial sea-god who is also Leviathan, instead of yom, “day.” (At this point, the King James Version commits one of its most lamentable errors, rendering the Hebrew for Leviathan, lewayatan, as “their mourning,” which is both grammatically wrong and imagines that the noun is lewayah, a term for “funeral” in rabbinic Hebrew that is not biblical.) The second line in this verse makes the wish for darkness cosmic, still another aspect of intensification — no stars, a night without dawn, without morning stars. The concluding metaphor of this line demonstrates how utterly original the Job poet is in his deployment of figurative language: “let it not see the eyelids of dawn.” 

    This is a daring, and beautiful, metaphor: the first crack of light on the eastern horizon is likened to the opening eyelids of the sleeper looking out to the east. Modern translators, deluded in thinking that readers can no longer understand metaphors, substitute for the metaphor what may be its referent, as in the Jewish Publication Society’s “the glimmerings of the dawn.” The Job poet knew that this was a remarkable metaphor, for he did not hesitate to use it again. This occurs much later, in a radically different context, in representing the fierce appearance of the daunting Leviathan, thus locating beauty at the heart of terror: “His sneezes shoot out light, / and his eyes are like the eyelids of the dawn” (41:10).

    The initial movement of the death-wish poem heads toward a conclusion by Job’s saying of the day he was born that it “did not shut the belly’s doors / to hide wretchedness from my eyes.” The prominent noun in the last phrase is a strong instance of poetic efficiency. We might expect here “life” or “the light,” but for this sufferer life itself has become nothing but wretchedness. In the next few lines, following the characteristic movement of biblical poetry from the general to the specific or concrete, we get an evocation of the physicality of birth: womb, belly, knees (presumably parted in birthing), and breasts giving suck.

    Job’s wish never to have been born joins with a panorama of human life, and it is a bleak panorama. Kings “build ruins for themselves,” imposing structures that inevitably crumble to dust (one thinks of Shelley’s “Ozymandias”) and princes store up gold and silver, futilely, for they will part from it in death. Verse 18 shows the poet’s firm sense of integrated structure, of which we will see a more spectacular instance in the Voice from the Whirlwind, for “babes who never saw light” takes us back to the early lines expressing the wish to be a stillborn and blot out the light. Everyone, in this despairing vision, finds repose only in death, the great equalizer. And what has life been for humankind? The wicked have troubled others, all are weary, there are prisoners and slaves and taskmasters. Existence is so universally miserable that everyone longs for death. In this way, Job invites us to see his wretchedness not as a special case but merely as a particular instance of the fate of misery shared by all. The large resonance of Job’s inveighing against God as he proceeds in his poetic argument derives from his seeing unwarranted suffering not as his alone but as the common plight of humankind.

    A single word toward the end of this poem exemplifies how this writer creates connective links in the overall structure of his work. In the frame-story, the Adversary says to God, “Have you not hedged him about and his household and all that he has all around?” (1:10). The verb here obviously has the sense of “protected.” But the same word in Job’s mouth here, “and God has hedged him about” means the opposite: Job is complaining that God has blocked him on all sides and left him no way out of his terrible plight. This sharp play on two opposed meanings of the same word might suggest that the Job poet did not transcribe verbatim the old folktale that had come down to him, probably orally, but felt free at least at one point and perhaps others to modify a wording. Alternatively, the folktale may have included “hedge,” which the poet then played against. That same verb will occur twice more in the Voice from the Whirlwind, with still a third meaning, but I will postpone commenting on its use there until we have a long look at the God speech that concludes the poetic body of the book. 

    The final lines of the death-wish poem aptly round off its argument. The third line before the end exhibits the expected biblical intensification from verset to verset: “For before my bread my moaning comes, / and my roar pours out like water,” from moaning to roaring. Job goes on to assert that he had lived in a state of anxiety, perhaps intimated in his offering sacrifices for his children because he was afraid that they had committed some offense. Finally, having longed for the quiet of the grave, he admits the realization of those fears. The last word of the Hebrew text is rogez, etymologically a state of disturbance or unrest, as this same verbal stem is used elsewhere to describe the shaking of earthquakes.

    The devastating extremity of Job’s wish for non-being raises a general issue. It serves the obvious purpose of introducing all that he will say by making clear how utterly unbearable his suffering is. He does not yet indict God, though that will follow, for he never ceases to believe in an all-powerful God, which means that God must be responsible for the chain of disasters inflicted upon him. But what good can poetry do in the face of intolerable and unwarranted suffering? The Job poet is clearly not alone in confronting this dilemma. In English literature, perhaps the most memorable instance is Lear driven out on the moor in the fierce storm, blinded, stripped of his possessions, cruelly rejected by two of his three daughters: “For I am bound upon a wheel of fire, / that mine own tears do scald like molten lead.” The Job poet would have appreciated this figurative 
language, especially the way the wheel of fire is extended in the simile of the scalding tears like molten lead.

    In our own time, the signal instance of poetry conveying unbearable suffering may well be Paul Celan’s celebrated Todesfuge, or “Deathfugue.” It is a formally deft and beautifully crafted poem, but it embodies “a terrible beauty,” in Yeats’ phrase. The first two words, repeated as a kind of refrain, are a shock: “black milk.” The violent transformation of the nurturing substance of life into blackness will then work in tandem with another refrain-phrase, “Death is a master from Deutschland.” Celan exploits the resources of poetry to express the unspeakable outrage of six million of his people murdered in the industrial death-machine of Deutschland, the cherished homeland figured in the poem by the fair Margarete. What, in confronting such appalling realities, can poetry possibly do? The reflexive response to outrage against our moral instincts is, I suppose, a scream. Poetry of this order of greatness, from Job to Celan, transforms the scream into articulated, eloquent expression. I do not believe that it is really cathartic, but it communicates a feeling that suffering has been endowed with a sharp focus in language, the primary human medium — that outrage has been given a voice, a voice that gets across the full awfulness of what has occurred and therefore is, strangely, at once horrifying and satisfying. We somehow cling to our humanity in the face of horror.

    As the great poem of Job unfolds, it emerges that the work incorporates three different orders of poetry. Most prominent, until a new poetic voice appears at the end, is Job’s poetry. The poetry of the three comforters in the debate is by and large inferior to his. It is inferior because they are working with the complacent clichés of traditional wisdom, and so the poetry that they speak in their argument invokes many shopworn formulas. From time to time, to be sure, there are brief passages of strong poetry, because this writer was such a fine poet that he could scarcely refrain from intermittently giving the comforters a few good lines. Finally, when we come to the conclusion of the poetic body of the work, the poet takes a certain risk, at which he succeeds splendidly. 

    God is now, at last, speaking, and because He is God, He must be given poetry that transcends the stunning poetry of anguish spoken by Job. It is a challenge that the Job poet undertakes because he must have been confident in his own mastery. If I may propose a seemingly irreverent analogy, let me cite Molly Bloom’s soliloquy that is the concluding episode of James Joyce’s Ulysses. She has been a looming presence in the novel, especially in her husband’s thoughts, as we are taken through a rich variety of remarkable prose-poetry, some of it stream of consciousness and some of it in other forms. Then, at the end, we enter the living current of her unspoken words as she lies in bed, mulling over the day, her husband, her loves, her life, and this soliloquy proves to be prose-poetry arguably even greater than all that has come before. Like the Voice from the Whirlwind, it is poetic language that embodies a grand epiphany, the resonant affirmation that the work as a whole is meant to pronounce.

    God’s poem begins with a challenge to Job: “Who is this who darkens counsel / in words without knowledge?” 
The initial phrase demonstrates how aptly this poet chooses his words. “Darken counsel” is not an idiom that appears elsewhere in the Bible, and so it evidently has been coined 
for the present purpose. Job’s first poem, as we saw, begins with a whole sequence of images that express a longing to blot out the light — daylight, the rising sun, the stars, all things to be engulfed by darkness. In response, God signals in the very phrasing that this is profoundly misguided. After a line in which He tells Job to gird his loins like a man, 
God continues:

    Where were you when I founded earth?
    Tell, if you know understanding.
    In what were its sockets sunk,
    or who laid its cornerstone,
    when the morning stars sang together,
    and all the sons of God shouted for joy?
    Who hedged the sea with double doors,
    when it gushed forth from the womb,
    when I made clouds its clothing,
    and thick mists its swaddling bands?
    I made breakers upon it My limit,
    and set a bolt with double doors.
    And I said, “Thus far come, no farther,
    here halt the surge of your waves.”
    Have you ever commanded the morning,
    appointed dawn to its place,
    to seize the earth’s corners,
    that the wicked be shaken from it?
    It turns like sealing clay,
    takes color like a garment,
    and their light is withdrawn from the wicked,
    and the upraised arm is broken.
    Have you ever come into the springs of the sea,
    in the bottommost deep walked about?
    Have the gates of death been laid bare to you,
    and the gates of death’s shadow have you seen?
    Did you take in the breadth of the earth?
    Tell, if you know it all.
    Where is the way that light dwells,
    and darkness, where is its place,
    that you might take it to its home
    and understand the paths to its house?
    Have you come into the storehouse of snow,
    the storehouse of hail have you seen,
    which I keep for a time of strife,
    for a day of battle and war?
    By what way does the west wind fan out,
    the east wind whip over the earth?
    Who split a channel for the torrent,
    and a way for the thunderstorm,
    to rain on a land without man,
    wilderness bare of humankind,
    to sate the desolate dunes,
    and make the grass sprout there?
    Does the rain have a father.
    or who begot the drops of dew?
    From whose belly did the ice come forth,
    to the frost of the heavens who gave birth?
    Water congeals like stone,
    and the face of the deep locks hard.

    The arresting image of the morning stars singing together in joy singles out these points of light in the night sky, heralding a dawn that Job had wished never would have come, and there are more references to light further on. This is an imagining of creation not hinted at in Genesis, though we do not know whether it is the poet’s original invention or reflects a tradition about creation that did not make it into the text of Genesis on which he drew. In the next line, we get the sea “hedged” with double doors. That verb, we recall, occurred in two different senses in the frame story and in the death-wish poem. Now, the idea that it conveys in this moment of cosmogony is a blocking of the surging waters of the sea from flooding the land. That notion is inherited from Canaanite poetry, where it is accompanied by the mythological motif of the conquest and imprisonment of a monstrous sea-god, variously called Yamm, Leviathan, Rahab, Tanin (the last also refers to lesser sea-beasts), affected by the land-god (or weather-god) Baal. This old story was so familiar in the culture that Job could invoke it without explanation to signify God’s holding him captive under relentless surveillance: “Am I Yamm, or the Sea-Beast [tanin], / that You should put a watch upon me?” (7:12). 

    The explicitly mythological figure is excluded from the representation here. Instead, we witness the waters gushing forth from the “womb” of the primordial sea, an image for it invented by the Job poet. This strategically chosen metaphor thus represents creation as birth, the very thing Job wanted to cancel in his death-wish. Birth is confirmed in the following line: “when I made cloud its clothing, / and thick mist its swaddling bands.” This is a metaphorical coinage of the same level of originality as Shakespeare’s scalding tears of molten lead. In the ancient world, infants were wrapped snugly in swaddling bands, strips of white linen. These are like what one might see looking out to the west at bands of mist over the water. The strikingly visual image is completely unusual while it continues the figuration of creation as birth.

    Verses 17–20 further develop the rejoinder to Job’s initial bleak poem: “Have the gates of death been laid before you, / and the gates of death’s shadow have you seen?” Job had fervently wished for death but knew nothing of its looming reality. God alone is the master of death, His all-seeing eyes taking in the full measure of its dark realm. The next two verses underscore the antithesis to chapter three: “Where is the way that light dwells, / and darkness where is its place / that you might take it to its home / and understand the paths to its house?” Light appears here in poetic parallelism with darkness, as it does elsewhere in the Bible, but that appearance is diametrically opposed to Job’s desire for light to be swallowed up by darkness. Instead, there is a diurnal rhythm of alternation between light and darkness — and just possibly, by implication, between hope and despair. Light and darkness are part of the harmonious ongoing process of the created world, something the devastated Job has chosen to turn away from.

    What follows in the poem is a manifestation of the powerful, at times even violent, energy that pulses through creation, a theme that will continue through this poem all the way down to Behemoth and Leviathan at the end. This might be the relevance at this point in the mythological reference to God’s setting aside the harsh elements as weapons “for a time of strife, for a day of battle and war.” The face of nature itself is limned with violent action — “who split apart the channel for the torrent, / and a way for the thunderstorm.”

    At this junction, the poet introduces a crucial, and radical, idea: God brings “rain on a land without man, / wilderness bare of humankind, // to sate the desolate dunes / and make the grass sprout there.” The version of cosmogony in Genesis is emphatically anthropocentric. Man is the culmination of creation, enjoined to rule over all things, everything set out for his benefit. Here, by contrast, God causes the rain to fall “on a land without man,” a rainfall that will “sate the desolate dunes.” (The poet’s mastery of sound as well as metaphor is evident in these words: “desolate dunes” represents the Hebrew sho’ah umesho’ah, that alliteration merely approximated in my English phrase.) This dissenting notion that the natural world extends far beyond man and is perhaps indifferent to him surely resonated with Melville in Moby-Dick, perhaps as much as Job’s Leviathan, which the novelist chose to construe as a great whale. 

    Finally, in the next lines, the poem returns to the birth imagery prominent at the beginning:

    Does the rain have a father,
    or who begot the drops of dew?
    From whose belly did the ice come forth,
    to the frost of the heavens who gave birth?

    The second line here demonstrates both the originality and the boldness of this poet in his handling of metaphor. Having chosen to figure the origins of ice and dew as a birth, he represents that birth in a way that is almost shocking when he invites us to contemplate chunks of ice emerging from the womb. Poetry can be a means of reorienting or radically shifting perception, especially in conjoining through figurative language totally disparate and surprising ideas or realms. This is clearly what God wants to do with Job — to shake him up, to compel him to see the world in ways that would never have occurred to him.

    After leading Job’s vision to the sky in the next few lines, the Voice from the Whirlwind moves on at the end of the chapter to the animal kingdom, with particular attention to beasts of prey — the lion and the raven. (One should remember that chapter divisions in the Bible were a medieval editorial intervention, and the lines to which I am referring were meant to initiate the tour of zoology that will continue to the end of the poem.) This zoological section, running to the end of chapter 39, then picking up after a few lines of address by God to Job with the climactic Behemoth and Leviathan in chapters 
40 and 41, is too long for a reading here, but we can look at two passages. The first is in 39:1–12:

    Do you know the mountain goats’ birth time,
    do you mark the calving of the gazelles?
    Do you number the months till they come to term
    and know their birthing time?
    They crouch, burst forth with their babes,
    their young they push out to the world.
    Their offspring batten, grow big in the wild,
    they go out and do not return.
    Who set the wild ass free,
    and the onager’s reins who loosed,
    whose home I made in the steppes,
    his dwelling-place flats of salt?
    He scoffs at the bustling city,
    the driver’s shouts he does not hear.
    He roams mountains for his forage,
    and every green thing he seeks.
    Will the wild ox want to serve you,
    pass the night at your feeding trough?
    Bind the wild ox with cord for the furrow,
    will he harrow the valleys behind you?
    Can you rely on him with his great power
    and leave your labor to him?
    Can you trust him to bring back seed,
    gather grain on your threshing floor?

    We are now returned to the theme of birth announced in the figurative language at the beginning of the poem. Birth is universal among animate creatures, and it goes on, far from human observation or human ken, in the mountains and the forests, beyond the grasp of the man who wished for his own birth never to have occurred. The animals of the wild “burst forth” with their little ones. Birth, too, is imagined as a violent process. This is a unique application of this verb to birthing — the general meaning of the verbal stem is “to split apart” — an indication that the poet is imagining procreation in a different way. The lines that follow take up a theme that will become more salient in the representation of Behemoth and Leviathan. The wild ass and the onager, out on the salt-flats and the steppes, live remote from any human control, scoffing at the crowded habitations of men and women, free from the whips and the commands of the driver. 

    As one sees elsewhere in biblical poetry, intensification within the single line is projected forward through a sequence of lines. In the four lines devoted to the wild ox, the poem’s audience is challenged with the question of whether they can ever domesticate the wild ass and subject him to servitude, hitch him to the plow, train him (fantastically) to bring back seed or gather grain from the threshing floor. This theme of the resistance of the beast to human mastery will be elevated to a new level of intensity in Behemoth and Leviathan. All this is a strong expression of the rejection of anthropocentrism: contrary to the assurances of Genesis 1, humankind will never be able to rule over the animal kingdom; there are forms of life simply too powerful for man.

    A few lines down, the poem goes on:

    Do you give might to the horse,
    do you clothe his neck with a mane?
    Do you make him roar like locusts —
    his splendid snort is terror.
    He churns up the valley exulting,
    In power goes out to the clash of arms.
    He scoffs at fear and is undaunted,
    turns not back before the sword.
    Over him rattles the quiver,
    the blade, the javelin, and the spear.
    With clamor and clatter he swallows the ground,
    and ignores the trumpet’s sound.
    At the trumpet he says, “Aha,”
    and from afar he scents the fray,
    the thunder of captains, the shouts.
    Did the hawk soar by your wisdom,
    spread his wings to fly away south?
    By your word does the eagle mount,
    and set his nest on high?
    On the crag he dwells and beds down,
    on the crest of the crag his stronghold.
    From there he seeks out food,
    from afar his eyes look down.
    His chicks lap up blood,
    where the slain are, there he is.

    At this moment in the poem, before the eagle and before Behemoth and Leviathan, the poet introduces his famous description of the war horse (39:19–25). Here are some vivid lines from the beginning of this section:

    Do you give might to the horse,
    do you clothe his neck with a mane?
    Do you make his roar like locusts —
    his splendid snort is terror.
    He churns up the valley exulting,
    in power goes out to the clash of arms.
    He scoffs at fear and is undaunted,
    turns not back before the sword.

    Some readers may wonder what it is doing here. There is one plausible explanation that does not immediately justify its inclusion in the Voice from the Whirlwind. It is that the poet put it here as a demonstration of literary skill: he was drawn to do it and knew that he could evoke this bellicose equine presence with remarkable vividness. He gets the sound of weaponry around the horse just right, enriching the depiction with an expressive alliteration — “clamor and clatter” in my version emulates the Hebrew ra‘ash werogez, with the accent on the first syllable of each alliterated noun. Sound plays an energizing role in the effect of the passage — the rattle of the quiver and the weapons, the clamor of the pounding hoofbeats, the blast of the martial trumpet. With all this, the fierce battle charger prepares the way for the two daunting beasts yet to come, Behemoth and Leviathan. Like them, he is at once glorious and frightening: “his splendid snort is terror.” Also like them, he embodies power and fearsome beauty that are beyond humanity, that do not relate to humankind: “Do you give might to the horse, / do you clothe his neck with a mane?” The war horse, in contrast to the wild ass and the onager, is surely saddled with a rider holding reins, at the command of the mounted warrior. The main point is that this fearless creature galloping into the midst of battle is imagined — in this one respect, unrealistically — as though he were virtually autonomous, charging into the fray out of the sheer love of armed combat. 

    The poet needs this divergence from verisimilitude in order to set the stage for those two creatures, Behemoth and Leviathan, who are impermeable to any human action and resistance. They will make their climactic appearance, beginning in verse 15 of the next chapter, after an intervention addressed by God to Job (40:1–14). In brief words God challenges Job to answer all that He has said, to which Job responds with a profession of his own worthlessness. God then resumes His speech, beginning as before with “Gird up your loins as a man.” But before that, in the last six verses of chapter 39, we move from the battlefield to the sky in the depiction of the hawk and the eagle. It is an appropriate place for the naturalistic phase of the zoological parade to end because it is a realm no human being can ever reach. Even the nests of these creatures of the sky are unreachable, placed in the crags of high mountains.

    But what is it that the poet focuses on in the life-cycle of the eagle? That the eagle nurtures his young, an instinct that impels all creatures. The nurturing of the fledglings, however, necessitates killing: “His chicks lap up blood, / where the slain are, there he is.” I have been contending that this writer, virtually unique in biblical poetry, is a poet keenly interested in nature, but that interest is resolutely unsentimental. There is no anthropomorphizing in his vision of the natural world, no pathetic fallacy, no gentle rhapsodizing over the beauties of creation. He understands that nature is red of tooth and claw — the eagle’s chicks “lap up blood.” That is the harsh order of things. He lucidly sees that violence, even lethal violence, is an intrinsic element of the life-cycle in the animal kingdom. This does not really answer Job’s complaint about unjust suffering, but it does suggest that the world around us does not conform to our comforting assumptions about good and evil and that we have to live with a reality that resists our conventional moral calculus.

    I will not consider Behemoth and Leviathan directly, because our scrutiny of the fierce creatures from the lion to the war horse to the eagle has anticipated much of what needs to be said about them. The difference between these two and the preceding creatures in the catalog of animals is that they straddle the border between zoology and mythology, thus culminating the poetic process of intensification. Presumably, they are based, respectively, on the hippopotamus and the crocodile, creatures of the Nile conveniently removed from the direct observation of the poet and his audience, mainly reported to them through travelers’ yarns. There are realistic touches in the depiction of both: Behemoth in the shallows of the river shaded by lotus, and willow, “hedged” — again that strategic verb — by the lotus, and Leviathan with his crocodile’s plate of armor, “His back is rows of shields, / locked close with the tightest shield,” and his fearsome teeth, “All around his teeth is terror.” But such naturalistic depiction seamlessly slips into the supernatural. “Could one take him with one’s eyes,” it is said of Behemoth, “with barbs pierce his nose?” (In fact the Egyptians did hunt hippopotami.) And before long, the naturalistic crocodile morphs into a dragon, his mouth shooting firebrands, his nostrils emitting smoke. He is impregnable to all man’s weapons (a note Melville would pick up in associating Leviathan with the Great White Whale): “When he rears up, the gods are frightened, / when he crashes down they cringe.” At this point, Leviathan has merged with the ferocious Canaanite sea-god from whom he takes his name. 

    In the logic of the poem, the poet needs these mythologized beasts for his conclusion because they crown his argument that there are things in nature beyond human ken and absolutely beyond any hope of human domination. The Psalmist, in a splendid celebration of man’s supreme place in a cosmic hierarchy, wrote: “You make him rule over the work of Your hands. / All things You set under his feet.” (Psalm 8:7) Tellingly, Job the sufferer quotes another line from this same psalm — “What is man that You should note him?” — but bitterly reverses its meaning to say, what is miserable man that You constantly scrutinize him to persecute him? Here, in the climax of God’s speech to Job, the idea that man exerts dominion over all things is powerfully opposed.

    The writer responsible for this extraordinary book was not only a very bold poet, among other things coining imagery, as we have seen, that would not have occurred to any other poet in ancient Israel, but also a very bold thinker, not hesitating to challenge some of the essential ideas long cherished in the Hebrew tradition. The boldness of the poetry is a necessary vehicle for the boldness of the thought. Poetry of the first order of originality is a way of enabling us to see the world with fresh eyes. It is worth going back to a notion promulgated by the Russian Formalists a century ago, that what literature in general does is to shake us out of what had become complacent, unseeing perception through what they called “defamiliarization,” thereby bringing us back to the realities we had ceased to experience, making us feel anew the stoniness of the stone. 

    The poetry of the Voice from the Whirlwind does this on a philosophical level, serving, I would say, as the Bible’s ultimate defamiliarizer. As countless readers have complained, it does not really provide an answer for the dilemma of unwarranted suffering under a supposedly just God. But that dilemma has no real answer. There is no way of explaining why an innocent child should die of cancer or a benevolent woman perish in a fire with all her family. What the poetry does manage to do is carry us away in its sweep, in the brilliance of its riveting and sometimes startling imagery, occasion us to see the world freshly, prod us to let go of our habitual notions of man as the master of nature and the measure of all things, and realize that contradiction and anomaly and even violence are at the heart of reality — in sum, to accept the limitations of human imagination. We need to take in the power of the poetry in order to have a full sense of the originality of the thought.

    The Wages of Cultural Secularization

    I take my title from the critic and literary scholar Simon During, who coined the phrase “cultural secularization” as a way of understanding the sharp decline in prestige — since the beginning of the twenty-first century and especially in the last decade — of the “high humanities.” The concept will strike many as evasively abstract, and certainly it is as open to skepticism and revision as its predecessor and model, the social-scientific and philosophical account of religious secularization that extends from Nietzsche to Weber to Charles Taylor. But the core of the religious secularization narrative rests on a basically unimpeachable empirical claim — that we have “moved from a condition in 1500 in which it was hard not to believe in God” to a modernity in which unbelief “has become quite easy for many,” as Taylor puts it. During’s parallel claim — “Faith has been lost across two different zones: first, religion; then, high culture . . . The humanities have become merely a (rather eccentric) option for a small fraction of the population” — cannot yet command the same ready assent. But in our universities, where tenure tracks in the humanities are swiftly disappearing, where majors and enrollments in fields such as English and art history are plummeting, some such notion as “cultural secularization” seems necessary — even, once you get past a first recoil at its conceptual hubris, obvious.

    Cultural secularization, During writes, is a “second secularization,” meaning both that it came about after religious secularization and that, to a degree, it is a variant of it. That is because the ascent of the high humanities was understood, and to an extent engineered, by thinkers for whom, as During writes, “culture was consecrated in religion’s place.” The most important expression of this compensatory substitution in the nineteenth century is by Matthew Arnold, for whom “poetry” might preserve what is true in religion from the depredations of scientific positivism: “Our religion has materialized itself in the fact, in the supposed fact; it has attached its emotion to the fact, and now the fact is failing it.” But “poetry attaches its emotion to the idea; the idea is the fact. The strongest part of our religion today is unconscious poetry.” By the 1930s, I. A. Richards could insist that “the fact” was failing or had failed much more than just religion. As he writes in Science and Poetry, “Countless pseudo-statements — about God, about the universe, about human nature, about the soul, its rank and destiny — pseudo-statements which are pivotal points in the organization of the mind, vital to its well-being, have suddenly become, for sincere, honest, and informed minds, impossible to believe as for centuries they have been believed.” Yet there is a “remedy,” Richards declared: “to cut our pseudo-statements free from that kind of belief which is appropriate to verified statements,” from any dependence on facts. “This is not a desperate remedy,” he insisted, “for as poetry conclusively shows, even the most important among our attitudes can be aroused and maintained without any believing of a factual or verifiable order entering in at all.” Not just religious emotions, but also the whole complex of scientifically invalid but existentially inescapable intimations of significance, are what poetry, or more broadly the sublimated religion of the high humanities, might save. 

    This sublimation was institutionalized in the twentieth-
century university’s commitment to the study of art and 
literature, where the humanities secured a repository of post-Christian meaning precisely for the educated classes that had fallen furthest away from faith. As During observes, an important distinction between cultural secularization and religious secularization is that “unlike religion, the humanities have always been classed. In their formalized modes especially, they have belonged mainly to a fraction of the elite.” That historical reality has led, too hastily, to a prevailing diagnosis of the crisis of the humanities as essentially one of status-signaling. One commonly hears humanities professors lament, in a sociological vein, that classes in literature or art history are under-enrolled now because knowledge about those topics no longer confers “cultural capital” — no longer impresses, or even interests, one’s bourgeois dinner party guests. This is true, but question-begging. The loss of cultural prestige follows upon a more primary loss of felt significance. 

    Artistic modernism bears a special relationship to this history, because its emergence coincides with, is indeed an effect of, the decisive acceleration of secularization between the mid-nineteenth and the early-twentieth centuries. W. B. Yeats’ “The Second Coming” might stand as an emblem of what Helen Vendler called, in these pages, this “single historical fact: the exhaustion of Christian cultural authority after its ‘twenty centuries’ of rule.” As the scholar of modernism Matthew Mutter, from whom I borrowed the conjunction of Arnold and Richards, puts it, modernism “is the first sustained moment in the long secularization of Western intellectual culture where writers begin to imagine a comprehensively post-Christian future, and where secularism becomes, out of necessity, an object of reflection.” This does not mean that there were no religious or spiritualist modernists, or that modernists were the first to imagine a religion of art as a compensation for religion as such. They inherited that substitution from the Romantics, but with a difference: they have “become critically aware,” Mutter writes, of earlier “methods of naturalizing and adapting religious concepts,” and with this awareness came critical scrutiny. When 
T. E. Hulme accused Romanticism of being “spilt religion,” he meant to suggest that it didn’t quite know what it 
was doing. The art of his own time, he implied, ought to be more self-transparent about its operations. Modernism lives by a spirit of critical demystification supposedly more reflexive, more thoroughgoing, than its predecessors. This is what Mutter calls “modernist secularism” or, alternatively, “restless secularism.” 

    The result is that secularism itself, and especially its compensations, became an object of acute critical concern. Romanticism came to look too tethered to the religious models it was, only half-wittingly, adapting; that kind of self-delusion would no longer do. The technophilic futurism of F. T. Marinetti and company was one response, the machine being the appropriate god for a secular age. T. S. Eliot’s neo-orthodoxy offered another solution, D. H. Lawrence’s erotic neo-paganism yet another. What all of these strategies have in common is a distressed sense of the vulnerability of the model of culture as sublimated religion — perhaps an adequate compromise when things weren’t quite so far gone but hopelessly naïve after the aeroplane, Freud, and the Great War. 

    Modernism was therefore constituted by its impassioned attempts at resolving contradictions that remain with us. For a while, and with the collaboration of a system of higher education that promoted the study of interpreted aesthetic traditions perceived to climax in modernism, its particular modes of high seriousness held the field. (In some cultural arenas, they still do: the tendency of the Nobel Prize in literature, for instance, has up to the present been to reward writers recognizably working in the wake of a modernism become traditional, an acknowledged source of the major aesthetic possibilities.) But in others — American publishing by and large, as well as in American university curricula — modernism’s rigors have passed into a kind of obsolescence. Its demands made sense only so long as art was felt to matter in something like the same way religion once had. With the faltering of that faith, the study of modernism first, and then of the “high humanities” in general, is contracting almost out of existence. 

    Ivy Compton-Burnett was born in England in 1884. Her first mature novel (she had disowned an earlier work of juvenilia), Pastors and Masters, was published in 1925; her second, 
Brothers and Sisters, in 1929. By that time, as Compton-
Burnett’s biographer Hilary Spurling puts it, there was “no doubt that she represented the last word in modernity.” In what did that modernity consist — and what has happened to it? Today, her nineteen novels are almost entirely out of print (although New York Review Classics offers handsome paperback editions of two of them, A House and its Head and Master and Maidservant, originally published as Bullivant 
and the Lambs). One would be hard-pressed to discover an undergraduate or even a graduate seminar in which her works are taught. 

    After a series of losses in her early years, Compton-Burnett led an almost eventless life, save for the event of writing and publishing. Her adolescence and young adulthood are a record of family deaths, one sibling after another — some to the routine devastations of a pre-antibiotic age (a beloved brother died of pneumonia), others to psychopathological tangles specific to her rather odd family (two sisters died by poison, 
in a suicide pact). Another brother, Noel, died in the Great War, after which his widow tried to kill herself. Ivy nursed her to health. 

    Noel, with whom Ivy was close, went to King’s College, London. Her biographer tells us that Ivy absorbed through him “not only the general skepticism prevalent at King’s but even the mannerisms of Cambridge conversation.” Compton-
Burnett evolved a stylish atheism ballasted by knowledge of the tragic waste of the war. Here is some representative dialogue from Pastors and Masters

    “I think I have found myself at last,” said Herrick. “I think that, God willing, I shall have done my little bit for my generation, done what every man ought to do before he dies.” . . .

    “Assuming God, you wouldn’t do much if he wasn’t willing,” said Masson. 

    The influence of Cambridge was decisive; the most distinctive feature of all of Compton-Burnett’s novels is that her characters talk to one another about their melodramatic problems (involving wills, inheritances, bigamy, false parentage — things like that) with a high degree of linguistic self-consciousness (“Words pass from mouth to mouth. It is the only way you can become conversant with things,” as one character says, describing the circulation of a rumor.) Told largely in dialogue, the novels read as if Samuel Beckett were writing Freudian soap operas parodying the way Henry James’ people talk. Oedipal rage is routine in Compton-Burnett’s books — she rings grim changes on all the ways that children can hate their parents and that parents can destroy their children. Incest is a motif. Murder and theft are not unheard of. These sordid plots play out in dialogue that is arch, hyper-formalized, elliptical, and extremely precise. Unlike the Oscar Wilde comedies to which they are clearly affiliated, there is something cold and forbidding, something deliberately, nastily airless about Compton-Burnett’s novels, even when they are quite funny, as they often are. They are “glacially witty,” in the apt phrase a reviewer applied to them in The Atlantic in 1951. The subjection of extreme emotional intensities, drawn from highly melodramatic plots, to disciplined grids of language and syntax is one face of Compton-Burnett’s modernity 

    In the milieux of decaying gentry that Compton-
Burnett depicted in novel after novel, men of the cloth are not unknown, but religion as such is almost never Compton-
Burnett’s real topic. Religiosity, rather, is a signal of a personal deficit. As Spurling observes, “an active faith is generally a sign of mental or moral obtuseness.” Compton-Burnett’s vicars “are usually odious, fools and toadies or worse.” This cool contempt for religion and the religious, so different from the painfully achieved unbelief of her parents’ generation — “Ivy and her brothers seem to have reached their position of humorous incredulity easily and early . . . apparently without any of the torments suffered by an older generation of conscientious Victorians,” as Spurling puts it — is another face of Compton-Burnett’s modernity.

    But it is not, for the most part, thematized or presented as an artistic problem. Compton-Burnett’s unanguished atheism may be modern, but it is rarely modernist, in the sense that it does not submit the secularization process that it reflects to self-conscious scrutiny. Secularism is its condition but not its theme. “Religion, or the lack of it,” as Spurling says, “plays no great part” in Compton-Burnett’s books, “except as a convenient indication of social and intellectual standing.”

    That is almost true. There is one major exception: Elders and Betters, from 1944, perhaps the greatest of Compton-
Burnett’s mid-career novels, the plot of which can be summarized in a few sentences. The Donnes — the widower Benjamin and his children, Anna, Bernard, Esmond, and Reuben — have recently moved into the neighborhood of Benjamin’s sister, Jessica Calderon, her husband Thomas, and their children Terence, Tullia, Theodora, and Julius. Also living with the Calderons is Benjamin and Jessica’s invalid sister Sukey, possessor of a failing heart and a large fortune. Sukey soon dies and Anna, thirty and therefore approaching old maidhood, contrives by deception to inherit Sukey’s fortune. Jessica, aghast not so much at her failure to inherit as at the discovery, so she thinks, that her sister did not love her, commits suicide. Anna uses her fortune to marry Terence, love of whom seems to have been her motive. 

    This is the only novel in Compton-Burnett’s oeuvre in which a central family, the Donnes, is described in ethno-
religious terms: “The family had a faintly Jewish look, and biblical names had a way of recurring amongst them, but they neither claimed nor admitted any strain of Jewish blood. The truth was that there had been none in the last generations, and that they had no earlier record of their history.” In Compton-Burnett’s world, this is an unprecedented, and unrepeated, instance of genealogical specificity. And both family names, the Donnes and the Calderons, carry associations with Catholicism (John Donne was born to a recusant family; Pedro Calderón, the great seventeenth-century Spanish dramatist, became a Catholic priest). Calderon, moreover, is a not uncommon name among Jews of Iberian origin. It might be true that Jessica “held the accepted faith” — English Anglicanism — “and lived according to it,” but through their Jewish looks and their Catholic names, the Donnes and Calderons might be open to other admixtures.

    That possibility is realized in the private religion of the youngest characters in the book, Julius and Theodora (“Gift of God”), who worship in secret an Asiatic deity called Chung, whom they address in the cadences of the King James Old Testament: “‘O great and good and powerful god Chung,’ said Theodora Calderon, on her knees before a rock in the garden, ‘protect us, we beseech thee, in the new life that is upon us. For strangers threaten our peace, and the hordes of the alien draw nigh. Keep us in thy sight, and save us from the dangers that beset our path. For Sung Li’s sake, amen.’” 

    Theodora and Julius’ charming syncretic faith represents the only sustained treatment of religion in all of Compton-
Burnett’s novels — almost the only treatment at all. Touchingly precocious, the two children work through the vast problems of post-Christian belief as they offer a sort of anthropological commentary on their own invention. “‘Sung Li is a good name,’ said Julius, as they rose from their knees. ‘Enough like Son and yet not too much like it. It would not do to have them the same.’” His sister responds: “‘Blasphemy is no help in establishing a deity,’ in a tone of supporting him.” They go on to speculate about whether the power of Chung will persist into adulthood, or whether he is merely a children’s god. And they stumble upon hard questions about their own moral character: 

    “After the age of fourteen his influence fades,” said Julius, in a tone of suggestion.
    “Then people have to turn to the accepted faith. Their time of choice is past. But the power of the young gods is real for those who are innocent. That would be the test.”
    “But we are not innocent,” said Julius.
    “Yes, I think we are. Children’s sins are light in the eyes of the gods.”
    “We steal things that are not ours, Dora.”
    “Yes, but not jewels or money or anything recognised as theft.”
    “A sixpence would be thought to be money.”
    “But it is not gold or notes or anything that counts 
to a god.”
    But the steps of the pair faltered, and they turned with one accord back to the rock.
    “O great and good and powerful god, Chung,” said Dora, as they fell on their knees, “forgive us any sins that go beyond the weakness of youth. Pardon any faults that are grievous in thy sight, for temptations lie in wait. For Sung Li’s sake, amen.”
    “Temptation does beset us,” said Julius, gaining his feet.
    “It is a pity that so much of the pleasure of life depends on sin,” said his sister. “We could not be expected to live quite without joy. No god of childhood would wish it.”

    Later they wonder whether the exotic names that they have given Chung and Sung Li, “purloined from a book,” are “fitting,” and decide that they are. Dora points out that “a name with a Chinese sound is more reverent than an English one. “We could not call a god John or Thomas,” Julius says. “Or Judas,” says Theodora. 

    For Julius and Theodora, religion might be a fiction, but the sin it polices is a fact. They share this conviction with other exemplary moderns, such as Hulme, who writes of the “sane classical dogma of original sin,” and, most exemplarily, the anthropological Freud of Totem and Taboo, for whom the primordial Oedipal murder is original sin’s original scene. It is their recognition that they are not immune to — that indeed they are already corrupted by — the same cruelty and aggression that infect their elders and “betters” that makes the children in Elders and Betters so poignant and interesting. 
“‘O powerful god, Chung,’ said Julius, in a rapid gabble, turning and inclining his knee, ‘be merciful to any weakness that approaches real transgression.’” Julius and Theodora 
have no illusions about the innocence of childhood, their 
own included. 

    Compton-Burnett is nowhere more modernist than in suffering a lively sense of the reality of sin, a sense which should be incompatible with her atheism. This tension, or contradiction, was typical. It was partially for what he took to be their implausibly anodyne vision of human benevolence that Hulme railed against the Romantics; and in this he was sometimes joined by Yeats, for whom, as he put it in one of his autobiographies, Romantics such as Emerson and Whitman “have begun to seem superficial because they lack the Vision of Evil.” Elsewhere, Yeats averred that “the strength and weight of Shakespeare, of Villon, of Dante, even of Cervantes, come from their preoccupation with evil,” whereas in “Shelley, in Ruskin, in Wordsworth . . . there is a constant resolution to dwell upon good only.” What Mutter calls Yeats’ “sense of recalcitrant evil” compelled Yeats to embrace what he himself called “original sin,” a concept that he thought was compatible with heathenism and paganism. That was one solution to 
the problem. 

    T. S. Eliot’s conversion to Anglicanism, in which his reluctance to dispense with the idea of original sin figured prominently, offered another solution, albeit an incomprehensible one to more resolutely secular modernists such as Ezra Pound and Virginia Woolf. “I have had a most shameful and distressing interview with poor dear Tom Eliot,” Woolf wrote to her sister, Vanessa Bell, in 1928, “who may be called dead to us all from this day forward. He has become an Anglo-Catholic, believes in God and immortality, and goes to church. I was really shocked.” Nine years earlier Pound had been similarly struck by Eliot’s assertion, during a trip to the south of France, that he thought he might believe in life after death. In language he could not have known mirrored Woolf’s to Bell, Pound in Canto XXIX, published in 1930, had Eliot express a certain satisfaction at Pound’s surprise: “‘I am afraid of the life after death.’ / and after a pause: / ’Now, at last, I have shocked him.’” Writing in his own voice, Pound was capable of moralizing anti-religiosity: “All religions are evil.” 

    Pound’s own solution to the felt thinness of secularity, at least before he was seduced into grandiose fascistic politico-
economic visions, was the old nineteenth-century one, the effete “religion of culture” that Matthew Arnold’s critics accused him of desiring. (“It is said to be a religion proposing parmaceti, or some scented salve or other, as a cure for human miseries,” as Arnold paraphrased the charge.) Eliot felt its attractions, too, but decided, in what seemed like deliberate perversity to many of his peers, that the answer was instead to merge an avant-garde poetics with a resuscitated religious orthodoxy, even as Pound was merging an avant-garde poetics with a pseudo-scientific theory of political economy. For Eliot, the critic Matthew Hollis writes, “the weakness of the condition of literature was an effect of the weakness in the condition of religion.” 

    For four months in 1934, Pound and Eliot argued about these issues in the pages of the New English Weekly. Their dispute began with Pound’s aggressively negative review of Eliot’s After Strange Gods, the text of a lecture given at the University of Virginia arguing, among other things, for the importance of “orthodox” religion to culture in general and literary culture specifically. Pound took a programmatic secular line: “‘Religion’ [has] long since resigned,” he wrote. In the old days, “religion was real,” but today, for most people, it is “either a left-over or an irrelevance.” (In later decades Eliot himself suppressed the text of this lecture, which had become notorious for its anti-Semitism — “reasons of race and religion combine to make any large number of free-thinking Jews undesirable.” It is one of the more tangled ironies of the history of modernism that Pound, who would become a systematic anti-Semite and a traitorous fascist during the Second World War, should have objected to After Strange Gods, not indeed out of any love for the Jews but out of distaste for the Christian parochialism of Eliot’s anti-Semitism.) 

    Like Woolf, Pound finds Eliot’s profession of faith essentially incomprehensible — so much so that he claims not even to know what Eliot means by “religion” in the first place. This, despite the fact that there is nothing really obscure in what Eliot intended. As the literary critic Christina C. Stough summarizes the Pound–Eliot controversy, religion for Eliot comprised “a full sense of Heaven and Hell — a religion of sacraments, rituals, orthodoxy, and above all, an acceptance of original sin.” For Woolf, the notion that Eliot could believe in all this was absurd: “A corpse would seem to me more believable than he is. I mean, there’s something obscene in a living person sitting by the fire and believing in God.” For Pound, such belief was perhaps even pathological; he refers, in a letter of 1936, to “Eliot’s crazed and pseudo-religious brain.” 

    Compton-Burnett’s personal convictions on the matter were surely closer to Woolf’s and Pound’s than to Eliot’s. Yet she shared with Eliot, as with Yeats and Hulme, a vivid sense of human sinfulness, and so she was compelled, in Elders and Betters, to take religion seriously, as a poetic system with moral effects. Or at least half-seriously: religion in this novel is after all a children’s game. But is it merely that? 

    Throughout Elders and Betters, the adults use a secularized and ironized religious language to refer to their own actions and emotions. When Anna, whose secret bullying dishonesty triggered Jessica’s suicide, tells her father that Jessica “raised the devil within me,” she is speaking accurately, although dissembling in context. When, later, Anna rebukes her cousin Terence for what she suggests is his hyperbolically gloomy portrayal of human nature — “Oh, we are not such sinks of iniquity. We are most of us well-intentioned, everyday sort of creatures” — her irony of course conceals the truth about herself. Closer to Compton-Burnett’s own vision is Terence’s rejoinder: “The part of us” — where “us” means not just the Calderons and the Donnes but all people — “that we have in common would shock anyone.” Terence’s cousin Bernard deflates Terence’s pessimistic articulation of something like original sin by reducing it to mere shallow moralizing: “You sound as if you would make a resolution on New Year’s Day.” 

    But for Dora and Julius, the problem of “the part of us that we have in common” cannot be so easily contained or dismissed. Small lies trouble them, like their having told their governess that their mother had given them a holiday from instruction. “‘O great and good and powerful god, Chung,’ prayed Dora, ‘forgive us, we beseech thee, the lie that has passed our lips. For we have uttered to thy handmaid, our governess, the thing that is false, yea and even to our mother. And this we did to gain respite from our daily task.’” Their sensitivity even leads them to a sort of scrupulosity, a constant condition of moral auditing; they are always able to discover further subtleties to a misdeed. For instance, after the above prayer for forgiveness:

    “I should think it is especially wicked to take advantage of [Mother’s] being absent-minded, when it is a sort of illness,” said Dora.
    The pair met each other’s eyes and in a moment were back at the rock.
    “O great god, Chung, pardon any wickedness we showed in putting our mother’s weakness to our wrongful purposes. For Sung Li’s sake, amen.”

    The heightened language of Julius and Theodora’s worship, laced with the rhythms of the King James Bible and the secondhand exoticism of Orientalist adventure books, is one of the funniest and most touching inventions in all of Compton-Burnett’s work — touching because, for the children themselves, its magic is absolutely real. The deployment of ritual language really can render atonement for the “wrongful purposes” one is always discovering within oneself. This despite the fact that they know perfectly well that their sacred idiom is merely their own invention, that it lacks any priestly or scriptural warrant. In their games with Chung, Dora and Julius are very serious ironists. Irony is one way of handling the problem of sin in a secular age. 

    Serious irony is modernism’s master mode. It is a way of approaching not just the sacred in a secular age but also the authority of the past in a technological age. One of its basic forms is the quotational — the deployment of a rhetorically self-conscious language larded with quotations or near-quotations or shadowed by the penumbra of quotation. This is what Louis Menand, writing of T. S. Eliot’s verse, calls “the literary quotation marks of imitation and allusion.” (Most infamously, there is Walter Benjamin’s fantasy of producing a book that consists of nothing but quotations.) Eliot’s facility with a kind of virtuosic imitation became clear in his second book of poems, Ara Vos Prec, modeled stylistically and metrically on the French symbolists. That book was a disappointment to many reviewers, who sensed in Eliot’s ironic recourse to an earlier generation of ironists the last redoubt of a minor satirist. But in “The Waste Land,” Eliot performed a kind of alchemy on an allusive poetic idiom that had otherwise come to seem insincere. Eliot understood that what Menand calls “the aura of insincerity” associated with literary quotation was in fact where its modernizing potential most clearly lay. Irony wedded to a sufficiently capacious grasp of tradition could save one from insincerity by estranging the source material — thereby smuggling intensities of emotion through the cracks between the juxtaposed inheritances, the shored fragments, of the poetic past. Irony and collage enable an extreme and sincere intensity. That was “The Waste Land”’s achievement, its instruction to the culture. 

    In their own way, Dora and Julius are similarly masters of a sincerity that issues, paradoxically, magically, from cobbled-together pieces of past language, a King James cadence made newly powerful in the mouths of these precocious, self-aware, vulnerable children. Where Eliot wrung a renewed sense of sacred meaning, however fractured and tenuous, from sources drawn from anthropological texts such as The Golden Bough, so Dora and Julius are attracted to pagan rites of sacrifice (just flowers, no animals) and even to witchcraft. (The two children, atoning for a scuffle over a wishbone, drop some of Dora’s hair into the fireplace. Dora: “Say an incantation over the witch’s cauldron.” Julius: “We ought to have the finger of a dead child, not the hair of a live one.”) Where Eliot’s interest in non-Western religion, in particular the Vedas, supplied a much-needed mystical charge to his exhausted Unitarianism (and supplied some of “The Waste Land”’s most famous and estranging lines, “Datta. Dayadhvam. Damyata. / Shantih shantih shantih”), so the fictional Asiatic divinities “Chung” and “Sung Li” caress the children’s Biblical idiom with an exotic wind from the East, amplifying its authority by suggesting that its truths are transcultural and transhistorical, with rhymes across time and place in the great book of comparative religion. And where Eliot’s own conviction, confessed to Pound before “The Waste Land” was even begun, that he believed — needed to believe — in life after death must have been, in 1919 at least, a component of a generalized mysticism rather than an item of orthodoxy in an Anglican creed he had yet to embrace, so Dora and Julius’ own anxious envisioning of an afterlife flows less from Anglican doctrine than from the multifarious speculations of their uncannily sophisticated childish imaginations. “‘Of course Mother can look down [from Heaven] and see,’ said Dora. ‘It almost seems a pity that people can do that. It might prevent them from having perfect bliss.’” 

    When Julius and Dora get going, their rituals take on a life of their own — they become possessed, transported by their own serious game. Here, in one of the novel’s most striking scenes, Julius encourages Dora, who is reluctant at first, into anathematizing their father, Thomas, for what they rightly perceive as his condescending failure to grasp the emotional complexity of their attitude toward their mother’s death. After scolding them for fighting, Thomas encourages them to mourn in what strikes Julius as a false and sentimental fashion. “If people can’t talk about their dead in a natural way,” Julius says, “they had better be silent.” 

    “Of course we did fight,” said Dora.
    “Well, and why not?” said her brother, with increasing violence. “Are we children or are we not? Are we likely to have the ways of a man and woman, or are we not? Had we been through an impossible day through no fault of our own, or had we not? Is it our fault that Mother is dead? I should like to hear Father answer those questions.”
    “You did not ask them.” said Dora.
    “The time was not ripe. The moment is not yet. But I hold them in store. And then let Father rue the day.”
    “I don’t suppose you would dare to ask them. And it wouldn’t be any good to make him hate you.”
    “There is such a thing as wholesome respect,” said Julius.
    “We are in his power,” said Dora. “I suppose he could starve us if he liked.”
    “Whatever base and dastardly thing he contemplates,” said Julius, striking an attitude, and losing sight as readily as his sister of Thomas’s having no inhuman tendencies, “whatever dark meditations have a place in his heart, there is no easy way for him towards them; there is no royal road. So let him keep the truth in his heart and ponder it.”
    “He gives us food and clothes and has us taught,” said Dora, in a dubious tone, uncertain if mere fulfillment of duty should operate in her father’s favour.
    “The minimum that a man could do,” said Julius. “The least amount of expense and thought, that would save him from the contempt of all mankind. Would you have him cast us forth, as if no tie bound us?”
    “As if we were not his kith and kin,” said Dora, falling into her brother’s tone. “As if we were penniless orphans, driven to seek a moment’s shelter within his doors. As if no sacred tie of blood bound us, hand and heart to heart.”
    “Let him take thought for the dark retribution that is gathering,” said Julius, with a deep frown. “Let him take counsel with himself. That is all I have to say.”
    “The bread he has cast upon the waters, will return after many days,” said Dora.
    “Then he will repent the grudging spirit that stayed his hand.” 

    Unlike all the other scenes of Dora and Julius’ ritual language use, this one does not take place at the temple of Chung, and it is not addressed to that deity. Instead, as Julius seeks to vent his anger at his father and to encourage Dora to join him in that feeling, his speech becomes stained by the ritual idiom — at first, it seems, almost involuntarily, and then with mounting deliberateness. Dora, hesitant at first (“He gives us food and has us taught”), is at length compelled by the rhythm and the sacred diction of Julius’ lines to join him: “‘As if we were not his kith and kin,’ said Dora, falling into her brother’s tone.” The prosody of the sacred here offers the children a sort of therapeutic pressure valve for their pent-up feelings — and affords them one of religion’s less savory uses, the cursing of an enemy. 

    The scene ends with our learning that the children’s older brother, Terence, has been listening in on their strange conversation. “Terence rose and left the room, disturbed by the activities of his brother and sister, whom he believed to be acting some kind of play, a view in which he was right.” “Some kind of play” — rooted in an ever-expanding aura of allusion and quotation, in a pseudo-Biblical speech that riffs but does not mock — is how, for Julius and Dora, the highest devotions are enacted, and the sternest moral judgments passed. Julius and Dora’s private ritual language enacts modernism’s serious irony, in a childish key. 

    The ritual, even magical force of language was a prominent aspect of the modernists’ interest in religion. If modernist literature is the literature of the first historical period in which secularity itself was an object of reflection, its difficult style was the new language in which that reflection could unfold. But its newness — in Joyce, in Pound, in Eliot, in Yeats, in Stevens, and indeed in Compton-Burnett — is compounded of older strains, including, saliently, of idioms drawn from the religious traditions felt to have been superseded. In other words, the “restlessness” of what Mutter calls “secular modernism” inheres in its uneasy — alternatively skeptical and re-enchanted — repurposing of ritual language. Nor is this restricted to the Anglophone world. As Pericles Lewis observes, both Kafka and Proust became the transnational influences they did in part because, “fascinated by the limits of secularization,” each relied on “the frequent use of religious language.” Their very modernity is a consequence of their post-religious navigation of the magical formulas of religion. 

    Matthew Arnold wrote in 1880 that “most of what now passes with us for religion and philosophy will be replaced by poetry.” If that was true in 1880, it was even truer in 1909, in 1922, in 1944. It doesn’t seem to be true anymore. One way of understanding cultural secularization, then, is as the process whereby the ritual emanations of literary language no longer arouse a response among a large enough body of readers to sustain literary culture as a satisfying substitution for religious feeling. Or, more accurately, to sustain it at the scale at which it had become entrenched over the course of the twentieth century. Literary culture persists, of course, but interest in any literature not of the immediate present is increasingly the mark of the antiquarian. 

    The analogy between religious secularization and cultural secularization should not be pressed too far. The most important difference is that the formal humanities have always been the province of the elite. But religious secularization, too, was an elite, rather than a popular, phenomenon in the period encompassing Arnold, Eliot, and Compton-Burnett. (In fact, Arnold’s era saw fairly high rates of church attendance, and although attendance declined at the end of the nineteenth century, it was fairly stable in the first several decades of the twentieth century.) The shock that Woolf and Pound felt at Eliot’s conversion was a distinctly rarified response. The average Englishman would have seen nothing surprising. 

    In the West, irreligion as measured by church attendance as well as by reported self-identification is much more widespread today than it has ever been — even in the United States, long thought to be an outlier in this regard. If “cultural secularization” is a useful concept (as obviously I think it is), its relationship to the uneven distribution and recent acceleration of religious secularization will need to be worked out. It seems fair to suspect that the weakening claims of religion, the increasing unfamiliarity of its ways of thinking and seeing — of its poiesis — have also vitiated the study of literature specifically and the humanities in general. Will a completely secular society create and preserve and transmit high literature? Must supporters of the humanities hope for an incomplete secularization? In the absence of religious forms, how will we honor and express religious feelings? Are the humanities, at least in many of their forms, at bottom the study of the same eternal themes that preoccupied religion — the same questions with different answers? 

    When, in a conversation with her father, Dora lets some of her heightened ritual idiom slip into her speech, she is rebuked. “Suppose we stop quoting other people, and say the things that come into our own little head,” he says. Then, “there was a silence, while Julius and Dora exchanged a glance, and with it a resolution to submit to fate.” That fate increasingly appears to be one of an obsolescent tradition. But the contours of cultural obsolescence were first fully comprehended 
by the modernists themselves. In the coming years, the most vital cultural institutions, whether in the academy or outside of it, will need to reckon with the terrible responsibility of preserving kinds of value and forms of meaning to which the wider culture is inhospitable. 

    9 Poems

    Yehuda Halevi

    Nine Poems

    Yehuda Halevi (c. 1075–1141) was the Hebrew poet who culminated the startling period of Andalusian cultural production that Jewish history calls the Golden Age. In a moment of symbiosis in Islamic Spain from the eleventh to the twelfth centuries, Hebrew poetry flowered as it had not since the Bible and would not again until the modern era. A courtier-rabbi class arose serving Muslim rulers, steeped in their Arabic language and culture. The men of this new Jewish elite adapted Arabic poetics to Hebrew, including quantitative meters and a purist approach to the lexicon, while adopting Arab poetry’s embrace of secular alongside religious verse.

    Halevi was born on the frontier between al-Andalus and Castille, possibly in Tudela, while it was still under Muslim control. As a teenager he went south to Granada, where he rose to prominence as one of the finest Hebrew poets of the age. In addition to over eight hundred extant religious and secular poems, he wrote (in Arabic) The Kuzari, an important work of Jewish philosophy that challenged the rationalism that was ascendant in the Jewish thought of the period. He was also a successful physician and merchant who moved, in the increasing political instability of the period, between a number of communities, including Lucena, Seville, and Christian Toledo.

    Late in his life Halevi repudiated Andalusian cosmopolitanism and prepared for a rare and dangerous pilgrimage to Palestine, which he undertook shortly before he died. His poems from this period are known as his Shirei Tzion (Songs of Zion), and they reflect the “mystical geography” (in Hillel Halkin’s phrase) laid out in The Kuzari, which holds the Land of Israel as the site of utmost Jewish holiness. While many modern thinkers have claimed this work of Halevi’s as proto-Zionism, it resists such retroactive categorization. The scholar Raymond Scheindlin has argued persuasively that Halevi was following the period’s Islamic pattern of the mutawakkil, an ascetic withdrawal from society and dramatic altering of one’s life.

    In keeping with their Arabic models, the Golden Age poets wrote for the most part a highly, almost extravagantly, ornamental verse. Their work is sonically lush with alliteration, assonance, and interwoven consonants and vowels; and syntactically dense with double and triple puns, homonyms, and other wordplay. It is also written within elaborate formal constraints in a Biblical Hebrew which layers it with intertextual references to that canon (as well as the contemporary Arabic poetry on which it is modeled). It thus presents an extreme case of the inadequacy of translation in general. 

    My particular focus as a translator is on the sound, or what poets like to call the music, of this poetry. As a poet reading these poems I experience above all an utter reveling in the materiality of language. My goal is to create versions that approach some of this sonic richness. In this light I have chosen the music over form and precision of content. I aim to render this music as immediate as possible, and so I sometimes adapt archaic images and terms to ones with more resonance in contemporary language.

    Dan Alter

     

    [Beloved did you forget . . . ]
    Beloved did you forget how you slept between my breasts
    & why have you sold me forever into chains

    Didn’t I chase after you once in an untamed land
    Mountains & sand, Dead Sea & Sinai my witnesses

    You had my love & I was your desire, so how
    Can you share out my riches without me?

    Pushed into Seir, forced toward Kedar,
    Turned in the furnace of Greece, abused in the bond of Iran

    Who besides you could set me free
    Who but me, so caged by hope?

    Lend me your strength,
    I will give you my tenderness

    [Oh homeland don’t 
you wonder]

    Oh homeland don’t you wonder on your captives,
    who call to you, the ones left of your pastures

    From seaward to sunrise, tree-line to barrens,
    calling from far or close by from all sides 

    Call of a captive of desire, who sheds tears like dew
    on Hermon’s peaks, & longs to let them fall on your mountains

    When I wail out your torment I’m a hound, & when I dream
    your homecoming has come, I’m strung with your songs

    My heart beats to Beth-El & louder at P’niel
    & Makhanayim & every point your pure ones touched

    Where the holy nearness filled your winds, your maker
    opened your windows to the windows of skies

    & only God’s glow for your light, no
    sun moon or stars to illuminate you

    I would let my last breath spill out right where
    the divine spirit overflowed your chosen ones

    You’re the royal home & the holy seat & how
    have servants sat down on your heroes’ thrones?

    If only I could wander in the places
    God was shown to your envoys & seers

    & who will make me wings & I will range
    with my heart in pieces between your ragged peaks

    I would fall to my face on your ground & thrill
    to your stones & feel your sweet dust with my fingers

    & then standing on my ancestors’ gravestones,
    I’d be stunned in Hevron how your finest are buried there

    I would cross your forests & terraces & pause
    in awe at the Gilead ridgeline where Moses was buried

    His & Aaron’s burial mountains, those two huge
    lights shining on you, showing you the way

    Living breath — the air of your soil, & myrrh fragrance
    the grains your earth, & honey-flow for your rivers

    It would soothe my mind to go shoeless, naked
    in the waste & ruins that were your shrines

    Where your ark was hidden away, where cherubim
    stayed in your innermost chambers

    I’d shear off my hair grown in devotion, curse the years
    in unclean lands that fouled your most devoted

    How fine can the food on my plate taste when I see
    your young lions dragged along by dogs?

    Or how will daylight sweeten my eyes, while
    I watch crows carry away their kill of your eagles 

    Slow now, cup of suffering, ease up, my belly
    & soul are swollen with your bitterness

    When I remember Jerusalem straying, I drink,
    & her fallen sister Samaria, & I drain it

    Zion loveliest crown, how you weave love & grace
    as of old & your friends’ souls woven through you

    Those who smile if you’re at peace & ache
    at your desolation & weep for your shattered pieces

    From a captive’s pit yearning for you, each
    in his place bowing toward the arches of your gates

    Your numberless herds driven out & scattered
    from mountain to hilltop but your fence-lines not forgotten

    Who cling to your fringes & fight to climb
    & grasp onto your date palms’ canopies

    Could Babylon & Egypt ever match you, their hollow
    prayers compare to the gemstones you told truths with?

    Who will compare to your nobles & holy men,
    your chanters in the temple, singers in the choir?

    False-god kingdoms will fall & be gone — your force
    is forever, through the generations your jewels

    God sought you for a home, & joy to a human who
    has chosen, draws closer, to dwell in your courtyards

    Joy to one who awaited, arrives, lays eyes
    on your light dawning & your sunrises burst upon him

    To see your chosen thriving, to thrill in your joy
    as you come back to your long-ago bloom

     

    [West, this is your wind, wings]

    West, this is your wind, wings      perfumed with balm & apple
    You come from the trader’s warehouses,       not the storehouses of wind
    You lift the swallow’s wing calling me to freedom      like the fragrance of just-picked aloes
    How we all long for you, by whom      we ride a few boards over the sea
    Please don’t ease off your touch on the ship      when day catches up or settles down
    Level the depths, spread the sea open,       don’t stop until you reach the holy mountains
    Rebuff the easterly that thrashes up the seas      until the waves seethe like a pot boiling over
    But what can the wind do, bound in God’s hand,       now restrained, now he sends it?
    My only wish: in the hand of the One on high      who raises high mountains & makes the win

     

    [In distances a dove]

    In distances a dove strayed into a forest
    lost, not knowing how to recover

    Hovering, wings swinging spun
    wheeling circles around her beloved


    Counting down the millennia to redemption
    but her figuring leaves her dumbfounded


    Having suffered for her lover long years
    of wandering, soul exposed to the grave

    Saying I will no more remember his name
    but it lights in her heart like flame:


    Why would you be her enemy, her beak
    parted for your salving spring rain


    She believes to her depths, turning from despair
    whether she’s raised up or suffers by him

    Let our God come, no longer silent
    fire rising on all sides of him

     

    [Gently, with your soft waist]

    Gently, with your soft waist but heart      hard, gently with me, I surrender.

    Straying only with my eyes, my heart      pure but my eyes drunk on you.

    Let those eyes gather roses      & lilies where they grow together

    in your cheeks, from which I rake fire      to fight fire & if I’m thirsty find water there.

    I would sip the red lips glowing      like coals, & my moth tongs. My life

    suspends between those crimson lines      but with sundown my death comes. Then

    I find only nights with no end where once      was no midnight for days, time

    was in my hands like clay & constellations      spun like the potter’s wheel.

     

    [Cry out, forest, for a cedar]

    Cry out, forest, for a cedar, for one
    who waited for daylight, but night fell. 

    No, I didn’t know the Seven Sisters
    could die with him, & the morning star.

     

    [Lately with winter rains]

    Lately with winter rains      the land has nursed on a cloud like a baby
    or like a bride shut in by the cold      longing for days of love,
    hours of touching, until spring      comes to ease her aching heart.
    Dressed in flowerbeds of gold & brocade      the way a girl thrills
    at her clothes as she changes      & shares them with everyone.
    Day by day the hues go      amber to ruby to pearl,
    turning pale or green & then      reddening as if kissing.
    So beautiful I wonder      if they’re stars stolen from heaven.
    At dawn we come where the trees lean      with wine & our hearts flaring,
    wine snow-cool to the fingers      that lights fire inside us
    rising from its jug like a sun      to flow into our fine glasses.
    As we stroll under the garden’s shade      it laughs at a rain shower weeping,
    smiles when a cloud cries shaking off       droplets like a burst necklace,
    savoring the swallow-call like liquor      or when the dove behind leaves
    tells a secret like a singer       swinging her body behind a screen.
    How I miss the breeze of those dawns      where my friends’ scent lingers,
    wind which sways the myrtle, wafts      its fragrance to them far away
    lifting & lowering the branches      as palm fronds clap along to bird song.

     

    [She, washing dresses]

    She, washing dresses in the rain
    of my tears, then spreading them in her sunlight

    to dry, has no need of springs with my
    two eyes, nor sun, given how she shines. 

     

    [My heart is in the east]

    My heart is in the east, while I’m in the far west
    how can I savor the food in my mouth?

    How make good my pledges & vows, with Zion
    tied down by Edom & I’m in Arab bonds?

    As easy to leave all the fineness of Spain
    as it would be sweet to see the ruins of the Shrine

    The Adults in the Room

    I was a liberal before I knew what the word meant, before I had read a word of Locke, Mill, Berlin, and Rawls, before, in fact, I knew anything about the world at all. Liberalism was not a political idea; it was a family loyalty, born in the blood, and it became a way of life. We liberals commonly tell ourselves that, unlike the far right and the far left, we reach our beliefs through a rational inspection of the world as it is, but I didn’t get my ideas that way. I didn’t form my convictions through a critical evaluation of evidence about life as it actually was. I was born a liberal. 

    My parents were liberals, their friends were liberals, and my father worked for thirty years for liberal governments in Canada. Some of my earliest memories are political: at the age of five, in 1952, watching the Republican convention with my parents on the first fuzzy black- and-white TV we ever owned. My parents were Canadian diplomats in Washington, and they were for Adlai, not Ike, and like their American friends they were horrified by McCarthy, the scowling Republican bully who presided over the Senate Army hearings. So before I knew anything at all, pretty much as soon as I could stand up and put on my own clothes, the label had been sown into the shirt on my back.

    While other kids had baseball or hockey stars for heroes, mine were Jack and Bobby Kennedy, Bayard Rustin, Martin Luther King, Jr., and Rosa Parks. By the age of twelve, I was copying Jack Kennedy’s mannerisms. I couldn’t do the Boston brahmin accent, but I could put my hand in my blazer pocket, with my thumb down the front seam, the way he did. By the time I was twenty-one I knew by heart Bobby Kennedy’s improvised speech in Indianapolis on the night of King’s assassination, to comfort a shocked and grieving black crowd, quoting Aeschylus about the “awful grace of God”. In those terrible months of spring and early summer in 1968, when both King and Kennedy were murdered, I campaigned for Pierre Trudeau, bringing delegates over to our side in the tumultuous five ballot struggle at the convention that elected him leader of the Liberal Party and then traveling with him on the cross-country campaign that elected him Prime Minister in June 1968. I was twenty-one years old. Bliss it was in that dawn. It was the only political campaign I have ever been part of where we knew we were going to win, the only question was by how much. It was also the only political campaign where I saw what winning meant. Two nights after his victory I was invited out to Harrington Lake, the Prime Minister’s country residence, to dine with him and one of his then current girlfriends. Instead of exhilaration, there was exhaustion in Trudeau’s eyes, and I thought I saw fear too, in his dawning realization of what it meant to hold power. 

    My heroes may have been Americans, but my liberalism was Canadian all the way down. Liberalism prides itself on its cosmopolitanism, but in truth all liberalisms are local, since, as the man said, all politics is local. Canadian liberalism had all the self-congratulatory earnestness particular to a small official elite, to whom my parents belonged. It was a managerial doctrine of moderation appropriate for a small country, with no imperial destiny like its neighbor next door, but instead trying to muddle through, holding together a continental nation-state the size of America but with a tenth of its population in a harsh but beautiful landscape where, as Margaret Atwood said a long time ago, the name of the game was survival. 

    Yet, in its muddling way, Canada did more than survive. In the surge of postwar prosperity Canadian liberalism did some great things, a new national flag, a new constitution and charter of rights, a new immigration policy, a national pension and a national health care program. The canard that liberalism never dares to take on big enemies is false. To make all this happen, liberal governments had to take on provincial governments, resurgent Quebec nationalism, and vested interests coast to coast, chief among them the pharmaceutical companies and the doctor’s lobbies. So I grew up with a liberalism that knew how to fight. It was unafraid to tame capitalism and to “socialize” medicine and pensions in order to take the fear of catastrophic illness and poverty in old age out of people’s lives. Liberalism’s victories in the 1950s and 1960s laid the foundations of a welfare state not just in Canada, but also in Europe and America. Lyndon Johnson’s administration secured Medicare for elderly Americans and Head Start for poor children. 

    We liberals of the 1960s thought we had laid the granite of basic security under everyone’s feet. Sixty years later, the granite is cracking, the liberal state is frayed, contested, underfunded, straining at the seams, and we are defending our achievement, and none too successfully, against populists and authoritarians who want to take it apart. They have mobilized resentment at the price of social solidarity, but they offer no solutions, or solutions so drastic, such as the forcible deportation of millions of migrants, that they would tear society to pieces. A politics that stokes anger without proposing solutions is not a politics. It is only manipulation, and we like to think that we are in the solution business. 

    We are right about that, but we keep on defending achievements of long ago instead of raising our sights and finding a way to fund and reinvent social solidarity for the twenty-first century. For my heyday — 1945 to 1975, what the French call les trente glorieuses, the glorious thirty years of robust growth and relative equality — has gone forever. Beginning with the oil crisis of the 1970s, an abyss slowly opened up between a credentialed elite and an uncredentialed working class whose steady union jobs were stripped out and shipped overseas. Those of us who got the credentials to enter the professional classes did well, but plenty of our fellow citizens fell behind. We didn’t notice this in time, and our failure opened up a chasm between who we were, what we believed, and the people we represented. We kept offering “equality of opportunity,” a chance for the credentialed few to enter the professional elite, without tackling capitalism’s remorseless distribution of economic disadvantage itself. 

    By the late 1990s, the conservatives began to gain power by playing to the resentments of the ignored. The authoritarian right, especially, understood that they could build an entire politics on mocking the blindness of the liberal elite. They didn’t need solutions; stoking the rage was enough. We are now the embattled object of that rage. What will it take to earn the trust of those whose discontent we ignored? Liberalism in the next generation will need to save social solidarity from the “creative destruction” of the market, by rebuilding the fiscal capacity of the liberal state and investing in the public goods that underpin a common life for all. Saying this, at a high level of generality, is easy enough: the tougher part will be finding the language and the cunning to convert a radical liberalism into a politics that wins elections and a governing strategy that pushes change through 
the veto-rich thicket of interests waiting to derail our best laid plans.

    In the meantime we lament the “identity politics” of our populist and authoritarian competitors, when it would be more honest to admit that identity is where all political belief actually comes from, including our own. My identity — charter member of the white professional classes of Canada — defined my liberalism. What the liberal critique of identity politics does get right, though, we owe to our much-maligned individualism. Identity is not destiny. Every formative confrontation with reality presents each of us with political choices. We can either make up our own minds or borrow someone else’s beliefs. The convictions that stick are the ones that we decide for ourselves. The beliefs that we hold onto are the ones that first required a primal Yea or Nay to the allegiances we started life with. In the 1960s, I could have rebelled against my parents’ liberalism. Many of my generation 
did. Instead I said yes to the world I was born into and to the parents I was lucky enough to have. 

    Black friends of my generation also said yes to their parent’s allegiances, and they remain committed to deliver the still withheld promise of American equality. But this liberal inflection isn’t a racial obligatory. The black entertainment superstars of succeeding generations, with their bling and their Bentleys and their “attitude,” appear to have emancipated themselves from the entirety of their civil rights inheritance and its liberal conscience. So no, identity does not give us our politics. I was born a liberal, but I stayed one for life because I chose the liberal tribe as my own.

    How tribes shape you depends on the times that shape the tribe. My liberalism’s primal beginning was World War II. My parents were in their twenties when the war picked them up by the scruff of their necks and changed them forever. They found each other in London in the midst of the Blitz and the V-2s. They came to maturity during the most dramatic expansion of state power in history. In the space of five short years, Canada, like the United States, became an arsenal of democracy and fielded an army that landed in Normandy and liberated Europe. Their generation discovered the power of government, and the idea that government could be the problem and not the solution was inconceivable to them. There seemed to be nothing that a democratic government under arms couldn’t do, even defeat absolute evil. 

    Because they had watched their world burn down, theirs was a liberalism with internationalism at its heart. Human rights, the United Nations Charter, and the spider’s web now known as the “rules-based international order” were not the vapid bromides that they have become, but my father’s life-calling. He was part of an international generation of public servants who believed that the United Nations system, with its rules and its treaties, would tie down the predators in the international system and keep the small fry safe. 

    Their generation also knew what it was to hunker down in a bomb shelter with strangers, trying to keep the talk in the darkness light while the ground shook. They had lived the cross-class solidarity of those wartime shelters, and they came home from the war believing that liberal government could bind the classes together in peacetime. My left-leaning generation was just as sentimental about working people, except that we didn’t know any actual workers. As a student journalist in the 1960s, I once spent a morning on a picket line with printing workers who had been locked out of the plant that printed our university paper. It was a cold autumn morning, and we walked up and down, in front of the plant, carrying picket signs and what I remember best was not the warm glow of solidarity, but a red-faced feeling that I had nothing but good intentions to share with the big men who knew in their bones that they were marching not to victory but to the unemployment line. 

    Looking back now, liberals of my generation didn’t realize that the welfare state we grew up in did not unite classes. It interposed a state bureaucracy between classes, and its programs divided those who earn salaries from those who claim benefits. The rising costs of social solidarity divided citizens into warring camps. Exiting from liberal arrogance means finding a way back to a liberal politics of cross-class solidarity. 

    Exiting from arrogance also means, even if this sounds contradictory, recovering what that inheritance actually believed, before liberalism slipped into the suave managerial discourse it became in the Clinton and Blair years. For my parent’s wartime generation, it was a fighting creed. They knew that they were fighting against: fascism’s cult of death, its loathing of Jews, its national and racial hatred, its lust for conquest and domination. Against these forces of darkness, there was no place for compromise, moderation, splitting the difference, all the liberal virtues. This was a fight to the death that had to be won. 

    We can still learn from this intransigence. The Nazi marches in East Germany, the re-packaged Vichyite racism in the National Rally of Marine Le Pen, and the jeering anti-Semitism in Charlottesville, Virginia, show us that malignity never rests. Liberalism today would do well to be less self-deceiving about its opponents. I used to believe that liberalism only faces adversaries who could be allies tomorrow. I have had to learn there are some enemies in the house, dangerous to democracy, fatal to every liberal achievement, who simply have to be defeated, over and over again. 

    With the onset of the Cold War, my parents’ generation’s anti-fascism turned into anti-communism. By lining up against the Soviet threat, recent revisionists have argued, their generation abandoned their progressive New Deal beliefs and became apologists for American hegemony. According to Samuel Moyn, liberal thinkers such as Isaiah Berlin and Judith Shklar — who happened to have been my teachers — let their anti-communism blind them to the ugly violence of America in its imperial heyday. This progressive critique is meant to chide contemporary liberalism into learning from its mistakes, but it has the opposite effect. It severs the liberalism of today from potential sources of renewal. For if there is anything that Cold War liberalism can teach the next generation, it would be its unflinching opposition to authoritarian tyrannies and a determination to contain and deter their expansionist march. 

    Canadians, like Mexicans, do not need progressive Americans to tell them that America is an imperial power with blunt unilateralist instincts when it comes to defending vital interests. But we never forgot that America had fought fascism, stationed troops and weapons in Europe to deter the Soviets, and ensured that Western Europe stayed free, and we didn’t care overmuch that their motives, like any great power’s, were bound to be mixed. We also did not forget how long it took for America to enter World War II or to let in the refugees — too long, on both counts. So yes, the liberalism that became mine at adulthood was human rights universalist, militantly anti-communist, strongly internationalist, and pro-American at its core. 

    Besides, America also shared with the rest of the world an exuberant popular culture created by artists of genius. My parents’ heroes were Louis Armstrong, Fats Waller, and Ella Fitzgerald. Mine were Buddy Holly, Bob Dylan, Marvin Gaye, Sam Cooke, Wilson Pickett, and the Four Tops, names which still conjure up, across sixty years, what it felt like to be sixteen. It would be comical to call this music liberal, but it was certainly liberating, and it was profoundly American. This was a music of freedom and soulfulness, and tenderness too, and the fusion of black and white music promised something at once exciting, terrifying, and new: a truly inter-racial society. A young white teenager such as myself didn’t know a single black person well, but in our high school dances we danced to black music. We went to basement clubs and listened to grizzled old bluesmen, up from the deep South, and we knew by heart all the haunting and apocalyptic lyrics of Robert Johnson. This was still an innocent time when whites and blacks could learn what they wanted from each other’s culture, before the ban on “cultural appropriation” forced us all back into the false authenticity of our exclusive tribes. 

    We embraced black music, but we had no real idea about what was at stake in the struggles to the south of us. In the summer of 1963, President Kennedy gave his first television address on racial justice. I was sixteen, at my aunt’s house, having dinner with her sister’s husband, Clark Foreman, a New York lawyer, who did pro bono work for the National Association for the Advancement of Colored People. I proclaimed how impressed I had been by Kennedy’s speech. Foreman lowered his bifocals, stared me down across the table, and shook his head. I insisted Kennedy was going as fast as he could. Fast enough? Bull Connor’s dogs were tearing the clothes off black demonstrators in Birmingham parks; Governor Wallace was standing at the door of his state university, barring entry to a qualified black man who wanted to study. Black churches were being dynamited, and young children were dying. 

    My inter-racial enthusiasms were too cautious by half, but the civil rights struggle gripped me. It offered a vision of inter-racial harmony as well as a politics of how to get there: through non-violent civil disobedience, mass rallies, legal challenges in the courts. In the space of a decade, black men and women perfected the whole repertoire of liberal politics for my generation, from the sit-ins and freedom marches in the American south to Rustin’s organizational effort that produced the March on Washington. It was a transformative experience to see that liberalism could be a fighting creed again, as it had been in my parent’s time — to see a young man named John Lewis, no older than I was, daring to cross Selma Bridge and being beaten bloody by the troopers, an image so resonant in memory that when, sixty years later, on Capitol Hill, I was introduced to Lewis, all I could do was clasp his hands and thank him, inarticulately, for the lesson in courage that he had taught us all. It was impossible for someone of my generation to hate an America that produced such a man. 

    When we entered college, we marched against the war in Vietnam, and borrowed the entire repertoire of black civil disobedience, not to denounce America, as our leftist friends wanted us to do, but to redeem it. Looking back now, I see that it was precisely then that a pro-American liberal like me made the emotional commitment that decades later led to a mistake that haunts me to this day: support for the war in Iraq. We had opposed the war in Vietnam to call America back to its better angels, and fifty years later the same instinctive belief in America, despite Bush, Cheney, Rumsfeld and all the avatars of doom I might have noticed, led me to support an operation that has become synonymous with imperial folly. 

    When I entered graduate school in history at Harvard in 1969, I continued marching against the Vietnam war, often with people far to the left of me, but my chief political commitment of the time was prison visiting at the Massachusetts Correctional Institution, Norfolk. My doctoral thesis was on the punitive side of Enlightenment liberalism, how intellectuals and philosophers such as Jeremy Bentham, Benjamin Rush, and others sought to replace the arbitrary violence of ancien regime punishment with the new carceral technology of the penitentiary. By day, I read historical documents in Langdell Law Library, while by night I sat in a poorly lit room, sixty miles south of Boston, behind bars with a dozen young black men doing time for murder, rape, and a host of other crimes.

    For four years, I came out every Tuesday night and sat across the table listening to them arguing, joking, and just being themselves with profane exuberance, often at my expense since I was the only white man in the room. As I got to know some of them, I helped them to get parole, or to get jobs, only to see them skip town or skip parole as soon as they tasted freedom. Some of what I learned was shocking, such as discovering that the handsome and articulate young man across the table was doing life for having thrown a skillet of boiling fat all over 
his girlfriend, blinding and scarring her forever. I had no conception of where such life-devastating rage could come from, or how a man could ever repent or repair such lethal harm. My years at Norfolk were the first moment in my life when I could see, plain as day, that a liberal upbringing was insufficient for me to understand the world I had entered.

    In 1973, in the wake of the Attica prison riots in upstate New York that claimed forty-three lives, a similar riot devastated Walpole, the medium security prison next to Norfolk, and I volunteered to go in and mediate the stand-off between police and inmates that followed the uprising. When I did get inside and toured the cell blocks, the anger that had consumed the place seemed elemental in its force. In the weeks that followed, I played out a classic liberal role — mediating between the prisoners barricaded inside the ruined prison and the state police, waiting outside, tear gas and truncheons and shotguns at the ready, to retake the institution. I did a night shift on the segregation ward, reserved for prisoners too dangerous to be in general circulation, and I remember sitting on a chair in near darkness at one end of the cell block while black men on lockdown poked mirrors out between the bars of their cell doors to keep me under observation. I was twenty-five years old. 

    As a young graduate student in the dark and turbulent 1970s, the education that changed me most was not at Harvard but at Norfolk. It led me out of innocence. Here was rage beyond understanding, directed obliquely at me because of my race. I had never experienced the impersonality of racial hatred, its fixation on your skin, its indifference to who you actually are. After the prison riot of 1973, I retreated back to academic work and wrote my dissertation on the origins of the penitentiary in more privileged precincts, in the stacks and cushioned reading rooms of the Harvard library system. 

    I earned my living teaching the first generation of young students to benefit from Harvard’s venture into affirmative action. Now, fifty years later, affirmative action and race-based admissions have been outlawed by the Supreme Court, to loud expressions of liberal despair, and so it is worth recalling that the first students to benefit from affirmative action were often miserable. I remember a young female student from South Carolina, the first in her family to attend college, saying between choked-back tears that liberal good intentions had gifted her a place in an elite institution, but not the belief that she had a right to be there. 

    Without realizing it, my generation of young white liberals was witnessing the problematic unfolding of a multi-dimensional and all-encompassing revolution. Affirmative action for black students was followed by the feminist upsurge. In the space of a generation, women went from being a minority in university classrooms to the majority. The girls we had dated in high school, with their gardenia corsages and “good girl” proprieties now became the young women discovering their sexuality and challenging our own. Outside the university, in the wider society, a little-noticed change in immigration law opened the door to Asian, Caribbean, and African immigrants who had been barred since the 1920s. In the same period, liberal democracies began decriminalizing homosexuality. As Pierre Trudeau said in 1967, “the state has no place in the bedrooms of the nation.”

    The Canada I grew up in had been white and aggressively heterosexual. By 1980, I was living in a multiracial and sexually pluralistic society, teaming with new citizens from every corner of the globe. The contrast is captured in a comparison of my University of Toronto graduation class photo of 1969 — mostly male, at least professedly straight, all white — and the graduation photo of the same age group, at the same college, in 2024 — majority female and every color of the rainbow, turbans, hijabs, and skullcaps all expressive of a new diversity which we liberals quickly turned into a religion of its own. 

    This still-unfolding multi-dimensional revolution turned out to be the cardinal liberal achievement of my era, but it enormously complicated the liberal task of finding the middle way between the Scylla and Charybdis of extremisms. We were naïve about the nature of this problem, preferring to believe that all reasonable human beings would embrace a revolution of inclusion, when the reality was that our generation had upended the entire social order, and even our own place in it. Diversity — of gender, sexual orientation, race, religion, and class — was a virtue in comparison to the dire cantonment of peoples in silos of exclusion, but liberals turned diversity into an ideology. Once an ideology, it quickly became a coercive program of invigilation of speech and behavior in the name of dignity and respect. 

    Credentialed whites of my generation welcomed the revolution because we could invite new recruits of color into our ranks without ever feeling that our own elite status was being challenged. We didn’t seem to notice that non-elite whites were threatened, even betrayed, by the new multiracial order. Faced with what we thought was white racism and sexism, when it was mostly fear, we began promulgating codes of speech and conduct to impose diversity as a new cultural norm. New bureaucracies in universities, corporate headquarters, 
and government offices enforced diversity at the price of freedom, the freedom to defend unpopular loyalties, to freely dislike others, to be funny at other people’s expense, to be critical of the pieties of others but especially our own. A liberalism whose defining value should have been liberty invented a diversity and inclusion industry, whose guiding principle may have been justice, but whose means of enforcement included coercion, public disgrace, and exclusion. 

    Worst of all, we censored ourselves, willingly turning off our bullshit detectors, and stilling the inner doubts that might have made us confront our mistakes. We abandoned the truism that arguments are true or false, irrespective of the race or the origins of the person who makes them. We began promoting arguments as true based on the gender, race, class, origins, or backstory (oppression, discrimination, history of family violence) of the person uttering them. The value that we placed on diversity and inclusion led us by stages to jettison a care for truth itself. We ended up compromising the very epistemological privilege that had provided us with such unending self-satisfaction. 

    In failing to pay heed to the fears of displacement that the liberal revolution created, we ended up creating a vital political opening for every strand of extreme opinion queuing up to speak on behalf of everyone whom liberals had stopped listening to. By the 2020s most liberals were walking back, at first nervously, and then with increasing speed, from our own self-righteous politics of virtue. First we made everyone else sick of our virtue-signaling and then we became sick of 
it ourselves.

    The irony was that the liberal revolution destabilized liberals as much as it upset those who were resisting it outright. For it was the liberal revolution of inclusion that fragmented the centrist consensus that had made the liberal revolution possible in the first place. Once each group — black, female, gay, and trans — achieved emancipation, many of them began to identify with their own group to the exclusion of wider civic-sized political aggregations of interest. The old political parties — Liberal in Canada, Democrat in the United States, social democratic in Europe — that had presided over the liberal revolution now saw their white working-class base heading for the exits, and their multicultural support splintering into autonomous groups each beginning to make a strange new epistemological claim: you can only understand me if you are like me. Only black people can understand the black experience of racism and police violence. Only women can understand the tyranny of patriarchy and the fear of male sexual violence. Only gays can understand what same sex love truly means. 

    The old liberal epistemology at least rested on egalitarian and universal premises. We believed that everyone was capable of entering to some degree into the mental worlds and lived experiences of others, because all of us, regardless of race, creed, ethnicity, or sexual orientation, were rational human creatures. This rationalist universalism disintegrated in the 1980s and 1990s, attacked by a new generation of “progressive” scholars as masculinist, colonialist, racist, and fundamentally condescending. This assault was supposed to awaken us to “intersectionality” — the interaction of disadvantages — but instead of drawing hurt constituencies together it fragmented them into highly sectarian and identity-based political groupings that foreclosed on alliances, shared understandings, and common political projects across race, class, and gender. So now liberals denounce the prison house of identity politics, without realizing the degree to which this new self-defeating politics is a consequence of the very revolution that we helped to foment. 

    Needless to say, at the time I understood little or nothing of this, but these were some of the factors — the complacent politics of virtue, the blindness to the new inequality, the conceit that ours was the only rational politics — that began to erode the electoral base that had sustained the center ground of Western liberal politics. The convictions of my youth had survived intact from 1968, sheltering me from any mind-changing encounter with the world that had metamorphosed around me after the end of the Cold War. When, in 2005, I left behind a professorship at Harvard and took the plunge into Liberal party politics in Canada, it didn’t feel like a crazy departure from security, tenure, and privilege, 
but instead as if my feet had been traveling homeward bound all along. 

    I had no idea of what I was letting myself in for. I had no understanding of my own inexperience, and no grasp of how weakened and debilitated the liberalism of my party had become. We were a party that kept winning elections and governing the country, but with a vote-share slowly declining in the small towns and rural districts and piling up in the downtown urban centers where the professional and commercial elites liked to live. When I led the party into an election in 2011, truth be told, the liberal platform had not much to offer a people still shocked by the financial crash three years earlier. Our message, though we never said so directly was “trust us, we are the adults in the room.” We even called ourselves “the natural party of government.” On election night in 2011, our party suffered the worst defeat in our history, and I lost my seat in parliament — a verdict that all these years later reads to me like a judgment on me, but also on a liberalism that had allowed itself to be captured by its own self-regard.

    Defeat is a great teacher. It taught me that liberalism endures because it is a way of being and a set of values that tell us who we should try to be. This is what gives liberalism its hidden resilience, its capacity to rebuild after political reversals. If we want to rebuild, we will need to recover what the word used to mean. It once was a synonym for generosity. In the old days, a liberal gentleman was a generous man. We will want to discard these male, elitist associations by marrying generosity to the egalitarian individualism at the core of the liberal creed. The creed tells us that we are no better than anybody else but also no worse. What liberals value should be within everyone’s reach. A liberal person wants to be generous, open, alive to new possibilities, willing to learn from anyone. We want to share whatever wealth and fortune we have, to welcome strangers to our table, to stand up for people when they are in trouble. We know we have to change our minds when someone’s idea is better than ours. We have faith that history rewards those willing to fight for what they believe. Now, none of us is ever as generous as we would like to be and no liberal has a monopoly on generosity, but the largeness of spirit it calls us to does define our horizon of hope. 

    Such values are embattled today, and they need defending because our societies so desperately need largeness of spirit, together with a revived liberal ideal of solidarity. We need to be filling out this vision and bringing our citizens to believe in it. Defeat has taught me that we cannot afford to jettison our values when the tides of politics turn against us. Liberalism’s incorrigible vitality comes from the fact that it tells us who we most deeply want to be, provided that we are willing to fight for it and never surrender to the passing fashions of despair.

    Egalitarian Idealists and Authoritarian Zealots: A Cautionary Memoir

    In 1952, a year after I was born and a decade and a half before I became an active participant on the American left, Daniel Bell published a book called Marxian Socialism in America, the first serious scholarly examination of the subject. He considered, among other questions, why the traditional Marxist parties in the United States had by then descended into abject political isolation. The Socialist Party of America (SP), which four decades earlier had enrolled over a hundred thousand members and attracted nearly a million voters in the presidential election of 1912, was reduced to less than a thousand aging stalwarts by 1952; its youth affiliate, the Young People’s Socialist League (YPSL), had fewer than a hundred. Further to the left (or to the east, given its affinity for the Soviet Union), the Communist Party, USA (CP) still counted roughly twenty thousand members, but many of its leaders were imprisoned or about to be imprisoned for violation of the Smith Act, a federal law making it a crime to conspire to teach or advocate the desirability of overthrowing the government. According to public opinion polls in the early 1950s, a clear majority of Americans believed that the Party should be outlawed entirely. 

    Bell, who had joined YPSL at the age of thirteen in the early 1930s, and who in the first election in which he was eligible to do so voted for Socialist Party presidential candidate Norman Thomas, now concluded that the American Marxists of all persuasions had been destined for failure from the beginning, their fate “rooted in [an] inability to resolve a basic dilemma of ethics and politics”:

    The socialist movement . . . in its rejection of the capitalist order as a whole, could not relate itself to the specific problems of social action in the here-and-now, give-and-take political world. It was trapped by the unhappy problem of living “in but not of the world,” so it could only act, and then inadequately, as the moral, but not political man in immoral society. . . . A religious movement can split its allegiances and live in but not of the world . . . ; a political movement can not.

    Bell continued in later years to describe himself as a socialist in economics if not in political allegiances, and his verdict was delivered in a regretful tone, at least in regard to the Socialist Party’s fate. In any case, from the perspective of the early 1950s, it would have been hard to disagree with his judgment that the Marxist left’s moment as a meaningful player in American political life had come and gone. 

    Fast forward a dozen years to 1964, when I turned thirteen, the age that Daniel Bell was when he joined the socialist movement more than three decades earlier. This was also a coming-of-age moment for me, as I first began paying attention to what was happening in the broader world outside of family, neighborhood, and school. In that year’s presidential election, I was a fierce (if still eight-years-under-the-voting-age) partisan of the candidacy of the Democratic incumbent Lyndon Baines Johnson, and took great satisfaction in his trouncing of his arch-conservative rival Barry Goldwater in November. But something else caught my attention in 1964 that was destined to have a lasting impact on my political trajectory: a sub-drama within the Democratic camp playing out in Mississippi. There, from June through August, about a thousand young civil rights volunteers from around the country were taking part in the Freedom Summer Project directed by a remarkable twenty-nine-year-old activist named Bob Moses, a leader of the Student Non-Violent Coordinating Committee (SNCC). At the risk of their own lives (three would be kidnapped and murdered by the Klan at the very start of the project), the volunteers conducted a voter registration drive among the disenfranchised black population, and helped organize a new political formation called the Mississippi Freedom Democratic Party (MFDP) as an alternative to the regular and staunchly white-supremacist Democratic Party in the state. 

    At summer’s end, the MFDP sent an integrated delegation of Mississippi residents to the Democratic National Convention in Atlantic City to challenge the seating of the all-white regular delegates. Although President Johnson had overseen passage of the Civil Rights Act earlier in the year, fearful of losing the state and perhaps the entire south to Goldwater he dispatched a crew of established liberal and civil rights leaders to persuade the insurgent Mississippians to give up their challenge. In exchange, they were promised that the MFDP would be awarded two at-large delegates to the convention, along with the assurance that by 1968 all Democratic state delegations would then and thereafter be required to be open to black as well as white delegates. From the perspective of realpolitik, or what Bell would describe as the necessity of acting as “political man in an immoral society,” it was not an unreasonable offer. Dr. Martin Luther King, Jr. was among those initially urging the MFDP delegates to accept it. Yet the MFDP delegates refused to do so. As Bob Moses declared, after King had spoken in favor of compromise, “This reasoning you’ve been giving us here is inaccurate. We’re not here to bring politics into our morality, but to bring morality into our politics.” 

    To bring morality into our politics: this was something new in mid-twentieth-century American major party politics, where compromise and consensus, defined as accepting the recognition that you were not going to get everything you initially asked for, were considered the fundamental rules of the game. Some things, the MFDP delegates decided, were not up for compromise, such as their claim to full and equal rights as citizens in the United States. Theirs was an example of the “in but not of the world” stance that, just a dozen years earlier, Bell had described as leading to inevitable political irrelevance. And yet, in this instance, it worked. Less than a year later, building on the uncompromising groundwork laid by Freedom Summer, the MFDP credentials challenge at Atlantic City, and the Selma voting rights campaign of the following spring, Congress enacted a Voting Rights Act signed into law by President Johnson that would transform southern politics. And in 1968 some MFDP alumni would be seated at that year’s Chicago Democratic convention as members of the official Mississippi state delegation.

    Of course, unlike the radicals Bell was analyzing back in 1952, Bob Moses was not a Marxist. Nor, apart from some Red Diaper babies (that is, children of Communists), were most of the Freedom Summer volunteers in 1964. Their political outlook, including their insistence on a morally driven politics that refused to compromise basic principles, came out of a different and older tradition in American radicalism. It stretched back to Anne Hutchinson in the Massachusetts Bay Colony in the seventeenth century insisting on her right to question the authority of the colony’s Puritan ministers based on what she described as direct revelations from God. (For this she was branded an “antinomian” by both the political and religious establishments.) 

    A similar commitment to what in the nineteenth century was called obedience to a “Higher Law” was central to what was seen at the time as the wildly irresponsible demand for the “immediate” end of slavery by the abolitionist movement, as it was shaped by leaders such as William Lloyd Garrison and Frederick Douglass. Later still in the nineteenth- and early-twentieth centuries this same uncompromising morality-infused radicalism could be found in the women’s suffrage movement, represented by such figures as Elizabeth Cady Stanton and Alice Paul. This American radical tradition also had some influence within the Marxist left, most notably on Eugene Debs, who read his Marx and Engels and believed in class struggle and international working-class solidarity, but whose aspirations for the American cooperative commonwealth were based on the moral principles about democratic citizenship and individual conscience that he had imbibed as a young man growing up in Terre Haute. “Despite his Socialism,” Debs’ biographer Nick Salvatore argued, “a fierce individualism fueled his core vision,” evident in his most famous speech. In 1918, upon being convicted in federal court of speaking out against American entry into the First World War, he declared: 

    Your Honor, years ago I recognized my kinship with all living beings, and I made up my mind that I was not one bit better than the meanest on earth . . . While there is a lower class, I am in it, and while there is a criminal element I am of it, and while there is a soul in prison, I am not free.

    Time and again in the history of the United States, outsiders to the political mainstream, raising what were regarded as unreasonable demands and speaking in a moral vernacular that owed much to that antinomian strain of American Protestantism, brought to the fore issues such as slavery, women’s rights, and opposition to war that found few if any supporters in more conventional political circles. The role of this left-wing prophetic minority, which Daniel Bell did not sufficiently appreciate, is part of a vital political tradition that has over the centuries enhanced the freedoms and opportunities of millions of Americans. It is a tradition that I consider worth honoring and emulating — so long as one recognizes that it has also led on occasion to unforeseen and less inspiring outcomes.

    17-year-old Bonnie Raitt, and beside her the then 16-year-old author, American Friends Service Committee project volunteers, Indianapolis, July 1967.

    When I turned sixteen in the spring of 1967, I was a high school junior living in a small town in rural Connecticut, desperately anticipating graduation the following year, and with it the opportunity to move away to a big city to attend college. With a Quaker mother and a Jewish father, coming of age in a community where what passed for ethnic diversity ran the gamut from Yankee Protestant to Irish Catholic, my early desire to fit in with my neighbors and classmates gave way sometime post-puberty to an even stronger desire to escape the whole lot of them, and in doing so lay claim to a new sense of independence. Identifying entirely with the heroic selflessness of students taking part in the Freedom Summer project (some of whom also went on in the fall of 1964 to play leading roles in the Berkeley Free Speech Movement), my otherwise conventional adolescent rebellion took on an increasingly political edge. Of course, those history-making freedom struggles were happening so far away from my tedious isolation in Coventry, Connecticut that they may as well have been set on Middle Earth (it was about this time I also was immersing myself in the paperback edition of Lord of the Rings).

    But in 1967, with the advent of anti-Vietnam war protests in cities relatively nearby, my life changed in ways that felt most welcome, adventurous, and authentic. In mid-April I took the train to New York City to join in the “Spring Mobilization to End the War in Vietnam,” a demonstration called by a recently assembled coalition of radical, pacifist, and student groups, which proved to be the most massive anti-war gathering to that point in American history. Back home, nobody in my high school or community seemed to be against the bloody and unjust and unwinnable war in Vietnam but me. (My parents had their doubts, but prudently kept them to themselves.) But when I reached Sheep Meadow in Central Park on the morning of April 15, I no longer felt quite so lonely. With a quarter of a million other protesters, I marched downtown to a rally at the United Nations building. There, surrounded by my newfound fellowship (no elves, dwarves, or hobbits involved, but lots of students and hippies, as well as veterans, trade unionists, and other grown-ups), I listened intently to speeches by Dr. Benjamin Spock, Stokely Carmichael, and, most memorably, the Reverend Martin Luther King, Jr. He was not calling for compromise this time. “Let no one claim there is a consensus for this war,” he said as he began his remarks. “No flag-waving, no smug satisfaction with territorial conquest, no denunciation of the enemy can obscure the truth that many millions of Americans repudiate this war and refuse to take moral responsibility for it.” Maybe not King’s greatest speech and certainly not his best-remembered one, but it spoke to me then and continues to do so today, as part of the great American radical tradition that we now call “speaking truth to power,” as exemplified 
by Garrison, Douglass, Stanton, Paul, Debs, Moses, and 
King himself.

    I spent the summer that followed in a down-at-the-heels neighborhood in Indianapolis, Indiana, one of a group of teenage volunteers enrolled in an American Friends Service Committee (AFSC) project. An aspiring seventeen-year-old folksinger named Bonnie Raitt, bound for Radcliffe, and then for greater things, was among our number. Working with a local settlement house, we did various good works in the community, while learning about poverty, race, and the Quaker vision of conscience-driven social change. (Our texts ranged from Michael Harrington’s The Other America to Malcolm X’s Autobiography to Richard Gregg’s The Power of Non-Violence.) Bonnie and I and our compatriots organized a silent vigil in downtown Indianapolis on Hiroshima Day to protest the war, to the discomfort of the adult project leaders who were worried about hostile local reaction. (If they had preferred we stayed home that day, they should not have assigned us Gregg’s book.) By summer’s end, under the influence of Dr. King and the Quakers, I considered myself a committed pacifist, and what’s more — with the success of our little vigil — an organizer.

    That fall, a senior in high school, still short of my seventeenth birthday, I traveled from New York City with my Uncle Abe and Aunt Joan to Washington, D.C., to take part in the March on the Pentagon on October 21, organized by the same coalition of anti-war groups that had staged the Spring Mobilization. Although not a “Red Diaper Baby,” I did grow up knowing that people to whom I was related and respected had indeed been members of the Communist Party back in the years of the Great Depression — and some, like my father’s brother Abe, remained so. In fact, he had been one of the lawyers for the eleven Communist leaders convicted in 1949 for violating the Smith Act, and like the defendants he also went to jail, in his case for contempt of court. Despite his continuing allegiance to the Communist cause, he scrupulously refrained from trying to recruit me to his own corner of the organized Left (which, in any case, seemed to me too old and stodgy for serious consideration), but was happy to encourage my growing radical inclinations, wherever they led me.

    I had never been to Washington before, but sightseeing was not on our agenda — it would be years before I had the occasion to travel to the nation’s capital without attending a protest of some kind or another. We stayed in a downtown hotel, and on the morning of the march we joined the mass and legal part of the day’s protest, with about a hundred thousand people attending the rally at the Lincoln Memorial. Afterwards we marched across the Memorial Bridge to the Pentagon, where Secretary of Defense Robert McNamara (who by then had his own private doubts about the war he was so instrumental in launching and escalating) was nervously looking out the window of his upper-story office at the gathering crowd. When we got there, I impulsively decided to part company with my uncle and aunt and ran with a group of other young participants past surprised lines of military police to the very steps of the building. Totally unplanned, this was my first venture into militant but non-violent civil disobedience, which I imagine was the case for most of the rest of the few thousand protesters who now found themselves hemmed in by soldiers and federal marshals. There, at the steps of the Pentagon, we listened to impassioned speeches against the war, chanted anti-war slogans — and did not, contrary to subsequent accusations, spit on anybody. Mostly, astonished at our proximity to the planning center and symbol of an evil and unjust war, we wondered what would happen next. In my case, the answer came shortly before dusk, when a burly federal marshal pulled my feet out from under me and dragged me roughly down the adjacent embankment before depositing me on the pavement of the Pentagon’s north parking lot. I picked myself up and limped back across the bridge connecting Arlington to Washington. All in all, I thought it had been the best day of my life. 

    Among the speakers I remember listening to that afternoon at the Pentagon was a young Swarthmore College graduate named Cathy Wilkerson, then working as a regional organizer for Students for a Democratic Society (SDS) in Washington, D.C. Many years later I read Flying Close to the Sun, her memoir of the years she spent as a radical activist in the New Left. She began the decade of the 1960s as a Quaker and an admirer of Gandhi, and she ended it as a fugitive in the Weather Underground. When I encountered her in 1967, she was half-way through that unfortunate transition, but some part of the earlier Quaker influences apparently lingered. She urged us that afternoon not to take unnecessary risks. “While I had been excited by Debray and Fanon,” she recalled, “here in the heat of confrontation it was the model of the non-violent confrontations of the civil rights movement that seemed most powerful. To the extent we had any power at the Pentagon, which didn’t feel like much, it was the power of a moral witness.” 

    Wilkerson’s life story is, among other things, a cautionary case study of how even the best instincts can sometimes lead to terrible outcomes. The emphasis in the early civil rights movement of “putting your body on the line,” as in the lunch counter sit-ins of 1960 and the Freedom Rides on 1961, took on a darker meaning for some activists by the late 1960s and early 1970s. In Wilkerson’s case, it led to the “townhouse explosion” of 1970, which left three of her bomb-building comrades dead in the rubble of her father’s expensive Greek Revival home in Greenwich Village, while she stumbled out of the wreckage and took flight into the underground. 

    Wilkerson’s memoir provides an illuminating glimpse into the fatal trajectory of the largest radical student movement in American history. I find her particularly astute in describing the internal dynamics of SDS in 1968–1969, the year I joined as a freshman at Reed College in Portland, Oregon (having finally realized my ambition of escaping small town life). I was part of a flood of new recruits to SDS, pushing its numbers that academic year to the vicinity of a hundred thousand strong. “This infusion of young people,” Wilkerson wrote in her memoir, “drawn in more by culture than politics, was becoming the norm” in SDS: “They weren’t looking for a complicated discussion about how to bring about change, but for validation, for community, and for a way to express their anger about the war.”

    That seems exactly right in my memory of the moment: validation, community, anger, all understandable reasons for joining a radical movement. Embracing left-wing causes, historically, has not been restricted to “a complicated discussion” about either abstract ideals or political strategy. It also often includes, especially for young people who provide the majority of converts to such causes, the forging of a new personal identity, as Wilkerson — and my own example — suggest. What matters is how motivation and spirit that goes into shaping that self-redefinition is then channeled.

    A similar insight from an earlier generation comes from another memoir, Starting Out in the Thirties by Alfred Kazin, which I read while writing a senior thesis at Reed on the history of Communist-organized literary groups in the Depression decade. “History was going our way,” Kazin wrote of his own youthful attraction to the radical Left (and briefly Communism) in those years:

    Everything in the outside world seemed to be moving towards some final decision, for by now the Spanish Civil War had begun, and every day felt choked with struggle. It was as if the planet had locked in combat . . . There seemed to be no division between my effort at personal liberation, and the apparent effort of humanity to deliver itself. Reading Silone and Malraux, discovering the Beethoven string quartets and having love affairs were part of the great pattern in Spain, in the Valley of the Ebro, in the Salinas Valley in California . . . Wherever I went now I felt the moral contagion of a single idea.

    Substitute Vietnam for Spain, and, perhaps, the White Album for the Beethoven string quartets (we weren’t quite as attuned to high culture as Kazin’s generation of young Jewish intellectuals thirty years earlier), and that pretty much sums up my own sense of historical destiny in my year in SDS. I don’t know if “contagion” is quite the right word, but even though I and most of the Reed SDS chapter avoided the temptation to follow Cathy Wilkerson into the violent and radical Weather underground, student revolutionary politics in 1968–1969 did, at times, have a feverish quality. 

    Thinking back on those years, the problem was not so much that I had a “single idea” governing my political choices (and certainly not “Communism” as it would have been recognized by a veteran of the 1930s); rather, I had too many contradictory and muddled ideas, which in my youth and inexperience I found impossible to sort out. Like Wilkerson that day at the Pentagon a year earlier, I was still partially under the sway of the Quaker ideal of “moral witness” as espoused in Gregg’s The Power of Non-Violence. But I was also attracted to the in-your-face confrontational politics being pushed in New Left Notes, SDS’s weekly newsletter. The ideal of non-violent “moral witness” was giving way to the vague but not necessarily non-violent ideal of “resistance.” I remember sometime that fall reading The Port Huron Statement, the founding manifesto of SDS, written (largely) by Tom Hayden in 1962, calling for the creation of a left with genuine intellectual skills, and being entirely persuaded. But also, that same fall in 1968, I read his essay in the radical monthly magazine Ramparts called “Two, Three, Many Columbias,” celebrating the strike at Columbia University the previous April, and essentially dismissing higher education in the United States as a wholly owned subsidiary of the war machine, which, he implied, did not deserve to survive. I found that equally persuasive. In sum, I was confused, an intellectual mess. I found wisdom in the works of Lenin and in the works of Lennon (and McCartney, Harrison, and Starr). I somehow believed in all these things simultaneously, and that they all went nicely together. I failed, or refused, to notice any contradictions.

    The author, age 20, at an anti-war protest in Portland, Oregon, 1971 (photo by Michael Kazin).

    As Wilkerson noted, my cohort of young radicals had few fixed political ideas beyond opposing the Vietnam War. And that sentiment, completely justifiable in itself, could lead in any number of directions, some entirely sensible and decent, including marches and vigils, draft resistance and other forms of peaceful civil disobedience — or, in Wilkerson’s case, to her embrace of lethal terrorism. “People make their own history,” as Marx famously observed, “but not under conditions of their own choosing.” I hardly mean to excuse poor choices made by Wilkerson, or myself, or others at the time — but it does suggest why so many individual actors tended to make worse choices at the end of the 1960s than, at an equivalent age, they might have made eight or nine years earlier. 

    SDS’s evolution in the course of the 1960s resembled a streetcar that, depending on the year you climbed aboard, carried you where it would with no fixed route. Had I been old enough to climb aboard in SDS’s early days, from 1962 to 1965, our destination, at least in the short run, would have been what the Port Huron Statement had described as a “participatory democracy,” our chief activity supporting the southern civil rights movement. Climbing aboard in 1968, however, in the midst of an ever-escalating and ever more destructive war in Vietnam, plus domestic warfare in the streets of Newark, Detroit, and Washington, D.C., SDS’s destination was transformed into the revolutionary transformation of, well, everything — the details a little vague, to be achieved through means that were not clear, but in my imagination bore some resemblance to Paris, May ’68, if only American streets were lined with cobblestones. I could have gone either way, and looking back I much prefer the former to the latter destination. But instead, I climbed aboard the streetcar in what proved SDS’s final calamitous year.

    I did learn some valuable lessons in the late 1960s and early 1970s that, when the dust had settled, informed my subsequent career as a historian exploring the fate of twentieth-century American radicalism. Between 1982 and 2000 I published four books on the history of the American left, and then a fifth, Reds: The Tragedy of American Communism, this past year. If I were to summarize the thesis of that latest effort, it boils down to a single sentence in the book’s preface, arguing that the Communist movement “attracted egalitarian idealists, and bred authoritarian zealots.” Although I was never a Communist as such — I was too much immersed in the counterculture of the era to go that route — I have to confess there is at least a little submerged autobiography informing that thesis.

    The lessons I learned in the late 1960s also informed my own subsequent political choices. In 1982, the year my first book came out, I was a founding member of a new left-wing group, Democratic Socialists of America (DSA). DSA’s most influential figure was Michael Harrington, who I had first encountered as an authority on poverty back in that formative summer of 1967. DSA, in Harrington’s vision, would strive to be “the left wing of the possible.” By that he meant that it should devote itself to building the broadest possible coalition of progressive groups within and alongside the Democratic Party, including the labor, feminist, environmental, and civil rights movements. Along with its socialist aspirations, DSA was thus firmly committed to “relate itself to the specific problems of social action in the here-and-now, give-and-take political world,” to borrow Bell’s formula (although Bell, by then describing his own politics as neo-conservative, had no interest in Harrington’s project). DSAers also laid equal stress on both words in the organization’s name, combining an absolute commitment to democracy as well as socialism, in the United States and internationally. Given my experience with the crack-up of the New Left a decade earlier, which led to SDS’s splintering in 1969 into a host of competing would-be revolutionary vanguards, I found DSA’s pragmatism and its democratic principles reassuring. No more charging down blind adventurous alleys for me, literally or figuratively, thank you very much.

    The author, age 31 in 1982, the year he joined Democratic Socialists of America, (Photo by David Weintraub.)

    Over the next three decades, DSA’s membership hovered between five and ten thousand, making it the largest socialist group on the American left, but one that was unable to attract sizeable numbers of new (and younger) recruits in the years between the Reagan and the George W. Bush administrations, and so played at best a minor role in the nation’s politics. Harrington died in 1989 of cancer at the age of sixty-one, and no one of comparable public stature (and intellectual skill) replaced him as a widely recognized symbol of what it meant to be a democratic socialist.

    And then, unexpectedly as these things tend to happen, new opportunities arose. In 2016, Senator Bernie Sanders, a self-described democratic socialist (although not a DSA member), ran a spirited campaign for the Democratic presidential nomination, and attracted many younger voters to his cause. That was followed two years later by Alexandria Ocasio-Cortez’s election to Congress — a young woman of color with extraordinary political skills, who proudly proclaimed her own DSA affiliation. People, especially young people, began to Google this unfamiliar term, “democratic socialism,” and up popped the DSA website. By the early 2020s, forty-four percent of Americans between the ages of eighteen and twenty-nine reported positive views of socialism, according to polling by the Pew Research Center. (Unfortunately Pew failed to ask what they mean by socialism. Probably not Soviet-era Communism.) As a result, and in some ways comparable to the boost that SDS experienced in the mid-1960s driven by the escalation of the war in Vietnam, DSA underwent an enormous expansion in membership, peaking at over ninety thousand by 2020. 

    Most of those new members were in their twenties, with a lot of enthusiasm, energy, and raw political talent, organizing chapters in cities and states across the country that had not seen an active socialist presence since the Debsian era. Soon scores of DSA members, running as Democrats, had won election to state legislatures, city councils, and other offices. By January 2021 there were four DSA members sitting in Congress, more socialists than had ever served in that body at the same time. Harrington’s “left-wing of the possible” was really happening, at long last. Or so it seemed. 

    But in keeping with the history of the modern American left, DSA’s future trajectory and prospects would prove neither uncomplicated nor untroubled. Two factors began to tug the organization in a very different direction politically from its earlier identity. The first was that, like those of us who came of age in the 1960s, the younger radicals now pouring into the organization, who soon vastly outnumbered DSA veterans, were impatient with their elders. In the early 1960s Michael Harrington had served for a few years as a mentor and role model to Tom Hayden and other SDSers, but by the latter part of the decade was either regarded as a sell-out by my generation of SDSers or forgotten entirely. “Perhaps it is inevitable,” he observed ruefully in 1967, “that young people come to the radical movement with the fervor of catechumens,” that is, converts to the Faith preparing for confirmation in the early Church, “and always believe that the veterans of past struggles are tired and going soft.” So it proved in DSA. By 2019 or so, “Harringtonite” was becoming a term of abuse within the organization (although no one I knew in DSA ever labeled themselves as such — cults of personality were never a particular feature of democratic socialism). 

    DSA’s years of maximum growth were also, significantly, years of despair rather than hope on the left, coinciding with Donald Trump’s election in 2016 and the subsequent four years of his dystopian presidency. Those four years climaxed with the killing of George Floyd in 2020. To many of the young people joining the Black Lives Matter demonstrations that ensued, their nation seemed stained with the sin of absolute and irredeemable racism, not unlike the way it seemed to abolitionists in the 1850s and New Leftists in the later 1960s. 

    It might be useful here to think back on the events that took place in Atlantic City in August 1964, when the Mississippi Freedom Democratic Party rejected the compromise offered them by prominent liberals, and chose instead to be ruled entirely by their principles. They were right to do so. But that moment had other, more protracted, and unintended consequences, especially for young radicals, as Bob Moses’ biographer Eric Burner argued in his perceptive account, And Gently He Shall Lead Them: Robert Parris Moses and Civil Rights in Mississippi: “In giving compromise a bad name the maneuvering of the liberals at Atlantic City . . . contributed to a mentality, increasingly aggressive in the years that followed, that purity is to be measured by how many people the pure refuse to cooperate with . . . It was purity of a sort that has since come to define the later sixties, at once fortifying and destructive.” 

    At once fortifying and destructive. What is fortifying for the already persuaded does not necessarily build bridges to those who still need persuading — quite the opposite. That tendency for “purity . . . to be measured by how many people the pure refuse to cooperate with” can be considered historically the Achilles’ heel of the antinomian tradition of American radicalism. All too often in recent decades, those on the American Left — and this is especially true on campuses, among faculty as well as students — seem to be engaged in a competition among themselves to demonstrate that they have individually arrived at a state of political grace, as measured by the ability to deploy ever more esoteric language and the embrace of ever more marginal niche issues. Failure to attract supporters outside the core group of believers can then be attributed to backsliding within the congregation, a morally satisfying but politically self-defeating habit. And seen in this light, Bell’s point about the political problems of “living in but not of the world” is not without insight. What worked in Atlantic City in 1964 decidedly did not work in the streets of Chicago in 1968, or in the Weather Underground in the years that followed. Or in too many left-wing circles today.

    The other factor complicating DSA’s future was that not all of the organization’s recruits were twenty-somethings new to the organized left. Starting in 2016, hundreds of veterans of left-wing groups that had nothing in common with democratic socialism joined DSA, with the intention of turning this now sizeable but somewhat inchoate group into something very much at odds with its founding vision. These included Trotskyists, Maoists, and others who traced their political inspiration and organizational lineage to the Leninist vision of (and here forgive the parade of venerable and stale clichés) creating a “party of a new type” of “professional revolutionaries” who would devote “the whole of their lives” to the cause. This is a vision of revolutionary change whose highest virtues are discipline and hierarchy, neither of which were much in evidence or much valued in DSA before this. 

    This reverence for authority and structure, for all its ideological variations, at its core differs very little from the model promoted by Joseph Stalin in an ideological primer that he composed in 1924 entitled The Foundations of Leninism, in which he asserted that “the working class without a revolutionary party is an army without a General Staff.” An essentially militarist mentality unites Trotskyist, Maoist, and other Leninist cults operating in the United States, which can be seen in their fondness for terms like “cadre” to describe their hard-core supporters. Accordingly, post-2016, various would-be General Staffs set to work to recruit that wave of younger members pouring into DSA, and in doing so convert the organization into the vanguard party of the American proletariat that they had never succeeded in creating when they were operating under their own banners of deepest red. DSA’s dwindling core of aging veterans who adhered to the traditional left-wing of the possible vision, and who did not think of or conduct themselves as a “cadre,” were not up for the ensuing challenge for control. Once again, as in so many factional wars on the left in the past, power became the real prize. In August 2023, at the organization’s bi-annual national convention, a coalition of self-described communist factions effectively gained control of DSA’s ruling sixteen-member National Political Committee (NPC).

    Here I will allow the new leaders of DSA’s NPC to introduce themselves. One of the dominant factions, the Red Star Caucus (with three members elected to the NPC, including one who was appointed co-chair of the national DSA), ran an article in its August 2024 on-line newsletter entitled “Communists Belong in DSA,” announcing that as “a Marxist-Leninist DSA caucus” it was calling “on all communists in the United States to join us” in the struggle to capture DSA and “move toward a revolutionary horizon.” Allied (as well as competing) with the Red Star Caucus is the Marxist Unity Group (MUG), which describes itself as “particularly inspired . . . by those that kept [the movement’s] revolutionary spirit alive in the face of political capitulation: Lenin and the Bolsheviks.” MUG’s vision of the transition to socialism in America is one in which “our class” (that is the working class, in whose name it presumes to speak — another trope of radical history) will take advantage of a future crisis of capitalism

    to topple the old order and convene a revolutionary Popular Assembly . . . Under the democratic leadership of a victorious socialist party [led by the Marxist Unity Group or its political heirs], the Popular Assembly will proceed to construct the socialist order. It will dismantle the slaveholder constitution and write the founding documents of the new republic. . . .  All parties that accept the laws of the new revolutionary order will be free to operate. 

    Tough luck for those Americans who will not be not on board with discarding the old (and certainly flawed!) U.S. Constitution for that imposed by “the new revolutionary order.” Embracing this vision of an extra-constitutional and almost certainly violent road to power at home, the communist majority on the NPC scrupulously avoids any criticism of anti-democratic actors abroad, at least those that are anti-American. As the Red Star caucus explained in its “points of unity”: “We see no benefit in levying public criticism of states or movements that are opposed to US empire, as such critique in effect serves no purpose except to create consent for empire.”

    Thus no criticism can be heard in DSA’s ruling circle of Vladimir Putin’s Russia, Xi Jinping’s China, Kim Jong Un’s North Korea, Nicolás Maduro’s Venezuela — or Hamas. (It was DSA’s response to Hamas’ October 7, 2023 pogrom in Israel, uncritically celebrating it as an act of legitimate “resistance” to Zionist “settler colonialism,” that led to my own resignation from DSA two days later; I believe in Israel’s right to exist and defend itself, but condemn the humanitarian catastrophe that Israel’s subsequent military response has unleashed on Gaza). 

    Daniel Bell had experienced the likes of the Marxist Unity Group and the Red Star Caucus many decades before these latecomers to the perennial authoritarian strain of ultra-leftism came into existence. In Marxian Socialism in America, he described the equivalent kindergarten Leninist fantasies entertained by left-sectarian groups of his own era as consisting of “the illusions of settling the fate of history, the mimetic combat on the plains of destiny, and the vicarious sense of power in demolishing [other left-wing] opponents.” Since Bell’s day, however, the advent of social media has fostered and simplified the process through which the sectarian left can with a few keystrokes go about settling the fate of history on the plains of destiny. Consider the explanation for Kamala Harris’ decision to choose Tim Walz rather than Josh Shapiro as the vice-presidential candidate offered by DSA’s NPC in a statement posted to X on August 6, 2024:

    Harris choosing Walz as a running mate has shown the world that DSA and our allies on the left are a force that cannot be ignored. Through collective action, DSA and the US left more broadly have made it clear that change is needed. The Uncommitted movement, in which DSA members played crucial roles nationally and in multiple states, pressured the Democratic establishment into choosing a new candidate and backing down from a potential VP [Shapiro] with direct ties to the IDF and who would have ferociously supported the ongoing genocide in Palestine. 

    Most political activists engaged in real-world Democratic Party politics would find the claim that DSA somehow determined Harris’ choice of running mate an absurd example of childish boasting. Fox News, however, which has different priorities, really liked that tweet, amplifying DSA’s claims to exercising such powerful influence over Democratic campaign strategy on “Fox and Friends,” with a chyron on the screen reading “Tim Walz: A Radical, Far Left Ideologue,” and adding on its website, “Walz is not a member of the DSA, but has made favorable comments regarding socialism.” 

    As for the left sectarians’ vicarious sense of power in demolishing opponents within their own camp, consider how the Marxist Unity Group, in a posting to “X” on July 21, 2024, described the two individuals who more than anyone else were responsible for DSA’s growth between 2016 and 2020:

    AOC and Bernie Sanders have abandoned the working class to exploitation and the Palestinian people to genocide. Rather than lead the working class in the battle for democracy, they tried to tail Biden to win bandages for a disintegrating capitalism.

    Once again, in the time-honored radical tradition, the sectarian impulse is to eat their own. 

    While the Republic may well be facing its moment of maximum danger since the election of 1860 and its aftermath (see the events of January 6, 2021 and November 5, 2024), the danger does not come from this quarter. DSA’s new amateurish proprietors have already managed to reduce the organization’s membership from its peak of ninety thousand to seventy thousand. Or perhaps lower: it has been some time since they have offered any authoritative figures.

    That is the bad news, I mean for them. The good news, such as it is, is that the views of the NPC sectarian majority likely do not represent that of the actual majority of rank-and- file DSAers. Bernie Sanders and Alexandria Ocasio-Cortez, safe to say, are not seen as traitors to the cause by most democratic socialists. The communist caucuses that gained control of the NPC probably count no more than a few thousand members, if that, between them. Most of the remaining dues-paying DSAers are not active participants in any of the rival caucuses or in their local chapters.

    There still remain sane caucuses in DSA who plan on challenging the control of the ultras at the national convention 
in 2025, and I wish them well, but without much expectation that they will succeed. One thing left sectarians are good at is stacking meetings and manipulating votes. And even if the silent majority of DSAers, let’s call them the “democratic socialist caucus,” do regain power, the damage has been done. Thanks to the Red Star Caucus, and others who have succumbed to the Leninist temptation, DSA has become toxic in the eyes of many on the democratic left, for its juvenile ideological posturing and for lending ammunition to the Murdoch media empire. That legacy of would-be Bolsheviks romping across plains of destiny will prove hard to erase.

    David A. Bell, a historian at Princeton University and a contributor to these pages, and the son of Daniel Bell, penned a fine article for Dissent magazine a few years ago marking the centennial of his father’s birth. He noted that, despite his own repeated urgings, he could never persuade his father, who certainly did not suffer writer’s block, to write his memoirs. And he ends with a telling anecdote about his father’s relationship with my generation of young radicals in the late 1960s. His father, he tells us, “worried about the student movement, feared its wildness, looked askance at the hedonism associated with it, but still could not help sympathizing with its political radicalism.” The senior Bell was then teaching at Columbia University and was one of a number of faculty who tried to negotiate an agreement between the students occupying buildings during the Columbia strike of April 1968 and the administration. “But on the night of April 29,” the junior Bell recalled:

    negotiations broke down, and the police moved in with nightsticks and tear gas. Many of the students were badly beaten, and hundreds were arrested. I remember waking up early on the morning of the 30th — I was six years old at the time — and finding my father, fully dressed, on the couch. He had been up all night and he was weeping uncontrollably.

    For whom did Bell weep during that long and long-ago night? The students, beaten and arrested? Possibly. Columbia’s damaged reputation and future as a learning community? Also possible. His son suggests an additional possibility — that he wept out of frustration and confusion about how to understand the terrible and stark choices that he, his colleagues, and his students were forced to confront in the tragic spring of 1968: “He could never quite reconcile the Jewish conservative and the Yiddish radical within him — never quite decide from what perspective to judge and interpret the times he had lived through.” The tension between the two perspectives, one formed in his youth, the other in adulthood, had a positive side, the son wrote, for it kept the father politically “sensitive to the dangers of extremism, but also to the dangers of injustice.” 

    I did not know Daniel Bell personally, and I cannot say whether that is the case or not. But if it were true, it would make me think back on him with considerable sympathy. Remaining alert both to “the dangers of extremism,” and to “the dangers of injustice” is a tough balance to maintain. After a lifetime of engagement, sometimes hopeful, sometimes despairing, with the American left I can but aspire, imperfectly, to achieve the same. Can a disabused idealist still be an idealist? How zealously should one oppose zealotry? From one old radical to another, rest in peace, Daniel Bell.

    The Master of Attention

    It would be silly to call William Wyler underrated — he was one of the most acclaimed and commercially successful movie directors in American history. A staple in every American film canon, he was my favorite director long before I knew his name. Growing up I watched Dodsworth, The Little Foxes, The Heiress, and Jezebel dozens of times without noticing that four of my favorite movies were directed by the same person. His legacy is really remarkable: the same person who directed these close and complex dramas, and who was Lilian Helman’s favorite collaborator, also made movies such as Funny Girl, Ben Hur, and Roman Holiday. The man whom we have to thank for the stardoms of Audrey Hepburn and Barbra Streisand also taught Bette Davis how to act and Laurence Olivier how not to overact. And yet he does not inspire the kind of cultish attachment that other directors do. Perhaps that is because his style is almost imperceptibly subtle. It is not clear whether there is such a thing as a “Wyler touch.” We love Wyler movies, but we don’t love Wyler.

    Wyler famously quipped that although he was not an auteur, he was one of the only American directors who could pronounce the word correctly. Critics said that Wyler had an “invisible style.” Andre Bazin, in Cahiers du Cinema, called him the “Jansenist of mise-en-scène,” meaning that Wyler was controlled and even self-denying in his visual style, in contrast to the personal and easily identifiable signatures of John Ford, Fritz Lang, and Alfred Hitchcock. Yet there is nothing austere about Wyler. He was no shrinking violet as a director, sacrificing his own artistic individuality for the good of the picture. No, the Wylerian style is not a particular visual brand. It is a special kind of attention. As he himself explained, “I have never been as interested in the externals of presenting a scene as I have been in the inner workings of the people the scene is about.”

    Wyler films are not iconic. They do not dazzle us with specular images that stand on their own, that can be borrowed and parodied. Greta Gerwig’s Barbie can “quote” Kubrick, aping the imagery from 2001: A Space Odyssey and Dr. Strangelove for a gag — images so powerful that they transcend tone, context, and even the stupidity of a movie like Barbie. Wyler, by contrast, cannot be quoted. His genius is quiet, specific, unspectacular, and intimate. If Kubrick’s movies are operatic, Wyler’s are novelistic. Films like Kubrick’s are spectacles suspended in time. The composer of an opera manipulates the audience by controlling the music. Viewers are transported into the pace and mood dictated by the composer; they are held steady, made to wait, and then finally allowed the relief of climax: the aria. Just as an aria can be excerpted from an opera and still retain its power, the images from a Kubrick movie can be stolen, mimicked, and interpreted while still recalling the original. Not so Wyler’s. 

    Wyler’s films have the bounded and internally elaborated intimacy of a novel. Characters are shaped supremely in relation to one another. Wyler’s delicate details — slight movements of flesh, diffusions of light, fluid camera motions — reveal the inner worlds of each character. His greatest stylistic flourishes work on the viewer almost imperceptibly. They are meant to be discovered. Picking one up feels like eyeing a stranger on the street — say, watching as they grapple with some inconvenience. You recognize a slight shadow of annoyance cross their face, and for an instant you feel that you know exactly what they are thinking. The ephemeral intuition of other minds is so powerful precisely because it is so gentle and because it reveals our collective interest and investment in each other. Wyler is always counting on our capacity for close attention.

    Put simply, the “Wyler touch” is a prodigious gift for people, for understanding and conveying on film the truthful appearance of inner experience. He makes real the distances between people, the way they bounce off each other and retreat into themselves, the way they work at themselves and forge each other. In this regard he is one of cinema’s supreme humanists.

    Wyler does not fit into any of the familiar categories of American filmmakers, or rather he seemed to have a foot in all of them. He was a European Jew who came to America as a young man, like Ernst Lubitsch, Billy Wilder, and Michael Curtiz. Unlike them, he never worked in European film or theater; he got his start in Hollywood. He was the son of middle-class Jews, born and raised in Alsace-Loraine, speaking both German and French (but not Yiddish). As a child during World War I, he remembered his family spending nights in the cellar — huddled with their Protestant and Catholic neighbors — waiting for the battle outside to end so they could find out whether the town was French or German that day. When he came to America, he did not come fleeing war or antisemitism. He sailed first-class in 1920, with his violin and his skis. His uncle Carl Laemmle had come over in steerage in 1884 and gone on to found 
Universal Pictures. Laemmle offered the eighteen-year-old Wyler a job in the mailroom in New York. 

    Wyler’s start in Hollywood looked more like John Ford’s than his German Jewish contemporaries. Both he and Ford started out making two-reel quickie Westerns for Laemmle, churning out simple stories starring any guy who could ride a horse, making as many films as possible, and hoping to slip in something creative somewhere to get noticed. Directors were paid less than cameramen, but they were free to change the scripts and try new techniques without asking permission. Like Ford, Wyler’s quickies received positive attention, and he was promoted to making full-length features in 1926, mostly more Westerns. His first talkie was the first all-sound outdoor film Universal made, a Western called Hell’s Heroes, shot on location in the Mojave Desert and Death Valley. Wyler insisted on the location for its flat landscape and cloudless sky. The film was Wyler’s first major success, and it was lauded, at home and in Europe, for its bleak realism and for the innovative directorial choices Wyler made. 

    In 1933, Wyler got the opportunity to direct a prestige picture, an adaptation of an important and successful play written by Elmer Rice, who also wrote the screenplay for the film. The protagonist of Counsellor at Law was a well-respected lawyer named George Simon, a Jew from the Lower East Side who first crossed the Atlantic in steerage and then made himself a prominent member of New York society. Naturally, none of the Jewish actors whom Laemmle and Wyler wanted for the film would agree to play it, for fear of being ethnically typecast, so, less naturally, in an early example of non-traditional casting, the Jewish immigrant was played by, of all people, John Barrymore. Barrymore was glad to have Wyler as a director: “Because you’re Jewish,” he told him, “you’ll be able to help me a great deal with the character.” Barrymore was determined to incorporate “Jewish gestures” into his performance, to the point that Wyler had to assure him that there is no Jewish way to pick up the phone.

    However crude Barrymore’s understanding of his character was, Wyler knew how to build George Simon with his camera, using the interplay of fluid, constant motion with sudden, suffocating zooms to lay bare the paradoxes in the man’s life. As Simon hums along, Wyler’s camera sweeps around his office, following his characters as they rush through a dizzying labyrinth of side rooms and halls, waiting rooms and libraries. Clients and colleagues step in and out of Simon’s office, and as his attention settles on each, Wyler’s camera slowly rests on them, before Simon’s mind hurries along and they are sent bustling out into the hall. As long as Simon can keep moving, as long as his energy fills the succession of rooms that make up his world, he is content. 

    This office, and the lateral motion of the camera as it roves its corridors, embodies life as Simon has constructed it for himself. But these jumbled forms distract from the agonizing contradiction which he endeavors to ignore. His life is divided against itself: he is a man of principles but he is striving to hit it rich; he adores his wife and his mother, but his wife is too snobbish to make civil conversation with his mother; he wants to help the kids from his old neighborhood, but this work jeopardizes his career, and they are not always grateful enough to make it worth it; he is surrounded by people who are fanatically loyal to him, but he is also out on a limb all on his own, risen above his station. 

    As the fates close in, Simon’s world becomes suddenly airless, and collapses in on itself. The camera stops roving and instead it rushes at Simon from below, isolating him in the frame. The claustrophobia of these shots forces us inside Simon, deprived of all the trappings of the bustling life that ordinarily keep him safe. Through movement, Wyler creates an effect reminiscent of the moment in Anna Karenina when Anna’s husband realizes that his wife may be in love with another man: “Now he experienced a feeling akin to that of a man who, while calmly crossing a precipice by a bridge, should suddenly discover that the bridge is broken, and that there is a chasm below. That chasm was life itself, the bridge that artificial life in which Alexey Alexandrovitch had lived.”

    As this torment becomes intolerable for Simon, we start to notice that there is one feature of the world that relieves him of isolation. The camera zooms in on the back of his head as he sits framed in a window, looking out at the city. We have heard, a few times, that his plucky receptionist is feeling unwell because she saw a man jump from a building this morning. It slowly occurs to us that a dark impulse has drawn Simon to the open glass. At the climax of the film, Simon looks out the window, and for the first time in the film the camera leaves the office and we look in at Simon from the outside. We hear the sounds of the city, watching the man inside sit alone in the dark. Slowly he starts to move toward us, toward the window, and in turn the camera accelerates toward him from below, exaggerating grotesque shadows in his face. In one swift motion he opens the window and starts to climb out.

    Though Counsellor at Law is based on Elmer Rice’s play, it does not feel like a filmed stage play, even if that shot into the window is the only time we leave the office. Wyler refuses to open it up, taking the audience inside Simon’s world by trapping us in the office, heightening the mania and the claustrophobia. The film is a masterclass in thoughtful adaptation, a Wyler specialty. Authors and playwrights knew they could trust him with their work. Lillian Hellman, his lifelong friend and collaborator, remarked that “it was Wyler who taught me about movies.” Their first collaboration was his adaptation of her play The Children’s Hour in 1936. This was a play that should have been unadaptable, because censorship under the Hayes Code made it impossible to discuss, allude to, or otherwise hint at its subject: lesbianism. But Wyler understood that the core of the story was the horror of being at the mercy of a careless lie that hits dangerously close to an unspeakable truth. 

    Wyler’s straight interpretation of the story, These Three, was his first of seven collaborations with the great cinematographer Gregg Toland. In 1936, Toland had not yet perfected the deep focus technique for which he would become famous, but he and Wyler used blocking, the arrangement of actors in the frame, to allow the viewer to watch the unfolding drama of people watching each other. In the heterosexual version of the story, we watch a lifelong friendship fall apart when one woman is falsely accused of having an affair with the other’s fiancé. As much as the three victims of the lie declare that they are determined to stand together, Wyler’s blocking shows us how isolated Martha — the alleged guilty party — truly is, by how she stands and moves, how she peers at her friends and how the world glares at her. When the three confront Mrs. Tilford, the woman who has spread the rumor of Martha’s affair, Martha is positioned in the center and alone, pacing and turning in between two poles: Mrs. Tilford in the foreground at the far-left side of the frame and Joe and Karen, the engaged couple, together on the far-right. At times Martha stands in the background, wobbling slightly, at times she lurches forward as if about to faint. The physical separation between Martha and the others is so stark that when Mrs. Tilford addresses the two women separately, we know who she is speaking to in the close-up shots of her face because we can follow her eyes. She is firm, looking directly ahead when she speaks to Karen and saving contemptuous sidelong glances for Martha. 

    Martha herself cannot look at her friends straight on. Wyler trains his camera on her as she searches silently, furtively, for signs of a coming betrayal. Before they leave Mrs. Tilford’s house, Joe makes one final appeal. He stands positioned in the foreground with Mrs. Tilford, while Karen and Martha stand in the background, out of focus but clearly visible standing by the door. Karen turns her whole body to watch directly, proud and fortified by Joe’s loyalty. Martha, however, turns her head slowly, with a hint of shame, her body still angled toward the door as she watches. We know the lie has a kernel of truth: she is in love with Joe, and so his heroism is bittersweet, corrupted by her guilty desire. Even in the blurry background, it is painful to watch her twist and turn out on her own, believing herself undeserving of loyalty.

    Each of Wyler’s adaptations has this inventive quality, giving life to the essence of the story through the creation of a cinematic language. In The Little Foxes, Wyler translates the world of Hellman’s celebrated stage play into a series of intricately sculpted compositions. In one scene, a father and son test the depths of each other’s depravity, facing away from each other but watching each other carefully, each in their own shaving mirror. To allow the audience to clearly see both men’s faces through this maze of mirrors,was a technical achievement that drew bewildered admiration from no less an artist than Sergei Eisenstein. In The Letter, a genuinely extraordinary film from 1940, Wyler builds the strictly segregated but intimately intertwined worlds of British Malaya entirely out of light and shadow. The cutting, brilliant moon burns through the blinds of the carefully arranged, electrically lit colonial society, calling Bette Davis into its penetrating, almost transfiguring light. Dodsworth, in 1936, is a delicately observed portrait of an American marriage on the rocks, a study of the transformation of a middle-aged man (Walter Huston) in thoughtful pursuit of his own happiness, which deserves the compliment that Virginia Woolf’ paid to Middlemarch: It is one of the few movies about marriage for grown-ups.

    But perhaps Wyler’s most stylistically daring adaption is his Wuthering Heights, in 1939, which focuses more on the characters’ emotional states than the actual plot of the story. He almost entirely ignores the wild natural setting of the moors in which Emily Bronte’s story is set, and places most of the action indoors. But despite this choice — the screenplay was by Ben Hecht and Charles MacArthur — he uncannily recreates the dense and savage feeling of the novel, the gothic idealization of the dark and romantic and the contempt for the false brightness of ordinary life. 

    The camera drifts carefully through the interiors in which Cathy and Heathcliff spend their days: the bright, opulent halls of the society world of the Grange and the chiaroscuro gloom of Wuthering Heights. But it is not that the Grange is light and Wuthering Heights is dark. Things are not so simple. There is light in Wuthering Heights — light that obscures, that blinds. This is not the illuminating light of the sun, nor the soft light of lamplight, but the flashing, obliterating light of lightning. There, fire makes the darkness blacker. 

    Cathy says, “Whatever our souls are made of, his and mine are the same.” The strange material is lit with an obfuscating light. Wyler allows us to discover this secret vibrancy as the camera presses into Cathy’s and Heathcliff’s faces, or it glints and pools on their lips, their eyes, their teeth. Both Cathy and Heathcliff are often lit from behind, their profile framed, their features concealed. When Cathy declares, “I am Heathcliff,” lightning strikes behind her, engulfing her, so that for an instant she can barely be seen.

    Carefully and dramatically stylized as Wyler’s Wuthering Heights is, what it delivers is not the director’s artistic signature but his subject. Wyler crafts a visual vocabulary for the world-annihilating passion between two people. When Cathy and Heathcliff think of each other, their faces don’t glow, they shine, as if feverish. Both can move through the ordinary world and appear normal, even beautiful, but in this strange other light they are transformed. When Cathy, played by Merle Oberon, is in the world of the Grange, playing the society lady, her lashes cast soft shadows over dark, intelligent eyes. When Heathcliff calls her out of this comfort, her eyes widen and flash with an unearthly glow. The effect is maniacal. The light is so intense that her eyeballs strike the viewer as horrifically wet and round in their sockets. As Cathy and Heathcliff embrace on her deathbed, one of Cathy’s teeth catches this otherworldly light, and throughout the scene the tooth shines grotesquely in its winking glare. We get to see the animal that Heathcliff loves, sickening and dying inside her expensive cage.

    Laurence Olivier, brought off the stage where he felt he belonged to play Wyler’s Heathcliff, found the director sneering and sarcastic. Wyler, for his part, felt that Olivier was overacting, and after days of frustration brought the issue to a head in front of the whole crew. “Tell me Larry,” he barked, “what dimension do you reckon you got to now?” Olivier shot back, “I suppose this anemic little medium can’t take great acting.” Everyone on set burst out laughing, Wyler especially. Olivier was mortified. Wyler later took Olivier aside to talk earnestly about the potential for great film acting. Later in life Olivier would say, “If any film actor is having trouble with his career, can’t master the medium, and, anyway, wonders whether it is worth it, let him pray to meet a man like Wyler.” 

    Wyler believed that working with actors was one of the most essential jobs of the director, and he could be gentle and encouraging with his actors when he thought it would get results. Audrey Hepburn was a newcomer to acting when she was cast as the lead in Roman Holiday in 1953, and although possessed of a certain kind of natural charisma, she was shy and unsure about how to behave on camera. She was initially terrified of making mistakes, and Wyler set out to calm and secure her with a steady drip of praise. But he had a reputation for harsher treatment. In Jezebel, in 1938, the other — the better — Gone with the Wind, Wyler’s heroine is a vivacious and impudent southern belle named Julie played by a young (that is, exceedingly vivacious and impudent) Bette Davis. Viewers meet Julie arriving late to her own engagement party still dressed in her riding clothes. In her first few seconds on screen, Wyler wanted Julie to hike up the train of her dress with her riding crop and hook it over her shoulder. Easier said than done: it is a nifty bit of choreography and it needed to have the ease and feel of a gesture that she had performed hundreds of times. So, Wyler instructed Davis to repeat the scene over and over again. When an exasperated Davis demanded to know what she was doing wrong, Wyler merely told her, “I’ll know it when I see it.” Years later Davis herself admitted that Wyler’s perfectionism paid off. The gesture is powerfully sexy, tinctured with a masculine vigor and informality and yet elegant and indisputably feminine. In a film that forces viewers to wrestle with this woman on the way to accepting her, this single motion shows us at the start exactly who she is and leaves us as much enthralled to her as every character she bewitches. Julie hooks us with the crop, too.

    There were instances when Wyler used cruder methods to get what he wanted out of an actor. The Heiress was Wyler’s adaptation in 1948 of a play based on Henry James’ novel Washington Square. It stars Olivia de Havilland as Catherine Sloper, the plain single daughter of a wealthy family. Catherine is seduced and then jilted by a handsome but mercenary suitor (a young Montgomery Clift) who had promised to elope with her when he thought her money would be coming with them. He had not been prepared for her to renounce her fortune when her father, one of the cruelest characters in the history of film and played with blood-chilling elegance by Ralph Richardson, threatened to cut her out of his will. She waits by the door for the fiancé who had promised he would come. Hours crawl by. 

    The playwrights, Ruth and Augustus Goetz, who had adapted the James novella for the stage, had written a speech for the moment when Catherine realizes in agony that her suitor is not coming. They considered the speech essential to their interpretation of James’ story, which focuses on the tragedy of Catherine being so cruelly treated — by her father, her aunt, and her unscrupulous suitor — as unlovable. (Cruelty is the tale’s true subject.) The play called for her to say: “I used to think my misfortune was that Mother died. But I don’t think that anymore . . . If she had lived, she too could not have loved me.” The speech remained in Wyler’s script, but on the set he decided to cut it out. Instead he instructed De Havilland to silently, laboriously drag up the staircase the suitcase she had packed for her elopement. He made De Havilland repeat the scene again and again, waiting in vain for her frail form to communicate the desperate heartbreak which must have tormented the character. Finally he filled the suitcase with heavy books, such that De Haviland could barely lift it. With unfeigned exhaustion, she lugs the laden suitcase up the stairs, and we realize she is bereft of any hope of comfort or strength or love, that she must carry on living alone in a world that will only ever treat her unkindly. She didn’t need to tell us; she shows us. We are expected to see it because we are expected to look carefully.

    Wyler’s novelistic films allow us into the internal world of the individual through these delicate moments of discovery. It takes a special talent for collaboration, for other people, to construct a story out of the flesh and movement of others. This was Wyler’s special gift: recognizing, respecting, and at times manipulating the gifts of others. He was very demanding, but his style of work was designed to allow for as much cooperation as possible. The day before shooting he would schedule a day of rehearsals, which would begin with the cast gathering together to read and discuss the scene they would film the subsequent morning. Then he would stage the scene, allowing the actors to play it the way that was most comfortable for them. This also gave the cinematographer an opportunity to observe the staging and think through the photography. Then, before shooting, Wyler would have the cast rehearse the scene again, this time giving the actors suggestions on their performances. Shooting would not begin until every person involved had the opportunity to come to their own understanding of how it should work. Film is always a collaborative art, of course, but Wyler’s craft of collaboration was at another level altogether.

    The most important collaboration of Wyler’s career was his partnership with Gregg Toland. They worked together on seven films, including These Three, Wuthering Heights, and The Little Foxes. Toland, who may be the most famous American cinematographer of all time (particularly for his work with Orson Welles on Citizen Kane), worked with Wyler more than any other director. Some critics have declared that all there is to the “Wyler touch” is the Toland touch. In 1955, in an issue of Cahiers du Cinéma devoted to “The Situation of American Cinema” (dedicated to Orson Welles, “without whom the new American cinema would not be what it is”), the French critics offered a simple explanation for what they saw as Wyler’s artistic decline after 1948: “Gregg Toland était mort.” Short, ruthless, and to the point: no Toland, no worthwhile Wyler. The American critic Andrew Sarris, who lived under the spell of the French critics, wrote: “Subtract Gregg Toland from Welles and you still have a mountain; subtract Toland from Wyler and you have a molehill.” 

    But comparing Citizen Kane with The Best Years of Our Lives, the apotheosis of the partnership between Wyler and Toland, it is clear that Wyler has a different use for Toland’s powers than Welles did. Both films make extensive use of the deep focus shot, Toland’s signature achievement — and a real challenge considering the limitations of contemporary cameras. This technique allows the background and the foreground of a scene to be clear and uniformly visible simultaneously. In Citizen Kane, Welles uses deep focus shots for dramatic irony, the action in the background giving poignant context to the events in the foreground. The shots are highly stylized, giving voice to the director’s God-like view of the characters. The image is an icon of the story. 

    For Wyler, by contrast, deep focus photography had a more practical, and more poignant, purpose. Having more than one area of focus allowed Wyler, as he wrote in 1947, to have “action and reaction in the same shot, without having to cut back and forth from individual cuts of the characters.” Even before Toland was technically able to render both action and reaction in focus, he and Wyler worked together to achieve this effect, particularly in These Three and The Little Foxes. The viewer feels that we are taking in a scene the way that we take in life, shifting our attention from one person to another. There is no irony, there is no dramatic distancing, no iconography in these shots. Wyler was not creating myths, he was seeking the textures of human feeling. He composed the actors in the frame to encourage the viewer to watch the characters react to each other. A man in the background is placing a difficult phone call. A man in the foreground is watching as a friend plays piano, but he keeps glancing back at the man in the background. We know that our man in the foreground is troubled; he cares for the man in the phone booth, and he worries about him, but he feels that the phone call must be made nonetheless. We stumble into this private moment of internal conflict and watch with him as the man hangs up the phone and leaves. 

    The Best Years of Our Lives is Wyler’s masterpiece. In 1946, filmmakers around the world sought to understand what the recent war had made of their societies. What was left after the wreckage? How could they distinguish between what was essentially Italian, Japanese, German, and therefore unchanged, and what had been transformed forever? The Best Years of Our Lives is the American contribution to this exercise in global introspection. Through the stories of three men returning from war to the families they left behind, Wyler reveals a society weighed down with a new awareness of the horror of the world. His characters struggle with disability, posttraumatic stress disorder, reemerging antisemitism, surging reactionary anticommunism, and pervasive fears of nuclear annihilation, as they try to discover who they — and we — are going to be now that we are on the other side of an apocalypse. And he does it by allowing the audience to watch the characters watch each other. The three men keep looking behind the eyes of the people around them for signs that they are not welcome, that they cannot be understood. As we catch characters regard each other, or pointedly avoid each other’s gaze, we feel that we are discovering for ourselves something that is happening inside them, the surging and subsiding feelings they have about each other.

    One of the most moving scenes in The Best Years of Our Lives comes when Al, the bank vice president turned infantryman turned bank vice president again, is invited to give a speech at a banquet, and he is so miserable about his job denying small loans to veterans like himself that he gets outrageously drunk. Wyler shoots his speech in a deep focus down the head table. As Al staggers through his speech at the center of the table, we see audience members listening with anger and astonishment in front of us, and his wife Milly (Myrna Loy, exquisite as usual), almost at the end of the table, sitting in pained anxiety, occasionally exchanging nervous looks with Al’s boss. But as Al continues, he becomes more eloquent: 

    I wanna tell you that the reason for my success as a sergeant is due primarily to my previous training in the Corn Belt Loan and Trust Company. The knowledge I acquired in the good old bank I applied to my problems in the infantry. For instance, in Okinawa, a major comes to me . . . he says, “Stephenson, you see that hill?” “Yes, sir. I see it.” “All right,” he said. “You and your platoon will attack said hill and take it.”

    So I said to the major, “But, uh, that operation involves considerable risk. We haven’t sufficient collateral.” “I’m aware of that,” said the major, “but the fact remains, that there is the hill and you are the guys who are going to take it.” So I said to him, “I’m sorry, major. No collateral, no hill.” So we didn’t take the hill, and we lost the war. Uh, I think that, uh, little story has considerable significance. But I’ve, uh, forgotten what it is.

    He finishes his speech by professing his belief that the bank will end up granting so many small loans to returning servicemen “that people will think we’re gambling with the depositors’ money.” He concludes: “And we will be. We’ll be gambling on the future of this country.”

    As Al speaks, we watch his boss look up sharply and furrow his brow, and the audience look on in confusion. Milly, however, is transformed, now staring at Al proudly and lovingly. We watch her realize that, unhappy and drunk as he is at this moment, Al is becoming a better man than he was before the war. He had been a selfish man and had not understood sacrifice or fellowship. He had no sense of civic responsibility and he, like his bosses at the bank, was happy to disguise his greed, to himself and the world, by recourse to economic principles and their alleged moral neutrality. Now, through all his pain and confusion, he is profoundly idealistic that Americans do share a collective destiny, and a collective commitment to the betterment of the world. All this Milly understands, as we can see in her eyes, because she is not so unchanged as she appears, and because she loves him. 

    Mrs. Miniver was released in 1942, but Wyler directed it before the United States entered the war. Even Wyler would — and did — acknowledge that the film was propaganda, intended to stir American sympathy for the British under Nazi attack by showing life on the home front as the war becomes increasingly desperate. The film ends with a speech given by the local priest after the town — the fictional village of Belham, near London — has been disfigured by Nazi bombs, when the community comes together to mourn. Wyler himself worked on the speech, which was later translated into French, German, and Italian to be broadcast throughout Europe on the Voice of America, and airdropped in millions of leaflets into German-occupied territories. This time Wyler did want an iconic moment, poignant but clear, to send the audience a forceful message. And that is what he achieved. As characters take a break from burying townspeople whom the audience, over the course of the film, has been taught to love, we are told: “They will inspire us with an unbreakable determination to free ourselves and those who come after us from the tyranny and terror that threaten to strike us down.” When the priest declares that “this is the people’s war, and the people must fight it,” audiences watching today are no less stirred than the many who watched the film when it came out. 

    That tone was profoundly unlike the tone of The Best Years of Our Lives, the keynote of which was not righteousness but goodness. Goodness is not iconic. Goodness cannot be broadcast over radio waves to move men to war. Goodness is even banal. The last scene of Wyler’s great film is a small wedding, in which all these people who were mauled or magnified by history are gathered. Homer, the football star who lost his hands in the Navy — played, unforgettably, not by a professional actor but by a war veteran with prosthetic arms and hands — has finally stopped shutting himself away in shame from the people who love him, and allowed his high school sweetheart Wilma to show him that she is as devoted to him as ever. Fred — a moving and only intermittently tough Dana Andrews — is a heroic fighter pilot who came home to find himself trapped in a bad marriage and a humiliating job, and by the end of the film he is still adrift. The wedding is the first time he has seen Peggy, the girl he really loves, and who truly loves and understands him, since he broke things off with her and was subsequently dumped by his frivolous wife. She is played by Teresa Wright, as uncannily subtle here as she is everywhere. Fred and Peggy stand separated — him closest to the camera in the foreground and her in the background in brilliant white — locked in their own internal struggles on the left side of the frame.

    We, with the rest of the wedding guests, are on edge as the ceremony unfolds, afraid that Homer will not be dexterous enough with his hooks to place the ring on his bride’s finger. The crowd watches as Homer promises to be with Wilma for richer or for poorer, but we start to notice a private moment unfolding. Fred turns slowly to look back at Peggy. Their eyes meet, but they grimly look away. As Wilma returns Homer’s promise, Fred looks back again and we realize Peggy is staring at him with tears in her eyes. Their gaze holds, and we can see clearly in Peggy’s eyes that it is full and confident, echoing the loving look that Wilma is giving Homer on the right side of the frame. As the ceremony ends, and the rest of the wedding party swells to the right to hug Homer and Wilma, Fred and Peggy stand still, in their own world. Fred crosses the room back to Peggy, slowly and deliberately, and they kiss in the background. Even as we share in the relief and joy of the rest of the wedding party, we alone catch Fred and Peggy’s private moment of loving understanding.

    We are utterly in thrall to the powerful emotions of the characters, but we barely notice the technique that is making it possible for us to feel that we have stumbled upon the intimate pains and joys of an entire generation. There is nothing brash and operatic about the blocking; almost nothing is said. The pinnacle of Wyler’s and Toland’s collaboration, the apotheosis of their powers, is this quietly climactic moment, when we can at last watch two good people meet each other’s eyes and see each other.

    Is it ironic or perfectly correct that the great American postwar epic is a simple story, an intimate encounter with ourselves as we find each other — unsure, foolish, hurt — and good? Without affectation or swagger, Wyler builds a new country stumbling into a new world, fat and safe, apparently untouched by man’s greatest atrocities, and yet firm under the weight of a grandiose but clear-eyed sense of responsibility. This delicate idealism would be strangled by spectacular iconography; the brand of the artist would crush it. Here is the gentle profundity of Wyler’s humanist generosity. He gives us to each other to discover, teaching viewers nothing less than how to recognize the human, how to live in community with one another, how to watch, to listen, to pay attention. There is no shorthand for the quality of Wyler’s attention, which is, ultimately, the Wyler Touch. Attention, after all, is not only the condition of cinematic experience, but also the beginning of all our moral and emotional duties.

    Blood Stains

    Horror — like beauty, passion, and all states in extremis — confounds the habits which regulate the human mind. Before articulation — which is to say, before the experience or the witness of horror is transformed from something beyond our ken into a verbal artifact manufactured by reason or insight or prejudice or all of these things — we are alone with something unlike the materials we are able to know. When we are in shock we remain fixed before the horror without the means to investigate it, such that the time that a human mind is forced to spend with inhuman action is discomfitingly distended. A terrible alchemy preserves the horror which yields a pure, uncomplicated, and correct pain — the pain of considering horror squarely, without evasion or escape. Some acts should not be metabolized and smoothly fitted into ordinary life. But the human mind resists shock; shock must be melted like ice in the sun. What do we do with what we have witnessed? How do we alter it so that it fits inside the boxes which order our imaginations?

    Whoever manages articulation first secures a strange power. The race out of horror into language is a power race. First-speakers have the job of first and forever transforming what could not be absorbed. That lucky winner, whose haste is more often a symptom of unintelligent compulsion than it is of reason and wisdom, sets the conditions for everybody else’s confrontation with the horror. At least, everybody unlucky enough to have to think about the horror at all. Thanks to these “first responders,” whatever the intellectual quality of their response, the rest of us are granted the gift of mediation. And if we later attempt to unknit the mediation and gain a more immediate relation to the horror, we will find it is no simple emancipation. 

    But not all horror is seen, and some horrors are seen more than others, and of those some remain alive in public consciousness for longer stretches. The ones that remain alive the longest are corrupted beyond recognition, corrupted the most. They feel nothing like the precipitating horror which catalyzed the fascination. They have been interpreted and put to use. Once horror is described it becomes what we call it, and slips further and further away from what it is.

    On October 7, 2023, the entire world was enlisted into a new era of discourse about horror, in which atrocities of this order were perpetrated in the Middle East, recorded on cameras, and shared around the globe along with “analysis” that hardly rose above a snarl, every day for over a year. For over a year an enormous subsection of the human population has participated in or at least witnessed and been degraded by this poisoned discourse. After the sun set on the evening of October 7, the locus of the horror migrated westward. Now the images of mutilated bodies depict Palestinians instead of Israelis. And the discourse remains every bit as degrading. 

    Over the past year many Palestinians and Israelis have howled that their people are brutalized and the world is silent. It is a profoundly human cry no less stirring for being patently untrue. Is there any other sliver of land in the world about which the global population is less capable of silence? And while we type and glance and glance away, we gauze the horror in verbal and visual buffering. Thus, for example, when an American reads the phrase “the rapes perpetrated by Hamas on October 7,” she does not mentally resuscitate the horror. Instead, dependably, her mind moves to the arena of political discourse in which horrors of this kind are debased by conversion into talking points and dogmas and slogans. She does not smell smoke and blood or shiver at the recollection of a woman’s splayed naked body and a face burned so badly she can hardly make out its features. Too many headlines, talk shows, late night hosts, speeches, op-ed pieces, podcasts, tweets, and heated or fortifying conversations come between her and the thing itself. She is protected by language and time. Words do the work of balm even when we would prefer that they did not. She can hardly see the scar, and anyway it was always someone else’s.

    But words do not have to serve as a tool for diminished understanding. As certain writers and poets have shown, language is not helpless before its own numbing power: it can preserve the shock. Language — more even than images — can 
find ways to maintain the aura of awe and fear and disgust which is proper, which is moral. There is a tradition into which the ghastly accounts can be placed which inflicts no disrespect. Human beings have had abundant opportunity to grapple seriously with the ugliest expressions of human brutality. There are examples of how to respond to horror while preserving the dignities that have been debased.

    Every single instance of brutality perpetrated over the last year on that strip of land between river and sea — the strip of land which Jews and Palestinians share — demands attention, analysis, compassion, and respect. The photos of mutilated Palestinians bodies flood our feeds every day as they have for over a year, like the dying leaves which litter the streets in the autumn. Each should be raised and cleansed, the faces should be remembered and the life stories and the names. The violence responsible for their debasement should be studied. We know how to do this. We have done it before.

    But there is so much wreckage, and too little is known about each instance. How can the personal hell of two million Gazans be paid proper respect?

    The rapes perpetrated by Hamas on October 7 are anomalous among the brutalities committed since the start of this bloody nightmare because they have been studied — there has been no silence. There has been erasure, denial, and extenuation, and there has been documentation and the repeated proliferation of evidence. All this went on for months. While other atrocities came and went, the rapes of October 7 retained global attention. 
And so, in the grisly competition between horrific crimes, these, the rapes, were distinct. This is true despite the facts that they were not the only instances of rape committed in the region over the past year. Reports of Israeli guards raping the Palestinian detainees at the Sde Teiman detention center, for example, rattled Israelis and others last summer. The Intercept, one of the publications at the forefront of the smear campaign against the rape victims of October 7, published extensive evidence documenting the rapes that were committed 
in Sde Teiman. But after a short while the next horrific crime shifted global attention. 

    Nor were the rapes of October 7 the only examples of crimes which people afterwards insisted did not take place. When footage from Gaza of a young boy being burned alive from fires ignited by Israeli strikes spread across social media in October, Israeli propagandists immediately insisted that the videos were fake. Commentators then denounced the deniers, but after just a few days even the image of the teenage boy’s body burning had joined the millions of others we have time to pick up and put down but by which we refuse to be appropriately shocked.

    When so much horror is force-fed to so many people over so short a period of time, all of us ingesting the obscene deluge engage in coldblooded triage. This is not conscious. We hardly know why some horrors haunt and others fade from memory.

    “Men rape in war” is a truism, and the fact that it is a truism is its own trauma, which schools women in their subhuman status. Wartime rape is barbaric, but women have been conditioned to believe that rape in war is no more or less barbaric than all the other horrible things that happen during armed conflict. War is barbaric. Of course men should not rape, but is that injunction anymore essential than those against a host of other war crimes, to say nothing of the acts committed during war that are not violations of the laws of war? Is wartime rape worse than an entire family being killed in a rocket attack as happened in Gaza just a few hours before this writing, or worse than the displacement of three million people as has happened in Sudan since 2023, or the slaughter of nearly half a million Syrians? When attempting to explain exactly why rape in war is a phenomenon worthy of special consideration, one is forced to situate the act of rape in the context of numerous other bloodcurdling horrors. And so if society does not affirm that wartime rape is not just an act like any other awful one, that it is cursed with a peculiar repugnance, it might seem natural that it would not be considered especially grave.

    But “not especially grave” is rarely the designation. Wartime rape is either ignored all together, or treated as profoundly horrific. It is a peculiar category of crime: for most of human history it was dismissed as merely a woman’s issue. Thus, in 1996, when a briefer was presenting a report to the Austrian government which emphasized that widespread rape was occurring at that time in Kosovo, she was told by a senior official in an Austrian humanitarian agency that “it is not a story” because “men are being killed.” Twenty years later, we live in a bizarre era in which the crime is both too horrific to believe and unworthy of serious consideration. 

    In April 2022, American readers read and forgot that twenty-four Ukrainian women in Bucha were “systematically raped” by Russian soldiers. The veracity of this account did not become a lightning rod for international debate. Americans believed it. They just didn’t care that much. Ukrainian women were spared the debasement of having reports of their rapes analyzed to pieces and questioned endlessly, but they were dealt another indignity: the world simply moved on. And some of the Americans who insisted that the rapes on October 7 did not happen and that the rapes in Sde Teiman did also casually wondered why Ukrainians really do require more aid than America has already doled out to them, as if the possibility of Russian victory is an eventuality at which leftists in the strongest democracy in the world can merely shrug their shoulders. As a friend of mine recently put it, it’s not only that there are good victims and bad victims — there are also good aggressors and bad ones. Sometimes rape is unconscionable and sometimes it is literally forgettable.

    2023, I mean before October 7, was a record year for violence against Israeli women. Twenty-three women were murdered in Israel that year, a slight increase from 2022. Of the women murdered, nineteen of them knew their murderer. Ninety-two percent of rape investigations in the country are closed without charges. Seventy-five percent of those charged with sexual crimes are released before the end of their sentence. But rape happens in ordinary life even more than rape happens in war, and that sort are the rapes we learn to live with. The genocidal rape and genocidal murders that were committed on October 7 are different in kind and not just in degree than the rapes committed by Israelis against Israelis. The men and women who were savaged by Hamas terrorists on that day were slaughtered and brutalized because of their ethnicity. Rape was used as a tool for crushing Israeli identity.

    As I say and as we know, rape happens in war. It always has. Ancient Greeks considered women the property of their husband or their father, and in war the pillaging of enemy property was reason enough to go to war. Rape was considered a property crime, one that war rendered permissible because the loser loses ownership claims. (Marriage rape, still legal in certain states in this country and regularly practiced in all of them, was an oxymoron.) The ancient Romans, who in this macabre category distinguished themselves, considered rape not only a weapon of war but an expression of victory. 

    Humanity did not attempt much progress on this issue in the intervening millennia. In the Second World War rape was used as a weapon by the German army in every country it violated. German soldiers tortured, raped, and forcibly sterilized women in torture-brothels and in concentration camps. The rapes committed by Japanese soldiers in Nanking are infamous, but less well known is the macabre history of the “comfort women.” Between 1932 and 1945, Japan kidnapped thousands of “comfort women,” many of whom were children, from China, Korea, and other occupied countries and sold them into sexual slavery. A survivor recalled that “it was no place for humans . . . there was no rest. They had sex with me every minute.” She was regularly beaten by her captors. American soldiers frequented the brothels where these women were imprisoned. When Russia invaded Germany in 1944, Russian soldiers punished German women with a frenzy of systematic rape which crescendoed when they invaded Berlin in 1945 and raped an estimated two million women.

    The Allies established trials at Nuremberg and Tokyo to prosecute and punish those responsible for perpetrating war crimes, but neither one of these tribunals designated rape as a war crime in their charters. No victims of rape were called to testify before either tribunal. At Nuremberg evidence of rape was submitted but not prosecuted, whereas in Tokyo evidence of the Rape of Nanking was given and the defendants were convicted but no mention was made of the comfort women. Perhaps this had something to do with the fact that American authorities allowed the brothels to continue functioning even after the war was over; American soldiers could continue raping the comfort women until General MacArthur ordered the system to be shut down in 1946.

    The International Criminal Tribunal for Rwanda, established in 1998, defined genocidal rape as a form of genocide. Judge Navanethem Pillay, who later became UN High Commissioner for Human Rights, said in a statement after the verdict: “From time immemorial, rape has been regarded as spoils of war. Now it will be considered a war crime. We want to send out a strong message that rape is no longer a trophy of war.” And yet. Even at the time she made her statement, rape was considered a war crime if and when it was genocidal. The qualification belies her assurances.

    Nowhere in the world do women securely enjoy the status — socially or legally — of full human beings, and so rape during war (like all other kinds of rape) has not been treated as a distinct crime for most of the period during which human beings have been committing it and bearing it. Concerning the rapes that were perpetrated on October 7, a variety of specific questions have been asked and answered and asked again and answered again, and analogous volleys have been performed before about other analogously horrific crimes, and all of them have been met with the same ritual questions. These are some of the questions.

    How do we really know rape was committed? 

    This is a question that victims of rape are always asked, regardless of whether the rape took place during war. And in war, as in life, people respond to allegations of rape as if the crime is so horrific it can hardly be believed. Thus, in the progressive outlets which set out to discredit the claims of the rape of Israeli women by Palestinian men on October 7, sentences such as these appear:

    The cornerstone of that report is Gal and Nagi Abdush, a couple killed on Oct. 7. The Times says Israeli police believe Gal Abdush was raped. But the only evidence given is a “grainy video” of Gal’s burned corpse, “lying on her back, dress torn, legs spread, vagina exposed.” Gal became known as “the woman in the black dress.” . . . 

    PHRI references the video of Gal Abdush as evidence of possible “sexual abuse.” 

    The Times mentioned messages that Gal and Nagi, parents of two children, sent to their family during the attack. After Gal was killed, Nagi sent “a final audio message” to his brother Nissim Abdush at 7:44 a.m., “Take care of the kids. I love you,” right before he was killed. 

    But the Times fails to mention other text and phone messages that make it almost impossible Gal was raped. She messaged at 6:51 a.m. about intense explosions on the border, based on an Instagram comment by Miral Altar, Gal’s sister. 

    Nine minutes later, at 7:00 a.m., Nagi Abdush called his brother Nissim to say Gal was shot and dying. . . . 

    The Times never explains how Gal could be captured, raped, fatally shot, and burned to death in nine minutes while Nagi messaged his family and never mentioned any physical contact with Hamas forces.

    How long, reader, would you guess it takes to shoot and burn a person? How long, for that matter, would it take to rape in war? Surely longer than nine minutes?

    Human beings are not supposed to have to know the answers to these questions. This “investigation,” published in Yes! Magazine, evinces a strange reverence for the status of rape. That is because there is no just reason to rape. This tendentious author can argue that Hamas “freedom fighters” had a right to fatally shoot a young woman, but rape? In war, “freedom fighters” are supposed to explode and maim and kill other human beings, but what sound reason could there be for sexual exploitation? 

    When criminals in international criminal courts are tried for war crimes, the prosecutor is tasked with proving beyond a reasonable doubt that the crime in question was committed. Reasonable doubt. Thus in February 2001, when the Trial Chamber at the International Criminal Tribunal for the Former Yugoslavia sentenced Dragan Kunarc to twenty-eight years imprisonment, it was because the chamber had been persuaded beyond reasonable doubt that he had, in July 1992, taken two girls to a house in which several soldiers awaited them. He personally raped one of the girls and aided and abetted in the gang rape of the other. In August of the following year he repeated the practice, this time with four girls, one of whom he raped himself. On at least two other occasions he took another girl to an apartment for the same purposes. 

    Can you imagine how Yes! Magazine would have analyzed such information? Would it have asked how, for example, almost ten years later, such facts could have been persuasively established? Would they have demanded to know how long the girls were imprisoned in the room so they could be satisfied that gang rape had truly taken place? Or would casting doubt of that sort on crimes of this kind seem to them unreasonable?

    Why are they doing it? 

    For those willing to accept that the rapes of October 7 had in fact been committed, another of the sickening questions raised about these particular rapes was whether or not rape was committed enough times for it to be called systematic. The Israeli government did immediately call it that, invoking a term which is an important part of the young international legal tradition on this subject, developed after the Rwandan genocide and the Bosnian wars, to try and force humanity to confront war rape as a distinct crime. In the war between Bosnia and Kosovo, rape was used by both Serbs and Croats, though Croats to a lesser degree, as instruments of war. Serbian soldiers committed mass rapes in Bosnian refugee camps, as those who managed to escape reported. The Bosnian foreign minister called them “rape camps.” Impregnated victims were forced to bear their enemies’ babies and were often ostracized by their own families afterwards, as often happens in such terrible cases. 

    In the Bosnian genocide in 1992, more than two hundred thousand women were raped in the war. The rapes served different functions. Some were especially sadistic and were perpetrated as acts of terror and humiliation; some were perpetrated with the aim of forcing pregnancy; some were forms of sexual slavery. In the Rwandan genocide in 1994, roughly a quarter of a million women were raped, after which they were often sexually mutilated, thrown into sex slavery, and forced to birth resultant children. In response to both of these genocides, international tribunals, the Yugoslav Tribunal (ICTY) and the Rwandan Tribunal (ICTR), were established and tasked with determining how to punish war crimes. ICTY convicted twenty-three men of rape and/or sexual violence, many of whom were convicted in the Foca case which was the first international case, to exclusively prosecute sexual violence. When the Israeli government said that rape was systematic on October 7, they were invoking the tradition of both of these tribunals and others like them, and one of their intentions was to communicate that Hamas had genocidal intent, even if they lacked genocidal capacity.

    But what does it mean to say that rape was committed with genocidal intent? According to international law, it means that the rape was committed on a large scale, systematically, by a belligerent force, with the intention of decimating the enemy culture in this way. This framing considers the crime as a crime against a people and not a crime against a woman — but of course it is both these things. It is both a crime against a woman and a crime against a people — and the fact that rape in war is overwhelmingly a crime against a woman is the primary reason that, for so long, it was regarded as an insignificant act, unworthy of special attention. “Just a woman’s issue.” Today the opposite is true: the crime is considered primarily a crime against Israelis, which is why leftists who ordinarily abhor the sort of barbaric interrogation of rape-victims which they model so enthusiastically in this instance felt entitled to overlook the fact that these Israelis happened to be female victims of male ferocity.

    Who were they harming and what was the nature of the harm?

    If rape is a war crime because it is genocidal, then the crime considers the harm done to the nation, not the harm done to a human being. When the Israeli government immediately insisted that the rapes were systematically committed, it was invoking the vernacular employed to prosecute genocidal rape, which is of course what the rapes had been. They were first and foremost an attempt to crush Israeli identity. And when progressives consider whether or not Israeli women could have been raped by Palestinians, their neat syllogism, which dictates that Israelis are oppressors and Palestinians are victims who either resist or submit, dictates that a victim could not have committed an act of violence that could harm a human being. They insist that all violence on the part of Palestinians against Israelis is justified because it is an act against the state. In Israel there is mandatory conscription. Progressives who insist that the rapes were fabricated often point out the connections that all those making allegations have to the Israeli army, as if connection to the IDF necessarily disqualifies the testimony. So in a country in which military service is legally mandated, who can be believed? All doubt, or a lot of doubt, is reasonable. And at the same time all acts of violence against Israeli citizens are acts of violence against the state. There are no innocent civilians. There are 
only colonizers. 

    The progressive left believes that Hamas is a resistance movement and that October 7 was a victory for resisters 
opposing Israeli oppression and colonization. The young 
Palestinian writer Ihab Hassan disagrees. In “Hamas Has Led Us To Slaughter,” an essay published on the website of this journal in October, he writes:

    On October seventh of last year, Hamas perpetrated horrific crimes: crimes against Israeli civilians the likes of which a healthy mind can hardly bear to imagine. In response, Israel has unleashed its own horrific retaliation on Gazan civilians, inflicting horrors that the same healthy mind cannot even begin to imagine. And the horrors multiply by the day and these bloody days have stained an entire calendar year and more. But Israelis were not the only victims of Hamas cruelty that day. Hamas also betrayed their own people. Gazans have been punished for Hamas’ crimes. Hamas knew that this is what would happen but did it anyway and they would do it again, just as Israel would repeat every crime it has committed against Palestinians if given the opportunity.

    In reply to the tweet in which Hassan shared this essay an especially sick individual wrote: “This is like condemning those that led the Warsaw uprising.” A respondent said: “The Warsaw uprising didn’t involve the people from the ghettos committing rape and child murder. You have a warped view of reality.” To which someone new chimed in: “Never happened.”

    This awful thread captures all the relevant reflexes and regurgitations: for those who esteem Hamas terrorists as freedom fighters, Hamas could not have raped because raping women cannot possibly help overthrow an occupation, just like slaughtering children cannot possibly quash a terrorist organization. Defenders of Israel cite the rapes in the same spirit as the defenders of Gaza cite the number of women and children killed in Gaza, both equally and tacitly affirming that the others who are killed could reasonably be considered dangerous and so worthy of death or violation. 

    The exchange on Hassan’s tweet occurred over a year after Israel first began to bomb Gaza. At least forty-two thousand people had been killed already. At the time of the exchange there were already no functional hospitals in Gaza, and nowhere in the entire strip where the two million people trapped inside could expect to secure safety, food, and water. No matter how horrific the crimes which catalyzed this onslaught, nothing justifies the magnitude of the carnage wrought on the people of Gaza since October 2023. So why insist that the rapes did not occur, as if their occurrence would justify everything else? Both things are true: Hamas terrorists are barbaric and motivated in their crimes by a repugnant ethnic hatred far dearer to them than their own people’s safety and freedom and what Israel has done and is doing and intends to do to the people of Gaza is unconscionable. 

    Hassan concluded his essay with this chilling paragraph:

    There are still outsiders who dare to characterize Hamas’ attack last year as an act of resistance. As a Palestinian, I hold you in contempt. I condemn your narcissistic usurpation of my people’s pain for your own shallow and cruel political posturing. What you call an act of resistance will be remembered forever by the Palestinian people as the day the gates of hell opened on the people of Gaza. Those gates still have not closed.

    For Hassan, who grew up in the West Bank and who has watched the lives of his friends from Gaza sink into utter immiseration, the hell which scorches the entire strip is not abstract. When he thinks about Gaza he can hear the screams. Images of Gaza represent a specific place and the bodies in the images are mere references to the actually existing human beings of whom the photograph was taken. 

    But for Americans like the ones in Hassan’s X thread, the images are an end in themselves. They refer to nothing outside of the digital universe. The photographs of dead children which they see on their social media timelines do not rattle them out of their screens. These bodies are made of pixels, they bleed pixelated blood. There is no original, no flesh, bones, earth, or concrete. It is not a category mistake to treat these bodies as something other than human, to treat them as a factor in a complicated political algorithm. That is literally what they are. (Musk makes sure of that.)

    We learned in primary school that Vietnam catalyzed mass protest in the United States because of the photographs which forced Americans to reckon with the vicious bloodshed. Today photography reproduces images of the destruction at such a high rate that the bloodbath hardly registers. This is not genocide, it is content. The image is the image alone. Rationalizing mass carnage cognitively, owing to distance and habit and speed, is now the human default, and there is no technology which mitigates our automated rationalization which does not also, and more powerfully, accelerate it.

    This is a disgraceful way to live. What can we do? How do we cleanse ourselves of these toxins leaching from every one of our public squares? Our days are laced with a repugnant, militant contempt for other people’s humanity. The flip side of this is the omnipresent and poisonous certainty that the people we love and care for are in a category apart from all other human creatures. This conviction stains us. The blood, the pixelated blood, stains us. Anyone numb to the sanctity of human life is incapable of treating the horrors of war with the requisite shock. And those are the people quickest to speak.

    By what right do we pick up and put down the facts of other peoples’ terrors and humiliations? What narcotic convinced so much of the human population that they have license to speak with certainty about the dismemberments of other peoples’ bodies, states, and souls? Where is our shame? Where is our shock? Shame and shock are not possible in a culture in which the only tools we have for communicating information about horrors committed around the world are tools which necessarily blind us to their horror. 

    Walter Benjamin famously observed that when a work of art is reproduced the quality of the original is always depreciated, and that 

    This holds not only for art work but also, for instance, for a landscape which passes in review before the spectator in a movie. . . . And what is really jeopardized when the historical testimony is affected is the authority of the object.

    One might subsume the eliminated element in the term “aura” and go on to say: that which withers in the age of mechanical reproduction is the aura of the work of art. This is a symptomatic process whose significance points beyond the realm of art. One might generalize by saying: the technique of reproduction detaches the reproduced object from the domain of tradition. By making many reproductions it substitutes a plurality of copies for a unique existence.

    Isn’t this true also of people? Do human beings not have auras? Think of the millions of copies of photographs of dead bodies — dead bodies pulled from rubble, dead bodies hustled onto stretchers, dead bodies lowered from the remains of a bombed out building. What is real? The photographs or the bodies? They are not the same thing. Ceci n’est pas un cadavre. 

    And Benjamin points out that, in the age of mechanical 
reproduction, art that can be reproduced is made to be reproduced — “the work of art reproduced becomes the work of art designed for reproducibility.” And if the “art” in question is the artifacts we make of war as the war is happening? We have done, we are doing every day, every moment that we contribute to or accept the discourse, a terrible thing. We are converting other people’s destruction into content. These are not human beings, they are retweetable, command-C-command-V-able vehicles for likes and replies and reposts. Engagement! 

    Benjamin, later in the same text:

    During long periods of history, the mode of human sense perception changes with humanity’s entire mode of existence. The manner in which human sense perception 
is organized, the medium in which it is accomplished, 
is determined not only by nature but by historical 
circumstances as well.

    And we have done this. We have altered our own minds, stunted them, hardened our hearts, changed their very function. How can a citizen of this digitized century resuscitate a faith in human worth?

    We have to preserve our horror, and not only our horror at the brutalities of war. Like Ihab Hassan, we must scorn all the journalists and reporters and politicians and diplomats and dinner guests and strangers in line at the grocery who, in their callous conversation, evince abject disregard for the human reality of all this content. That callousness spreads and the stain defies all manner of bleach. Beneath or beyond or outside of the rehearsed lines and prepared speeches, we have to cultivate and safeguard a human pulse, a redemptive repulsion. The horrors must provoke horror. Our horror is an antidote. It will not save us from evil — there is no sure escape from it, it is everywhere — but it will keep us from becoming evil ourselves.

    Sylvia Plath Turned 89 Today

    Sunrise lines a cloud: flamingo silk
    in an old fur coat.
    Seagulls catch the wind like scraps of paper. 

    She’s been up for hours, wrapped in her old plaid bathrobe.
    Migraine. Lightning
    flashes behind her eyelids.
    She drinks her coffee and writes: the tattered world

    Later there will be champagne and candlelight
    Phone calls from overseas.
    She’ll watch the full moon rise over the blue deeps
    and write a dreaming Fury.

    Who said to kill yourself for a man
    Is a waste of a good suicide?
    She hasn’t thought of Ted in years.

    Reading Marcus Aurelius in September

    Ripe olives drop from the tree, grapes glow
    green in the sun, bursting, already
    smelling a little of decay.
    A thousand miles away, the emperor slogs
    through mud with his grumbling soldiers.
    His son is worthless, possibly insane.
    At night in his tent he writes
    of odd accidental pleasures:
    bread splitting its crust in the oven —
    why do we love it?
    and urges himself to embrace his fate,
    like a rock in the ocean, waves boiling.
    Nature, he writes, holds us like a breath
    but also: a philosopher died eaten by worms.
    Seen rightly nothing is evil. Or lasts.
    His son watches, pretending to be asleep
    already seeing himself in the arena, slaughtering
    that strange and innocent creature, a giraffe.

    Going Gray

    My witchy hair
    so furious and alive
    stands up and crackles
    like a scratched 78
    Galli-Curci singing from the moon
    Sempre libera!
    It’s an owl’s nest
    twigs and feathers and bones and rain
    a straw broom
    forgotten in a corner
    but still capable
    of spontaneous combustion
    so watch out.

    There’s joy in taking your final shape
    if it’s what you’re meant to be.
    Who needs color anyway
    when the ocean in winter is so voracious
    and so beautiful?

    Ancestors

    Behind us, centuries of child brides
    split open in childbirth
    peasants fleeing bent under sacks of grain
    small boys who hid in the outhouse
    when the soldiers came for the family.

    Every one of us
    a survivor of survivors.
    Ishmael waves
    as he floats by on a coffin.

    Even the one-eyed cat
    slinking off round the corner
    had to have come
    from a lucky line of cats.

    My dear, my dear
    how is it we are here
    drinking green wine
    at the café in the square
    under the autumn leaves?

    The Woke Couch

    We refused most emphatically to turn a patient who puts himself into our hands in search of help into our private property, to decide his fate for him, to force our own ideals upon him, and with the pride of a Creator to form him in our own image and see that it is good.

    Sigmund Freud

    On April 6, 2021, Dr. Aruna Khilanani, a psychoanalyst, addressed a group of mental health experts at the Yale School of Medicine’s Child Study Center. The invited speaker titled her talk “The Psychopathic Problem of the White Mind” and delivered her remarks from New York City via Zoom. As she settled into her presentation, Khilanani told an audience of psychiatrists, psychologists, and social workers about her murderous impulses. “I had fantasies of unloading a revolver into the head of any white person that got in my way, burying their body and wiping my bloody hands as I walked away relatively guiltless with a bounce in my step, like I did the world a fucking favor.” Talking with white people, she said, was a “waste of our breath. We are asking a demented, violent predator who thinks that they are a saint or a superhero to accept responsibility.”

    When the Child Study Center invited Khilanani, they knew what they were getting — and many in the audience welcomed it. One black woman thanked Khilanani for giving “voice to us as people of color and what we go through all the time;” a psychologist deemed the talk “absolutely brilliant;” and one man in the Zoom audience said he felt “very shook in a good way.” These details were gleaned from a leaked audio recording of the talk made public by The Free Press two months later. Days later, The New York Times, The Washington Post, and NBC News reported on Khilanani’s talk, citing as well the statement issued by Yale School of Medicine, calling the “tone and content” of the presentation “antithetical to the values of the school.” Having myself been a resident and then a faculty member in the Department of Psychiatry at the Yale School of Medicine, I knew that Khilanani’s lecture, its crass unprofessionalism aside, flouted the very purpose of Grand Rounds, which is to impart scholarship, clinical wisdom, and original analysis. 

    Around the same time, another New York City psychoanalyst, Donald Moss, came to attention for his article, “On Having Whiteness,” published in the Journal of the American Psycho-
analytic Association. Moss, who is white, wrote that whiteness is “a malignant, parasitic-like condition [that] renders its hosts’ appetites voracious, insatiable, and perverse.” These appetites, once established, “are nearly impossible to eliminate . . . there is not yet a permanent cure.” As one disenchanted reader of the paper remarked, “it is unfortunate that psychoanalysts like Donald Moss, who express their views in a more temperate fashion [than Khilanani], still espouse a kind of racial essentialism to explain extremely complex social realities.” 

    In the years since Khilanani and Moss held forth, more and more practitioners of psychotherapy — psychoanalysts, psychologists, social workers, and counselors — have become vocal about approaching their work as a primarily political, rather than clinical, undertaking. Indeed, according to the Holmes Commission on Racial Equality in the American Psychoanalytic Association, in 2023, both Khilanani and Moss might well be regarded as role models for their boundary-pushing. “To live up to its fuller potential, psychoanalysis must imaginatively, thoughtfully, and self-reflectively move beyond the boundaries set by racism and white supremacy,” said the Commission. (The Holmes report has been criticized for its myriad methodological errors.)

    Social justice and “decolonizing” psychology are the twin missions of the American Psychological Association, the APA. The association has vowed to “work [to] dismantle racism in important systems and sectors of society.” A 2021 APA report on racism within its own ranks confirmed its commitment to “a critical examination of how the discipline structures opportunity in ways that uphold White supremacy.” Cited in the report was the association’s Chief Science Officer, who stated that “until we can embark on scientific practices that are not dominated by White supremacy, we’re only going to be getting part of the truth.” In a piece last year called “Psychologists Must Embrace Decolonial Psychology,” Thema S. Bryant, president of the Association, explained that “decolonial psychology asks us to consider not just the life history of the individual we are working with but also the history of the various collective groups they are a part of, whether that is their nationality, ethnicity, gender, sexuality, religion or disability.” 

    The code of ethics of the National Association for Social Work requires all members “to practice through an anti-racist and anti-oppressive lens.” The Association now stipulates that “antiracism and other facets of diversity, equity and inclusion must be a focal point for everyone within social work,” and has expressed its commitment to “confronting and working to change policies, practices, and procedures that create inequities amongst racial groups, understanding these systems of oppression are based in and uphold white supremacy.” In 2015, the American Counseling Association (ACA), representing over sixty thousand professional counselors, published a document called Multicultural and Social Justice Counseling Competencies, which divided counselors and clients into “privileged” and “marginalized” groups and encouraged them to “possess an understanding of their social identities, social group statuses, power, privilege, oppression, strengths, limitations, assumptions, attitudes, values, beliefs, and biases.” It identified “social justice” as “one of the core professional values of the counseling profession.” My own professional organization, the American Psychiatric Association, issued a report in 2021 that called for a four-year curriculum to teach trainees “skills [to] address racism in the clinical setting and in-patient care.” 

    Whether the social justice imperative will eventually dominate psychotherapy remains to be seen, but clearly it is already tainting the practice. These national organizations mandate the standards for training program accreditation, and the programs, in turn, dictate required curriculum for their students. Accordingly, faculty in psychology, social work, and counseling programs are populating their curricula and workshops with the popular rhetoric of progressive movements. 

    At the same time, trends in program composition are setting the stage. In psychoanalytic training, for example, fewer psychiatrists are entering as increasing numbers of applicants from the humanities and other professions are arriving with certain progressive proclivities. And with many new and radicalized graduates joining the professions at the same time that seasoned and senior clinicians, the sole bulwark against eroding professional and clinical standards, are silencing themselves or choosing to retire early, I foresee a troubled future for the talking therapies.

    When I was a resident in psychiatry, learning to become a skilled therapist was an all-consuming ambition. Take it from me, the process is harder than it looks. My colleagues and I met weekly with experienced faculty to discuss our cases. We learned to keep our private passions, neuroses, and blind spots from distorting the work. We were in therapy ourselves to better understand those enthusiasms, flaws, and biases. We were vigilant about countertransference, Freud’s term for psychiatrists’ own emotional reaction to a patient, which could cloud our clinical judgment; and even our professors, we were relieved to learn, hired trust supervisors to help them manage their own countertransference. 

    Essential to our work with patients was the development and maintenance of the “therapeutic alliance,” a core bond of trust nurtured through a non-judgmental, empathic approach, mindful about not imposing our own values on the patient. We were taught, as well, to reach an agreement at the outset of therapy, about treatment goals and concordance about the way therapy is supposed to work. Freud called it the “analytic pact.” Volumes of data confirm that the rapport between patient and therapist is a reliably strong predictor of positive results.

    Enter Critical Social Justice–driven therapy (which I will call CSJT). The British therapist Val Thomas first used this term to indicate “a practice that views people not as individual actors but rather as representatives of particular groups which are nested within systems of power and trains therapist-activists to diagnose patients through a collective lens.” Though many years in the making, Thomas says, it seemed to blindside conventional practitioners when it emerged as a finished strategy. “Therapy would no longer be focused on helping individuals;” she writes. “Instead, it would be reframed as a political practice, a means of dismantling systems of power believed to be oppressive.” 

    The education of Leslie Elliott shows how CSJT is taught to fledgling counselors. In the winter of 2019, Elliott enrolled as a graduate student in the Mental Health Counseling program at Antioch University. At first, she found it to be a stimulating master’s program — informative and clinically relevant — until she took a required course in multicultural counseling. “We were taught that race should be the dominant lens through which clients were to be understood and therapy conducted,” recalled Elliott, a mother of four who had majored in psychology. Race was to be broached early in therapy, regardless of clients’ stated goals and needs. The point, Elliott explained, was to increase the degree of importance that clients place upon race. 

    Thus, if a client were white, the counselor’s job was to help them see how they unwittingly perpetuate white supremacy. “We were encouraged to regard white clients as reservoirs of racism and oppression,” Elliott told me. If the client were black, Elliott was instructed to ask how it felt to sit with her, a white counselor. If the client felt at ease, “my job was to make him more aware of how being black compounded, or perhaps caused, his problems, regardless of what brought him to therapy.” White women, one professor informed a class, were “basic bitches,” “Beckys,” and “nothing special.”

    Elliott was also struck by the degree to which her program inculcated “selective empathy” in the students. A faculty adviser told Elliott in an unapologetic manner that the program was producing counselors who were not going to be able to work with Trump supporters. (If Trump supporters are so deranged, as a cynical colleague of mine pointed out, don’t they need more mental health care than others?) After the death of George Floyd, Antioch’s three-year program intensified its focus on race and oppression, making clear that counselors were to be foot soldiers in the culture wars. “Incredible as it sounds,” said Elliott, “we were encouraged to see ourselves as activists and remake ourselves as social change agents.”

    How could a therapeutic alliance ever blossom when patients are labeled oppressors by their therapists? They will feel alienated, or at least deeply confused, about the function of therapy. How can therapists ever maintain what the psychologist Carl Rogers called “unconditional positive regard” for afflicted patients who happen to be white, male, religious, gun-owning Trump-voters, whom many young therapists unabashedly say they are averse to treating? (As a cultural matter, it is unlikely that such individuals are stampeding to treatment anyway — but, in fairness, they sometimes do seek therapy, and they should not be ideologically disqualified from it.) In this way, CSJT is the glaring antithesis, the mirror image, of legitimate psychotherapy.

    Where traditional therapy regards each client as a unique individual and working in collaboration, CSJT reduces 
the patient to avatars of gender, race, or ethnicity. Where responsible therapy helps patients cultivate an aptitude for self-observation and introspection, encouraging them to experiment with new attitudes, perspectives, and actions, CSJT foments grievance and feeble victimhood. Where traditional therapy helps clear a path to autonomy, social justice therapy convinces patients that they have little choice or agency. As for exploring the serious consequences of a patient’s poor choices — a waste of time, after all, as the patient is a little but a passive vessel roiled and manipulated by malign external forces.

    The violations of sound practice are self-evident. Under no circumstances should a therapist derive personal or professional gratification from imposing her own worldview on a vulnerable patient and directing them to assume an activist role. Nor should she determine the agenda of the therapy, compel the patient to focus on their ethnic or gender identity, or disclose her own ideological affiliations. I am reminded of Donald Winnicott’s warning to psychotherapists — which sounds almost quaint these days — to avoid becoming enchanted with imposing their interpretations on the patient. He argued that a major part of the analyst’s role is to immerse oneself in the patient’s own particular subjectivity and not be too quick to offer our own views.

    How is it possible that therapists increasingly believe that they are political activists rather than healers? Val Thomas suggests that the answer lies mainly in the deployment of sophisticated rhetorical strategies. Critical Social Justice Therapy, she says, “does not advertise itself as a new modality; if it did then it would be subject to the usual testing of new therapeutic approaches. Instead, activist clinical theorists positioned it as the natural evolution of the field.” This clever move, she continues, puts anyone who criticizes CSJT or asks for evidence of its therapeutic value, at risk of shunning, derision as a bigoted reactionary, or reputational damage that could lead to a loss of employment. “Without public debate and critique, therapy could then be subverted and harnessed to a political agenda, as happened in other domains such as education, the label on the therapy tin is retained but the contents are being radically changed,” Thomas explained. As if contemporary psychotherapy is nothing more than a contest between discourses upon which nothing empirical or evidentiary can intrude.

    Let me pause here to set out the somewhat confusing professional typologies at play here. The word “therapist” is generic. Anyone who talks to patients or clients with a view toward providing psychological aid is a therapist. The term “psychotherapy” may sound more specific — but, in practice it, too, is loosely applied. I tend to think of psychotherapists as individuals with formal degrees and professional licenses, but again, no hard rules prevail. Analysts, by contrast, can come from a variety of educational backgrounds, but generally must attend a lengthy program of formal psychoanalytic training at a recognized institute. Until 1992, several years after the American Psychoanalytic Association settled a lawsuit charging the association of violating antitrust law, only medical doctors could train and practice as analysts. As for counselors and social workers, they usually have a master’s degree. Their patients tend to be looking for a therapist who is very engaged and offers emotional support, practical advice, and shores up their coping skills. Sophisticated counselors will also pay attention to the patients’ self-sabotaging routines and self-defeating patterns in relating to others. And lastly, I use “client” and “patient” interchangeably, although, more formally, counselors and some psychologists help “clients,” while psychiatrists treat “patients” and analysts treat “analysands.”

    Established therapeutic approaches fall within three basic schools: psychodynamic therapy (which is aimed at helping patients understand how their past experiences and unconscious processes influence their present behavior and relationships); cognitive-behavioral (treatments that seek to change maladaptive behaviors and dysfunctional beliefs through learning); and humanistic-existential (unstructured exploration of issues such as life, meaning, freedom). By locating a person’s difficulties within the self, these methodologies focus on helping the patient to achieve insight, agency, and accountability. The ultimate purpose is emancipation from constricting beliefs and behaviors.

    Critical social justice therapy, by comparison, identifies external forces as the most determinative or even sole cause of the patient’s problem. Its origins can be traced to two conceptual root systems. The more visible of the two was the postmodern project that flooded academia with the idea that a person’s identity is a near-exclusive product of cultural conditions and social dominance. Little surprise that such a philosophy seeped into psychoanalytic training — a study as likely to be facilitated these days by humanities professors as by psychiatrists — and into university-based graduate programs in clinical psychology. 

    The other root system is a practice called multi-
cultural counseling, which is taught in psychology, social work, counselor training. The first textbook on the subject, Counseling the Culturally Different, was published in 1981, and was grounded in the idea that conducting therapy with minority populations required a distinct set of competencies. By 1992, the ethics code of the APA held that a psychologist 
could be sanctioned if he or she is not behaving in a manner that could be considered “culturally sensitive.” The APA’s “Guidelines on Multicultural Education, Training, Research and Organizational Change for Psychologists,” from 2002, set a perfectly sensible standard for culturally sensitive practice, stating that “psychologists are urged to gain a better understanding and appreciation of the worldview and perspectives of those racially and ethnically different from themselves.”

    Indeed, there are broad variations in culture, such as individualist versus collectivist values, and variations in levels of acculturation within immigrant groups, as well as variations in family-of-origin differences. Some ethnic and racial groups are more likely to report emotional distress in the form of bodily sensations; sometimes culturally specific metaphors allow therapists to make a point more clearly. Such cultural adaptations have been incorporated with success into well-tested cognitive behavioral therapy strategies. A “culturally competent” practitioner is, in reality, little more than an otherwise competent therapist who has made necessary and thoughtful accommodations to patients with different traditions of disclosure, habit, and help-seeking.

    Less recognized as potential key aspects of identity are sociopolitical values. “This element may form the core of a client’s personality and identity,” I am told by the psychologist Richard E. Redding of Chapman University, one of the first scholars to research political values in psychotherapy. “Because mental health professionals overwhelmingly tilt to the left politically, they should be cognizant of the fact that their politically conservative, libertarian, and centrist clients will not share many of their values.” Redding refers here to the moral intuitions driving attitudes about issues such as abortion, affirmative action, welfare policy, crime-control, immigration, or gender politics. “Clinicians must be sensitive to the impact this may have on the therapeutic alliance and the ways in which this influences their diagnostic and therapeutic choices,” he cautions.

    Attention to myriad aspects of the patient, from ethnicity to sociopolitical values, is part of the routine methodology of conventional treatment. The relative weight, or insignificance, of various dimensions of the patient will announce itself in the course of treatment, a collaborative enterprise informed by liberal values of patient choice, autonomy, and truth-seeking. By stark contrast, CSJT represents an authoritarian regime. Not only is the patient compelled to conform to the unyielding social vision of the therapist, CSJT feeds off the misbegotten notion, as my colleague, the psychoanalyst Ira Moses, puts it, “that innate attributes are the core driver of one’s experience of himself and his world.”

    Moses warns, too, about the outsize emphasis placed on the idea of patient–therapist racial/cultural matching, an arrangement presumed by champions of CSJT to facilitate therapy. “No doubt the patient might feel more comfortable starting with a clinician of similar race or culture, but therapists should realize how they place themselves in an untenable position if they believe that they have a special understanding or a unique empathy with patients who share these external similarities.” It is a fallacy, he notes, that our identities give us wisdom, “Therapists who share a similar background or identity as their patient, he argues, should be cautious about over-identifying or assuming they have an increased likelihood of understanding the patient.” Moses calls this a “symbiotic fantasy,” of understanding each other without communicating. A related paradox of CSJT is captured by the psychologist Craig L. Frisby. “It doesn’t acknowledge universals, because groups are supposedly too distinct from one another,” he says, “and it doesn’t acknowledge individuals’ uniqueness, because only group affiliation matters.”

    To see what CSJT looks like in the real world of the clinic, imagine a depressed white man in his twenties talking to his therapist, a psychologist, about career woes. He has just been turned down for a coveted research fellowship and speculates that he lost out because of affirmative action. The hunch so unnerves the therapist, who is non-white, that he looks for guidance from colleagues during a weekly staff meeting where difficult cases are shared. In the Brooklyn clinic at which this scenario played out in real life, a colleague of mine, another psychologist, was at those meetings. “The group discussed the patient’s comment about affirmative action and the consensus was strong,” recalled my colleague. “They strongly advised the therapist who consulted them to tell the patient that if he didn’t overcome his biases, he would be transferred elsewhere.” The rationale? The group argued that it would be unfair for a clinician of color to be asked to treat a “racist” patient, my colleague explained.

    Andrew Hartz, a psychologist in New York City, recently published an account of his experience in City Journal:

    A few years ago, I provided therapy for a young heterosexual white man . . . he told me that he had experienced pervasive racially charged bullying at both his elementary school and his high school. . . . Much of it was explicitly racial, including comments like “white faggot” and “white bitch.” . . . He said that he had held back from telling me about it in part because he worried that I would frame him as privileged or “just 
not get it” — reactions he had experienced in the past from his friends.

    The patient had grown so used to keeping this experience buried that he became numb to it. “[I]n some ways more upset at the current cultural attitudes about race than about the bullying he had endured [and] the inability of the culture to express concern for white people who were attacked.” As therapy went on, Hartz writes, the patient became “more relaxed, more reflective, more open, authentic, and assertive.”

    Consider, also, the case of Paul O. Having read of my interest in the issue of politicized psychotherapy, the fifty-three-year-old-year-old from Sturbridge, Massachusetts emailed me to share his experience. Several years ago he suffered a pulmonary embolism and spent a week in Mass General in Boston. Paul’s health deteriorated and had to quit his job. “My physician recommended that I speak with a counselor due to the dramatic changes I was going through. After a few visits with the psychologist, he got to know me and he had the nerve to ask me how I could possibly take public funding since I was a conservative and Republican (I’m actually an independent here in Massachusetts) I was shocked by his lack of empathy. Needless to say, I never went back to him or any other counselor.”

    Some are alienated before ever setting foot in the therapist’s office. I learned about a politically conservative patient who saw a Black Lives Matter poster on the wall of a psychologist’s office and simply turned on his heel and walked away. In another, a young Christian woman was alienated by the they/them pronouns that her assigned therapist used on her website and stationery. Strictly speaking, of course, the poster and the pronouns say nothing about a therapist’s capacity for empathy for patients who might not share their politics. But in the face of such thoughtlessness, a patient could be forgiven for suspecting this would be the case. Even more repellent, I would surmise, is hearing your therapist refer to women as “AFAB people with vulvas” — assigned female at birth — as one of Elliott’s professors told her class to do. 

    In other instances, patients are rejected out of hand by the people assigned to treat them. I spoke to a newly minted psychologist who works in a Veterans Affairs medical center in Florida. His peers, he told me, are not interested in treating combat veterans; “they’d rather deal with ‘racial trauma’ and ‘LQBT issues.’” I have heard of psychiatric residents refusing to treat patients whose politics they dislike, patients who, in the throes of psychosis uttered a racial slur, and veterans who are too white, straight, and out of touch with the advanced opinions of the day.

    The magnitude of the betrayal inflicted by this new species of therapist cannot be overstated. Imagine yourself arriving at the clinic for your first visit. You are demoralized, in distress, perhaps in crisis. You are summoning the nerve to share with a stranger your most intimate, mortifying, and traumatic experiences. And instead of encountering a wise and empathically attuned presence, you are met with a therapist who seems to think that progressives are the only ones who need psychological safety and understanding. A therapist who forgot that she exists to heal pain, not to propagate doctrine.

    The mental health professions today are home to therapists who are overwhelmingly female, liberal, and politically aware. As self-declared enemies of privilege, they are primed to imbibe the social justice narrative and accept it as the proper objective of therapy. They reflexively impose the narrative on individuals who seek their help and react harshly to those who resist their efforts. The talking professions, I’m afraid, seem to be attracting as trainees people least suited to the job — and making that job inhospitable to would-be therapists who do not wish to be part of a highly politicized profession, one where therapy becomes politics by other means.

    The pipeline to the professions is skewed from the outset. A Yale psychologist colleague told me that he was “struck” by the number of applicants to his program “who were unabashed activists with their minds made up about best practices in psychology.” One of them declared that she had already staked out black feminist theory as her template for practicing therapy. “If what I saw is at all representative of incoming graduate classes, the future of psychology doesn’t look good,” my colleague said. Signing diversity statements and pledges are now part of the application process at many training programs. But perhaps the most potent deterrent is the exposure of poor training and psychological abuse in some programs.

    Which brings us back to Leslie Elliott. To warn would-be graduate students as well potential clients, she sought to expose what she calls “the ideological capture” of the counseling profession. She began to create YouTube videos and to post Substack commentary in the fall of 2022. She also recounted how the dean of Antioch’s counseling graduate program reacted to these online postings and to her refusal to sign a “civility pledge.” Not only did he urge students and faculty not to watch her videos, he also asked that they reach out to an ad hoc “crisis team” to help them handle their reactions to the “hate speech” — his term — that Elliott, by now labeled a “transphobe” and a “white supremacist,” had disseminated. She decided to leave the program, and has hired a lawyer so that she can complete her master’s degree without signing the pledge should she choose to return. Leslie is now working as a wellness coach in Seattle and continues as an active YouTube presence, spreading the word, in a preternaturally calm and measured style, about the corruption of the counseling profession

    Thousands of miles away in Knoxville, Suzannah Alexander enrolled in the University of Tennessee’s Clinical Mental Health Counseling Master’s Program in the summer of 2022. For six months, she endured colleagues and professors implying that she should be ashamed because she was white. “Professors taught us,” Alexander relayed, “that if you’re white, you are privileged and you need to ‘do the work,’ but at first it was never clear exactly what the work was or how we were supposed to do it.” Later, it did become clear: doing the work, Suzannah said, “really meant assuming that black or brown clients had more difficult lives due to their skin color, and it must be awful for them to have to be therapy with a white counselor.” What’s more, she explained, “we learned that it was not okay to ask a marginalized person, meaning someone whose skin was tan to black, not hetero, or disabled, about their experience. Why? Because that put an additional burden on them while they are already working hard to tolerate your whiteness.” The idea of treating individuals without delving into their unique experience makes a mockery of treatment, unless of course the therapist is more concerned about where the patient is located within the hierarchy of privilege relative to the clinician’s position in it.

    In one of Suzannah’s classes, a professor asked who they thought their most difficult client would be. “To a person, the class said a bigoted white man was their nightmare client,” she told me. In class she had mentioned that the Buddhist practice of reducing focus on one’s self could make it easier to act on one’s values — a tenet that Suzannah saw as consistent with the goals of secular psychotherapy. After all, suspension of obsessive self-regard is an element in cognitive behavioral therapies, she pointed out, further arguing that it could help therapists foster compassion for even the most challenging client. The professor disagreed.

    After several months in the program, her professors told Alexander that her thinking was, as she puts it, “too concrete.” They also objected to her allusions to Buddhism, calling it “bad thought” and resented her refusal to concede that she should be ashamed for being white. “I knew this was abusive,” she later wrote is a wrenching account. “I was determined not to quit until I absolutely had to. But I was discouraged.” Eventually Alexander’s adviser told her that she would not be able to take practicum (hands-on clinical experience), an activity without which she could not graduate and obtain a counseling license. Alexander left the program and is now seeking legal redress for her wasted tuition. “I doubt I’ll ever be a counselor now. I’m not even sure I still want that. More’s the pity, so many have told me I would have been great at it, and I do feel for the many men who find suicide to be their only outlet.” 

    In 2020, Lauren Holt enrolled in a mental health counseling program at a Jesuit university in New Orleans where “social justice indoctrination consumed a great deal of the training.” Many of her teachers were chronically unprepared, presented course material that was superficial, failed to grade assignments in a timely fashion, and ignored student emails — derelictions of duty that other students experienced as well. In response to complaints, the program held a mediation session between staff and students. Grievances were aired, though no faculty were in attendance to hear them or to respond to them — apparently, faculty members were to be briefed later on the complaints. Subsequent mediation sessions would be held, but only students who were marginalized (minorities, or gender non-conforming, or disabled) were allowed to participate. Lauren asked: “What about those of us who are not in “marginalized groups?” Do our concerns no longer matter? I find that difficult to swallow.” All hell broke loose. “Within minutes, I received a mountain of emails from other students calling me a bigot, a racist, a white bitch, all sorts of heinous things,” Lauren wrote in an article describing her ordeal. Eventually, she was told by a lower level administrator that the department head had decided she could not return for her second year unless she fulfilled the hours of therapy he requested she attend to manage her, as he put it, “incompetency” as a counselor and her “inability to listen to people.” The head also expected her to sign documentation stating she was, at that juncture, unfit to be a counselor. Lauren was not allowed to state her case or to defend herself. 

    Unwilling to be bullied by him, she filed a grievance. Though her complaint was successful and she was technically permitted to resume her coursework, faculty members were icy to her and her new advisor ignored her emails, and so she left school. Now she lives in Asheville, North Carolina, where she runs her own accounting business and teaches English to immigrants. As for the students who called her racist, she says, “they have presumably completed the program and are now collecting their hours towards licensure.” Lauren is seeing a therapist. It took her over two years after leaving the program, she said, “to feel comfortable seeking help from a mental health professional after my experience in counseling school.”

    By no means are all training programs so ideological, but the experiences of Leslie Elliott, Suzannah Alexander, and Lauren Holt are not rare outliers. In the years since Val Thomas, the British therapist, launched Critical Therapy Antidote in 2020, an online community for practitioners and clients dedicated to “protecting the integrity of talking therapies,” she has posted dozens of articles written by trainees who resorted to self-censorship (and near-nervous breakdowns) upon finding themselves the targets of indoctrination by professors, intimidation by faculty, mobbing by fellow students, and retaliation by their schools despite Orwellian reassurances that their programs were “safe spaces.” They include also many testimonies on professors scrimping on the basic facts and models of human psychology in favor of teaching dumbed-down mental health propaganda. 

    Many graduates of these debauched programs will go on to occupy slots at public mental health clinics, university mental health clinics, schools, and other institutions. Surely some of them will be well-prepared — not every single school is infested, and even marginal programs still have a remnant of qualified professors — but too many American therapists will base their work with patients on a distorted idea of their roles.

    The practices of CSJT roundly violate the code of ethics adopted by the American Counseling Association in 2014, which states that “counselors are aware of — and avoid imposing — their own values, attitudes, beliefs and behaviors.” Counselors are obliged to respect the diversity of clients, trainees, and research participants — and more, “to seek training in areas in which they are at risk of imposing their values onto clients, especially when the counselor’s values are inconsistent with the client’s goals or are discriminatory in nature.” The American Psychological Association also has a solid code of conduct that counsels psychologists to be “aware of and respect cultural, individual, and role differences, including those based on age, gender, gender identity, race, ethnicity, culture, national origin, religion, sexual orientation, disability, language, and socioeconomic status.” And yet, the major governing entities in the field have turned a blind eye to blatant ethical breaches, because — this conclusion is impossible to avoid — what they believe in most of all is the primacy of the political. 

    What to do? One strategy is to warn prospective students while shaming poorly performing programs. A cohort of dissatisfied (former) trainees and a handful of disillusioned counseling professors are doing this on social media and online postings. Another option includes robust referral networking for people searching “non-woke” therapists, as the requests for therapists are generally phrased. The website of “Conservative Professionals” provides names of conservative therapists because, as it says, “half of Americans have Conservative values, yet the majority of professionals in various occupations are guided by a Liberal mindset. (Unfortunately, this “matching” service reinforces the notion that only like can counsel like, an ironic and unattractive complication of the effort to make the couch a safe space for conservatives, too.) Another new site is called ethicaltherapy.org, established this year by a former professor of counseling at the University of Vermont to “help new and existing psychotherapy patients find psychotherapists who endeavor to leave ideology out of therapy.”

    Parallel institutions can play a role. In 2021, a professor of clinical mental health counseling at Florida Atlantic University and former president of the American Counseling Association launched the International Association of Psychology and Counseling. Its mission is to “promote critical thinking over indoctrination” and help the mental health field to return to “its roots of liberal education” and to “professionalism where advocacy should be the domain of individual conscience, not one’s professional identity.” Andrew Hartz, mentioned earlier, seeks to restore trust in the professions by launching the Open Therapy Institute in 2023 to “foster open inquiry in mental health care and support those underserved in the face of politicization of the field.” The institute will offer professional development for therapists and promises to provide therapy from professionals who “strive to be open, curious, and empathic,” he told me.

    Administrative and legal avenues also exist. A group of already certified counselors could appeal to the state legislatures or licensing boards to tighten accreditation standards or introduce an alternative accrediting body. Wronged or dissident trainees could undertake legal action of some sort, either individually or as a class action with others in the same program or across programs. It is hardly an exaggeration to allege that some training programs are perpetrating educational fraud or malpractice.

    Transforming therapy into a vehicle for political change fails on yet another count: there is no evidence for the effectiveness of an approach that conceives of patients’ problems as a function of oppression. By contrast, a robust research literature exists on the generally positive to very positive impact of behavioral and psychodynamic interventions. There is substantial literature purporting to show that dynamic psychotherapy is as effective as or more effective than cognitive-behavioral therapies, and also a very strong body of research suggesting that all therapies are effective and at about the same level. Certainly private insurers and Congress should be alerted to the fact that they are paying for a lot of therapy that is unproven and, worse, potentially harmful. 

    If I have not said much about organized psychiatry in the context of CSJT, it is because psychiatry is, foremost, a medical specialty. Psychiatrists do offer psychotherapy, but it is not the defining activity of the field. Consequently, traditional approaches remain largely intact. And yet in 2021, the American Psychiatric Association issued an apology to black Americans, announcing that the association “is beginning the process of making amends for both the direct and indirect acts of racism in psychiatry.” Three years later, it remains unclear what amends were made and whether anyone was helped. I have a better proposal. The field should compensate for historical offenses — and make no mistake, transgressions, including overtly racist ones, were indeed committed in the past — by educating the therapy-seeking public about what they deserve: practitioners who are free of ideological agendas; who see themselves as healers, not activists; who extol the primacy of the individual; and who inspire their patients to participate in their own flourishing.