I Was There

    She was looking like trapped meat. I’m talking about the pretentious freak of nature stuck in her mother’s washing machine down the street. 

    It was yesterday afternoon. I’d dropped by to return borrowed eggs. I heard screaming, begging, laughter. 

    We used to play house. I was eleven. She was ten. I was a bank teller. She was a secretary. We were sick and tired of the grind, but happy together. After a long day, she fixed me cranberry juice on the rocks, and I gave her a foot massage. It was a good marriage. 

    Nowadays, she reads European philosophy and dresses only in Goodwill, and I tend to keep my distance. 

    I’ve always managed to avoid the knives that come out of her mother’s mouth. 

    I mean, that woman vomits knives. 

    And yesterday, those knives were pointing at the trapped meat in the washing machine. 

    So, after a pink sandwich, which I ate only because I was hungry, I left.

    Muddy

    I run into my therapist from seven years ago. He’s standing around, still the young side of middle-aged, face blank, totally unimpressed. 

    But he’s not as I remember him. For instance, he has a fever. He’s glistening. His spectacles: gone. 

    Man, our old sessions. He was strict, for real. Even coffee was off the table. Anything I could hold was a distraction, a crime. It was always late afternoon, often raining. 

    I used to be much younger than he was, but we are the same age now. 

    Anyway, the hours change, and we eventually share the same dilemma — we are stuck right in the middle of some vast muddy field. 

    Boy, this bespectacled fucker used to catch me inside a million lies. 

    We are sinking, tiring awfully by the end, contemplating together. He clutches my arm just above the elbow. And doesn’t let go. We pray for the way God made us to finally mean something. 

    Through the wall, we hear the kids next door. They are calling each other names again. They will never grow up.

    Living

    On a rainy Sunday afternoon, pre-workout, I approach the girl behind the counter at my gym. 

    She mostly deals in fresh towels and electrolytes, and she doesn’t like me — not sure why, but she’s blatant. 

    And so right in front of her I go, “I’ll take the red Gatorade — the fruit punch flavor . . . ?” 

    And right in front of me she goes, “We’re all out of the red Gatorade. The. Fruit. Punch. Flavor.” 

    “But it’s right there, I can see it,” I say, beginning to point. 

    “No, you’re color blind,” she says, beginning to point at living.

    Ecstasy and the Englishwoman: Charlotte Brontë

    Charlotte Brontë and George Eliot had a similar quirk to their literary careers: after penning their respective masterpieces — Jane Eyre for Brontë in 1847, Middlemarch for Eliot in 1871 — both lived to publish deeply strange and religiously preoccupied novels a half-decade later. While the Jewish mysticism and Wagnerian scope of Eliot’s Daniel Deronda have prompted reams of scholarship, and its melodrama and finely-wrought heroine have broken through to popular consciousness, Brontë’s final published novel has met with relatively muted fanfare. Few beyond Brontë completists, academics, eccentrics, and those otherwise in-the-know seem to read Villette these days, let alone consider its aesthetic and ideological contours. Its heroine, Lucy Snowe, is criminally under-celebrated. The novel possesses a kind of secret-handshake status, seeming to subsist by virtue of the whispered interpersonal recommendation.

    This state of affairs is perhaps unsurprising, since Villette rings a dissonant chord in the annals of Victorian fiction. A rough summary of the novel goes something like this. A Protestant Englishwoman, poor, friendless, and plain, crosses the channel to the fictional kingdom of Labassecour; there, surrounded by French-speaking Catholics, she becomes an English teacher and eventually headmistress of her own school. Despite its inoffensive Bildungsroman-like frame, Villette was regarded with distaste from the start, disliked for the morbidity of its “unamiable” protagonist, as a contemporary take in the Dublin Review put it; for its hints of perversity; and for its mood of pungent defiance. Brontë had partly modeled the tale after a wretched year spent in Brussels in 1843. There she had herself been friendless, had been dizzyingly alone, had pined fiercely after her charismatic — and married — professor. Perhaps the novel’s early readers could smell its author’s acute bitterness. Perhaps they, like Virginia Woolf a century later, shrank from that “jerk in [Brontë’s novels], that indignation” rendering them “deformed and twisted.” Woolf seems to have preferred Villette to Jane Eyre. Yet unlike Jane Eyre, which wears its turn to the domestic on its sleeve (“reader, I married him”), Villette “jerks” its reader still further, permanently deferring marriage for its female lead and closing instead with an ambiguous evocation of shipwreck. On this score the novel is certainly an outlier, most comfortably framed as an exception that proves the rule of Victorian novels.

    Villette’s deafeningly absent marriage plot, along with the “hideous . . . convulsed” spirit that so baffled and disgusted its contemporaries — those were Matthew Arnold’s words in a letter, calling the novel “one of the most utterly disagreeable books I ever read” — were seized on in the twentieth century by second-wave feminist critics. Sandra Gilbert and Susan Gubar, in The Madwoman in the Attic, set the tone, seeing in Lucy a casualty of patriarchal constraint, and scholarship since has tended to follow suit. Brenda R. Silver pointed to the sphere of readerly sympathy as a form of surrogate-liberation for the novel’s protagonist, and Joseph Allen Boone maintained his predecessors’ language of empowerment and subversion. These canonical responses to the novel have tended to prefer an essential, autonomous self, which is either being oppressed and disfigured or else requires some kind of emancipation (if only at the hands of a congenial reader). They have also largely sidestepped the question of religion, consigning Villette’s Catholic theme to a peripheral position or glossing over it entirely. 

    Brontë’s final published novel scalds the fingertips and sharpens the mind, and it deserves to be grasped by different means and with different language. One should not take for granted that its protagonist desires autonomy as we might understand it, nor should one see the religious dynamic in the novel as simply an intriguing sideshow. Though indebted to the aforementioned readings, I see Lucy differently — as in some sense in flight from subjectivity, especially in its self-contained and autonomous Protestant mold, and in search of forms of release, mediation, externalization, even annihilation — experiences that the novel codes Catholic. Lucy seems often to find unfettered interiority an impossible burden to bear, to be drawn to a paradoxical “freedom not to consent,” in Julia Kristeva’s words. This tendency is most apparent in the frequent irruption of ecstatic states in Villette: ecstasy becomes the novel’s structural principle, contributing to its murky religious aesthetic. The tendency then appears to regroup in the realm of erotic love, to channel itself into a relationship that annihilates private inwardness. While Lucy first pines after a Protestant doctor, she ultimately emotionally entangles herself with an imperious Catholic man. 

    Why should a nineteenth-century English novel, with the genre’s overriding concern with selfhood and development, be interested in experiences that nullify, deflate, or otherwise jettison selfhood, albeit temporarily? And why should Catholicism be the terrain upon which questions of selfhood are pursued? It is tempting to speculate that Lucy’s Catholic fantasy life evinces a need to transcend a Protestant-cum-secular, perhaps bourgeois, conception of the individual — a need to transcend the conditions of the novel in which she finds herself. Indeed, in its fascination with the Catholic question, its scorn for cheerful individualism, and its appeal to the mystique of authority, the novel chimes with some of the louder Catholic-talk in our own intellectual air today. The much-covered rise of “tradcath” aesthetics and ideology need not be rehashed here, but consider it distilled by the New York Times headline of a few years back: “New York’s Hottest Club is the Catholic Church.” Paper of record aside, many of us have by now encountered the type: those who, afflicted with the malaise of the secular, have been drawn to the Catholic Church in part through the medium of ideology. (I observe the phenomenon without contempt, gathering that many such cases are on-ramps to genuine conversion or even constitutive of it.) Yet one senses, in some prominent cases, that religion and the culture war have mingled behind a curtain of mist — that the worldview in question consists mainly in a post-liberal pull toward unfreedom. 

    Is Lucy Snowe an early prototype of this phenomenon? A woman who suffered secular modernity’s birth pangs, whose anomie prefigured ours, who sought to curtail her freedoms in the face of the abyss? Like many backward-looking and ideologically smooth readings of imaginative literature, this partly satisfies, but it also requires a good deal of squinting. Lucy, though attracted to forms of externalization and erasure, is equally attracted to the idea of an intact self and keen to keep a tight lid on her consciousness. In the novel’s final pages we glimpse a final swerve from the self’s disintegration, and perhaps a genuine horror of it — and the triumph of a decidedly Protestant ethos. In Brontë’s vision of selfhood, in other words, one finds only contradiction: an oscillating relationship between ecstasy and containment, between the not-self and the self, between autonomy and heteronomy. And the novel can be persuasively read by honoring both impulses. I see in Lucy Snowe neither a victim nor a liberator, neither a misogynist nor a spotless feminist. I wish to take Lucy’s ambivalence at its word — to do justice to the irreconcilable drives that hold sway in Villette

    Midway through Villette — weary from insomnia and devastated by the isolation endured on her school’s long vacation — Lucy Snowe is lost and alone when a powerful storm engulfs her. She experiences it as a self-fracturing, and she sets the stage for the novel’s first ecstatic episode: “It was cold and pierced me to the vitals. I bent my head to meet it, but it beat me back. . . . I only wished I had wings and could ascend the gale, spread and repose my pinions on its strength, career in its course, sweep where it swept.” Lucy sketches a moment in which the boundary between herself and the not-herself is partly dissolved — her body is porous and the storm is felt within it as a piercing of the vitals — and partly not yet dissolved, triggering a desire to vacate her physical frame and merge with the gale’s turbulent course. 

    Volume I of Villette shudders to a halt and Volume II opens with Lucy’s account of the ensuing out-of-body experience. In it, she appears to consummate that very desire to sweep where the storm swept: 

    Where my soul went during that swoon I cannot tell. Whatever she saw, or wherever she travelled in her trance on that strange night, she kept her own secret; never whispering a word to Memory, and baffling Imagination by an indissoluble silence. She may have gone upward, and come in sight of her eternal home. 

    Lucy’s lexicon is idiosyncratic and allegorical, but the exalted transportation she describes — the thrusting of the soul or the self outward — calls up the state of ecstasy, from the Greek ekstasis, to stand beyond, above, or beside oneself. She may have “come in sight of her eternal home,” she hedges, but of the journey she simply cannot tell: this moment of non-presence cannot be captured in recollection or language, it is shrouded in an “indissoluble silence” whose depths she cannot plumb. Just before blacking out Lucy had scornfully alluded to “a certain Carmelite convent,” a veiled reference to the Spanish Carmelite nun St. Teresa of Ávila, who spoke in her autobiographical writings of ecstasy as a penetration of the entrails and through Bernini’s sculpture has become all but synonymous with ecstatic vision. Beyond the obscure and sarcastic allusion, the ascent of the soul in an obliterating “swoon” may be for Lucy, however superficially, understood to be a Catholic configuration. She has just undergone a purposefully botched confession, wading into a Catholic rite only to brazenly halt its course: “Je suis Protestante,” she had announced to her confessor. 

    The self-externalization of confession and the self-erasure of ecstasy share a common narrative logic, though one lets her speak and one renders her silent. Both episodes dramatize a release from the self and speak to an ambivalence about the sort of interiorized and contained selfhood that is for Brontë Protestant and English. Protestantism places the individual subject at the authoritative center of experience, as does first-person narration, but Villette is invested in moments when the self seems temporarily to come apart or become opaque. If Catholicism poses a problem for Brontë, it is not only on the surface, in her occasional conversion plots, anti-Catholic tirades (“God is not with Rome,” Lucy cries at one point), or even the novel’s lightly satirical Gothic set-up — the fact that Lucy’s school is built on the grounds of a martyred nun. Lucy’s Catholic forays seem to reflect above all an exhaustion with self-sufficiency and self-representation, and offer her a temporary release from subjectivity. Even so, such moments of ecstatic melting or release are for Lucy often opportunities to step back, embrace renunciation, and set limits against which the self can again stand whole and intact — a cycle of explosion and containment that creates a kind of narrative whiplash. 

    Lucy’s ecstatic blackout forms the bridge between the two volumes and between the novel’s past and present. She wakes up and is nursed to health by strangers, setting in motion the novel’s ensuing plot machinations: the strangers turn out to be her old playmate Polly, now Paulina, and her old love Graham, now Dr. John. It is striking that so much narrative work should be accomplished by a blackout that forms a lapse in its narrator’s consciousness and memory. On one level this is simply efficient plotting: a satisfying contrivance or gimmick fitting for a novel structured around mysteries, suspense, and sensation. Perhaps one can conclude, like Elizabeth Hardwick, that these “large, gaping flaws in the construction” of Brontë’s stories are “gothic subterfuges [that] represent the mind at a breaking point, frantic to find any way out.” 

    But one might also consider a more specific intention to the blackout, implications for the novel’s conception of the self and what can and cannot be worked out in language. The episode that remains fuzzy and sunk in an “indissoluble silence” gives Villette a forward motion and a deeper structure than its first-person narrator can understand or account for, let alone consciously witness. Lucy blinks and misses, so to speak, an event that comes to define the course of her life: it is not a moment of sudden vision that sets things forward, but one of occlusion and blindness. The blank space at the core of this story — the moment that does away with linear and legible selfhood — seems almost to operate like its connective tissue. 

    The fact that Lucy’s ecstatic episode concludes with her “re-enter[ing] her prison with pain” speaks to the troubled relationship between ecstasy and embodiment in Villette. The novel seeks ecstasy as a temporary absolution from selfhood, here conceived as an out-of-body experience and ending in an agonizing return to the prison of the flesh. On balance, however, Lucy does not wish to be a bodiless, floating soul — and how could she, while always resisting being typecast as an undetectable “shadow”? While one pole of Villette pulls away from contained selfhood, another insists on the self’s solidity and gravity. Lucy’s confessor advises her that “Protestantism is altogether too dry, cold, and prosaic for you,” but Lucy vehemently disagrees: she aspires to be, and often is, just those things. A recurring tag for this aspirationally compact and static self is her selective use of her full name: “I, Lucy Snowe, plead guiltless of that curse, an overheated and discursive imagination”; “I, Lucy Snowe, was calm”; and finally, “complicated, disquieting thoughts broke up the whole repose of my nature. However, that turmoil subsided: next day I was again Lucy Snowe.” The fullness of the name is contrasted with the self’s potential overflow or excess (its “overheating”) and potential fragmentation (its wholeness being “broke[n] up”). The pronominal force of “I, Lucy Snowe,” with its vow-like affirmation of subjectivity, seems to pledge and enact, not merely to describe, an experience of stable identity. 

    Whether “Lucy Snowe,” the contained self, is embodied and externally legible or is instead emphatically internal is ambiguous. The blackout scene, for example, aligns self-containment with embodiment and sensationalizes the disowning and recovery of both. On the other hand, the levitating “soul” lamenting its return to its “poor frame” would seem to imply a chasm between bodily frame and spiritualized interior. The “soul” is a nebulous term in Brontë and gestures to an altogether different vision of the self, one that is detachable from flesh rather than contained in it. Brontë’s notion of selfhood appears sometimes — like the soul, though distinct from it — to exclude what is material and external and to constitute itself on the inside, to regard itself as a private sanctum unknowable by others. It takes pleasure in privacy and secrets; it delights in the dissonance between interior and exterior and in being misapprehended. We see this predilection in Lucy’s relationship to Dr. John (and to the reader), where she cultivates and relishes a protective cloud of anonymity that allows her to see while unseen. We also glimpse the tendency in her “struggles with [her] natural character,” which she frames in a mood of proverbial wisdom as a matter of “surface” and interior: internal conflict enables life “to be better regulated, more equitable, quieter on the surface,” she muses, “and it is on the surface only the common gaze will fall. As to what lies below, leave that with God.” Much seems to depend on the tone in which one reads these dense, epigrammatic, often sententious asides; possibly they are themselves a form of self-dissociation. 

    The physiognomist, on the contrary, maintains that the body’s surfaces speak, that one’s body — or head, at any rate — encodes and communicates one’s true nature. Brontë, who was deeply invested in physiognomy and in the related science of phrenology, makes it crystal clear that intimacy and eroticism work along those lines. Is the self then defined by its obscurity — by the gap between what can be perceived and what is — or is the real self visible and readable? At times the former is presented as cheap compensation for the latter. Lucy sums up this form of substitute-gratification in the general, evasive first-person plural: “In quarters where we can never be rightly known, we take pleasure, I think, in being consummately ignored.” She utters it with regard to Dr. John, with whom we already know she enjoys being “cloud”ed, and with whom her attitude is essentially: if you will refuse to see me, to “rightly know” me, then I will seal myself up — I will define myself by what is private and hidden in me. Viewed in this light, her retreat into interiority and her “pleasure in being consummately ignored” are points against the Protestant Dr. John, signs that he is not and never was a viable partner in Brontë’s romantic universe. 

    Invisibility and non-participation are certainly out of the question under the Catholic M. Paul’s severe gaze: in his orbit there can be no hidden self and indeed no privacy. He enters Lucy’s life with brute force by “burst[ing] open” her “closed door,” after which “a paletôt and bonnet grec filled the void; also two eyes first vaguely struck upon, and then hungrily dived into me.” She receives his presence in breathlessly conjoined fragments, in visual flashes, and there is a quality of thickness to that presence, a sense that it physically implicates her (she is “struck upon . . . hungrily dived into”). M. Paul is later shown to have rifled through Lucy’s papers and personal items: she has long known, she admits, “that that hand of [M. Paul] was on intimate terms with my desk; that it raised and lowered the lid, ransacked and arranged the contents, almost as familiarly as my own.” The erotic charge of the image barely needs pointing out, but more surprising is Lucy’s claim that M. Paul arranges her private materials almost as familiarly as her own hand would. 

    When she finally catches him in the act of ransacking, she is “provoked at this particular, and yet pleased to surprise him.” Thus “provoked” by M. Paul, Lucy takes pleasure in confronting him in turn, heartened by his audacity to test the limits of her own. This is the threatened narrative space in which Lucy paradoxically thrives. Her agency is thrown into relief by its being tested — her bounded and delimited bodily self emerges when its boundaries are breached, when it must resist invasion. Invigorated by M. Paul’s aggression, she states emphatically that his “scorn gave me nerve,” while the interiorized “to view him . . . myself unseen” characteristic of Dr. John’s benign neglect of her had resulted in voyeurism and paralysis. The static, blazon-like descriptions of Dr. John’s chiseled features here give way to dynamic movements of stirring, flowing, and soaring: when M. Paul sneers at her, “his injustice stirred in me ambitious wishes — it imparted a strong stimulus — it gave wings to aspiration.” 

    Beyond this foreclosure of the private realm and its blood-warming dynamism, Lucy’s relationship to M. Paul is marked by surface legibility and a counterintuitive kind of reciprocity, touched off by “certain vigorous characteristics of his physiognomy, rendered conspicuous now by the contrast with a throng of tamer faces.” On the level of physiognomy, he and Lucy are on equal footing, for she glimpses “fire” in his face as he had earlier glimpsed it in hers: “I watched you, and saw a passionate ardor for triumph in your physiognomy. What fire shot into the glance! Not mere light, but flame,” he had thundered at her. A physically legible self — with fire shot through its glance, at once an internal blaze and an external effusion — is within reach for Lucy, but only the man with the right sort of narrative force can draw it out. 

    To that end, Brontë makes the dichotomy between the two male stars of the novel inescapable. Dr. John is a “cool young Britton,” a Protestant and a bourgeois doctor who had earlier diagnosed Lucy’s “fever of the nerves and blood… scientifically in the light of a patient.” “The old symptoms are there,” he informs her, at which point she digresses to the reader with some bitterness: “Not one bit did I believe him; but I dared not contradict; doctors are so self-opinionated, so immovable.” With Dr. John fever is illness and pathology, reducing her to silence and evasion, while with M. Paul — a Catholic European — flames signify deep compatibility. It is worth noting too that the former is tall and blond, the latter short and dark. By contrasting the two men with schematic and even overdetermined precision, Brontë links their respective modes to corresponding expressions of Lucy’s character. In Brontë there is an erotic dimension to character expression, or maybe it’s the other way around: quite simply, Brontë insists, Lucy is not the same Lucy in the company of these two men. 

    But is M. Paul drawing out Lucy’s passionate nature, or is he himself creating it (“arrang[ing] the contents” of her character almost as familiarly as her own hand would)? Is he a good reader of Lucy — her only good reader — for discerning the flame buried within, or is he the very precondition of that flame? And if Lucy’s surname is “Snowe,” are we meant to grasp that what he generates is something somehow contrary or even destructive to her nature, or that he has brought to light the inner nature that her surface conceals? These may be distinctions without a difference, or impossible to determine, in a novel that is always negotiating between interiorized and exteriorized forms of selfhood. Like ecstasy, confession, and the mediation that Lucy associates with the Catholic Church, the forfeiture of private interiority with the Catholic M. Paul delivers Lucy from the burden of a wholly interiorized self. It grants her the pleasure of self-externalization, of exerting gravitational weight and not hovering as “a mere shadowy spot on a field of light.” In romance and elsewhere, Brontë suggests, one needs a stumbling block or mediating force for one’s contours to take shape: perhaps this is why the Protestant shows up to confession only to haughtily declare herself a Protestant. 

    Imposition and encumbrance impart “a strong stimulus” in Lucy’s scenes with M. Paul — a texture of health, strength, and aliveness. “Scout the paradox,” then, as Lucy later puts it: with him, Lucy is both relieved of autonomous selfhood, effaced and totally erased, yet never more solidly there, never more present. For Brontë it is a sincere question: what good is freedom without the friction of other people, freedom in which nobody notices your presence enough to impose on you? Or, to paraphrase Sondheim, don’t we all require someone to sit in our chair, ruin our sleep, and make us aware of being alive? The novel, then, has moved thus far in a particular direction: from interiority and sickness with the Protestant Dr. John to exteriority and health with the Catholic M. Paul. Why, if so, does Lucy end up alone? 

    In their final scene together, M. Paul ushers Lucy into a pleasant house with a schoolroom attached. What he grants her is property with her name on it: a room of her own, financial security, a degree of autonomy. His preparations signify much more to Lucy than her own bourgeois ascendancy. “It was his foresight, his goodness, his silent, strong, effective goodness,” she writes, “that overpowered me by their proved reality. It was the assurance of his sleepless interest which broke on me like a light from heaven.” Surrounded by this proved reality of his affections, freed from having to search for signs of affection, Lucy closes out the novel with a pivot: “[M. Paul] was away three years. Reader, they were the three happiest years of my life. Do you scout the paradox? Listen.” 

    The paradox consists in this: Lucy works diligently to cultivate her school, taking on more students, expanding her property, coming into possession of some capital. She attributes her material prosperity and personal contentment to a “relieved heart” whose “energies lay far away” — to the legacy of and hoped-for future with M. Paul. “The secret of my success did not lie so much in myself, in my endowment, any power of mine,” she relays, “as in a wonderfully changed life, a relieved heart,” for “the spring which moved my energies lay far away beyond seas, in an Indian island.” The paradigm that Lucy describes is one of outward-facing solitude: while she plods away in healthy, satisfied employment, her inflowing sources of emotional energy “lay far away” and securely outside her person, with M. Paul in the West Indies. She lives autonomously, but that living is permanently mediated by an external force: her beloved is externalized into the general design of her life, and his absence is her sustenance. Unlike Lucy’s prior, bereft freedoms, this freedom allows for more than precarity and loneliness. Villette puts forward this fusion of structural autonomy and emotional heteronomy, material independence and mental interdependence, as its horizon of ultimate possibility. 

    1. Paul imparts his last words to Lucy in a letter: “Remain a Protestant. My little English Puritan, I love Protestantism in you. I own its severe charm. There is something in its ritual I cannot receive myself, but it is the sole creed for Lucy.” Lucy goes on to reflect that his Catholicism might in fact “be reckoned amongst the jewels” of his character. To end their relationship on this exchange is to insist that the Catholic and Protestant modes of the novel have indeed been vital to its central dynamic, indispensable to its central quest for selfhood and love. Rather than iron out their opposition, it indicates that Lucy and M. Paul have reached a tenuous equilibrium in and through each other, completed in each other’s absence. 

    A final deus ex machina preserves the equilibrium indefinitely. Rather than tip the scales in a consummated union — would they have married? — Brontë pulls the narrative to an abrupt halt. Lucy intimates that M. Paul has died at sea, signaling his fate through apocalyptic language but refusing to say so outright. She deprives the reader of that certainty, cagily concluding on a negative image instead: “Here pause: pause at once. There is enough said. Trouble no quiet, kind heart; leave sunny imaginations hope. . . . Let them picture union and a happy succeeding life.” A negative image is one the reader must hold in mind despite rationally grasping its untruth; it is like seeing double or going cross-eyed. By overlaying a seemingly tragic ending with its cheery obverse, Lucy “lets” us imagine a happiness that we cannot rid ourselves of, are unnervingly trapped in even as we grope in the dark to envision happiness’s opposite. In this finale Lucy rather mercilessly unsettles hope for a happy ending, but also withholds the catharsis of conclusive tragedy, instead leaving her reader to flounder in the unknown. M. Paul’s fate, and for that matter Lucy’s, is left in the dark. 

    And what, if anything, remains of ecstasy? Lucy implicitly fails to experience a standard referent of ecstasy: sex. Whether the novel’s swerve from the final implications of its marriage plot constitutes a tragic failure — or a kind of escape route — is left ambiguous. In either case, Lucy’s implied virginity evokes the nuns that haunt Villette. Lucy is visited by a ghostly nun throughout the novel, revealed to be a practical joke played on her by a schoolmate’s lover but nonetheless retaining its charge as a psychological double. And M. Paul’s former beloved, we learn, had apparently died in a convent after forswearing him. Does Lucy then mysteriously duplicate the arc of the nun for herself — or is her seemingly celibate fate a decidedly Protestant mirror image of it? 

    More to the point, one might wonder, is M. Paul’s assumed death by shipwreck a final step back, a final gesture of renunciation — that is, does he become collateral damage against which “Lucy Snowe” can hold fast to her full name and to all that entails in perpetuity? The repeated reverberation of “Lucy Snowe,” like “Jane Eyre,” imprints the maiden name firmly in the reader’s mind, seeming almost subliminally to foreclose or complicate the possibility of marriage. And the thirst for a “freedom not to consent” may only go so far when one’s freedom to consent faces up to real and solid limits: in Lucy’s case, the prospect of marriage to a domineering man, one who might reasonably be expected to efface her too thoroughly. As it happens, Lucy’s complete sarcastic quip before blacking out had been: “I might just now, instead of writing this heretic narrative, be counting my beads in the cell of a certain Carmelite convent.” Though dripping with sarcasm, the phrase hints at a binary opposition between the convent — or, broadly speaking, a perceived Catholic subjugation — and the “heretic” composition of an autobiographical tale. If it is in his absence that Lucy pens her tale, perhaps the permanence of that absence clears way for the story: crudely, he dies so that she can write. Self-erasure here gives way to autobiography, the self’s ultimate solidification. 

    The novel’s interpretive possibilities are vast, however, and one might consider whether Lucy — and Brontë by extension — has set up a hoax for the reader. Perhaps Lucy does eventually marry, but prefers in her autobiography to cultivate an aesthetic of opacity, to draw a closed circle around herself and her fate that no prying eye could possibly puncture. Lucy’s “here pause: pause at once” is after all a statement of aesthetic intent: a declaration of a cutting-short, a freeze-framing, that can preserve the past and fossilize it in an eternal present. Lucy declines to publicly subject her relationship to the element of the historical, the everyday, or perhaps, the novelistic. It endures as the exceptional, the monumental, as the having-been of greatness, and it closes out with a bang and not a whimper. The frozen images of “wild ecstasy” on Keats’s Grecian urn come to mind — those fair youths who, although they can “never, never” kiss, as a consequence “cannot fade” and are “for ever piping songs for ever new.” Like the ecstatic moment sealed in silence, M. Paul’s fate now lies outside the novelistic frame. These fissures are constitutive of the novel as much as they point outside it, gesturing outward to what cannot or will not be novelistically contained. If ecstasy endures, it is perhaps as this, as an aesthetic commitment to what will not be contained. 

    In Brontë’s final novel, as in select spheres today, Catholicism becomes the arena in which questions regarding the value of selfhood, of expressive individualism, appear to work themselves out. Yet Brontë resists the easy conflations that all-too-often threaten to poison such discourses, and throws a wrench in our retroactive ideologizing, by presenting us with a tangled knot of paradoxes and not a thesis statement. While her protagonist pushes up against the demands of bourgeois selfhood — and often finds genuine psychological value in constraint, limitation, and obstacle — Brontë nonetheless suggests that not all constraints are created equal, that some exact costs we would not wish to bear. She must have grasped that the romance of unfreedom meant one thing in theory and quite another in practice. For she holds both in mind at once, honoring the psychological value of the former while seeming, though we cannot know for sure, to flee the implications of the latter. We close Villette, then, suspended in the fog of Brontë’s ambivalence. The novel’s abrupt and ambiguous climax is almost a confession on the part of its author: I do not yet know how this ends. Where, indeed, do such contradictory impulses lead? Lucy Snowe’s existential drama is perhaps ours to live now. 

     

    The High Art of Distance

    “Art, of course, lives in history,” said Elizabeth Hardwick. By which she meant that a novel emerges in its own time, and changes in its passage to our own. This — the likeness which is also an unlikeness, the unfamiliar familiarity — is the shock of reading classic literature, of literature even a generation or two removed from one’s own. We understand that a novel is essentially a historical survivor, written in one moment, picked off the shelf in another, yet we want it also to enlighten us about our own lives, of which the author necessarily knew nothing. Astoundingly, they quite often do. And yet it is in those gaps, those absences, that the real excitement lives. We should not recognize ourselves, and yet we do. We should not be moved, but we are. And then we are offended, or struck, or in some other way expelled, and the gap expands, and the past and the work and the author come to seem the distant shore they really are. We can visit, but not to stay.

    I have a theory: the more we recognize in an era, a place, an artwork, the stranger its differences strike us. This is perhaps especially true for the novel, whose most familiar forms can be used to convey so much that we do not understand. “So much of a novel, after all,” observes Hardwick, “is information, necessary fact that gives a floor of understanding from which the flights of inspiration are launched.” Reading a novel from another country, another century, requires you to set a new foundation, plane a new floor — and to surrender yourself to the novel’s “subtle time,” that “spiritual and intellectual lengthening, extending like a dream in which much is surrendered and slowly transformed.” Yet for even the most sympathetic reader, this process is never complete. Your surrender becomes a kind of suspension, slack or tense, between your time and the novel’s, your era and the author’s, communicating down the years like a current shooting down a wire. You connect, and you don’t. You feel, you sense, you embrace, but always at a distance. There is always some gap.

    Yet a reader’s life, too, has its seasons. At a certain time in your life, you encounter a work written a certain time in the author’s, and you understand, or you don’t. The work, the reader, the writer are like dancers moving across the floor; all three must make a trio for the dance to continue. Thirteen years ago I first tried to read The Savage Detectives; but this past spring I read it at a sprint. All that intertextuality, all those fractured, puffed-up perspectives: I needed a decade-plus and hundreds of other books to begin to approach them. So, too, can a writer miss his or her moment, and be recovered later. Robert Walser’s obsessive self-obscuring semi-fictions sing clearer in our deeply pessimistic age than during the course of his indigent life.

    To say nothing of chance, when we are made to encounter the unexpected, and are made to change. In January 2017, I was twenty-five years old and in Melbourne, Australia. One day I was wandering north of the river when I passed a bookshop that was going out of business. I was with a friend then stationed in Okinawa and in a week I would fly to Tokyo, and so I picked up, at a deep discount, a slim Japanese novel called Snow Country

    Published in 1948, Snow Country tells the story of Shimamura, a young man from Tokyo, and his relationship with Komako, a geisha who serves a hot springs resort in the mountains. Shimamura is a cold man, ambivalent to the point of cruelty; in the first pages, he reflects that only a single forefinger remembers his lover. And yet again and again across the seasons he finds himself drawn away from his family, and back to Komako and the mountains. The novel proceeds as a series of piercing images: a woman’s complexion melting into a snowy mirror, a train window in which the reflection of an eye is superimposed off a light burning deep in the mountains. Komako will not let go of Shimamura, who, whatever his apathy, cannot raise the strength to escape. It concludes suddenly, and with great violence: Shimamura arrives at the site of a fire, turns upward, and feels the Milky Way roaring down into his body. 

    It was a startling book, a vision of the novel as something both shaped and shattered. By chance, it was also my first encounter with the great Japanese writer Yasunari Kawabata. A master of compressed forms and oblique endings, Kawabata helped introduce modernism to Japan, and published a number of significant novels, as well as more than a hundred and fifty short and ultra-short stories. For this he was the first Japanese writer to be awarded the Nobel Prize in Literature. From his time to mine: I have been reading him ever since. 

    In the early 1920s, when he was a student at Tokyo Imperial University, Kawabata lived above a hat shop in the northeastern neighborhood of Asakusa. The neighborhood was then one of the liveliest and most Westernized in Tokyo, and the indifferent student preferred to wander the modern quarter, taking in the revues, going to the movies, and soaking in the public baths. He seemed determined to engage in all that was new and exciting, and at the expense of his studies. 

    Along with much of Tokyo and Yokohama, Asakusa was leveled in the Kanto earthquake in 1923. Viewing the ruins, the novelist Jun’ichiro Tanizaki reveled in the possibilities available for reconstruction in the Western style. “How marvelous!” he wrote. “Tokyo will become a decent place now!” Kawabata’s building withstood the shaking, and he spent the following days wandering the wreckage, a jug of water and lunch in his backpack, writing down his observations. 

    In the first decades of the twentieth century, Japanese literature felt like a similarly cleared space. The autobiographical form known as the “I-Novel” was in decline, and a battle was being waged between the Marxist writers of proletarian literature and the modernists inspired by translations of Valery, Marinetti, and Durrell. Eminent writers such as Tanizaki and Ryunosuke Akutagawa argued in print about the future of literature. Would it be with Akutagawa’s “pure” fictions, “close to poems in prose”? Or with Tanizaki’s plotted works, “complicated things embellished with maximum intricacy”? Were the answers to be found in the Japanese and Chinese classics, or in Tanizaki’s occidental fixations? 

    Kawabata and his cohort found themselves suspended between these positions. In 1924, he co-founded the literary journal Bungei Jidai with Yokomitsu Riichi, a fellow writer and the founder of the Shinkankaku-ha, the New Sensationalist School of writing. Heavily influenced by European modernism, the Sensationalists emphasized the significance of form over content, artefacts of detached observation, and the personification of objects and the natural world. Their stories are full of fragmented narratives, found documents, and streams of consciousness. “We have become quite weary with literature that is as unchanging as the sun that comes up from the east today exactly as it did yesterday,” Kawabata wrote. “Our eyes burn with desire to know the unknown.” 

    Though his English was middling, Kawabata attempted to read Joyce and Woolf, and his earliest stories were in a distinctly modernist vein, employing fragmented forms. “A Saw and Childbirth,” first published in 1924, narrates a dream which begins in Italy, moves to the narrator’s hometown, finds he has to urinate, and engages in battle with a woman holding a saw. Kawabata folds the act of interpretation into the narrating, asking again and again what is happening, and what it means. In the final lines, he lies in bed, reflecting (in J. Martin Holman’s translation): “Somewhere would she bear someone’s child?” 

    “The Dancing Girl of Izu,” published in 1926, tells the story of a walking trip that Kawabata took across the Izu peninsula in 1918. When the narrative begins, the narrator is twenty years old, and has been on the road for several days. While climbing the Amagi pass, he reunites with a group of itinerant musicians who make their living performing at hot spring inns. He has already seen them twice before, and found his eye drawn by a young girl carrying a drum whom he believes to be about seventeen. The boy falls in with the group, traveling down the mountain and into Yugano. He tries speaking with the shy girl, and even dreams of inviting her to his room. Yet when he glimpses her coming out of the bath, he realizes that she is much younger than her dress had implied, and he is relieved. “I felt pure water flowing through my heart,” he reflects. His affections can remain unrequited; he will not have to allow another into his life. 

    Such distance came easily to Kawabata. “For me,” he wrote in 1934, “love, more than anything else, is my lifeline.” And love for him was a history of loss. Born in Osaka in 1898, he was an orphan by the age of three, and by 1914 had lost his grandmother, his sister, and the blind grandfather who raised him. In 1922, in a story of the same name, he reflects on being christened “The Master of Funerals.” His family life is retained as a series of fragments, each memory tied with a particular death. His parents exist as photographs on the family altar. He can only recall his sister as she appeared on the day of their grandmother’s funeral, carried on a relative’s back in white mourning clothes. The young man is so composed that he finds himself invited to the funerals of strangers. Yet his decorous behavior was never feigned, he writes. “Rather, it was a manifestation of the capacity for sadness I had within myself.” 

    His first love was for a male high school student, and in the early 1920s he proposed to a young woman who broke off their engagement. Even his most passionate male characters tend to keep their distance from life: they recognize emotions, but do not seem to feel them. Shimamura’s love draws him back to Komako, yet his apathetic treatment enrages her. In the novel A Thousand Cranes in 1952, an orphaned man named Kikuji drifts between various women, including his late father’s mistress and her young daughter, with a nearly existential level of indifference, a ghost in his own life. Even Kawabata’s happiest characters seem unwilling to act on their intuitions or feelings; when old Shingo hears a distant rumbling, in The Sound of the Mountain, he senses death. Yet this does not alter his conduct, and he proceeds through his family’s many crises without acknowledging it. These men keep everything inside. 

    As an editor at Bungei Jidai, Kawabata helped to shape and to promote New Sensationalism and its tenets. Yet his own early work only lightly resembles that of his peers. There is a directness to the writing which heightens every elision. Rather than circling each absence, he lets them stand. Other than his interesting but unsuccessful modernist novel The Scarlet Gang of Asakusa, Kawabata incorporates these fragments directly into the form of his stories, implicitly presenting them as fissures within the psyche of his characters, rather than overtly in the jagged structure of the text. 

    “Dancing Girl” is built around precisely such withholding. We wait until the story is almost over to learn that the narrator has gone walking to overcome the “stifling melancholy” of orphanhood. Early on, the male musician reveals that his wife has lost two children, one by miscarriage, another born prematurely. Only some pages later are we told that the premature baby in fact died within the last couple of months, while the performers were still on the road. As they head south to Shimoda to commemorate the forty-ninth day since the baby’s death, they talk freely, and without sentiment. “They said the baby was almost as transparent as water at birth, and it did not even have the strength to cry.” Like the death of the narrator’s parents, this tragedy rests always beneath the surface of the story, evoked through their conduct. His love for the dancing girl is not consummated, or even acknowledged. She dotes on him, but when the girl sees him off at the ferry, she refuses even to speak. When the boat sets off, he begins to sob, “a sweet, pleasant feeling,” as though he might drain away, and “nothing would remain.” 

    “Dancing Girl” and subsequent publications earned Kawabata substantial acclaim, and in the 1930s he moved from the avant-garde to the mainstream. He judged the inaugural editions of the Akutagawa Prize, and in 1934 he was appointed to the Bungei Kondan Kai, the Literary Discussion Group assembled by a former head of the Public Security Division of the Home Ministry. This was a period of increasing control within the arts: the literary fervor of the 1920s had given way to the increasingly militaristic and authoritarian 1930s, and many of his former rivals in the proletarian literary movement were jailed, tortured, and forced to make tenkō, a public rejection of their Marxist principles. 

    Given his frequent statements on behalf of artistic independence, Kawabata’s cooperation with an organ of censorship and control might seem awkward. He published articles insisting on freedom of speech and the rejection of social norms, and when in 1935 the BKK chose not to award the significant Akutagawa Prize to the tenkō writer Shimaki Kensaku, he protested publicly. Yet he also seems to have used it to firm up his own place in the literary landscape, knocking down upstarts such as Osamu Dazai and achieving financial security. When an early version of Snow Country won that award in 1937, he used it to purchase a villa in the mountain town of Karuizawa. 

    Like all his novels, Snow Country was published serially, before being compiled into a revised text. Because it is such a slim, exacting novel, it would be easy to think of it as a perfectly conceived work. In the introduction to his translation, the great Edward G. Seidensticker compares it to a haiku. Yet this was never Kawabata’s method. Snow Country was originally serialized between 1935 and 1937, but he returned to it in 1939 and 1940 and added a final chapter in 1947. Kawabata once remarked that it could have been broken off at any point — a harsh, fragmented quality that could describe all of his best stories. 

    His long works all began as short stories, often published without promise of future installments. Tanizaki theorized the Japanese novel as a work of architecture, requiring a carefully reinforced floor plan. Kawabata rarely thought ahead, writing on deadline for whatever newspaper or magazine would ask him, and essentially all of his major works were first published in installments, a common practice for Japanese writers at the time. Yet where Tanizaki used the extended gestation to construct a sturdy foundation, Kawabata leapt sharply from installment to installment, proceeding by non-sequitur, often skipping over major events to focus on stray details: the eye in the windowpane, the play of light on the Kamakura hills, the deep black of a camellia blossom. His practice was to “sound the overtones” of that first chapter, until the full harmony emerged, or he gave up; his career is full of abandoned works. They often end on a piercing image: the roaring Milky Way, the tea bowl broken across paving stones, the boy whose sorrow drains him dry. The effect is startling, and the lack of resolution lingers. 

    In his best books you sense him ranging across the course of a life, fusing his biography and the currents of his time into the thing called style. All those early deaths wounded Kawabata profoundly; and for all his philandering, you sense a man who held himself at a great distance from his own life. His characters, too, reside at a calculated remove from their own circumstances. Snow Country was based on an affair Kawabata had in the mountain hot springs town of Yuzawa, and he began writing the novel there, too. If it was anything like the fictional relationship, this affair must have been disappointing for all involved. Shimamura holds himself back from Komako, preferring to observe her from a distance so that he can keep from plunging into the warmth of real passion. He describes the world, so that he will not have to reach it.

    One morning, Shimamura awakes to find his lover preparing herself in his frigid room. “The white in the depths of the mirror was the snow, and floating in the middle of it were the woman’s bright red cheeks.” He remarks on the “indescribably fresh beauty in the contrast,” yet Shimamura is also reducing each element — the morning, the woman — down to their essence: blood and snow, red on white. In the process he is arranging them all within his own memory, perhaps hoping to step back from the scene and, like a man arranging flowers, to discover some harmony in it. 

    You see here Kawabata’s distance, but also the way he privileges intensity of focus: when his characters notice something, the story reorients towards it, following their associations across space and time. This continues to the best of his late work. In The House of the Sleeping Beauties, in 1961, a not entirely impotent man named Old Eguchi visits the title’s seaside establishment. In this place, old men spend the night beside beautiful young women who have been drugged to sleep. Night after night Eguchi returns, closely observing each girl’s skin, her hair, the feeling of her toes, how her body smells. These observations lead him to remember his past, the first lover taken away by her family, an affair with a married woman, the camellia tree in the garden of a Kyoto temple he had visited with his youngest daughter. 

    In his hands, this free associative movement is clean, effortless. On the first night, the smell of the girl’s breathing causes him to think of milk, which causes him to think of his grandchildren, which brings to mind two affairs from his own past: a geisha who could not stand the smell of his grandchildren, and then his own first love, from whose breast he had once drawn blood. Over only a handful of pages, Kawabata slips easily back and forth between Eguchi’s present and past, conflating the scent of the sleeping girl and the sound of the waves below with his youthful flight from his family with the girl by his side. “The facts were different, but in the course of time Eguchi’s mind had made them so.” The loneliness of the character combines with the restlessness of the style: Eguchi can never remain with anyone; like the narrative, they are always passing on. 

    The effect is like a cold flame, an emotion held unsustainably in check. The women’s bodies are described in steady, precise detail, and yet there is no familiarity to them: they might be statues or painted figures for all he can reach them. They exist, in an objective physical sense; but Eguchi can only access the women who exist in his own past, who become real in the course of his recollection. As in the late work of Kawabata’s protégé Yukio Mishima, something from beyond the human world is required to pierce the veil, to touch them at all. In order for Shimamura to admit any deep emotion, he must allow the Milky Way to flow into him. The novel ends with a gesture, turning away from the human story to face something abstract, an ideal as pure and as violent as a mountain river. 

    There is a condescending idea that Kawabata’s brevity, his aloofness, are somehow “quintessentially Japanese.” But his work stands apart from his predecessors and contemporaries. Perhaps no one makes a better contrast than Osamu Dazai. Across a short but prolific career, Dazai mined his own dissolute life in a series of confessional novels and stories. These are stories of self-styled bohemians, many of them drug addicts, most alcoholics, who alternate between states of ecstasy and debasement. “I am the sort of person,” confesses the narrator of No Longer Human, “who can forget even the name of the woman with whom he attempted suicide.” Though he was once a Marxist, by the time of his brief fame in the late 1940s Dazai could more accurately be described as a nihilist. “Philosophy?” declares a character in The Setting Sun. “Lies. Principles? Lies. Ideals? Lies. Order? Lies. Sincerity. Truth? Purity? All lies.” “There is something fundamentally cheap about such awareness of genius,” Dazai writes elsewhere in the book. “Only a madman would read a novel with deference.” In life and on the page, Dazai played the part of the brilliant clown, the man who writes his novel “clumsily, deliberately making a botch of it, just to see a smile of genuine pleasure on my friend’s face — to fall on my bottom and patter off scratching my head.” 

    Dazai believed that Kawabata hated his work — he was right — and believed that he had shut him out of the Akutagawa Prize in 1935. In response, he published an open letter mocking the older writer’s work. “Does keeping small birds and watching dancers perform,” he wrote, “constitute such an admirable life?” He accused Kawabata of feigning a cold, emotionless exterior, an obsession with essences and ideals rather than the brute facts of life. And Kawabata’s works do conflate people with the weather, the landscape, and the seasons. In his Nobel lecture, Kawabata finds this same quality in the poetry of a Zen monk who, in “seeing the moon, becomes the moon.” His novels are often grounded in rituals and traditional arts such as Gō, lending a refinement and a purity to human affairs. A Thousand Cranes filters its erotic tensions through the tea ceremony, imbuing each act of prostration and consumption with the significance of tradition. Even the most abject debasements take on their own cold beauty. 

    There is little beauty in Dazai, and no refinement. For his narrators, society’s charades mask the real and unendurable agonies of existence, a performance which we bear only out of our own ignorance. As one character writes in his suicide note: “When I pretended to be precocious, people started the rumor I was precocious. When I acted like an idler, rumor had it I was an idler. When I pretended I couldn’t write a novel, people said I couldn’t write. When I acted like a liar, they called me a liar. When I acted like a rich man, they started the rumor I was rich. When I feigned indifference, they classed me as the indifferent type. But when I inadvertently groaned because I was really in pain, they started the rumor that I was faking suffering.” 

    This clownish despair brought Dazai fame and success, but it was short-lived: he killed himself alongside a mistress in 1948. Yet in recent time Dazai has surged in popularity. His heightened emotionalism has found a following on TikTok, a perfect home for such piercingly direct statements as “learning is another name for vanity. It is the effort of human beings not to be human beings.” Kawabata’s work might be modern, but it is of a restrained modernity; Dazai overflows, rushing on into our own time, obsessed with the illusion of connection, the theater of confession. Kawabata’s elusive, opaque fictions cannot compete in such an exhibitionist contest. 

    “I am one of the Japanese who was affected least and suffered least because of the war,” wrote Kawabata in 1948. He joined several patriotic writers’ associations, and was sent to Nagano prefecture to discuss literature with farmers. He wrote for newspapers in Manchukuo, and visited Mukden with other prominent Japanese writers. Unlike his old friend Yokomitsu Richii he did not become a rabid anti-Westernist, and he did not join the Pen Brigades sent abroad to write propaganda. “I was never caught up in a surge of what is called divine possession,” he recalled, “to become a fanatical believer in or blind worshiper of Japan.” He served as an air raid warden in Kamakura; he spent the blackouts reading The Tale of Genji. After the defeat, he declared that he would live only to maintain the traditions of Japan. 

    The postwar period was probably the most productive of Kawabata’s entire life. In 1948 he became the fourth president of the Japanese PEN Club, and traveled to PEN congresses across the world to promote Japanese literature. He wrote frequently for newspapers, and serialized numerous novels simultaneously. In the ten years after 1945 he published a revised edition of Snow Country as well as The Master of Go, A Thousand Cranes, The Sound of the Mountain, and numerous “Palm of the Hand Stories.” These post-war works deploy a simplified, refined version of his pre-war modernism to address traditional Japanese arts in a thoroughly Westernized context. Other writers suffered under the U.S. Occupation’s Civil Censorship Detachment, which forbade, among other things, “Criticism of the Occupation Forces,” “Third World War Comments,” “Glorification of Feudal Ideals,” and “Overplaying Starvation.” But as under the military government of the 1930s and 1940s, Kawabata’s personal remove and his quiet, private subject matter largely evaded scrutiny. Even The Master of Go is largely ambivalent in its symbolic depiction of Japan’s defeat. Unlike in The Setting Sun, an aristocratic tradition simply slips away, too refined to insist on its own defense. The world had changed, and so should literature. Akutagawa’s “pure” fiction must give way to something else. 

    “If there is to be a ‘renaissance of literature,’” Kawabata wrote in 1935, “it will have to take place in works that are at once of pure literature and aimed at a mass audience.” He applied this theory in earnest through the 1950s and 1960s, writing great quantities of “middlebrow literature” for the highest-paying major newspapers and magazines, long novels such as Tokyo People which remain to this day untranslated. For Kawabata’s English-language admirers, this trove can seem more like a hoard, waiting for excavation. The Rainbow, an intermittently effective recitation of his core preoccupations, recently translated by Haydn Trowell, is the latest exhumation. Originally serialized in 1950–1951 in one of Japan’s largest women’s magazines, it tells the story of Mizuhara, an architect with three daughters by three women. Momoko, the eldest, and Asako, the middle child, live with him in Tokyo in the immediate aftermath of the war. Asako wants to find their missing sister, but Momoko is indifferent, caught up in a vicious romance with a teenage boy. 

    In works such as The Master of Go, the Japanese defeat is addressed indirectly, through a meditation on other subjects. Kawabata visited Hiroshima as a representative of PEN, and though he said that he would one day write a novel on the subject, he never did. The Rainbow is as close as he came. Five years before the novel begins Momoko was in love with Keita, a schoolboy and member of the kamikaze Special Attack unit. On their final night together, Keita made a mold of her breast from which to make a teacup to drink his last cup of sake before death. Realizing they might never see one another again, she gives herself to him, and he takes her virginity. She rejoices in the feeling, like “a flash of lightning in the overcast sky of her long love; a radiant, scorching cause of joy.” His response is immediate. “‘Ah,’ he spat out softly, turning his back to her. ‘Ah. How dull.’” He finds her pathetic, violated, and he dies in Okinawa without seeing her again. This is one of the few explicit references to imperial war-making in Kawabata’s work, in part because he remained at as much of a distance as an ambivalently pro-Imperial writer could. 

    From 1942 to 1944, Kawabata commemorated the outbreak of the Pacific War in the Tokyo Shimbun newspaper, publishing articles on the writings of soldiers killed in action. “I have always grieved for the Japanese with my private grief,” he wrote in 1948, “that is all.” Wartime literature portrayed Japanese soldiers as brave recruits, solid men spreading enlightened Japanese culture across the Pacific. Yet Keita is an unsentimental depiction of a Japanese soldier, caught up in fear and self-loathing, a death in search of a purpose. Late in the novel, his father reflects: “The dead escape condemnation. But it’s fine to put the blame on them.” 

    Kawabata’s serial plots are never terribly strong; they are propelled by thematic resonances rather than narrative drama. But The Rainbow is a particularly overextended beast. The plot is dictated by extensive coincidences, and despite being only a little over two hundred pages long, it is drawn far too thin. It is full of characters who do little but explicate their motivations, and at great length. Mizuhara is often present to deliver lectures on Japanese architecture, but does little else. Kawabata’s best work is defined by reserve, a nearly perverse unwillingness to state the obvious. Here, however, characters talk and talk, expressing everything, suggesting nothing. 

    Asako is a particularly failed creation, prone to sudden distressed exclamations, as if incapable of thinking even five seconds into the future. Her virginity is contrasted with Momoko’s bitter, wounded state, driven from a shattered love towards manipulative sex. This dichotomy recurs frequently in Kawabata’s work. As others have remarked, he prized beauty, “fresh” beauty, above all things, with virginity signifying the ultimate in beauty. Asako, Momoko, the sleeping beauties: all are ideal women who cannot be touched. In the autobiographical Letters to My Parents, Kawabata declares: “I always fall in love with women who are in between a child and an adult in age.” “I am all but moved to tears of gratitude that such a girl exists,” he writes, “but I could never love her.” His men keep their distance from such women, as if afraid of corrupting them.

    Yet for all his devotion to the pure, Kawabata’s virgins are often his worst characters, either fading into nothingness or remaining as symbolic foils to more complex women. Asako is not only weak, but passive; where Momoko pursues a series of destructive love affairs, her sister’s one attempt at romance lands her in the hospital, before she disappears from the novel entirely. Whatever his moralizing intentions — such as his claim that A Thousand Cranes was written to illustrate “the vulgarity into which the tea ceremony has fallen” — his work becomes electric once his characters have been in some way compromised. I am thinking of Shimamura’s apathetic philandering and Komako’s stubborn love, the diverse lusts of young Kikuji and Old Eguchi. Without such stains and blemishes, Asako and her father are lifeless. 

    When we pick up the failed works of a major artist, we glimpse the hard limits of his or her artistic project, and with it their worldview, and the result can be disorienting. The outlook that gave us such beautiful insights also gave us a host of inconsistencies and contradictions. We might want to dismiss a disappointing work as minor, insignificant, or we might use it to demolish the writer’s perfectly constructed canon from the inside. The flaws give the lie to the concept of brilliance; if they fail here, imagine where else their work might fall short. Yet it seems to me undeniable that such faults are essential components of an artist’s worldview. Without them, we see the reflection, not the landscape. Kawabata’s intuitions are not presented piecemeal; they arrive as a total worldview. A general allergy to plot created narratives driven largely by aesthetic associations. His profound emotional distance gave him a unique vantage on how passion ravages the mind. His apoliticism made him both the beneficiary and the critic of the imperial government and its militarist mentality. And he saw that the same impulses which seek to preserve purity wish even more to destroy it. This was not all conscious, yet it is expressed again and again across his oeuvre. In the work of a truly great writer, even the flaws cohere. 

    So it is in The Rainbow. In the final quarter, the focus narrows in on Momoko, and Kawabata achieves passages of immense power. The eldest sister becomes pregnant, and receives an abortion which neither her father nor Keita’s seem willing to mention; even the novel describes it only as “the operation.” Too weak to return home, she stays in Kyoto with Keita’s father. Time begins to collapse, and the narration leaps from one reflection to another: her birth mother’s suicide, the death of her first lover, the arsenic pill which her adoptive mother swapped out with sugar. How much of life turns on such small actions, she wonders, how much misery do we unknowingly perpetuate? Her furious heart is empty, and like a true Kawabata protagonist, she submits herself to the will of the world, unable to act upon her fury. Yet that rage is still there, like a taut string quivering at the heart of the novel, always ready to snap. 

    In the final pages, she is brought to meet her youngest sister, a geisha named Wakako, in a restaurant in the northwestern neighborhood of Arashiyama. As they are walking along the Oi river, she stops before a pool and sees a small tree reflected in the water: 

    It was a web of fine branches, drawn clearly over the water. What kind of tree was it? Above the embankment, its intricate, delicate lines were difficult to distinguish among the surrounding foliage, yet they stood out perfectly on the surface of the river. It was as though she was staring not at a reflection but at a tree growing inside the water. 

    The world is clearer in reflection, as life is more vivid in art. Momoko goes off to see her sister. In the final moments she slides open a shoji screen, to hear the river’s flow. 

    I have been trying to read Kawabata’s works in his own time, but now I must write about him in mine. 

    In 2017, I took my copy of Snow Country from Melbourne to Tokyo. I read it quickly, incompletely. I remember feeling at a loss, held apart from the characters, both thrilled and disoriented by the conclusion. So over the next few weeks I found copies of A Thousand Cranes and The Master of Go, as well as books by Mishima and Endo, and I devoured them as I went. I read them in Nikko, across the mountains from the Yuzawa snow country. I read them in Kyoto, in a coffee shop in Arashiyama. And I read them in Kamakura, where, on April 16, 1972, Kawabata went to an apartment in Hayama and drew a bath. He unhooked the gas line — on purpose? by accident? — and died. 

    When I think back on that time, I remember being shocked by the suddenness of Kawabata’s revelations: the roaring milky way, the broken tea bowl. I was young, I was in a foreign country, I was open to everything, like a house with all the windows flung wide. I took it all in, and reflected later. Several weeks later I came to Tsuwano, in the mountains west of Hiroshima. One night, I was eating a simple sushi meal by myself when a pair of men approached me. They were English teachers and were celebrating a colleague who was changing schools. Would I like to join them? 

    I spent that night with perhaps twelve other teachers, and after many drinks they asked what I was reading, and I told them. They didn’t think much of Endo, Mishima was too patriotic, Kawabata far too old-fashioned. A middle-aged man from Izumo described him as a “classic” whom few people actually read. He wrote his email on a piece of paper and told me to write him. I folded the slip, put it in my pocket, and lost it on the way back to my room. 

    Their words stuck with me. I had spent so much time in Japan, had witnessed so much, yet what had I understood? Kawabata’s books had surprised me, sure, but had I read them properly? What had I gained from them, really? Perhaps an openness to surprise, and to shock. Near the end of my trip, I arrived at the Koya-san temple complex in the mountains south of Osaka. It had snowed heavily, and during my days there tree branches snapped and roofs rumbled. I was walking alone through the Kongobu-ji when I came upon a small side room, its paneled walls bright with gold leaf. All my trip I had come across similar such artworks, and marveled at how the gilding set off the painted landscapes, abstract fields leafed across deeply detailed scenes. Yet I had not seen them. For as I stood there I saw, I really saw, that the screens formed a long landscape of mountains and waves, with a flock of cranes soaring across it all. I looked closer, and saw cloud patterns dimpling the edges of the gold leaf, and all at once I realized that the cranes had been scattered by the winds, separated in the field of clouds, calling to one another but lost, and lost forever, in this field of great beauty, and it was as if the light were blinding me, as if my body was collapsing, and a great wave of beauty and sadness flooded through me, a Kawabata feeling, a feeling I have not forgotten in all the years since. 

    “I Am an American Day”

    “In the huge gathering . . . there were, according to the official estimate . . . 1,250,000 persons. So far as available records indicated last night, this was the largest crowd that has ever assembled at a single point anywhere in the world.” This New York Times report from May 1942 refers not to a military parade in Nazi Germany but to a celebration in New York City. “The magnet that drew this astonishing turnout to Central Park, where it filled out not only the five acres of the Mall but the thirteen acres of the sheepfold, was the local observance of ‘I am an American Day,’ which, by proclamation of President Roosevelt, was marked yesterday in hundreds of other American communities, great and small.” 

    “I am an American Day” was a freshly instituted national holiday. It had started as a grassroots initiative before it was adopted by Congress and signed into law by President Roosevelt in 1940, and it was honored in small, local celebrations in cities and towns across the country, in schools and community centers, following special guidelines and utilizing educational materials that were put together by the Immigration and Naturalization Service under the Department of Justice. “Many desirable values result from such public ceremonies,” the government’s handbook from 1944 states. It continues, “The community ceremony lends dignity to the new citizenship status. Through the public ritual of oath or pledge, loyalty is cemented and the individual’s feelings are stirred by group honors paid to the flag.” “I Am an American Day” existed until 1952, when it was renamed “Citizenship Day,” moved to September, and merged with “Constitution Day.” But it was in its early days, the 1940s, that the occasion was freighted with public meaning and celebrated on a national scale. This was true also of the centerpiece of the celebration — large public naturalization ceremonies. In May 1942 in Central Park, forty-six thousand men and women became new American citizens and made this country their home. And the country welcomed them. 

    This little-known episode in the country’s long relationship to immigration is full of contradictions. Jewish and political refugees fleeing death and persecution in Europe unavailingly tried to secure visas to the United States. At Ellis Island, just a few miles away from Central Park, people awaited deportation back to the European horror. America did not adjust its closed-door policy and its quota system in the face of mass statelessness and murderous oppression. And yet the naturalization numbers increased sharply. Fewer immigrants came in, but more people than ever before became new citizens. Those new Americans, many of them former refugees, became messengers of a new patriotism. As Americans by choice, they embodied the ideal of citizenship and loyalty. 

    Naturalization procedures were restructured to publicly express these messages. Up until the beginning of the twentieth century, naturalization was not a standardized process in the United States. It was performed in five thousand federal, state, county, and municipal courts across the country. Every court determined its own procedure, requirements, fees, and naturalization papers. In 1906, however, the Basic Naturalization Act established the Bureau of Immigration and Naturalization under the Department of Labor and provided a “uniform rule for the naturalization of aliens throughout the United States.” This put in place a standard procedural framework that governed naturalization for most of the twentieth century. The authority to grant or to deny naturalization continued to be vested in the courts, but duplicates of every naturalization form had to be filed with the newly founded Bureau of Immigration and Naturalization in Washington, and standardized forms and fees were instituted. 

    In 1940, the Nationality Act transferred the Immigration and Naturalization Service from the Department of Labor to the Department of Justice. The act established requirement standards such as periods of residence, proof of good character, and special provisions for spouses of American citizens. It revised and detailed standardized guidelines for citizenship and its acquisition through birthright or naturalization, and it outlined the procedural framework for the process of naturalization. It also continued the United States’ Asian exclusion policy in terms of naturalization rights, clearly stating that the right to become a naturalized citizen extends only to white persons, persons of African descent, and to races indigenous to the Western Hemisphere. The only exception to these exclusions were Filipinos who served in the United States Army. 

    Naturalization consisted of two steps, colloquially referred to as filing “first papers” and “second papers.” Both were now made to take part in “open court.” And, for the first time, the ceremonial and performative elements of naturalization were coded into legislation. Interestingly, the Nationality Act of 1940 also included a series of recommendations regarding the “education” — a euphemism for indoctrination — of prospective citizens and the public on the meaning of American citizenship. Henceforth the presiding judge was required to deliver a “patriotic address to new citizens.” The larger celebratory occasions used their patriotic addresses to encourage enlistment and raise support in America’s participation in the war. An “I Am an America Day” address by Judge Learned Hand in 1944 touched many and was printed in newspapers in following days under the title “The Spirit of Liberty”: 

    What do we mean when we say that first of all we seek liberty? I often wonder whether we do not rest our hopes too much upon constitutions, upon laws and upon courts. These are false hopes; believe me, these are false hopes. Liberty lies in the hearts of men and women; when it dies there, no constitution, no law, no court can even do much to help it. . . . in the spirit of that America for which our young men are at this moment fighting and dying; in that spirit of liberty and of America I ask you to rise and with me pledge our faith in the glorious destiny of our beloved country. 

    Between 1795 and 1952, a “Declaration of Intention” was the first step in attaining American citizenship. The language in both the 1906 and 1940 acts does not make a clear distinction between filing the declaration in terms of papers and making the declaration orally in front of a judge or a clerk. The material part of the declaration was a signed form complete with a pledge under oath to “. . . renounce absolutely and forever all allegiance and fidelity to any foreign prince, potentate, state, or sovereignty of whom or which at the time of admission to citizenship I may be subject or citizen.” The text also included a clause on the commitment to organized government and a sworn oath to permanently reside in the United States. 

    The final stage of the ceremony, the oral oath of allegiance — the act of enunciating a commitment of loyalty in the company of witnesses and thus rendering that commitment binding — is a bit reminiscent of a marriage ceremony. Here, too, an individual binds themselves to another (or another entity) through the act of speech. Ever since the inception of naturalization policy in the newly founded United States of America, the transformation from non-American to American citizen required public language — the recitation of an oath. “I Am an American Day” celebrations from the early 1940s are, by far, the largest and grandest naturalization rituals in American history. 

    The largest of them was held in Central Park in 1944. In those festivities one and a half million people surged into the city to bear witness and promise loyalty. As in previous years, Mayor Fiorello La Guardia presided, and political speeches and musical performances accompanied the ceremony. Soldiers were in attendance, including veterans of World War I and even of the Civil War. Two years earlier, General Charles de Gaulle, then leading his occupied country from exile, was a surprise speaker at the event; he was invited by Mayor La Guardia to address the crowd over radio from London. “General De Gaulle told the gathering that millions of Frenchmen were placing their hopes of freedom in the efforts of the United States,” the Times reported. It must have been quite a moment. 

    Enthusiasm for “I Am an American Day” waned after the war. By 1946, attendance at the Central Park festivities dropped to one hundred and fifty thousand, and the merriment hit a low point as well. The new Americans in the crowd listened while Mayor William O’Dwyer, who had been a distinguished soldier during the war, warned them not to import any “dangerous, foreign ideologies to the U.S.” The robust patriotism of the war years was being drained by new Cold War anxieties, and immigrants bore the brunt of the new dread. The pomp and circumstance was replaced by foreboding. Was it inevitable that American pride would shrivel into fear as more and more newcomers found shelter within our borders? Is heterogeneity an inspiration or a threat? We are still asking ourselves that question. 

    Sewn Close to Pascal’s Heart

    “Man is but a reed, the weakest in nature, but he is a thinking reed.” The line appears halfway through Pascal’s philosophical work, Pensées (“Thoughts”), quiet as a whisper, final as a verdict. In around a dozen words, he captures both our fragility and our strange dignity. This is Pascal’s gift: the ability to distill what is vast into a sentence, and make the infinite startlingly present. To read him is to encounter a mind that recognized truth had to be lived, suffered, loved. His words carry the heat of a soul exposed to something greater than itself, whose whole being seems to burn through the page. 

    I discovered Pascal in my early twenties, like many do, through his profound and evocative collection of aphorisms. I had no religious education to speak of, and certainly no theology. Still, there was something in those complete fragments that reached past my youthful skepticism, addressing my inchoate longing. His voice seemed to emerge from the edge of two worlds: the measurable and the mysterious. 

    Pascal was born into the age of Richelieu and Louis XIV, when France trembled between the old certainties of faith and the new promises of reason. The Fronde civil wars of his youth taught him that human institutions, however grand, could crumble overnight. Perhaps this instability shaped his urgent spiritual seeking. When the ground shifts beneath your feet, you learn to look up. 

    He was a mathematical prodigy, reformulating geometry at twelve and inventing the mechanical calculator at nineteen to ease his father’s tax duties. His work on probability theory laid foundations that would stand for centuries. But to remember only his genius is to forget his gravity. He also knew what it meant to live close to death. From childhood onward, he suffered from chronic stomach ailments and nervous disorders that grew worse with age. He lived only thirty-nine years, much of it in bodily misery. Yet from this wounded life came spiritual clarity. In one of his memorable petitions, he asks God to “teach us the proper use of sickness.” 

    His mathematical precision never abandoned him, even in matters of the soul. Consider how he approaches the question of God’s existence. Where others built elaborate proofs, Pascal offered what became known as his “wager.” “Let us weigh the gain and the loss in wagering that God is,” he writes. “If you gain, you gain all; if you lose, you lose nothing.” Critics have dismissed this as cold opportunistic calculation, but they misunderstand. His wager was never meant to be a trick. He was trying to shake his reader out of indifference. If God exists, then eternity is at stake. If not, then nothing you love will last anyway. Rather than a syllogism, it was a cry from a man on the edge, pleading with others to look up before it was too late. 

    On the night of November 23, 1654, that cry was answered. Pascal experienced what he could only describe as fire, a torrent of divine presence that lasted two hours and changed everything. Here was a man who had spent his life measuring, calculating, proving, suddenly confronted with something that rendered his sophisticated vocabulary utterly inadequate. He wrote it down immediately in a text now known as the “Mémorial.” It begins with a single word repeated: “Fire.” 

    The word stands naked on the page, stripped of the elaborate reasoning that had defined his intellectual life. “God of Abraham, God of Isaac, God of Jacob, not of the philosophers and scholars.” That last phrase reveals everything. His conversion was not intellectual; it was volcanic. The God he encountered was not the prime mover of Aristotle or even the necessary being of Aquinas, but the living God who spoke to prophets and burned in bushes. 

    What follows is perhaps even more telling. He sewed the document into the lining of his coat, near his heart, and carried it with him until his death. Think of it: this master of public discourse, this defender of doctrine, reduced to the wordless intimacy of a hidden document pressed against his chest. The gesture seems almost superstitious, deeply personal, at odds with his rationalist reputation. How does one return to mathematics after touching eternity? How does one debate theology after encountering the God who exists beyond all categories? Pascal lived the rest of his life in this tension, caught between the measurable reality he had mastered and the immeasurable mystery that had mastered him. The “Mémorial” was discovered by accident, stitched inside the fabric, long after the body had cooled: a final secret, a private fire that had never been extinguished. 

    Soon after his conversion, Pascal withdrew increasingly from Parisian society and became closely associated with the Jansenists of Port-Royal. These rigorously Augustinian Catholics believed in the absolute sovereignty of divine grace and the profound corruption of fallen human nature. Their theology suited Pascal’s temperament perfectly. Where mainstream Catholicism often spoke of cooperation between human will and divine grace, the Jansenists insisted that salvation was God’s work alone. Man could neither earn it nor resist it. 

    Pascal lived among them for extended periods, embracing their discipline of prayer and study. When the Jesuits attacked Port-Royal’s theology, Pascal defended his friends in the Provincial Letters (1656–1657), a masterpiece of polemical literature that combined theological precision with devastating wit. He wrote under the pseudonym Louis de Montalte, crafting letters supposedly sent from Paris to a friend in the provinces. The device allowed him to appear as an innocent observer gradually discovering Jesuit moral casuistry. “I had thought that I was merely ignorant,” one letter concludes. “But I find I have been deceived.” 

    Yet even among the Jansenists, Pascal remained inward, solitary. He had tasted something no system could fully contain. The God of his “Mémorial” demanded more than correct doctrine. He demanded everything. Even Pascal’s compassion bore the marks of his rigor: “We must have pity for one another,” he writes, “but we must feel for some a pity born of tenderness and for others, a pity born of disgust.” It is a hard saying, but not a cruel one, blending a sense of mercy with moral exactitude. 

    It is in the Pensées, unfinished and fragmentary, that his genius remains most alive. He died before they could be completed, leaving behind bundles of notes written on scraps of paper, organized by theme but never synthesized into a single argument. What survives is not a book but a soul in pieces. There are no conclusions, only openings. “Man is neither angel nor beast,” he writes. “And the misfortune is that he who would act the angel acts the beast.” The line lasts because it is true. We recognize ourselves, caught between pride and appetite, and feel in his words both judgment and mercy. 

    What makes Pascal enduring is not only what he believed, but how he spoke. Where Descartes built systems and Spinoza constructed geometries of the emotions, Pascal worked in lightning strikes of insight. “The heart has its reasons which reason knows nothing of.” “We run carelessly to the precipice, after we have put something before us to prevent us seeing it.” “The eternal silence of these infinite spaces frightens me.” Each sentence carries the weight of a meditation, the clarity of a mathematical proof, the urgency of a man who knew the soul could be lost. 

    Perhaps, Pascal is a master of brevity because he lived with the pressure of death and the presence of the divine. When time is short and eternity is long, every word matters. His fragments read like prayers in disguise; they subtly invite us to kneel, arguments that double up as psalms. Even his opponents recognized the force of his prose. Voltaire, no friend to Christianity, remarked that, “One must agree that Pascal was a man of an extraordinary eloquence. His style is like his thought — original, profound, often sublime.” Likewise the combative aetheistic Nietzsche, who admitted: “I have a predilection for Pascal.” 

    Maybe what draws us to Pascal across the centuries is his refusal to choose between reason and faith, between the life of the mind and the life of the spirit. He shows us that a thinking person need not abandon thought to believe, nor abandon belief to think clearly. His God was encountered not despite his mathematics but through it, not in opposition to his learning but in its depths. 

    Even his silences speak. In the fragments, there are notes to himself, lists of themes, broken beginnings. They remind us that the greatest truths are not always delivered whole. Sometimes they come as fire in the night, stitched close to the heart, hidden until we too are ready to see. Sometimes they arrive as a whisper. He shows us what it looks like to think with one’s whole being. His legacy is not a system but an example: the spectacle of a brilliant mind undone by love, remade by grace, and given back to the world as a gift. 

    In Tehran Under Fire

    It was still early in the war. After four days of internet blackout — and, God knows, testing countless VPNs — I finally reached a stable enough connection to check my email. The last one was from a prestigious university in New York, where I was scheduled to begin my Ph.D. in the upcoming academic year. They had asked why I hadn’t confirmed my admission yet, warning that if I delayed any further, I might lose my spot. I responded politely that these nights I can’t sleep from the sounds of Israeli rockets landing right and left across my city, and that during the day, I’m constantly trying to make sure the many people I know across town are still alive. News that someone two oceans away was thinking about my fall plans felt like a comforting distraction. But truthfully — even if I had not been under rocket fire — the new travel restrictions against Iranian citizens would have made it impossible to attend that program anyway. I typed up a version of these sentiments, hit send, and then I stared at my phone screen, watching the VPN wheel spin, waiting for the email to leave “draft” status and finally be sent. The wait, of course, wasn’t short. As with all attempts at action during the days Tehran lost to war, it dragged on. 

    I’m not the only one who, in those black days, stopped thinking about the future or set aside carefully made plans. After the first wave of Israeli attacks on Iran — which mainly targeted western parts of the country and densely populated cities such as Tehran, Isfahan, and Tabriz — many others, too, collapsed under the weight of seeing mangled bodies or even just images of bombed-out homes on TV screens and social media. 

    Naturally, the fears of the future felt very real. They were triaged in a sort of macabre sliding scale. Immediate fears: where will the next missile land? who will it kill or render mourners? Near-future fears: what happens if food and medicine shortages hit? And more distant ones: when will this war end, and under what circumstances? And if it ends, will ordinary people — social, cultural, political actors — have any right to live? And if so, who will protect them? 

    For years, Iranians have feared becoming the next Syria or Afghanistan. In those days, more than ever, that fear breathed down our necks. Still, what most occupied Iranians — day and night — wasn’t what might happen in ten days or ten years. We were most occupied with counting the dead while dreading the next strike. 

    And it was not only the fear of death. It was also the fear of losing the last means of connection, solace, and sanity — the internet, which had become a lifeline during the Covid-19 pandemic’s isolation. In the days we lost to war, it became common to begin sentences with If I die during internet blackout . . . Audio recordings, snapshots of handwritten wills, Telegram notes — these were shared openly online, often with heartbreaking directness. 

    A father told his child to seek out help with funeral arrangements. A mother, in a voice message to her daughter living abroad, said, “If we die, make sure you request reparations — maybe it’ll make your life easier.” Another young woman, addressing the unborn child she was due to deliver in four weeks, wrote: “If I die, I want you to know I never thought I’d be giving birth to you in a war. Had I known this would happen, I might have chosen not to bring you into the world.” 

    Many just didn’t want to be forgotten. For younger left-leaning Iranians, this sentiment was particularly inflected with memory of the mass of faceless fatalities in Gaza, so many of whom were killed by Israeli airstrikes and then forgotten. In the younger generation’s digital pleas to be remembered, they invoked the memory of Fatima Hassouna, a Palestinian photojournalist and artist from Gaza, born in 1999 and tragically killed on April 16, 2025, in an Israeli airstrike that also took the lives of ten of her family members. She achieved international recognition for her powerful documentation of civilian life during the Gaza war, especially after foreign journalists were barred from entering the region. Young Iranians’ online wills began with “Don’t let us become numbers,” referring to how the names and stories of those killed in Gaza have often been lost. Except for a few, most are remembered simply as part of a rising death toll — hundreds more each day — rather than lives, dreams, and human histories. This was the fate that these young Iranians feared. 

    They were right to dread a similar oblivion. In the first ten days of the war that began with Israel’s attack on June 13, at least 400 people were killed. Despite relentless efforts by journalists and citizens, fewer than a hundred of them have been named, and about half as many photos have been made public. The fear of dying and becoming a statistic is painfully real. But there’s another, quieter fear: of leaving lives unfinished. 

    My mother, a woman in her early fifties, who left Tehran at my insistence during the bomb-filled days, told me as she departed, “I hope if someone must die, it’s us old folks. You’re still young. You haven’t lived. You have so much left to do.” I joked: “Out of a population of ninety-two million, around 935 people have been killed in under two weeks. The daily odds of dying are one in several million. You’re more likely to die from cancer than from a missile.” She didn’t find it funny. And she was right. There’s not much to laugh about in war. 

    A few weeks before the sirens started, Nazli, a close friend of mine, had told me she’d been diagnosed with a rare, untreatable cancer. Her doctor had bluntly told her that she likely had two years to live. Even as someone who has spent much of her adult life studying death, the news was hard for me to hear. I didn’t want to respond with hollow comfort. I couldn’t decide what to say. After the shock passed, my mind kept circling the same question: if I were in her position, what would I worry about most? An unlived life? An unfulfilled love? A long, raw intimacy never granted? My family? The pets and plants I care for? Probably all of that and more. 

    Sometime after she shared the news, I emailed her: “Sudden death robs us of one thing — time. Time to fully form ideas and complete projects. If there’s anything that needs time — unfinished research, a puzzle not yet solved, something that needs building — I’ll do it. I’ll give you my time, now or after you’re gone.” 

    We always think that we have plenty of time until we are suddenly — it always feels sudden — disabused of that illusion. During the first days of the war, with forced closures and the internet down, even with long days ahead of me, I had no time. The hours couldn’t be filled with reading. Even writing the shortest lines — even describing the weather — became impossible. After a few days of staring at my laptop and writing nothing, I gave in and wrote a letter to my publisher. It included final drafts of some texts with necessary edits, as well as a portion of a book I had been writing about the protests in Iran in 2022, with a note stating: “If I don’t finish it, the introduction explains the concept — I hope someone else can complete it later.” I felt lighter. 

    But there is a category of creation that cannot be delegated. And I was in the middle of such a project: a half-finished novel. I’ve been working on it for over five years. When I look at its pages, the weight of its incompleteness sits on my shoulders. The plot is straightforward, but a novel can’t be handed off to someone else. There’s a not-so-famous Iranian author, who is also a renowned writing teacher, who says, “Every writer’s duty is to write one great love story and then die.” And this novel, full of ellipses and question marks, is mine. My one great love story. Still unfinished. 

    On the Sunday that fighter jets and missiles bombed Tehran more than twenty times, my resistance to writing a will on social media finally broke. I typed: “If I get killed, some of my translations will be published eventually; the unfinished book on 2022 will remain, and a half-written novel I haven’t been able to add a single line to in days will remain unfinished.” 

    To friends who asked how I was doing after the Sunday attacks, I repeated the same silly calculation about cancer vs. war. I even laughed. But later I realized what I’d missed in my conversation with my sick friend — what is missing from the will-writing culture — is this: the deepest fear is not the absence of a future. That is inevitable, war or no war. What haunts us — my friend back then and I now — is the absence of the present. The days that could have been spent writing, creating, and loving — are now swallowed up by endless explosions. 

    War doesn’t steal our future — it steals our present. Not our becoming, but our being. 

    The Revolutionary Synagogue: Notes of a Grateful American 

    Pedro Alvares Cabral was the first human being in recorded history to have been on four continents. He set foot on each of them — Europe, Africa, America, and Asia — in a single year, 1500, which was the same year that he led the first extensive European exploration of the northeast coast of South America. He “discovered” what we know today as Brazil in April of that year, and wrote home to King Manuel I notifying him of Portugal’s brand new territory, theirs by virtue of the authority vested in him by the King and his interpretation of the divine will. Cabral sailed on from Brazil, but he left behind the seeds of what would become the first robust Jewish community on the American continent. Among Cabral’s crew was a man who went by the name João Faras. Faras was an astronomer, astrologer, physician, translator and — most importantly for our purposes — he was a member of the community of Portuguese Jews who had been forcibly converted to Christianity by the King of Portugal just a few years after the Spanish expulsion.

    Cabral’s men had reached what would become Brazil on April 21, but Faras remained on a boat offshore for six days — he had developed an irritation which made it impossible for him to walk. On the twenty-eighth of that month, finally upright and capable of studying the stars to determine location, Faras and two assistants set up a wooden astrolabe on the beach and attempted to establish the altitude of the midday sun. After some days of study, he drafted a letter to King Manuel I, which included a sketch of the stars which make up the southern hemisphere. He explained with apologies to the king that, due to his lame leg, he could not identify the precise height of the stars, but he did identify a new constellation, which we now know as the Southern Cross. 

    The king to whom that letter was addressed was the very same monarch who, on December 5, 1496, demanded that all his Jewish subjects leave the country. The following year this edict was rescinded and Jews were prohibited from fleeing the country and instead forced to convert to Catholicism. These decrees were issued just four years after the Jews of Spain had been forced from their homeland due to King Ferdinand’s and Queen Isabella’s genuinely world-altering antisemitism — in fact, Manuel issued the decrees in order to satisfy the Spanish monarchs, to whose daughter Manuel was attempting to marry off his son. There were few options for relocation for Spanish Jews in 1492 — England and France had already instituted country-wide bans (in the aftermath of expulsions) against Jews in 1290 and 1306 respectively. Many countries that did not ban Jews wholesale prohibited Jews from owning land and required Christian oaths for vassals under the feudal system. The easiest place for them to go was Portugal, and many of the Jews who were tormented by Manuel’s oppression were originally from the Spanish Jewish community that had just been violently dispersed. Scholars judge from João Faras’ weak Portuguese and preference for Spanish that he was likely among these Jewish Spaniards. In the intervening years, Faras had become a “converso,” or hidden Jew — living publicly as a Christian but privately as a Jew — and so he and his descendants must have remained for as long as the Portuguese maintained control of Brazil. 

    Jewish responses to the inquisition differed. Some, like Faras, chose to stay in Spain or Portugal, convert to Catholicism, and brave the anti-Semitism which stalked them even after baptism. Rumors swirled that the “new Christians” practiced Judaism in secret, adulterating Catholic purity with atavistic practices. Many chose to leave the Iberian peninsula altogether, and some part of that group travelled north to the newly independent Dutch provinces, which permitted Jewish immigration and Jewish practice. In this period Jewish fate was overwhelmingly determined by the governing power’s caprice, which often swung between prejudice and avarice: prejudice because anti-Semitism is a weed that flourishes under every sun and avarice because the Jews repeatedly proved themselves lucrative residents, and in the host countries, money-lust rivaled xenophobia in ubiquity and its power. There was not a single state which granted Jews rights because it understood that Jews were owed rights. The best Jews could hope for were privileges granted by opportunistic and self-regarding leaders. Privileges could only be acquired and maintained for as long as the governing authority could be persuaded that the deal was a good one. (Authoritarian dealmakers can be like that.) 

    The Dutch accepted the Jews because the Jews promised wealth and they made good on that promise. The Muslim rulers of Spain had permitted Jews to participate in trade and business, and for the Jews it was a more or less benevolent period, but when the Christians came and the Jews were eventually forced to flee they carried their business acumen on the road with them and, more importantly, they brought connections to the many Jewish businessmen who had been dispersed by the Inquisition and were now scrounging for residence in port cities around the world. The Jews in exile constituted a kind of international business network owing to their relations with each other. By 1600 most of the discriminatory laws that were enforced in other European countries were either not on the books in Amsterdam or ignored there. When, in the early seventeenth century, the Dutch West India Company dispatched ships to conquer territory across the ocean, some of these prospering Jews went with them. They met the community that João Faras’ had helped found when they got to South America. 

    The descendants of João Faras and his community lived as crypto- Jews for generations. In the years after they first arrived in the new world, the conversos flourished financially, so much so that Adam Smith observed in The Wealth of Nations that 

    The Portuguese Jews, persecuted by the inquisition, stript of their fortunes, and banished to Brazil, introduced, by their example, some sort of order and industry among the transported felons and strumpets by whom that colony was originally peopled, and taught them the culture of the sugar-cane. Upon all these different occasions, it was not the wisdom and policy, but the disorder and injustice of the European governments, which peopled and cultivated America. 

    Note that Smith referred to them as “Portuguese Jews” — it seems those Jews who converted learned what so many have had occasion to discover: antisemites don’t want Jews to be Jews, but they don’t want them to be anything else either. Wealth did not gain the Jews toleration, and these Jews knew nothing like the religious freedom that their brothers and sisters enjoyed in Amsterdam — the conversos in Brazil would continue to live in “hiding” for one hundred and thirty years. But apparently hiding was not enough. Between 1593 and 1595 an Inquisitional Commission was established in Olinda, in the port of Recife in Brazil, where conversos were tried and arrested. When the court was dismantled, Jew-monitoring was taken over by the local bishop. 

    All this changed in 1630, when the Dutch West India Company wrested control of Brazil from the Spanish. They had come to 

    South America with a specific interest in the cultivation of sugar cane, a trade dominated by the conversos, as Adam Smith noted. The Dutch brought with them members of the powerful Jewish community in Amsterdam, and also the freedom and toleration which was the law of the land back home. 

    Thus began the first openly practicing Jewish community in the Americas. This is where we started. It was, for a time, the freest Jewish community in the world. Less than ten years after the Dutch arrived, Brazilian Jewry built the first American synagogue, Kahal Kadosh Zur Yisrael, in Recife, which was responsible for maintaining a school and a cemetery for the community. For twenty years it seemed that true freedom was possible as a way of life. But it was too good to last — in 1649 the Portuguese instigated a war to win control of northern Brazil, which they managed to do within six years. After victory they instructed the conquered that the Jews, like the Dutch, had three months to pack their things and go. 

    Most of the Jews returned to Amsterdam, some went to Curaçao, Surinam, and Barbados, and twenty-three boarded a ship called the Saint Catherine bound for New Amsterdam, a Dutch colony in North America known today as New York City. Their welcome was not warm. Peter Stuyvesant, the director general of New Netherland, began a campaign against the Jews which would last for the rest of his time in office. Stuyvesant met the twenty-three Jews at the dock and tried to prevent their disembarking. He wrote home to his superiors in Amsterdam hectoring that the Jews were “deceitful,” “very repugnant,” and “hateful enemies and blasphemers of the name of Christ,” who ought to be made to depart lest they “infect and trouble this new colony.” He also warned that granting the Jews liberty would force the Dutch Reform community to do the same for “Lutherans and Papists.” (Intolerance is paradoxically inclusive.) Perhaps Stuyvesant had in mind the revolutionary liberal colony recently founded in Rhode Island by the radical Roger Williams, who had written a charter for the city which promised that no one would be “in any wise molested, punished, disquieted, or called in question for a difference in opinion in matters of religion.” This goodwill in Rhode Island did indeed extend to Lutherans and Catholics and it would eventually include Jews. Some of the Jews living under Dutch protection in Barbados, Surinam, and Curaçao got word of the freedoms offered in a faraway place called Providence and ventured there as early as 1658. 

    The Dutch West India Company (which included Jews among its founders and its directors) was not especially interested in protecting its Jewish subjects, but neither was it swayed by Stuyvesant’s warnings of a pollutant Jewish mass. His superiors had more pressing concerns involving their pocketbooks: a group of Spanish-Portuguese Jewish merchants in Amsterdam had sent the company a letter outlining several reasons why the Jews should be granted entry in New Amsterdam, and prime among them was that “many of the Jewish nation are principal shareholders” of the company itself. The leadership was convinced, and instructed Stuyvesant likewise. Yet the conditions they demanded were a far cry from the freedom these Jews had grown accustomed to in Recife: they were permitted to travel, trade, live, and remain in New Netherland only so long as “the poor among them shall not become a burden to the company or to the community, but be supported by their own nation.” And they were prohibited from practicing their religion in public. These Jews had plenty of experience with precisely this form of religious practice: clandestine and implicitly shameful. 

    The first documents that spurred on the Enlightenment which rattled Western culture and altered the course of modern history were written in the early seventeenth century, roughly concurrent with settlement in the New World. The proto-liberalism that they espoused was in part the breakthrough accomplishment of the legacy of Amsterdam, particularly the revolutionary thinking of one of Amsterdam’s Jewish sons, Baruch Spinoza. These were the ideas that eventually culminated, almost two centuries later, in this thunderous sentence: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.” A rather spectacular historical irony was about to play out: whereas the Enlightenment was kindled an ocean away, it came to fruition nowhere with greater force than the country that would blossom out of Colonial America, the country whose promise was sealed in the Declaration of Independence. 

    A miracle in Jewish history, in sum, was about to take place. The name of the miracle was the United States of America. The attainment of American independence from Great Britain marked the creation of the only polity in human history that Jews, along with all other human beings on the planet, were considered the just recipients of rights which were owed them purely by virtue of their humanity. These rights were not a charter or a privilege, they were axiomatic. 

    But there was a problem. By the time of the first Jewish settlement in Colonial America, the ideas that would coagulate into the rich soil from which the possibility of political liberalism would eventually spring were only just beginning to form and were certainly in their infancy in the New World. Anti-Semitism was an early import with which Jeffersonian ideals would have to contend. Peter Stuyvestant was early in his legislation of antisemitic laws — Jews in New Amsterdam were prohibited from purchasing land, serving in the guard (they were forced to pay a tax in lieu of guard duty), voting, holding public office, engaging in retail operations, and trading with the Native Americans — but New Amsterdam was hardly an outlier in terms of codifying antisemitism in the New World. The Catholic community of Maryland instituted a ban on Jewish settlement years before any Jews ever moved there. After Roger Williams’ death, Rhode Island became an inhospitable home to its Jewish residents and the Jewish community that had flourished there shriveled. Even in Pennsylvania, which had been founded by the Quaker William Penn as a “holy experiment” of religious tolerance, Jews had no citizenship rights before the revolution — Penn’s founding charter required all voters and public office holders to profess faith in Jesus Christ. And to some of the Enlightenment thinkers who laid the groundwork for the earth-shaking intellectual revolution consecrated in the American Declaration of Independence, Jews were not obviously worthy of any rights or privileges at all. Thomas Paine — the Voltaire of America in this respect, too — violently opposed both Christianity and Judaism, but believed that Christianity could be unlearned whereas Judaism was biological, and that Jews had to be cured of their Judaism or they would endanger enlightened society. 

    In this social pre-Enlightenment and pre-American context, the few Jews of North America who arrived before America’s founding followed the pattern of settlement that Jews had repeated in most of their new host countries — they settled primarily in major port cities and worked within the established international, and then national, network of Jewish merchants. Newport, Philadelphia, Charleston, and Savannah became centers of Jewish life, and insofar as Jews enjoyed the hospitality of their Christian hosts; “tolerated guest” was the highest status to which Jews could aspire. And here again they relied on their financial success for their security. 

    The New World did not extend rights to its Jews until long after Jewish settlement first began. This, despite the fact that the idea of “rights” as such, the idea that a person could be owed anything, as a matter of right, by the state and its governing officials, was percolating in the American intellectual welter. The stirrings of egalitarian idealism were beginning to be felt. Those ideas did not come to the fore until the American Revolution. And the American Revolution was also a revolution in Jewish history. 

    By the time the Revolutionary War began, one hundred years after a Jewish community settled in New Amsterdam, the Jewish population in North America had swelled to about a quarter of a million people. That Jewish community, which lived through the years of war, was the first Jewry to see a country founded with the promise of a place in it for all people, and so for them, too. (I hasten to add that the new republic was, as a matter of principle and practice, kinder to them than to its black population.) Descendants of that early New Amsterdam community, Shearith Israel, were still living in New York when British fleets approached New York Bay in 1776. Half of them, led by the cantor Gershom Mendes Seixas, fled first to Connecticut and then to Philadelphia three years after the British vacated that city in 1777. Seixas managed to snatch one of the congregation’s two Torah scrolls, lovingly referred to as “the revolutionary Torah,” and took it with him to his father-in-law’s home in Stratford, Connecticut. A few years later he and his flock moved to Philadelphia at the invitation of a congregation called Mikveh Israel, which would become America’s first permanent Jewish congregation. 

    When the patriot Jewish refugees of New York, led by Seixas, arrived in Philadelphia, the Jewish population of the city had about tripled in size. Israelite patriots from Richmond, Charleston, Savannah, Lancaster, and Easton had already come to the city fleeing the British. These Jews had come, while other residents stayed behind, because they believed in the Revolutionary project, in the philosophical premise of freedom and democratization. As two Philadelphian Jews put it in 1782, American Jews had “fled here from different parts for refuge” and arrived there to reconstitute the fledgling Mikveh Israel congregation on March 24 of that year as a “revolutionary synagogue,” which is how that congregation is described by the contemporary congregation of Mikveh Israel. The move to the City of Brotherly Love was symptomatic of deeply held commitments to the possibility of a country founded on the ideals of the Enlightenment, a possibility that was at that point in the war far from certain to come to fruition. 

    Jewish American patriots understood from the first that the Enlightenment had implications for how modern Jews ought to relate to their own religion. The first recorded meeting of Mikveh Israel mimicked Jefferson in declaring independence for American Jewry. These Jews claimed that the Philadelphian Jewish community that they were replacing had enjoyed “no right or legal power,” since it was founded only according to the custom of ordinary and familiar congregations and not as a conscious group of free rights-bearing individuals. But now, the document declared, the group must “bind ourselves one to the other that we will assist to form a constitution and rules for the good government of the congregation” in order to “promote our holy religion and establish a proper congregation in this city.” This declaration formally united two traditions for the first time: Judaism and liberalism. The alchemy of that precious admixture altered the course of human history. (It became the civil religion of American Jewry for a long time, and like all civil religions it came to be taken for granted and in need of refreshment. In our time it may be gravely in crisis.) 

    The affecting thing about the Jewish kindling to the new American dispensation is that Jews were championing their claim to rights about which they did not yet feel completely secure. In the matter of democracy they were rather like Nahshon, the Israelite at the Red Sea whose faith was so great that he dove into its waters before Moses parted them. Before the war was over, the freedom that Jefferson described was very young and (as we would say) aspirational. The Jewish patriots were taking a great risk in throwing their lot in with the revolution: it was not at all clear in 1782 that freedom from Britain would come, or that if it did it would bring with it rights for all citizens regardless of religion. In fact, the same year that the Jews of Mikveh Israel drew up their constitution, they also bought a plot of land to build their first synagogue on a street called Sterling Alley, which happened also to house a church belonging to a German Reform Congregation. The church complained: they did not want their community polluted by Jewish proximity. The Jews had a choice: they could adjudicate the case in court, or they could placate the bigots by buying a new plot of land a safe distance away with money they hardly had to spare. They chose not to bet on the American courts and found another location. In April 1782 it was not at all clear that a Philadelphian court would honor a Jewish community’s right to erect a synagogue wherever it pleased. 

    The principles were sterling, but it was a struggle. Religious rights were not quite an existential certainty for American Jews, but they were prepared to fight for the American promise. The next year the patriot Jewish congregation in Philadelphia, with Seixas at its helm, petitioned the state of Pennsylvania to formally alter the religious oath required by all elected officials so that Jews would not be prohibited from running for public office; the Pennsylvania Council of Censors considered and then tabled the request, but news about the petition was printed in three newspapers at the time favorably reporting the Jewish cause — it was a salient social issue, and the Jews were on the liberalizing side of it. When the war ended, many states heeded Seixas’ plea and rescinded their test oaths of their own accord. As the Jews who had been holed up in Pennsylvania returned home after the war, the states that accepted them back all dropped their test oaths: Georgia, South Carolina, Pennsylvania, Delaware, and New York lost their test oaths by 1792. Virginia, which had not been a Jewish center before the war, got its first synagogue shortly after victory was declared; its test oath had been abolished in 1786 in the extraordinary Virginia Stature of Religious Freedom, drafted by Thomas Jefferson and husbanded into law by the like-minded James Madison. All this was not, to put it mildly, the familiar Jewish experience of the exile. 

    In the last decade and a half of the eighteenth century, the laws of state constitutions, and the freedoms owed to all citizens of the newly won country, were in far greater flux than we may realize. True, the Constitution of the United States, and the amendments to it, forbade the federal government to make laws establishing religion or limiting the freedom of religious expression, and it prohibited religious tests for federal offices — but much discretion was still left up to the states. Some states enforce blue laws — laws which prohibit work on Sundays, the Christian Sabbath — to this day, and blue laws were used in the early decades of the country to coerce working class Jews to abandon Jewish practice. Blasphemy laws, always the spear point of illiberalism, were in place in many states into the nineteenth century as well. The federal Constitution of 1787 allowed all states to decide for themselves who could and could not vote — the ideas which would make up the country’s legal skeleton had not yet calcified into shape. In New Jersey, for example, the state legislature amended the state constitution in 1790 to formalize a right that had already been in practical operation. It added the phrase: “he or she” in a clause regarding enfranchisement: “. . . no person shall be entitled to vote in any other township or precinct, than that in which he or she doth actually reside at the time of the election.” It thus made explicit a property-owning woman’s right to vote, which that state had considered implicitly secured through an ambiguity in its original constitution. And seven years later, the state removed a different phrase regarding property ownership after which the number of women voters increased dramatically, as did the number of voters amongst free people of color. And so it remained in the state until 1807. 

    The Jews were among the scrappy minorities eager to secure for themselves the rights which were not yet clearly stipulated and honored by all. The Jews of New York immediately took advantage of that state’s decision, in 1784, to automatically recognize all religious societies that applied for incorporation. The New York congregation Shearith Israel was incorporated that very year, thus securing the same legal benefits and status as churches in the state. The same legal benefits: almost a secular salvation. 

    The quarter of a million Jews who benefited from the birth of political liberalism in those early years was a fraction of world Jewry. At that time the largest plurality of the Jewish population — roughly one million people — was concentrated in Eastern Europe and Russia. After the partition of Poland in the final decade of the eighteenth century, at about the time that America’s Jews were reaping the first fruits of American liberalism, Jews in Russia were restricted to a large region known as the Pale of Settlement, established in 1791, essentially an enormous ghettoized community in which Jewish life was governed by strict Russian laws. This geographical sequestration remained in force for over a hundred years (it was dissolved shortly after the abdication of Nicholas II during the Russian Revolution). In Russia there had been no Enlightenment, even if Voltaire and the “enlightened despot” Catherine the Great voluminously corresponded. 

    But in the late eighteenth century and early nineteenth century, the new liberal dispensation began to make itself known to Europe’s Jews. The phenomenon known as “emancipation” was born, with all its mighty imperfections, first in Austria with a proclamation by the emperor in 1782. In France, the French Revolution and then the rule of Napoleon allowed Jews to become full citizens of the state and annulled the laws requiring that the Jews live in a ghetto and enacted other liberalizing measures, which were extended to the countries that Napoleon conquered. (There were also outbursts of anti-Semitism among the revolutionaries, and Napoleon also enacted some reactionary measures against the Jews. Progress is never unalloyed, especially for Jews.) As is well known, the phenomenon known as “emancipation” also led to the phenomenon known as “assimilation,” with its many psychological agonies and social anxieties. Liberalization came to be rightly seen as a threat to tradition; the founder of Habad Hasidism, in Russia in 1812, wrote that “should Napoleon be victorious, wealth among the Jews will be abundant and the glory of the children of Israel will be exalted, but the hearts of Israel will be separated and distanced from their father in heaven. But if our master [Czar] Alexander will triumph, though poverty will be abundant and the glory of Israel will be humbled, the heart of Israel will be bound and joined with its father in heaven.” This same plot — the joy of liberalization and the fear of liberalization — would later play out in the years of the great immigration to America, when millions of Jews who never experienced it before came to the shores of a liberal order. 

    And there were other liberalizing developments for the Jews — for example, the creation and distribution across borders of a Jewish press. This innovation in Jewish life created the possibility for there to exist such a thing as a global Jewish community. Jewish newspapers detailing the goings-on of local Jewish life was soon enhanced by a more ambitious Jewish journalism designed to disseminate news of Jewish communities in far-flung parts of the world. The first known Jewish newspaper produced by and for Jewish readers was the Gazeta de Amsterdam, which was printed in Spanish for the converso Dutch community in the Dutch capital. Between 1835 and 1840 eighteen Jewish newspapers were founded in five different countries. Over the following five years, their number increased by fifty-three in thirteen different countries. The circulation of news and opinions strengthened the new liberal muscles. 

    Scholars argue that a single event catalyzed this steep rise: the Damascus Affair. In 1840, thirteen members of the Jewish Syrian community in Damascus were accused of kidnapping and murdering a Catholic priest and his servant and draining their blood to bake matza — the blood libel that had haunted Jews throughout the medieval period, but the global Jewish community was shocked to discover that the old antisemitic trope still had currency in the modern period, and that in a city as cosmopolitan and significant as Damascus, political and religious leaders could permit — indeed, sanction — such a thing. The French Consul in Damascus, Ulysse de Ratti-Menton, supported the libel, and ordered an investigation in the Jewish quarter. Ratti-Menton swayed the governor of the city, who also happened to be the son-in-law of Muhammad Ali, the viceroy of Ottoman Egypt who governed Ottoman Syria at that time. 

    The accused were imprisoned and violently tortured until they confessed to the crime. Some were forced to convert to Islam on penalty of death. Others were sentenced to death. Some number of the seven prisoners who were forced to confess died during the interrogations. At the same time state authorities kidnapped sixty-three Jewish children and held them hostage until the entire Jewish community collectively confessed and also brought proof of the murder forward. Bones were unearthed somewhere in the Jewish quarter on the basis of which Ratti-Menton and Sharif Pasha declared that this was “proof” of the ritual slaughter, and more Jews were arrested on charges of abetting the murderers. Christians and Muslims in the city united to unleash violence on their Jewish neighbors. A synagogue in a suburb of the city was pillaged and desecrated. 

    One of the arrested was a man named Yitzchak Picciotto. His brother, Eliyahu, happened to be the Austrian consul in Aleppo. It was thanks to him that world Jewry was made aware of the catastrophe. Dissolve to American Jewry. Then numbering about fifteen thousand, the Jews of the United States protested the murders in six different cities. For the first time in American Jewish history, Jews demonstrated on behalf of their own interests and exercised influence in foreign policy — they pressured President Van Buren to protest, and the United States consul in Egypt did so at the President’s bidding. As Hasia Diner, the historian of American Judaism, put it, 

    For the Jews, the Damascus affair launched modern Jewish politics on an international scale, and for American Jews it represented their first effort at creating a distinctive political agenda. Just as the United States had used this affair to proclaim its presence on the global scale, so too did American Jews, in their newspapers and at mass meetings, announce to their coreligionists in France and England that they too ought to be thought of as players in global Jewish diplomacy. 

    At the same time, the most powerful Jews in the world — among them Moses Montefiore in England, the Rothschild family, and the lawyer Isaac-Jacob Adolph Cremieux who went on to serve as the Minister of Justice in France — questioned the integrity of the investigation. The pressure worked, and the Syrian authorities were forced to lift the death sentence for those who had not succumbed to torture. 

    The Jewish press came into being in part because the Damascus Affair made clear how uneven the quality of life for Jews was depending on where and under whose authority they lived. Suddenly it was possible for Jews in Cincinnati to know that a Jewish child had been secretly baptized, kidnapped, and forced to be raised as a Catholic in Rome — as happened in 1858 in the infamous Mortara Affair. And when pogroms started rolling through the Pale of Settlement at the end of the nineteenth century, Jews were aware that in a place across the ocean called America it was possible to assert one’s right to live wherever one pleased, to vote, to buy land, and to bring a violent antisemite to court with the confidence that the court was tasked to rule in a wronged Jew’s favor. And so it made sense that between 1880 and the start of World War I approximately two million Eastern European Jews came to America. By 1914, when Congress squeezed Lady Liberty’s arms shut after the start of World War I, the American Jewish population had swelled to 2,349,754 — more than all of the Jewish population in Austria-Hungary — according to the American Jewish Yearbook. 

    The Declaration of Independence changed Jewish history and it changed Jewish identity. Suddenly Jews possessed a sense of possibility that was not delusional and not limited to messianic redemption. Democracy promised nothing less than to sever the old equation between exile and suffering; and whereas antisemitism has never been expunged from American life, and has recently become more prominent in American life, it generally did not express itself violently. (That is why the Tree of Life massacre and the subsequent violent assaults on Jews in America were so terrifying.) 

    America is not the only modern revolution in Jewish history, not the only experiment in rejecting the unjust terms of Jewish existence in the violent diaspora. The same dissatisfaction with the oppressive reality that led to the emigration of millions of Jews, that awakened a new sense of historical agency in the exiled Jews, issued also in the other great experiment: the restoration of Jewish sovereignty in the land of Israel, the creation of a Jewish state. You might say that America and Israel, democracy and self-determination, are the two competitors for the Jewish future. Yet it is important to be clear that there are many differences between these two dispensations, and the ways in which these respective states are different are intensely significant for their respective citizenries. The United States is blind to chosenness, or Chosenness. No people who live inside it are esteemed or despised as essentially other in the eyes of the state. Neutrality about difference, and a skepticism about historically and biologically inherited privilege, is a corollary of the American system. America was a nation founded on the principle that all people are equal: there are no exceptions in this all-encompassing proposition. 

    But Israel — and in this sense Jews are no different from every other ethnically defined nation — was conceived as a Jewish state, whatever the definitions and the difficulties of such a conception. As in all other nation-states, such a definition is a secularized version of Chosenness. When the Jews lost their state the first and then the second time, they were forced to reckon with the plight of every minority in an alien state, in an exile, up until the founding of America: the people who belong, the majority, are held to be historically and sometimes even ontologically better than the people who do not. The state of Israel sought to modify the ugliness of this deep distinction by introducing democracy into the bargain, and making an explicit promise of complete equality to all; but still it lives within the old European framework of a majority and a minority, which, when the minority reaches a size that frightens the majority, has always led to violence. Even as Israel promised equality, it promised favoritism. Is a liberal ethnically-based state a contradiction in terms? Whatever the situation in theory, in practice the government that presides in Jerusalem today certainly thinks so. Netanyahu and his fevered cronies have drowned the possibility of real democracy in blood. Israel promised equal rights to all citizens. In this, it has failed. 

    So here I am, even in these days, to praise America, and what it has done for the Jews, and why. And to implore my fellow Jewish Americans to remember what we owe American liberalism, what it cost us in life and in pain to get here, and what it will cost us if the dream of American liberalism is allowed to wither. Before the United States, no country had ever accepted liberalism as the groundwork upon which to build a polis and to justify itself. And certainly no liberal society ever made the next essential breakthrough that America did, which was to add pluralism — the recognition of the blessings of difference and of the axiomatic rights of groups as well as individuals — to its public philosophy as a corollary of liberalism. 

    The magnitude of the struggle that we face in America now is owed precisely to the high moral bar that a liberal order sets for itself. The practice of political liberalism is always incomplete, always asymptotic, because there will always be citizens and non-citizens. The promise of universal equality will always be denied full fruition, and so, in some sense, liberal states are hypocritical by definition. (The moral revolution that is America long ago committed a genocide on its own ground.) Yet there are worse things than stubbornly aspiring to justice, especially when one prefers it to come from below rather than from above. Not doing so is far worse. Better to fail trying to institutionalize equality than to build a political framework on the premise that like should favor like. Tribalism is a constant seduction, but that does not make it a good. And tribalism in a heterogeneous society may quickly become an evil. 

    The commitment of American Jews to America was well founded. We need to remember this now, as everything is shaking, and as so many of our brothers and sisters link arms with the cruel reactionaries destroying the American Revolution and offer them “antisemite” as a useful spin-word for fascism. The Declaration of Independence is indeed based on, and necessitated by, self-evident truths. It would be a terrible delinquency to shut our eyes to its promise. 

    The Cheeseman

    A little lectionary:

    Every human being, no matter how slightly gifted he is, however subordinate his position in life may be, has a natural need to formulate a life-view, a conception of the meaning of life and its purpose. — Kierkegaard

    The world has become “infinite” for us all over again, inasmuch as we cannot reject the possibility that it may include infinite interpretations. — Nietzsche

    A man’s vision is the great thing about him. Who cares for Carlyle’s reasons, or Schopenhauer’s, or Spencer’s? A philosophy is the expression of a man’s intimate character, and all definitions of the universe are but the deliberatively adopted reactions of human characters upon it. — WilliamJames

    The anarchy of the philosophic systems is one of the most effective reasons for continually renewed skepticism. Historical consciousness of the limitless variety of philosophic systems contradicts the claims of each of them makes to universal validity and this supports the skeptical spirit more powerfully than any systematic argument. — Dilthey

    Worldviews can engage in controversy, but only rigorous knowledge can decide, and its decision bears the stamp of eternity. — Husserl 

    The ever more exclusive rooting of the interpretation of the world in anthropology which has set in since the eighteenth century finds expression in the fact that man’s fundamental relation to beings as a whole is defined as a worldview. It is since then that this term has entered common usage. As soon as the world becomes picture the position of man is conceived as a worldview. Within this, man fights for the position in which he can be that being who gives to every being the measure and draws up the guidelines. — Heidegger

    From the father the child has a right to demand a view of life, that the father really has a view of life. — Kierkegaard

    If I want to have a worldview, then I must view the world. I must establish the facts. The smallest fact from the connection between the soul and hormonal balance gives me more perspectives than an idealistic system. But the facts are not finished, they are hardly even begun. A worldview that waits for facts believes in progress. — Musil

    The schoolboy believes his teacher and his schoolbooks. — Wittgenstein

    The most practical and important thing about a man is still his view of the universe. We think that for a landlady considering a lodger, it is important to know his income, but still more important to know his philosophy. We think the question is not whether the theory of the cosmos affects matters, but whether, in the long run, anything else affects them. — Chesterton

    Lavoisier makes experiments with substances in his laboratory and now he concludes that this and that takes place when there is burning. He does not say that it might happen otherwise another time. He has got hold of a definite world-picture — not of course one that he invented: he learned it as a child. I say world-picture and not hypothesis, because it is the matter-of-course foundation for his research and as such goes unmentioned. — Wittgenstein

    Principles, guidelines, models, and limitations are storehouses of energy. — Musil

    A person, to be a person, must have a worldview. — The Cheeseman 

    man when the cheeseman unexpectedly addressed those words to me. The setting was completely unphilosophical. The words from across the counter startled me: was it possible not to have a worldview? Certainly not where we were, where I was growing up, which was a thicket of convictions. I was raised in a neighborhood of verities, Brooklyn, New York 11235. The streets flowed with answers. I was always already in possession of a worldview and could not even picture the emptiness of the opposite condition. I never met a nihilist. The worldview that held me in its grip, sometimes too tightly, was not anything that I had chosen: I was born into it and was systematically schooled in it. We are all born into a worldview; we receive it, we do not invent it; we are not Prometheans who begin ex nihilo and operatically create the terms of our selves and our world. 

    The most pressing business for a thinking mind is what to do about what it already believes. Credence must be earned by more than fidelity, which is not an intellectual virtue. The intellectual melodrama of my youth was whether I could find a way to assent to what had been bequeathed to me, to accept it not only as mine but also as true. Would I want it to be mine if it was false? I have intense feelings of affection for what my ancestors believed, because they believed it and I am their son, and for what their beliefs, true or false, enabled them to achieve (a belief system need not be true to issue in beauty and goodness, or to strengthen the spirit under duress); but neither filial duty nor the stewardship of tradition requires that we adopt the errors or the illusions of those who preceded us. 

    The vindication of a worldview that we did not elect must be accomplished in a manner that is spiritually richer than a mere reconciliation with facts, with the accidentals of one’s birth. Decades later I came to cherish this sentence at the beginning of The Guide of the Perplexed: “certainty should not come to you by accident.” Accident had to be elevated into necessity, which could then be celebrated as luck. But I was taught to begin with the feeling of luck, which had the unpleasant implication that everyone who was unlike me was unlucky, when of course they all ardently believed in their own luck, too. Chosenness, specialness, distinctiveness, uniqueness: irony does not flourish in the hothouses of self-love. In their insistence upon our own possession of the only truth, my rabbis outwitted themselves: their confidence about truth was designed to unburden us of the obligation, or the need, for a critical examination of first principles, but instead it seduced some of us in its direction. What was it that they did not want me to know? How can one live in the kingdom of truth and not use one’s mind? What is truth if not the harvest of examination and reflection? I suppose there are two kinds of people, those whose minds are started by a claim to truth and those whose minds are stopped by a claim to truth. 

    So the inherited scheme that makes the world constantly meaningful is only the start of a life in beliefs. Passive assent brings no glory to what is assented to. The clarification of conviction should be a matter of personal honor for an individual; otherwise he has no bragging rights for what he believes. I do not mean that we must become a population of philosophers, but surely we must all cop to a strain of doubt, a bout of obscurity, a run of uncertainty, even if it has passed or we have mastered it. There is no shame in human finitude. The important point is that beliefs are paltry things without their reasons. A weak faith is one in which there are more beliefs than reasons. (A “heretic in the truth” is how Milton acidly described the man who holds a correct opinion without knowing why.) 

    The history of received intellectual frameworks, even before the legendary convulsions of modernity, has been turbulent and wounding. Do not think for a second that it was not ever thus, that people in the fourteenth century harbored no doubts and posed no objections: that is the escapist fantasy of reactionaries. Wholeness — the seamless fitting together of all that is human, and then of all that is human with the cosmos — was never for creatures such as ourselves; or to borrow the current cliché, it is, however ennoblingly, forever aspirational. 

    A society of perfectly contented and perfectly coherent people has never existed. I cannot believe that there ever lived a completely integrated individual; and if there did, then the thoroughgoing absence of alienation in such a person should be regarded as a flaw, as a disorder. Moreover, nothing provokes a doubting mood as much as a claim to certainty. And a culture of certainty is not only edifying, it is also asphyxiating. A final completion of the search for what is really the case is not a human option. The history of ideas and the history of religions demonstrate that all the traditions that solemnly instruct us never to change have themselves changed, to a greater and lesser degree, with greater and lesser integrity. Rabbi Akiva was one of Moses’ most illustrious successors as a teacher of the law, but there is an astonishing midrash which describes Moses visiting Rabbi Akiva’s classroom in the first century — sitting in the eighth row, the midrash adds with wanton hermeneutical imagination — and not comprehending a word that was being said. When Moses hears Akiva tell his students that what he has just taught them is the law that was given to Moses at Sinai, he is reassured. It is the very definition of a tradition to be transmitted, handed down, passed on — which is to say, to make accommodations for its survival. Every tradition that prides itself on its rigidity but has made it all the way to us is clearly deceiving itself in this regard. Traditions are inheritances that must continue to be inheritances, and this is not possible without a capacity for honorable adaptation. 

    The continuity of tradition demands a measure of discontinuity, carefully and knowledgeably managed, with a clear assessment of historical circumstances and intellectual flexibilities. This, the distinction between bending and breaking, between developing and dissipating, is not easy: too much discontinuity is as lethal to tradition as too little discontinuity. Going into the future with only the old is as stupid as going into the future with only the new. But if you live according to a tradition, if you love it with all your heart, but you do not pass it on, you have betrayed it. It is not fulfilled only by your own practice of it. It was not created just for your own enjoyment of it. What I cherish must not end with me; or so I must resolve. 

    What thou lovest well remains, 

    the rest is dross 

    What thou lov’st well shall not be reft from thee 

    What thou lov’st well is thy true heritage . . . . 

    Here error is all in the not done, 

    all in the diffidence that faltered . . . 

    The enterprise of perpetuation is not only a matter of having children and educating them adequately. It is also a matter of building and sustaining institutions, those allegedly soulless entities without which the most poetical accomplishments of the human soul would not stand a chance against time. In a free society, certainly, there is no excuse. But in our free society an excuse is desperately needed, because here we doom traditions with ignorance and indifference; here we have more important things to do. 

    Precisely because the acquisition of a worldview seemed moot to me, like settled business, and precisely because I inhabited a universe of previousness, the notion that a worldview is somehow lacking, and therefore an obligation, something for which I was responsible, was odd to me. The cheeseman saw the woolen yarmulke on my head; he knew that I was not wanting in an encompassing scheme; and yet he pronounced his admonition. And thereby he shook the settledness of things. The sufficiency of the received, one of the dogmas of my upbringing, would no longer suffice. And if one must have a worldview, then which worldview? Even without a familiarity with the shock that anthropology delivered to the West in the eighteenth century, I knew that there were many pictures of the world on offer. They were everywhere. I lived not only among synagogues but also among churches, though I left the neighborhood before it included mosques. Inside those churches and mosques, no doubt, they wrestle with the same problem of the threat to the validity of belief that is posed by the multiplicity of beliefs. A church down the street from a synagogue always thickens the plot. A sense of exclusiveness invites a siege mentality. 

    The cheeseman made belief seem less like an inheritance — a marvelous consequence of genealogy — and more like a task. In the matter of the most profound commitments, there was suddenly the suggestion of a choice. This was a little alarming: wasn’t the purpose of a patrimony, and of loyalty to it, to relieve me of choice? Moreover, I was quite sure that I was thoroughly unequipped for such a decision. (It was not until many years later that I understood that obedience, too, is a choice. The thing about voluntarism is that you can check out but you can never leave.) Yet there I was, on Brighton Beach Avenue, experiencing the vertigo from which modern Western thought has never recovered: the recognition, vexing and then intoxicating and then vexing again, that there are many pictures of the world and that all of them are held to be true by the people and the communities who espouse them. What did the diversity of fervently championed truths say about the possibility of truth? Maybe my view is correct and your view is incorrect, and there the story ends. No, too simple. But why too simple? Somebody must be wrong! My head, loyally and disloyally, swam. 

    This teaching — my early induction into the arcadian anxieties of a philosophical existence — did not take place in a classroom. It took place in an appetizer, which is what we called a delicatessen that did not serve meat, so as to respect the ontology of the Jewish kitchen. And my tutor was not, professionally, a teacher. His job was to run the cheese counter in Mr. Haber’s market on Brighton Beach Avenue in Brooklyn. The cheeseman was slightly stooped, balding, soft spoken, with a heavy accent and a gentle smile. I never saw him without his white apron. He stood behind a weathered cutting board that bordered a long refrigerated counter with a glass window that displayed the rows of culinary delicacies. (The appetizer was well named.) He handled the blocks and the wheels expertly, slicing them with a ruthless wire that he manipulated with the dexterity of a craftsman. The slices fell gracefully onto a waiting sheet of wax paper, which he then folded speedily and with geometrical precision. And all the while he talked and he taught. He cautioned me about hollowness and shallowness; as I stared at the sturgeon, he worried me about nothingness. 

    His accent was the tell. This was not supposed to have been his fate. Like almost all the adults I knew when I was growing up, the cheeseman was a displaced and disrupted individual. He was living his second life in his second world, because his first life in his first world had been annihilated. When the Germans invaded Poland, he was a graduate student in philosophy, and a Jew, at the university in Warsaw. Those were good years for philosophy in Poland, a flourishing era of logicians and phenomenologists. He devoted himself to the advanced study of Kant. It was at the cheese counter that I first heard the word “Kant.” But it was all blown to bits by the war. He never told me how he survived it, or which of its many hells he endured, but like Mr. Haber, and like my parents, he was “a survivor” — which is to say, he had a spirit more powerful than history. His mind emerged intact from an apocalypse. He had carried philosophy with him. He never abandoned it. Did it abandon him? 

    Years later I wondered whether philosophy, or the philosophical attitude, had stood him in good stead during the atrocities. I prayed that it had. For philosophy can surely serve as an instrument of human perdurability, though you would not know it from the philosophy departments. I pictured him starving and thinking, hiding and thinking, running and thinking, weeping and thinking. But I wondered also about the limits of mental detachment in the face of catastrophe. Elias Canetti once remarked that during the war he did not save Goethe but Goethe saved him. It is a geistliche sentiment, but I wonder. Aren’t there circumstances in which the equilibrium of a philosophical mind is not merely impossible but also suicidal? What are abstractions in a world on fire? What is an idea compared to a crumb of bread? 

    Should reflection upon existence teach interest or disinterest, and in extremis which is wisdom? 

    In any event, he made it out alive, and Mr. Haber, a man of such bountiful kindliness he could have been mistaken for a simpleton, gave him work. When the cheeseman discovered that one of his regular customers, a young man whose mother sent him frequently to his counter with the same order was eager to hear what he had to say, he was delighted to instruct the lad. He cut the cheese and discoursed on philosophy. I had the impression that he cut the cheese more slowly in those pedagogical moments, so that we would have more time for the seminar. I developed the habit of doing my errands at hours when the store might be less crowded and he could elaborate more fully, in his heavy accent, on his themes. 

    O those accents! They are almost all gone now. It is a commonplace of the literature of American immigration that the children were often ashamed of how their parents sounded, but for me it was Mozart. The accents were proof that everything that we were told about a vanished world had been exactly so. Not that I doubted it, not at all; but the accents were a sensual link, the actual sound of their world audible in my world, so that I sometimes felt almost as if I could travel backward along the accents, like traversing a rope stretched over a hideous drop, to the time and the place before the extermination of my people was attempted. I love refugees: these people of the before and the after, they gave me life. Refugees are the aristocrats of human fragility. They know more than we do, even if their knowledge is not power. In shul there were moments when the sound of those accents was almost too much to withstand, as when, in the afternoon prayer on Yom Kippur, old broken-down Mr. Frost sobbed as he prayed: “Do not cast us aside in our old age, as our strength wanes do not forsake us . . .” I never saw the cheeseman in shul: a philosopher. Mr. Haber was there often. 

    I hope that we will one day come to see the folly of our acceptance of disruption as an ideal of life. I refer not only to the Darwinism of the technologists: owing in part to their successes, a much wider social and cultural prestige has been conferred upon a cataclysmic view of change. (Is arrogance the condition of innovation or its consequence?) The only thing that we seem to produce in larger quantities than data is disrupted lives. In our romance of shattering we have come to scorn the shattered. If you have known shattered people, then you will think twice about all the shiny advanced rubbish about the virtue of hastiness, the idolatry of the will, the worship of the macro. After all, a dissatisfaction with contemporary conditions can take many forms and have many consequences: there are myriad ways to manage necessary and justified change, and heartlessness is not a requirement of progress, even if there will always be “losers.” Cataclysmic change is always sloppy change and always cruel change. I am so sick of living in the rubble of my culture and the rubble of my politics. Isn’t destruction exciting? Cataclysmists are pathologically at peace with human costs. To the social breakdowns that preceded them they add the social breakdowns that they devise, and the population of broken people grows. A new electorate! Like all revolutionaries they are vandals, except that they enjoy the further conceit of having circuitry on their side, as if science makes a moral difference. And now we have a government that looks down from its morally moronic stratosphere and gleefully breaks people. The viciousness is not a by-product of the policy, the viciousness is the policy. 

    The population of broken people grows beyond our borders, too. In 2024, according to the Population Division of the United Nations Department of Economic and Social Affairs, the global number of international migrants was 304 million people, or 3.7 percent of the world’s populations. These immense movements of human beings are all attempts to escape varieties of horror and danger; nobody, except the rich, uproots themselves recreationally. There is an unprecedented amount of vulnerability and instability, of homelessness and statelessness, in our world — disruptedness is a dark norm, and we must therefore take the trouble, and not only in southwest Texas, to understand disrupted people, people who will never know what the Stoics called a smooth flow of life, fractured and fissured people, people who need help. They are not alien, they are hurting. All hurting people are alien, I suppose, if you are fine; but wellbeing can itself be a myopia. In their indifference to the most elementary human considerations, it is the nativists who are the aliens. Syria is an entire nation of disrupted people, and the contemporary locus classicus of the savagery of disruption is Gaza. Cleansings and expulsions are nothing other than disruption as an instrument of war. I am not ranting. I am shocked again by all this damage, which in a certain light coheres as an era. And I am recalling the cheeseman and what he represented, the sweet obscure contemplative man whom they failed to kill, who for the rest of his exiled days could aspire to nothing better than the rent and the attempted mastery of his own memories. Healing must not be confused with happiness. There is no return from disruption, there is only the arduous labor of creating yet another life. Did the cheeseman have a family? There was something isolated about him, the halting tone of the solitary. When he stood before me and talked about moral theory, I heard justice but saw injustice. You cannot pretend to be familiar with the world unless you know post-catastrophic people. The man was a living monument to the futility of the categorical imperative. 

    “Kant” was not the only word I first heard at the cheese counter. “Descartes” was another. As he cut through a brick of muenster one afternoon, he explained the cogito, and more generally the question of certainty. “Imagine a man suspended in space”, he said. “There is no wind, no light, no sound, no external stimulation of any kind, nothing that could be registered by his senses. How would he know that he exists?” All around me the wrinkled Jews of Brighton Beach were fussily buying herring and rye bread and seltzer. (“The plebeian bubbly,” as Irving Howe delightfully called it.) My aproned teacher continued: “He would know that he exists because he is thinking. Thinking!” The squares of muenster were piling up on the wax paper. “And not only would he know it. He would know it with certainty.” The cheeseman’s eyes were hot. For emphasis he would sometimes come out from behind the display case that separated him from his customers, wiping his hands on his apron, and give me the intellectual climax again. “Thinking!” “Certainty!” I stalked his every syllable. When he handed me the artfully packaged cheese — all those tucked-in angles, muenster made mathematical — I knew that the lesson was done. Years later I learned that the story of the floating man was Avicenna’s invention, and that Descartes’ argument was somewhat more complicated, and somewhat less persuasive, than my teacher’s version of it. Perhaps he had a more sophisticated account that he could have provided but there was another customer waiting behind me, and maybe she was getting cranky. I wouldn’t have blamed her. Still, it was he who gave me the gift of the problem of doubt. It was at the cheese counter, in a hundred afternoons of provisions, that I acquired the conviction that the stakes in matters of belief are high. 

    In those years, before the Russian Jews arrived, Brighton Beach was a tired place, populated largely by retired workers from the garment industries in New York. The avenue was a block away from the sea, and they had come there for the beauty and the breeze, they were not floating men, and for the boardwalk, which was a stupendous incubator of community, particularly on summer nights, when the heat in the uncooled dwellings drove them outside to the shore beneath a Yiddish moon. (Later the Russians would bring their balalaikas.) Mr. Haber’s establishment was one of a series of storefronts on the ground floor of one of the many low, glum apartment buildings that sometimes seemed as hunched and as weathered as the people who inhabited them. (The camera recently panned past Mr. Haber’s block in Anora, which was set in these homely streets.) Beyond the appetizer, a few steps past our Sunday night deli, was a movie theater called the Oceana, where I began my education in American cinematic kitsch. There was no Antonioni in my Brooklyn. And further up the avenue was the elevated subway, which regularly overwhelmed the neighborhood with its nasty screeching noise and sent sparks falling to the unignited street below. The din of the train was the soundtrack for my tutorials. 

    Along with “Kant” and “Descartes,” I heard “Leibniz.” Even in my undeveloped state I recognized that there was something unbearable about a survivor teaching theodicy. I do not mean that the cheeseman had found comfort in his own mind for his own suffering. I never knew him well enough to take the measure of his wounds. Had he uncovered, for his own adversity, what Leibniz called a sufficient reason? But there he was, the pounded man expatiating upon the idea of the best of all possible worlds. When he explained the Leibnizian idea of God, I was reminded of what little I knew of Maimonides. (The cheeseman never discussed Jewish philosophy.) I remember him insisting that Leibniz’s optimism was derived not inductively, as a conclusion to be drawn from experience, but rationally. The world has to be this way — reason says so; and never mind the testimony that he himself could have given that the world is not this way. The cheeseman was certainly an expert on the frustration of the rational by the real. But at the same time he was an ardent exponent of reason. He adored it. Its concepts were stations on a magical journey away from spurious magic. In the company of reason the ordinariness of his surroundings melted away and an antique grandeur was recovered. I am embarrassed to recall that I did not adequately appreciate his reverence for reason. I was an adolescent reading Nietzsche, who is very bad for adolescents, and more generally I had fallen under the spell of existentialist paperbacks. I believed that reason was the problem; I was a hormonal fool. Reason, unlike logic, is not for the young. 

    What did the cheeseman mean by a worldview? I cannot say for sure. He almost certainly did not have in mind the organizing apparatus of the mind, the operating systems of categories and conceptual schemes that make experience legible: the cheese did not come with a side of epistemology. He preferred to impart to me moral and metaphysical ideas. He failed to warn me that all these ideas may not go together even when the individual pieces seemed proven and right. I had the impression that he was telling me, instead, to choose among philosophical packages; and for a young man packages are more exciting than pieces. Packages can make you feel brilliant. Yet perhaps he was not at all recommending a package, as if slices of thought could be neatly joined together like slices of cheese: the skeptical and even playful tone of his little lectures belied a young man’s hope for a single answer to all the questions. He was not preaching dogma, or any variety of mental convenience. He was soft, but he was strict, as if to say: young man, there is no haven from complication. A worldview, in his account, or in my understanding of his account, was not an evasion of intellectual labor. It was an invitation to intellectual labor. He was advising me of a duty. I sometimes associate the stirrings of my mind with the smell of smoked fish and pungent cheese. 

    A worldview is certainly a package, because it purports to be comprehensive, and sometimes even totalistic. The advantage of its scope is that it equips you for every contingency. It anticipates every confusion and every surprise. For this reason, whatever its intellectual merits, it has an undeniable psychological utility. “A Weltanschauung,” Freud wrote in an undelivered lecture not long before he fled Vienna, “is an intellectual construction which solves all the problems of our existence uniformly and on the basis of one overriding hypothesis, which, accordingly, leaves no question unanswered and in which everything that interests us finds its fixed place.” In subsequent years such unifying and simplifying explanations came to be most commonly described as ideology, and associated with tyrannical political systems, though they may tyrannize just as easily over an individual mind in an open society as over an entire society in a closed one. When ideology is thwarted by the openness of a society, it often curdles into conspiracy theory. 

    Modern philosophy, mainly in Germany, includes something known as “Weltanschauung philosophy,” which was largely the creation of Wilhelm Dilthey, the heir of Vico, whose immortality is owed to his prescient and extended defense of the humanities against the natural sciences. Dilthey classified and inventoried the many types of Weltanschauungen; it was his tribute to the natural diversity of the human mind. (The term Weltanschauung was coined, but not developed, by Kant in the Critique of Judgement in 1790.) Dilthey was one of those thinkers who, after Nietzsche, the self-styled “psychologist”, was happy to collapse philosophy into psychology, and so he shrank from calling a worldview a philosophy. In an especially provocative remark, he declared that “worldviews are not products of thought.” For this reason, worldviews are felicitously “undemonstrable and indestructible.” The origin of worldviews, in his telling, was to be found in human need: “the formation of worldviews is determined by the will to stabilize the conception of the world.” Americans might describe this as the will to closure. One way to preempt spiritual crisis is to remove foundational concepts from the teeming domain of the mind — to hold them with intellectual immunity, as it were. 

    But where in the hierarchy of intellectual virtues does stability belong? What is the mortal terror in inconsistency? Yet it was the unphilosophical, or even anti-philosophical, character of worldviews that lent them to the purposes of culture, I mean culture defined in the anthropological sense, as the given outlook of a community or a society or a nation, the sum of its assumptions and its axioms — the spirit of an age. Indeed, all that is required to come into possession of a worldview is to speak a language. (God, said Nietzsche, is in our grammar.) Thus Karl Mannheim seized upon Dilthey’s remark to emphasize the irrelevance of “theory” to the creation of a Weltanschauung. Philosophy, he observed, is “merely one of its manifestations.” In his account, a worldview is “the basic impulse of a culture” that manifests itself in all the culture’s expressions, high and low, and represents the “global unity” that runs through the entirety of a way of life in all its material and non-material aspects and knits them together. Hegel strikes again! Mannheim’s objective was not psychology but sociology — more specifically, the sociology of knowledge, which was invented by Marx but codified by Mannheim a hundred years later. 

    We might even say that a worldview is a degraded form of philosophy, an intellectualization of what we now like to call a habitus, a polished collection of doctrines with no rough edges, easy to swallow, allergic to paradox, bored by variations, of no intellectual distinction, handy for the education of children and the advancement of demagogues, a social asset, perfect for nothing more ambitious than a personal identity. Foremost among the attractions of identity is that it confers intellectual invulnerability. It takes guts to keep allowing ideas in; such porousness can be mistaken for vacuity, when in fact it represents a stubborn determination to keep checking the merits of one’s ideas, because justification is the work of a lifetime. In 1938, in an essay called “The Age of the World Picture,” Heidegger declared that “all great philosophy culminates in a worldview.” He did not mean it as a compliment. He wished to secure the superiority of philosophy, especially as he construed it. Yet as a factual matter — and he was always insulted by the grubby concern (“positivism”) with facts — Heidegger was himself in the worldview business, and his statement is false: there are great philosophies that cannot be in this way reduced. Their difficulty and their depth make them impossible to package and useless to cultural and political programs — as Heidegger’s own doctrine of Being should have been, except that the phenomenologist-turned-mystagogue found a way to place Being at the service of the Führer. When a way to grovel is sought, a way to grovel is found. Is there anything more poisonous than identity made ontological? (Heidegger had a Romanian contemporary named Constantin Noica, a metaphysician who also lived in the woods and also sided with the fascists, who wrote a tome with the risible but chilling title The Romanian Sense of Being.) The closer one studies worldviews, the more facile they seem, and the more sordid their history. 

    Can we agree that there is no such thing as the spirit of an age? There are many spirits in any age. The impulse to dissolve and to merge, the lust for oneness and for sameness, the entire Parmenidean enterprise, has had an awful blinding effect. Monism is sublimely satisfying, but it leaves out too much. Empiricism, before it is magnified into a philosophy and enters the lists, may be just a fancy term for alertness, for the scruple about paying the requisite attention. The issue is the philosophical and even spiritual significance of the details. We have been taught, especially under the influence of the social sciences, that knowledge advances by means of subsuming the particulars, by means of generalizations, but so, too, does ignorance. Common features are not always more revealing than uncommon features, and the unification of a manifold is not always the surest method of grasping it. Enter art. 

    The love of one thing may be owed to the fear of many things. The rampant variability of reality must be desperately brought under control. Unfortunately the pluralists, whose fundamental intuition against the monists seems unimpeachable, get too rattled by the task and go too far. They believe that the only way to relieve the pressure of the diversity and the opacity of the world is to surrender the possibility of objectivity altogether and to confine assertions of meaning and truth to the more lenient realm of subjectivity. Philosophical claims, they rule, will henceforth be regarded as self-expressions. And how they have prospered in America! William James was so convinced that serious thought was entirely an activity of personal temperament that he concluded that philosophy consists of “a few main types” — “cynical characters take one general attitude, sympathetic characters take another,” and so on. Argument is usurped by personality. Cogito ergo sum? He would say that! 

    Dilthey’s conclusion that the historical multiplicity of worldviews leads inexorably to relativism is similarly overwrought and unfounded. Why, in the end, can one view out of a thousand not be the true one? Why does it matter that a true opinion is surrounded by many false ones? Why is critical reasoning helpless before a large field? Shouldn’t it, in the name of its purposes and its methods, relish the challenge? Or is the anxiety about philosophical commitment political — that the rational justification of one position among others would be disrespectful or hegemonic? But the respect that we owe other believers — though not other beliefs — must be apparent long before the exercise of criticism begins; it must always be prior. I respect many people whose views I hold to be nonsense. (Evil nonsense, however, makes cordiality much harder.) There is something immature about being scared of other people’s philosophical choices; it is like being scared of other people’s happiness. And it is even more abject to be afraid of choice itself. The wounded pride of the eighteenth century — the rudeness of its discovery that what Hamlet told Horatio was correct, there are more things — is not enough of a reason to give up on truth; the anthropological revelation demanded only an expansion of the search for it. Anyway, the Western thinkers who were so unnerved by the existence of natives and aboriginals were never going to seriously consider their views of the world. (A few lines after his remark about cynical and sympathetic philosophies, James declared that “the thought of very primitive men has hardly any tincture of philosophy.”) The plenitude of philosophies and worldviews means only that people everywhere have been contemplating difficult questions and ultimate realities, and that is not bad news. We must tread carefully here, because from the truth that nobody can believe everything the lie that nobody can believe anything may quickly follow. The history of ideas is not the meal, it is the menu. 

    The cheeseman’s insistence that I gain a worldview may have been his way of teaching me the belief in belief, and his successive presentations of the philosophers may have been his way of suggesting that one day I would have to make a choice. And so it may not have been a worldview after all that he was commending to my attention, if a worldview is a summary of prevalent but unexamined beliefs, an arrangement of platitudes, a perquisite of membership in a culture, a badge of belonging. Whether by means of heritage or contagion, people usually hold the picture of the world that is held by people like themselves, but the cheeseman was not prodding me to select a conformity. He wanted me to stretch, not to wallow. 

    I came away from the appetizer also with another treasure: a lasting impression of one man’s amor intellectualis, and therefore with the notion that thoughtfulness is an essential element of dignity. It would be hard to exaggerate the cheeseman’s dignity. His circumstances never levelled him. All of his dreams had been destroyed, except his dream of understanding; and that dream could be realized anywhere, even in a small food shop in a distant corner of the world. He had suffered, but he had not lived stupidly. The cheeseman was the first genuinely philosophical individual I ever encountered, the first one who exemplified — or modelled, as we would say — the intrinsic satisfaction of serious thought. In this way he prepared me for my low and stupid age. He gave me a course in the experience of mental independence. It is unlikely that he had many people with whom he could explore the contents of his educated and energetic mind, and it pleases me to imagine that our conversations may have relieved his loneliness. I hope I helped. All hail the solitariness of the thoughtful! And the master whose name I never knew.

    Staying Decent in an Indecent Society

    To grow up, as I did, in a country that had been under Nazi occupation less than a decade before I was born, was to be very sure about who had been good and who had been evil. Where I lived, in The Hague, we refused to buy candy from a local tobacconist, because the woman who worked behind the counter once had a boyfriend in the occupying German army. The butcher shop around the corner was out of bounds, because the owner was rumored to have been a Nazi collaborator. Most of our primary school teachers had been on the side of the angels, of course — or so they said. This one had been a brave resister by sending German soldiers, asking for directions, the wrong way. That one had punctured the tires of a German army vehicle.

    Whatever the truth of these claims and rumors may have been, the central moral yardstick with which we grew up was the question of whether a person had been a resister or a collaborator. It took some time for us to realize that people who had willingly collaborated or actively resisted were in the minority — less than ten percent either way, and there were far more collaborators than resisters. Most people had kept their heads down and tried to survive as best they could. If unpleasantness happened to others, to Jews in particular, it was more comfortable to avert one’s eyes. That way one could pretend not to know.

    For those of us born after the war it was easy to judge such behavior harshly. But it might have been wiser to heed the words of Anthony Eden, the former British prime minister, in The Sorrow and the Pity, Marcel Ophuls’ great film about French collaboration. He said in perfect French that he would not presume to cast moral judgment over the French treatment of former collaborators, since he had not had the misfortune of living under a brutal occupation himself.

    But even if one learns to be less quick to judge others, the experiences of World War II still cast a dark shadow and moral questions cannot be dodged, especially now that we live once again in a time of increasing autocracy, persecution, and violence licensed by the world’s most powerful leaders. Whether people can be classified as heroes or villains is less interesting to me than the question whether a person can remain decent in an indecent society. Is it at all possible, apart from joining the resistance, which puts one’s own life and that of others at risk, to remain uncorrupted by a criminal regime?

    What constitutes an indecent society was succinctly defined by the Israeli philosopher Avishai Margalit in his superb book The Decent Society. An indecent society, in his view, is one whose official institutions are designed to humiliate people, often a minority. A decent society is not quite the same thing as a civilized society. In Margalit’s words, a “civilized society is one whose members do not humiliate one another, while a decent society is one in which the institutions do not humiliate people.”

    A society ruled by Nazis, or Stalinists, or Maoists, or other rulers aspiring to totalitarian control, is of course more than just indecent. As long as one can freely express one’s critical opinions without being killed or imprisoned, it is possible to remain decent. The real moral dilemmas start when one’s livelihood, or even one’s life, depends on whether one is willing to cooperate with an indecent state. Where there is no choice, there is less of a dilemma.

    One choice faced by people in a dictatorship bent on humiliation, or worse, is whether to stay or to leave. Not everyone has this luxury, of course. Moving to another country is always difficult, and for many people unthinkable. Most countries will not let you in without money, documents, jobs in prospect, linguistic skills, and so forth. People who try to leave anyway often end up dead, or in holding camps under appalling conditions. And all this depends on whether you are allowed to leave in the first place. Everything was done to make it extremely hard for Jews to leave Nazi Germany in the 1930s. Once the Germans occupied most of Europe, it became utterly impossible. Less than a decade later, the Iron Curtain, which was not just a metaphor, was designed to stop people from leaving Communist states. 

    These are practical issues. But assuming that a person is famous, or rich, or well-connected enough to get out, there is a moral issue, too. In the case of Nazi Germany, but also the Soviet Union, China, Russia, or any country under dictatorial rule, a rift invariably opens up between those who leave and those who, for whatever reason, choose to stay. Thomas Mann, who was hostile to the Nazis and married to a Jewish woman, fled Germany as soon as Hitler came to power in 1933, first to Switzerland, later to the United States. After becoming aware of the horrifying extent of Nazi crimes, Mann said on BBC radio that “everything German, everyone that speaks German, writes German, has lived in Germany [my italics], has been implicated by this dishonorable unmasking.” He went on to claim that all books published in the Third Reich stank of “blood and shame” and should be pulped. 

    This did not go down well with writers who had stayed in Germany but had not been Nazis, who had tried in their own eyes to remain decent. The novelist Frank Thiess took Mann’s criticism personally. He responded by coining the phrase “inner emigration.” Living through the darkest times at home and shielding oneself from the criminal state by withdrawing into one’s private thoughts was surely more heroic, he argued, than lecturing one’s compatriots from the comfort of Californian exile. 

    A similar conflict emerged from the terrors of Vladimir Putin’s policies in Russia. Hundreds of thousands of Russians have left their country; some, like Thomas Mann in 1933, because they would have been arrested if they had stayed, others because they found life under Putin’s autocratic and belligerent rule intolerable, and still others to avoid being compromised by it. The film director Kirill Serebrennikov, for instance, had wanted to stay in Russia despite being continuously harassed by the government. But the attack on Ukraine was the last straw. “This war,” he said, “is being waged by a president and politicians I didn’t vote for, but in the eyes of many, I am their unwitting accomplice.” 

    The chess champion and political activist Gary Kasparov, who left Russia in 2013, declared that Russians who want to be “on the right side of history should pack their bags and leave the country.” Those who don’t, he said, “are part of the war machine.” And yet some who have opted to stay are indisputably decent people. The journalist Dmitry Muratov won the Nobel Peace Prize in 2021 for trying to uphold freedom of expression in Russia. He has openly criticized Putin’s war. His newspaper, Novaya Gazeta, is now published online from abroad. But he refuses to leave Moscow, where some of his former colleagues on the paper continue to live: “We will work here until the cold gun barrel touches our hot foreheads.” 

    A figure such as Muratov would probably not have survived in Nazi Germany. He would at the very least have been forced to remain silent. Inner emigration, however, can take different forms. Some writers and artists continue to work, while managing to avoid being tools of propaganda. Very few Japanese left their country when it was run by military autocrats who were unleashing wars all over Asia in the 1930s and 1940s. Foreign exile was hard for most Japanese to imagine. Some writers, such as Nagai Kafū, refused to publish during the war. Others avoided being propagandists by concentrating on historical themes or offering harmless entertainments. In 1941 the great film director Mizoguchi Kenji made The 47 Ronin, a long film about a famous samurai legend. It could be interpreted as a patriotic work, but without endorsing the ultra-nationalistic regime. 

    Several famous artists who continued working in Nazi Germany claimed that classical high culture lifted them above the criminal nature of Hitler’s rule. The actor Gustav Gründgens was happy to lead the Prussian State Theater under the auspices of Hermann Göring. He performed the German classics there in the belief — he later claimed — that his theater was a kind of oasis cut off from the terrors of the Nazi state. Wilhelm Furtwängler, perhaps the greatest conductor of his time, could easily have left Germany after 1933. He refused to do so for the same reason that Thomas Mann chose exile. Mann claimed that German high culture went wherever he went. Exile was the only principled way to keep it alive. Furtwängler, too, saw himself as the guardian of high culture, but he felt that his artistry would wither outside his native country. Responding to the criticism of Arturo Toscanini, who said that “everyone who conducts in the Third Reich is a Nazi,” Furtwängler made the following statement: “Personally, I believe that for musicians there are no enslaved and free countries. Human beings are free wherever Wagner and Beethoven are played, and if they are not free at first, they are freed while listening to these works. Music transports them to regions where the Gestapo can do them no harm.” 

    This was astonishingly naïve, and not a little self-serving. But was it indecent? Did it compromise Furtwängler? Since it was the intention of Joseph Goebbels to showcase German high culture, and even popular entertainment, to demonstrate the civilized nature of the Third Reich, one might say that anyone who assisted him in this endeavor was complicit. Furtwängler refused to be a party member, unlike, say, Herbert von Karayan, who had an SS rank, and he protected some Jewish musicians. He remained a decent man, but he was compelled to perform for Hitler’s birthday, and his work as a conductor certainly kept up the pretense that culture under the Nazis still thrived.

    The case of Erich Kästner was even more complicated. He was not only the author of Emil and His Detectives, a celebrated children’s book that appeared in 1928, but also of an anti-Nazi novel, Fabian: The Story of a Moralist, in 1931. His books were tossed into the fire during the notorious book burning in 1933. Kästner, who detested the Nazis, was banned from publishing. But he was determined to stay in Berlin; he refused to let the Nazis chase him out of his own country. And he didn’t want to leave his mother, to whom he was devoted. A good man, no doubt. Yet he needed to work. When he was asked to write a film script for the UFA studio under a pseudonym, he accepted with alacrity. Munchausen, released in 1943, is not a propaganda film, but a brilliant rendering of the tale of Baron von Münchausen, the eighteenth-century fantasist. Goebbels wanted the movie to be more lavish, more accomplished, more technically brilliant than anything produced in Hollywood. Shot in glorious Agfa color, it featured many of the major movie stars who still remained in Germany. Not only was there no sign of Nazi propaganda; Kästner even managed to slip in a few lines that anyone who paid attention could construe as barbs against the Nazis. In one scene, a wicked wizard, dripping with malice, suggests to Munchausen that invading Poland would give them untold power. When Hitler realized who had written the script, he erupted in fury and made sure that Kästner would not work again. 

    And yet, Kästner, like Furtwängler, had cooperated with Goebbels in his aim to burnish the image of the Third Reich with high art and entertainment. This did not make him an active Nazi collaborator. One can excuse his behavior, even if it is hard to justify. But he was still compromised. 

    If this was true of German artists who were decent men, and even hostile to the Nazis, what to think of the French artists and entertainers who continued working under German occupation? Sartre published books and put on plays. Dior designed dresses, Mistinguett sang songs, Henri-Georges Clouzot made (some very good) films, and so on. Their main excuse for doing so was not unlike Furtwängler’s: they wanted to demonstrate that French culture was still alive, despite the humiliation of Nazi occupation. This was actually a source of pride. Some even saw it as a tacit sign of resistance.

    But not everyone. Jean Guéhenno was a teacher and a highly esteemed literary critic. Rather than submit to Nazi censorship, he decided that silence was the only decent response for a French writer. He explained his reasons in the priceless journal he kept during the occupation years, which was published as Diary of the Dark Years. He wrote (in my translation): “What to make of French writers, who, to stay on the right side of the occupation authorities, decide to write about anything but the one thing all French people are thinking about; or worse still, who, out of cowardice, bolster the occupants’ plan to make it appear as though everything in France continues as it did before?”

    Guéhenno’s journal is wise, witty, and scathing about his fellow writers, including some very famous ones, such as Paul Valéry and Henry de Montherlant. “Incapable of being in hiding for long,” he writes, this type of literary figure “would sell his soul just to keep his name in print.” Valéry didn’t write propagandistic poems, to be sure, but he kept safe by sticking to mythological themes to entertain his readers. Still, writes Guéhenno, “if all you can do is amuse us, just shut up.” 

    Such a principled stand was harder to maintain for people who depended on their pen or their artistry to make a living. Guéhenno could still teach at a lycée. But it made sense in a country under a decidedly indecent foreign occupation. In a totalitarian state, even the choice to remain silent is not usually open to people. When Chairman Mao ruled China, everything was done to make everyone complicit in the crimes of the state. People were forced to go on murderous campaigns against “rightists,” “deviationists,” “bourgeois revisionists,” and other “class enemies.” During the Cultural Revolution, many Chinese were both perpetrators of terrible violence and victims of it, depending on the ways the winds blew. For artists, intellectuals, and writers, shutting up was not really an option: they were forced to praise Mao’s infinite wisdom and extoll the party line. Writers who still hoped to remain decent and tried hard not to be compromised often ended up murdered. One of the great twentieth-century Chinese writers, Lao She, who refused to write propaganda, was tortured to death (they called it suicide) during the Cultural Revolution as a so-called “counterrevolutionary.” To have been an Erich Kästner, let alone a Jean Guéhenno, in Mao’s China would have been impossible.

    Stalin, whom Mao much admired, could be just as murderous. The deliberate humiliation of certain categories of people, including in certain periods the Jews, made the Soviet Union under Stalin a patently indecent state. There are, however, examples of public figures who remained decent nonetheless, but almost never without making some compromises, if they wished to survive. 

    Dmitri Shostakovich would not have been able to compose his music under Mao. (The Chinese pianist and composer Liu Shikun was imprisoned for eight years in 1967 for playing Western classical music; guards took special pleasure in beating his arms and hands.) In fact, Shostakovich could easily have disappeared into the gulag at different times in his life. He was denounced in 1936, after Stalin walked out of his opera Lady Macbeth of Mtsensk. When Pravda accused him of composing a “muddle instead of music” and betraying “Soviet art,” friends deserted him and former admirers were quick to pile on the accusations. He was hauled into the NKVD headquarters, the “Big House,” a year later, where he was interrogated and pressed to denounce a close friend. Only the lucky event that his interrogator was suddenly arrested himself saved the composer. 

    In 1948, Shostakovich was one of the victims of a decree against “bourgeois degeneracy.” His music was attacked for its “formalism.” Again, many former friends and colleagues, terrified of contamination, turned against him. He was forced to repent for his aesthetic and political sins. His teaching positions were cancelled and he considered committing suicide. To make amends and feed his family, he promised henceforth to compose music only for the People, and wrote music for propaganda movies and odes in praise of Stalin.

    And yet, despite the threats to his life and livelihood, Shostakovich also continued to compose serious music and he remained a decent man. There are many instances of bravery and personal kindness, recorded in Elizabeth Wilson’s book, Shostakovich: A Life Remembered. A young composer named Isaac Schwartz, whose parents had been arrested as “enemies of the people,” was taken under Shostakovich’s wing. The composer even secretly paid for his education. At the height of the anti-formalist campaign, Schwartz was ordered by his conservatory to publicly denounce Shostakovich as a bad teacher. He refused to do so. When Shostakovich heard about this, he was moved, but he told Schwartz that he should never have taken such a risk; he had a wife and small children to think about. “If I am criticized, then let them criticize me — that’s my affair.” When Stalin went after Jews in public life to purge “Zionists” and “rootless cosmopolitans” in the late 1940s, Shostakovich stood up for Jewish musicians and composed a cycle of Jewish songs, entitled From Jewish Folk Poetry — which was performed in public only after Stalin’s death. 

    Shostakovich probably could have left the Soviet Union at some point, but he chose to stay. He knew that in order to survive and to keep composing, even pieces that could not be performed at the time, certain compromises had to be made. That is why he agreed to do some hack work to appease Stalin. In the eyes of Russian artists who had been able to live abroad, such as Igor Stravinsky, these small compromises tainted him. This came to a head in one of the most humiliating episodes of the composer’s life. 

    In 1949, just a year after he was denounced as a “formalist,” Shostakovich was sent by Stalin to represent the Soviet Union at the World Peace Congress in New York. Shaking with nerves, lighting one cigarette after the other, he had to read out a prepared statement criticizing such composers as Stravinsky and Schoenberg (both of whom he in fact admired) for their complicity in decadent bourgeois Western culture. He was also forced to express his gratitude to the Communist Party for correcting his own errors. This sorry performance was a punishment designed to humiliate Shostakovich in public. 

    But the rift between those who stayed in the Soviet Union and Russians living abroad was made painfully clear when Stravinsky was asked to sign a telegram welcoming Shostakovich and other artists to the United States. He responded that he was unable to “join welcomers of Soviet artists,” since all his ethical and aesthetic convictions opposed such a gesture. He also turned down an invitation to engage in a public debate with Shostakovich, no doubt to the latter’s relief. Stravinsky stated: “How can you talk to them? They are not free. There is no discussion in public with people who are not free.” To say that Stravinsky was right is not to say that Shostakovich was indecent. But it did point to the compromises a person had to accept if he chose (so long as there was any choice at all) to live in a brutal dictatorship. 

    Stalin’s Soviet Union and Mao’s China are extreme examples of oppression. There were periods in both countries, as well as in peripheral Communist states, when writers and artists could produce serious art, untainted by official propaganda, but they often had to do so in the way Erich Kästner wrote his script for Munchausen, full of subversive hints that demanded reading between the lines. During the Prague Spring in the 1960s, Czechoslovak writers and filmmakers hardly even needed to do that. But once that relative freedom was crushed under Soviet and East German tanks, the same question emerged there, too: to stay and submit to the indignities of public denunciation and censorship, or to leave. 

    Again, the option of leaving was only there for the few, and crossing the Iron Curtain was risky. But those that made the leap into the greater freedom of the West still had to make a living in strange and not always hospitable countries. Milan Kundera succeeded in Paris, but was resented by many who had stayed at home. Of the great Czech filmmakers, Jiří Menzel, opted to stay, but at the cost of seeing his work banned, or much reduced in quality. He never again made anything as good as his Closely Watched Trains

    Miloš Forman and Ivan Passer made it to the United States. Forman somehow managed to continue making wonderful films inside the Hollywood system. The satire of some of his American work (Taking Off, One Flew Over the Cuckoo’s Nest, The People vs. Larry Flynt) is as biting as his Czech films. He was a survivor on both sides of the Iron Curtain. But operating inside the American entertainment industry exacts its own compromises, which have less to do with relative decency than with the demands of the marketplace. Forman was asked about this often. His answer was always that he preferred commercial constraints to political censorship. He had left his country because in his view Communism “humiliates your pride, because it forces you voluntarily to twist your spine.” If he had stayed in Prague, he would have had to quit making films, or conform and “shit right into my mouth.” In Hollywood, he said, “financing what the ‘money men’ consider uncommercial can be a problem, not a prohibition.” He likened living in Communist society to being kept in a zoo. You get fed, but you are locked up in a cage. America was more like a jungle: “You’re free to go wherever you like, but everyone’s out there trying to kill you.” It is clear which he preferred. But there is of course a price to be paid for commercial conformity, too, expressed by Passer, who was less successful in America than Forman. Passer once said that only he could have made his Czech films of the 1960s, while his American movies could have been made by any capable studio director.

    Whether an authoritarian society, Communist or rightwing, is always an indecent one is a valid question. Not all of them humiliate and persecute minorities. Forman would argue that Communist governments humiliated all their citizens by forcing them to submit, not just physically but also mentally. Vaclav Havel, in his famous essay about “living in truth,” wrote about the way people had to repeat official lies knowing full well that they were lies: “We fell morally ill because we got used to saying something different from what we thought.” The only relative advantage of living in a rightwing or military dictatorship is that a person has a better chance of remaining silent. 

    The humiliation of having to repeat falsehoods touches public figures, such as writers and artists, more than the average citizen. But the complicity which is often forced on people affects everyone. One of the most perverse examples of Communist oppression in Europe was the German Democratic Republic — perverse because many former refugees from Hitler’s Reich saw the GDR as the good Germany, the anti-fascist Germany. Not many German Jewish writers returned to the democratic Western half of Germany. Alfred Döblin was an unusual case, and he never felt at home there. Quite a few Jews did return to the GDR; Stephan Hermlin and Stefan Heym are the most famous examples.

    But even though many former Nazis continued to thrive in the West, sometimes in high positions, the East replicated some of the oppressive methods of the Third Reich. The East German secret police, the Stasi, was more pervasive in daily life than the Gestapo had been. There were no death camps in the GDR, and the rhetoric was all about equality, brotherhood, and world peace. Overt state brutality was avoided, unless strictly necessary, for example in cases where people tried to escape over the wall. But it was common for people to be taken in for interrogation by the Stasi. The conversation might turn to a person’s children who wished to go to a decent school. This could be arranged, of course, but in return the person would be obliged to come in for regular chats to report on what his friends were saying about the state. By complying the decent person would become indecent, denouncing friends and even close family relations. 

    The case of Heiner Müller, the famous playwright, shows how this could happen to the best. He was born in 1929. His father was a Social Democrat, and locked up for a time by the Nazis in a concentration camp. But Müller joined the Hitler Youth, like most boys of his age. After the war, the family lived in Soviet-occupied Germany. They were socialists, but in 1951 Müller’s parents and younger brother decided to move to the greater freedom of the West, when it was still easy to do so. Müller remained in East Berlin and became a successful writer. But he often got into trouble with the party. His plays were sometimes banned. He protested against the oppression and forced emigration of dissidents. And yet he remained loyal to the GDR, as the better, the anti-fascist, Germany. 

    Müller was to all accounts a decent man. Then, after the fall of the Communist state in 1989, documents were found in the Stasi files suggesting that Müller had been a secret informant since 1979, and had possibly even had a hand in the denunciation of a fellow writer. Müller responded that he had never signed any documents or put anything in writing, but admitted that he had been naïve not to realize that conversations with Stasi agents would classify him as an “informal informant.” Perhaps he had been naïve. Perhaps he was coerced for personal reasons. Here, too, Anthony Eden’s refusal of unearned righteousness should be observed. But it does show at the very least how hard it is to navigate the pressures of an indecent state. 

    All the examples above concern the behavior of people in societies where free speech is severely impaired, or non-existent. What about an indecent state which still retains freedoms that people in liberal democracies have taken for granted, such as free elections, a free press, and a degree of judicial independence? Consider two countries in our own time. They may not be the only examples, but right now they are the most prominent: Israel and the United States. Under the government of Benjamin Netanyahu, Israel, though still a democratic state, has taken on Avishai Margalit’s definition of indecency, in which official institutions devise policies to humiliate people and minorities. His cabinet includes men whose views on Palestinians are violently hostile. The Minister for National Security, Itamar Ben-Gvir, has been convicted numerous times for inciting racial hatred. Killing more than forty-thousand people in Gaza in retaliation for the terrible violence against Jews on October 7, 2023 was less an inevitable consequence of war than an act of brutal vengeance. Palestinians living on the West Bank have been subject to institutional humiliation for many decades. To say that worse atrocities are taking place in Sudan or eastern Congo will not do, precisely because Israel still is a democracy.

    Many Israelis have left their country as a result. It would be quite wrong, however, to claim, as some commentators have done, that those who stay are therefore complicit in the crimes of the state, for it is still possible to be a decent Israeli. One of them is the novelist David Grossman. His fiction and his political writings are works of great humanism. He has protested vigorously against his own government’s cruel and degrading policies against the Palestinian population. He speaks out against official abuses regularly. Yet he continues to live at home, where he feels he belongs. Despite all its flaws, he still believes that his country, founded by survivors of mass murder and persecution, has a right to exist, and to defend itself. This is not an indecent position. Nonetheless, he has been accused by certain “anti-Zionists” outside Israel of being an apologist for genocide. When citizens can still speak freely, even if they might have to suffer the wrath of their government and some of their compatriots, they should be honored for raising their critical voices, and not found guilty by association.

    The American administration under Donald Trump is doing everything it can to construct an indecent state. Immigrants are insulted by the president himself and threatened with arrest and deportations. Some permanent residents have already been thrown in jail for expressing views the government disapproves of; protesting against the Israeli war in Gaza can be reason enough. Government agencies upon which large numbers of people depend for their health or even their lives are called “criminal” and destroyed. Havel’s dictum of “living in truth” is systematically undermined by demands that men and women in leading government posts repeat lies: the 2020 election was “rigged,” rioters in the Capitol were “patriots.” The independence of the judiciary is being damaged by appointing sycophants who promise to prosecute the president’s political opponents. Journalists are denounced as “enemies of the people.”

    But the indecent state in the United States is not yet a dictatorship. The press is still free to report and publish critical opinions. There are still independent judges. There will be elections, unless Trump is willing to provoke a constitutional crisis. And there is a large opposition party. All this could collapse in time, of course. One way to make that more likely is to behave as though the United States already is a tyranny. Caving in to irrational and sometimes illegal demands without being forced to do so can only strengthen the indecent impulses of government leaders with authoritarian intentions. This has been called “anticipatory obedience.” Instead of resisting unreasonable attacks on their journalistic work, media companies pay large amounts of money to a hostile government to avoid getting sued. Law firms have done the same. Newspaper owners order editors to pull their punches against the president. Politicians indulge in what in Nazi Germany was called “working towards the Führer”: seeking to please the leader by anticipating his most outlandish desires — Trump’s face on Mount Rushmore, a third presidential term. Corporations, universities, and even the American military, in a fit of panic, scour through their records, communications, and academic curricula, so as to scrub anything that might attract the ire of the president and his minions. If “woke” ideology was often used in many of those same institutions to stamp on free expression before, the anti-woke crusade of the far right is even more dangerous, since it has the backing of the state. Universities are deprived of billions of dollars in federal funds if they refuse to let the government decide what should be taught, and who should do the teaching. That Harvard decided to fight back and sue the Trump administration is a welcome sign. One hopes that others will follow.

    Then there is that other inevitable psychological response to any indecent state: internal emigration, the temptation to tend only to your private garden, to shut out the noise of the polis, to refuse taking note of the news. There were more demonstrations when Trump got elected in 2016. That the the far more radical nature of the second coming of Trump has met with less resistance so far could have several reasons: a sense of numbness, a punch-drunk Democratic Party, the lack of a focus that could bring people out into the streets, or a justified fear that mass rallies will provoke state violence without doing much to deter Trump’s extremism.

    But if inner emigration is excusable in a dictatorship, where speaking out carries lethal dangers, there is little excuse for it when speech is still free. Anticipatory obedience is not the way to stay decent. Citizens must protest, in any way they can, against attempts to break down the institutions that protect a liberal democracy, especially when men and women leading those institutions, including the president himself, use them to humiliate people. If citizens fail to do so while they still can, without risking prison or deportation, they will deserve much harsher judgment from future generations than Anthony Eden was prepared to pass out to the French.

    In Blood-Boltered Times: The Northern Poets

    The late Michael Longley told of how, one Saturday morning in the middle of the 1970s, when tribal warfare in Northern Ireland was becoming bloodier by the day, and even more so by the night, he and his wife, the literary critic and academic Edna Longley, were having a weekend lie-in, when he heard from below the rattle of the letter box. He put on his dressing gown and padded downstairs — I can see him, that big soft shambling gentle man — and collected a sheaf of letters and bills, and returned to sit in bed propped against pillows, with his spectacles teetering on the end of his nose, which is how in my memory he always wore them. 

    On one of the envelopes he recognized the handwriting of his friend and fellow poet Derek Mahon. Accompanied by a brief covering note, Derek had sent a poem fresh from the smelter. It was “A Disused Shed in Co. Wexford.” Michael read it, read it again, then in an awful silence let the sheet of paper fall from his hand. Edna asked him what was the matter. He sighed grimly and said, “I’m giving up poetry.”

    This was not the first instance I had of the depth of the rivalry — friendly, sort of — that smoldered amongst the remarkable generation of Northern Ireland poets born just before the war and in the decade or so after, a generation that included Seamus Heaney, John Montague, Medbh McGuckian, Seamus Deane, Ciaran Carson, Frank Ormsby, Tom Paulin, James Simmons, Gerald Dawe, and Paul Muldoon, a list which is by no means exhaustive. Note the preponderance of men. Plus ça change.

    The phenomenon of so much poetic talent erupting — and there was the sense of a volcanic event taking place — is surely remarkable, if not unprecedented. There was of course the unsettling thought that the poetry and the violence might be gushing up from the same creative depths. I put the possibility to Seamus Deane, in trepidation and not without a sense of daring — it was a question, in those troubled days, if we in the South had any right to offer our thoughts or opinions on the burgeoning Northern disaster. 

    Seamus was born in the Bogside — a radical republican area of Derry city which at the time was under more or less permanent siege by Protestant extremists — and therefore knew a thing or two about the subterranean forces at work in his homeplace. His theory as to the inspiration for so much literary activity on his side of the border was far simpler and far more prosaic than any of my dark speculations. According to him, the phenomenon was mainly due to the Northern Ireland Education Act of 1947, which, despite thunderous protests by the Protestant Orange Order and some Unionist politicians, provided increased support for education generally and for Catholic schools in particular — as Seamus wryly observed, “They graciously allowed us to get an education.”

    Certainly the Northern poets knew their stuff. Longley was a fine classicist, Heaney and Ciaran Carson would both produce translations of Dante, Mahon knew the French poets in French, and Deane was to become one of the finest literary critics of the time; and the rest were no intellectual slouches either. While the sons and daughters of the 1940s, 1950s, and 1960s in the South were pinned under the yoke of the Catholic Church, which dominated the education system in the Republic, the Northerners were benefitting from the postwar British determination to provide equal opportunity at all levels of society. Needless to say, the reformers could not prevail against the rigidity of the English class system, but while the going was good they did many and very good things.

    Were writers in the South shocked by the quantity and the quality of the work that suddenly started pouring out of the North? Certainly the poets were — shocked, and envious. One of them, whom I shall not name, from an earlier generation, born at the end of the 1920s, carried a bitter lifelong resentment, frequently and vociferously expressed, of Seamus Heaney’s lavish successes. But then the internecine struggle among the Northerners themselves was in some instances almost on a par with the warfare going on in the streets of Belfast. Years ago, at a literary festival somewhere abroad, we were being brought on a tour of local beauty spots, and one of the party — I shall not name him either — when he learned that Heaney was on the bus, refused to board and went off and got drunk by himself. 

    Anyone familiar with Derek Mahon’s work will understand Longley’s reaction that Saturday morning in what the newspapers used to refer to lip-smackingly — the hacks do like a good scrap — as “war-torn Belfast,” when he first read “A Disused Shed in Co. Wexford.” It is, simply, though it is not simple, one of the finest poems, if not indeed the finest, written in Ireland since the death of Yeats.

    Mahon himself came to resent the poem for what he saw as its vulgar popularity; on stage once I urged him to read it to the audience, to which he responded with a cold stare and a curt refusal. I, of course, could have kicked myself for my crassness: never ask a poet to have another go at a lollipop that long ago has been licked down to the stick. But what a marvel it is. Mahon had accompanied his friend, the novelist J.G. Farrell, on a field trip to research the latter’s novel Troubles. On a stretch of coast in County Wexford they came upon the ruins of a burnt-out hotel, the once grand Ocean View, where in a tumbledown shed

    Among the bathtubs and the washbasins
    A thousand mushrooms crowd to a keyhole.

    In the poem, the mushrooms become a symbol for universal suffering:

    They are begging us, you see, in their wordless way,
    To do something, to speak on their behalf
    Or at least not to close the door again.
    Lost people of Treblinka and Pompeii!

    There is another Mahon poem, a companion piece to “A Disused Shed in Co. Wexford” but not as well-known, though it is very nearly as fine, and which is one of my own favorites among the poet’s work. The central conceit of “A Garage in Co. Cork” — “Like a frontier store-front in a western / It might have nothing behind it but thin air” — is that the long-departed couple who ran the abandoned filling station have undergone a marvelous Ovidian metamorphosis:

    A god who spent the night there once rewarded
    Natural courtesy with eternal life —
    Changing to petrol pumps, that they be spared
    For ever there, an old man and his wife.

    Derek’s drinking was as much a part of him as his cravats, his blazers, and the drawl which he affected mainly for comic effect. I mention the booze only in order to express my wonderment at how much he achieved despite his addiction — work universally celebrated for its poise and sure-footedness. I recall an evening in the 1990s in Paris, when I stood with him at the bar in a crowded La Coupole while we waited for a table. “Of course, I don’t drink at all any more,” he remarked casually, as the barman handed him the double Pernod he had just ordered. In a way I understood what he meant; compared to the quantities he used to put away, he was by then practically a teetotaller. In a rueful but characteristically jaunty essay, “At Aristaeus’ House,” he bemoaned the price that the drunkard pays for his drunkenness. “Insomniac, disoriented and paranoid, I made the acquaintance of a series of detox-and-rehab establishments. This . . .went on for several years: in and out, in and out, the ‘revolving door’ as they call it. (I’d have to celebrate sobering up, you see.)”

    He did sober up and stayed on the wagon for years, though I heard he occasionally fell off of it towards the end. By then I had lost touch with him. He was living in Kinsale, a pretty seaside village to the south of Cork city. I hope he was happy there. He continued to write, and in 2018, two years before he died, he published one of his finest collections, Against the Clock, which I had the happy opportunity to review, and which I described as “superb, thrilling and fizzingly exuberant” — come on, allow me a bit of smarm: I was writing about one of our greatest poets at the close of his life.

    I knew them all, or almost all, and a few of them were friends. Oddly, in every case I cannot remember where or in what circumstances I met them first. Ireland in the last quarter of the twentieth century was a swirl of literary goings-on: parties, book launches, raucous dinners, and festivals, festivals, festivals. I suppose it was all a little frantic and somewhat got up. Flann O’Brien used to say that Ireland had ten thousand poets — a standing army! — and five thousand readers of poetry. 

    I seem to have been readily accepted by the Northern crowd, or perhaps I just barged in; either way, I saw much of them. In our middle years, my wife and I used to have dinner with the Heaneys and the Deanes every three or four months, in our house or in theirs, turn and turn about, and delightful occasions they were. Seamus Deane had a wonderfully sly, dry wit. One evening, at our place, at a time when we were financially strapped and boastfully home-producing much of our food, Mrs. B. served up a splendid beef roast, at the sight of which Seamus exclaimed, “Ah, I see you do your own slaughtering, too.”

    Marie’s husband was a notoriously good storyteller. When Seamus Heaney walked on to the stage to give a reading, he would have the audience in the palm of his hand within seconds — he was a natural. He read his own work beautifully, in a softly modulated baritone purr, enhanced by the richness and roundness of Ulster vowels. I can still see him, forty years ago or more, on the stage of the old Eblana Theatre in the basement of Dublin’s central bus station, working his magic. 

    In a break between poems he spoke of the subtle and not so subtle distinctions between formal and demotic speech in Northern Ireland, and gave us an example. A schoolteacher friend of his suspected that a little boy in his class was “copying” from his deskmate. So one day he set the class to write an essay, on the theme of “The Swallow.” Shortly after they had got started he interrupted the exercise, and separated the copier from the copied from, sending the former to sit at a desk on the far side of the room. At the end of the allotted twenty minutes he gathered up the schoolbooks. The bright boy had written, “The swallow is a migratory bird. At the end of the summer it flies south and winters in North Africa . . .” And so on, for two or three pages of peerless prose. The other, less bright boy’s effort in its entirety read: “The swallow is a migratory bird. He have a roundy head.”

    Lest I give the impression that it was all fun and frolics at the expense of work, I should emphasize, if it needs emphasizing, that they were all, all of them, wholly dedicated to their art. Yes, they craved attention, they wanted adulation, they demanded fame, but the pursuit of excellence came first. Heaney led the way, others went with him, some even went further and outstripped him on occasion, but no one could hit the common note as surely and as often as he did. He was the lodestar in the poetic firmament of the time. There was something in the warmth and inclusiveness of his work that elicited, and elicits, love, not only in Ireland, where he was Our Poet, but all over the world. Wherever I went, wherever I go, Heaney was and is the one, whether in the frozen north of Finland, in the cider country of Galicia, or on the Brazilian pampas. 

    It could be trying, for the rest of us. One day long ago somewhere in the United States I was signing my books after a reading when a large loud man in a Stetson hat came marching up with an arm outthrust and boomed at me, “I want to shake the hand that shook the hand of Seamus Heaney!” I wish, in l’esprit d’escalier, that I had thought to say to him what Joyce said to the young man who approached him in a Paris restaurant and asked if he might kiss the hand that wrote Ulysses: “Yes, but remember, it has done a lot of other things as well.”

    As I have said, the competitive drive was strong in the lot of them, including Heaney, for all his ease and his generosity of spirit. In the 1970s he and Marie and the children moved to the Republic and settled in a fine house in Sandymount on the south Dublin coast, hard by the beach where Stephen Dedalus wrestled with the ineluctable modality of the visible. They were a hospitable couple, and their parties were frequent and famous. I think it was at the very first one of them I attended — perhaps it was our initial encounter? — that Seamus spoke to me about my novel, Birchwood, which had been published recently and wellishly received; he praised the style in particular and gave me a hard look and said that if it had been poetry instead of prose, he and many others would be worried.

    I remember being struck by the notion that he and I, or any of “the others” and I, should be in competition. This was a long time ago; I was young and naively puritanical in the matter of art and the making of it. Weren’t we all in it together? Yes, but the togetherness had spiky edges, and awkward and unaccommodating corners. In time, as Seamus garnered more and more fame, I joined the sullen ranks of the begrudgers who, even those who were friends with him, as I was, agreed with John Montague’s remark that “Heaney uses up all the oxygen.” I wasn’t even a poet, and could only imagine how “the others” felt. It would be many years before I found the courage and the occasion to tell him how much I regretted the jealousy and the rancor I had nursed in my treacherous heart, especially after he was awarded the Nobel Prize in 1995.

    Yes, he was a totemic figure, always there before us, immovable, unavoidable, winning more and more accolades and, infuriatingly, producing better and better work. Was he an encouragement as well as a challenge? Did our modest skiffs rise on the tide along with his ocean-going liner?

    He was always generous, always ready to lend a hand. In 1975 I was given the Irish-American Foundation Award, which came with a check for four thousand dollars, an appreciable amount of dosh in those days. I was by no means the obvious candidate, for I had as yet published only a few things; Seamus, who had won the prize two years previously, was, as we say in Ireland, “well got” with the Foundation, and I am convinced it was he who persuaded the judging panel to choose me. I never asked Seamus to confirm this, and of course he would never have mentioned it himself. He worked by quiet ways.

    It may seem paradoxical to say that one source of Seamus’ strength as a poet was the fact, which he mentioned more than once, that there was a part of him that “didn’t give a damn about poetry.” He was ever on the side of life, a truism that is not true of all artists. He came from a close-knit rural family, and grew up on a small farm, Mossbawn, a name so softly evocative the future poet might have invented it. His father was a cattle dealer as well as a farmer. I was at Seamus’ house one day when he confided to me that when his publishers told him he had come in for some unexpected royalties, he had asked to be paid in cash. “Look,” he said, “here it is,” and produced from his back pocket a warm wad of folded notes curved to the shape of his hip. “This is how my father carried his money,” he said, “and now I’m doing the same.” Filial piety was strong in him, and informed much of his poetry.

    I think of him as a worker-poet, bending to the task, putting his back into it, getting it done: calm, assiduous, patient. Who among us oldsters will forget the first time we read that seminal poem, “Digging,” from his first, luminous collection Death of a Naturalist in 1966? For many of us the poem had the same effect as “A Disused Shed” had on Michael Longley: how could we compete with such early mastery? In the poem the poet recalls the image of his father bending among the potato drills — “By God, the old man could handle a spade” — and closes with the announcement of his aesthetic, ringingly formulated in three short lines:

    Between my finger and my thumb
    The squat pen rests.
    I’ll dig with it.

    Among “the others,” Paul Muldoon was probably the most stylistically adventurous, and is still producing work of great daring, intricacy, and inventiveness. Like his playful poetry, he has himself an ageless quality, and still makes me think of those fictional tales of English public — read: private — school life in which bespectacled chaps get up to jolly japes and have midnight feasts of tinned sardines and cream buns. I believe it amuses him to maintain this persona, one of a number that includes heading up a rock band.

    In fact, one of his most telling poems evokes schooldays. “Anseo” — the Irish word for “present” — from the collection Why Brownlee Left in 1980 — recalls the poet’s childhood classmate, Joseph Mary Plunkett Ward, who was regularly beaten by the Master for his many absences from the morning roll call, when

    You were meant to call back Anseo
    And raise your hand
    As your name occurred.

    Years later the poet encounters the grown-up Joe Ward, who boasts that he is now in the IRA and “had risen through the ranks / To Quartermaster, Commandant,” and tells

    How every morning at parade
    His volunteers would call back Anseo
    And raise their hands
    As their names occurred.

    So much of the origins of the Northern Irish “armed struggle” is encapsulated in this superb, brief poem.

    That struggle was a fearsome challenge to anyone in the North harboring poetic ambitions. A couplet from Shakespeare’s Sonnet 65 that both Heaney and I often quoted pointed up the dilemma:

    How with this rage shall beauty hold a plea,
    Whose action is no stronger than a flower?

    And it was Heaney who addressed the issue most forcefully in 1975 in his collection North. He came in for some bitter criticism over that volume, in particular for daring to frame the violence of “the Troubles” by reference to ancient Norse ceremonies of blood and sacrifice, thereby, as some saw it, evading his duty to speak plainly of the here-and-now. On the other side, too, he was challenged by the “men of violence” for not expressing solidarity with their “struggle.” In the poem “The Flight Path,” in 1996, he describes, in deliberately unpoetical language, an encounter on the Belfast to Dublin train with the IRA volunteer and publicist Danny Morrison, though he does not name him in the poem:

    So he enters and sits down
    Opposite and goes for me head on.
    “When, for fuck’s sake, are you going to write
    Something for us?” “If I do write something,
    Whatever it is, I’ll be writing for myself.”

    In an interview later he said: “After that, I . . . wasn’t so much free to refuse as unfree to accept.” 

    I’m not sure why, but I associate Medbh McGuckian with Paul Muldoon. Her work, like his, often sounds more like magical incantation than poetry — mind you, some would say that poetry is magical incantation. I have only met her on a couple of occasions, and liked her for her humor and friendliness. I have never seen a photograph of her in which she is not smiling. Her poetry baffles me, in the best of ways. When I read her, I have the sense of being told things that I should understand, and that I will understand, with enough application. Here are the opening lines of “Ylang-Ylang,” which is as opaque as it is resonant.

    Her skin, though there were areas of death,
    Was bright compared with the darkness
    Working through it. When she wore black,
    That rescued it, those regions were rested
    Like a town at lighting-up time. 

    Muldoon, along with Derek Mahon and Tom Paulin, became notable exiles, Mahon and Paulin heading off to London, and Muldoon to Princeton. Heaney and Deane chose internal exile, moving from the North to Dublin and taking up academic posts, while Longley stayed on in Belfast, and worked with the Northern Ireland Arts Council. 

    My wife and I spent some delightfully convivial weekends at the Longley home not far from Queen’s University, with good talk, splendid food, jorums of drink, and frequent literary disputes — Edna is a person of strong opinions trenchantly expressed, who championed her husband’s poetry, was critical of it when she thought criticism was required, and was not slow to put others straight in the matter of poetry, its origins and ends and its fit place in society. Many a thick ear I got from her, for my lazy assumptions and lazier assertions; though she was born in the South, she is in ways more Northern than the Northerners themselves. 

    He was so funny, Michael, so funny and mischievous and — the word has just come to me — boyish. He loved jazz, drink, amusing company, fun. One evening he told us of a peace initiative he intended to present to the governments in Belfast, London, and Dublin. He was calling for the mass manufacture of the Northern Ireland Knickers, which would have a Union Jack printed on the front, the Tricolour of the Republic on the back, and, in the gusset, the Red Hand of Ulster . . .

    Longley, who died in January, wrote some of the most subtle yet provocative poems around the subject of the Troubles. In the spring of 1998, when I was books editor at the Irish Times, he submitted to me a poem called “Ceasefire.” For months negotiators had been trying to work out the terms of what would become the Good Friday Agreement, which eventually brought to an end, more or less, the three-way conflict between the IRA, the Loyalist paramilitaries, and the British army. The war had lasted for thirty years and in it more than three and a half thousand people had perished. 

    Michael’s poem makes no mention of the North — it is a retelling of that passage in the Iliad in which King Priam of the Trojans approaches the Greek warrior Achilles seeking the return of the corpse of his son Hector, whom Achilles had bloodily slain. It ends:

    I get down on my knees and do what must be done
    And kiss Achilles’ hand, the killer of my son.

    I held on to the poem through the final tense weeks of negotiations, until, at the beginning of April, word came along the back channels that a deal was very close. The poem would have to be put into the books pages on Thursday, for Saturday publication. We waited and waited. Thursday came and still we were kept in suspense. At midnight I took a deep breath and gave word to the printroom to go ahead. “Ceasefire” was set in large type and inserted prominently in a box in the center of the leading page. Next day, Good Friday, April 10, 1998, the breakthrough was announced. Phew.

    Longley was one of the finest nature poets of the age, as well as an elegist for those terrible times. The poem of his which draws together the two aspects of his genius most movingly is “The Ice-Cream Man,” and I want to quote it in full.

    Rum and raisin, vanilla, butter-scotch, walnut, peach:
    You would rhyme off the flavours. That was before
    They murdered the ice-cream man on the Lisburn Road
    And you bought carnations to lay outside his shop.
    I named for you all the wild flowers of the Burren
    I had seen in one day: thyme, valerian, loosestrife,
    Meadowsweet, tway blade, crowfoot, ling, angelica,
    Herb robert, marjoram, cow parsley, sundew, vetch,
    Mountain avens, wood sage, ragged robin, stitchwort,
    Yarrow, lady’s bedstraw, bindweed, bog pimpernel.

    Ciaran Carson worked with Longley on the Arts Council and later held a professorship at Queen’s University, where, after Seamus Heaney’s death, he headed up the Center named for him. He was something of a Puck among the poets, and would introduce his stage readings with a tune or two on the tin whistle, often accompanied on the melodeon by his wife Deirdre. Behind the impishness, or in the midst of it, he was a scholar, a linguist, and a poet of high achievement. Along with his Dante translation, he made a splendidly vigorous English version of the Irish prose-and-poetry epic Táin Bó Cúailnge. His final book of verse, Still Life, published in 2019, contains some of his most enduring and most moving work. And his study of Irish traditional music, Last Night’s Fun, is fun for any night.

    I have suddenly recalled his stammer. To my ear it was not so much an impediment as a means of emphasis, and an enhancement in particular to his fund of funny stories. I know I have perhaps leaned too heavily here on the topic of humor, but it was an important component of the Northern poets’ practice and program. It kept them buoyed up in atrocious times, it kept them focused, it kept them sane. In this context, I think of Philip Larkin suggesting that one belly-laugh would have brought the entire edifice of the Ted Hughes-Sylvia Plath phantasmagoria crashing down. The Northerners laughed among themselves, at themselves, and with and at us, and were all the better for it.

    Ciaran’s tone was always lovely and light. Here are the opening lines, characteristically long, of the first poem in Still Life, written after a successful medical procedure: 

    Today I thought I’d just take a lie-down, and drift.
    So here I am
    Listening to the tick of my mechanical aortic valve — overhearing, rather, the way
    It flits in and out of consciousness. It’s a wonder what goes on below the threshold.

    Much was going on, all of it awful. His heart was one thing, his lungs another. He died of cancer shortly after Still Life was published.

    And Seamus was six years dead by then. 

    Paris, 2009, and I am waiting on a delayed flight at Charles de Gaulle Airport, reading an advance proof copy of Human Chain, Heaney’s final collection. I had known he was uncertain about the book. He had phoned me one day about something or other, and I could tell he was not himself. In the doldrums, he said, finding it hard to concentrate, hard to keep up. Now, from that dreary airport lounge, I think to send him a phone text: “You have nothing to worry about. Human Chain is masterly.” On the instant comes his warm but worrying reply: “You have no idea how much this means to me.”

    As soon as I got home I rang him and invited him and Marie to come out to us for dinner. It was a long trek, since Sandymount, where they lived, is twelve miles or so from Howth, where we lived. But they came, and we had one of the sweetest occasions ever, the four of us together. That was the evening when I apologized to him for having been so jealous of his success, and for so long. It was childish, I said, and I’m sorry. As our wives looked on I pressed his hand, probably the first time we had touched since the day we first met, in the nearly forgotten long ago. I thought of him writing, in the sequence “Clearances,” of peeling potatoes at the sink with his mother when he was a boy: “Never closer the whole rest of our lives.” Just so.

    The Northern Poets — by now in my mind they have taken on the blazon of capital letters. They flourished in a glorious time, when we were young and the world was there to be written, and now that time is gone. Yes, a few of them are still vigorously present; there is, as Ciaran Carson had it, still life, though often nowadays the life-force feels more like a flicker than a flame. But what wonders they wrought, in blood-boltered times and their aftermath.