Poems l Spring 2026

    Draft of a Letter: Elizabeth Hardwick to Simone de Beauvoir

     

    Dare I say it? He was the best thing

    that ever happened to me. My man, the poet, mine

    then not mine, then briefly and nearly mine again.

     

     Why and how these things happen you would have said

    better than I—don’t you think?—free and unfettered

    as you were, another kind of liberated woman, never—

     

    horrid phrase— male-identified, your man, Jean-Paul,

    small and ugly and never for a moment yours as I thought

    mine now and then to be, though in the end we

     

    both of us knew them as the real distinguished thing,

    well worth the trouble we endured on their behalf.

    On their behalf. How little they knew,

     

    each of them, what they cost us, though here I Imagine

    you didn’t much mind what your man put you through,

    determined not to let him demand more of you than

     

    you demanded of him. Tedious, this effort to settle

    accounts, tally up the wounds and losses, to fight free of

    envy or resentment. Funny that I write to you now

     

    as if we had once known each other, had ever met,

    as if I hadn’t, young and maybe foolish, written about

    The Second Sex all those years ago and wondered at

     

    what you knew about women that I never quite believed,

    that we were as strong as men and had only ourselves

    to blame for our doubts and disappointments.

     

    No reason now or need to make amends, with you

    long gone and the occasion for this meeting, if you can

    call it that, a photo you never got to see, still

     

    nubile and naughty, slated to appear on your

    100th birthday—on the cover of Le Nouvel Observateur:

    you in your birthday suit, stark naked, your 1950 ass,

     

    alluring as it surely seemed to Sartre, back when

    he might still have gotten it up

    and wooed you with those inconstant off-center eyes.

     

    Can it be that yet again they’re raving about

    The Second Sex, as if there might be still some scandal left

    to fight about or repel?

     

    I look again at those perfect buttocks,

    the muscular legs and high stiletto heels and wonder

    at the Peeping Tom who got near enough to capture

     

    you and keep the photo stashed and secret,

    so that now the feminists you spawned can rage

    about the photo and the insult to your memory,

     

    while I see you smile and think to fix the stray curl

    you were standing at that dressing room mirror

    to tease or unleash.

     

    So much for the two of us to talk about, more than this photo

    of you standing stark naked for the hidden camera. So much

    still unresolved about the old story: Power and Subordination.

     

    Fancy words you wielded with a confidence and appetite

    I never had, not when it always seemed to me so boring

    to be nattering on about Dominance and Disenfranchisement

     

    at a time — our time — when women like the two of us could do

    as we pleased, take on men at least as brilliant — or more so —

    than the two of us, my Cal coming back to me but dead almost

     

    on arrival, after wasted years with that other woman, while you,

    who knows what you were doing with all those lovers, Jean-Paul

    never jealous or possessive, thinking you too much

     

    for any other man, certain you’d leave them fainting at your feet,

    fit for a good lay or two but not for more,

    the Woman Thing always your reliable calling card,

     

    Why you came on so querulous and angry

    even at the end I can’t imagine.

    Odd, you know, I always like you best

     

    when you tell them, your disciples, that they

    all live too much in their heads, though—forgive me—

    I don’t think you’d have been much of a wife, or a mother.

     

    To Appetite

    That summer of the incessant migraine you left me,

    The meds incremental, appetite weak and depleted,

    my sleep deep, without texture or upswell,

    by day the nocturnal fog lingering, desire at best remembered.

    First to go was my craving for sweets—chocolate, mango, 

    papaya, guayaba, the cherished fruits of childhood.

    Nothing enticed. Swallowing was hard.

    All passion drained from me. And with it the pain.

    All energy—even my longing for the old ways, some

    Surge of hunger—was sapped. I did not want you back.

    The famous itch that wants scratching

    was gone. I was at home at the farthest edge of Scythia,

    like Limos in her underworld, too numb to hurt.

    Really I did not pine for you. Rather, I feared your return,

    and with that terror a glimmer of something like strength

    came back: the spark of thought that awakens

    the next glimmer, and the next. From deep

    in the fog the famished brain cried for its want—

    so that at last and again the migraine came back,

    savage and tyrannical, came back, and with it, so did I

    come back to myself, desire indecent,

    resurgent, hunger insatiable, the wanting and the

    not having a torment and a gift, a restitution.

     

    Obliterated Light

                                          —to Seamus Heaney

    I am practiced in the art of goodbyes.

    You might say I do it for a living.

    No tears, no breast-beating laments,

    no wailing or gnashing of teeth: only a calm embrace,

    a soft kiss on each cheek, some simple words.

     

    But the dead, closer now, keep pressing

    —first one, then at once another, and

    another—Russell and Barry, Louise and now Pat—

    all dear to me, so that I want to ask—but won’t,

    knowing the answer—Where will it end?

     

    Today I awoke to the sound of a dog barking relentlessly.

    For hours he howled and complained

    to no avail. His suffering, so it seemed,

    was unanswerable.

     

     I was reminded of the unhappy dog

    in your exuberant English rendering

    of an African folk tale, replete with

    desperate crows, and one mischievous frog.

     

     That stupid frog you wrote about

    is still stuck in the mud, Seamus,

    and the dog, crying out in vain, still howls

    all night behind the corpse house — inconsolable — 

    for the House of Life — too late,

    too late!

     

    Last time you left us you said — was it a promise? — we’d meet

    again, that you would fly in, read us your poems, enchant

    us over dinner and walk us through the morning woods. 

    We said our goodbyes, confident of the next meeting.

     

    Of course, there are always endings, Seamus, early deaths

    and late, pointless murders, sectarian assassinations you know

    too well — but what had they to do with you?

    You, our jovial big-hearted genius. You,

    our irrepressible, earth-loving friend.

     

    In that clearing, just outside the House of Life, dear Seamus,

    the grass brims over its sacred cargo,

    and the company you keep is growing.

    The gate at the portal is open and the dog still wails at the wall;

    the new arrivals, slipping through, will be glad to see you.

    Shine your light on them.

    Tom Wolfe, Graduate Student

    I.

    Live long enough and eventually people will come calling and start asking you questions. In the last decades of his very long life — ninety years, from 1898 to 1989 — the editor and critic Malcolm Cowley became an increasingly attractive target of opportunity for a phalanx of inquiring minds who wanted to pick his brain about his personal history with the marquee names of American literature in the twentieth century. One of the dubious honors that attend literary longevity is that the survivor can become regarded as something like a vending machine of anecdotes and facts about the distinguished departed by biographers, historians, literary scholars, graduate students, and the merely curious and intrusive. 

     They were right to do so. Cowley had been in the thick of the action in American literature ever since he arrived at Harvard from Pittsburgh in 1915 as a scholarship student. He had driven a munitions truck on the Western Front as one of the many literary-minded “gentleman volunteers”; lived in France as the first wave of American cultural pilgrims hit the Left Bank (and wrote about it, first and perhaps best, in his classic memoir Exile’s Return); replaced Edmund Wilson as the literary editor of The New Republic, a true power seat in American letters; and made himself all too visible in the 1930s when American writers took a hard left turn politically. The breadth of his associations and his significant actions as an editor and a critic — and participant, observer, and all-round tastemaker and operator — positioned Cowley as a central player in the long-delayed rise of American literature to world stature. 

    This wave of curious and importunate callers began to accelerate in earnest in the 1950s, as the 1920s and the figures of the Lost Generation that Cowley knew so well and wrote about so consistently were coming into academic and popular vogue. Cowley took on this burden diligently and for the most part with good humor, with the occasional irritated complaint registered. In 1978 he wrote to Allen Tate that “People come round here with tape recorders to tap, tap, tape my memories as if I were a National Scholarly Resource. The National Endowment for the Humanities ought to provide me with a secretary. I should be funded before being embalmed.” But Cowley’s cooperation with them, sometimes willing and sometimes weary, is testified to by his prominence in the pages — text, acknowledgments, endnotes, and indexes — of unnumbered biographies, cultural histories, literary studies, journal articles, and long unread graduate theses.

     One of the earliest of the graduate students to contact Malcolm Cowley was a twenty-one-year-old by the name of Thomas Kennerly Wolfe, who was pursuing a degree in the still new and swiftly blossoming field of American Studies at Yale. In early January 1953, he wrote to Cowley at his Sherman, Connecticut, home, hoping to arrange a meeting on the Yale campus when he was scheduled to arrive in New Haven. For a seminar that his adviser, Professor Ralph Gabriel, was teaching on twentieth-century American intellectual history, Wolfe was writing a paper on the League of American Writers and the Writers’ Congresses that it had sponsored in the 1930s and early 1940s. As he put it in his letter, “What I want to find out is what documents, minutes, or perhaps other sources there may be, bearing on the Congress and the League of American Writers, besides the speeches that were published. I figured you might be in a good position to help me out on that.” 

    Tom Wolfe — to call him by the name the world now uses — had come to exactly the right person for his purposes. Malcolm Cowley was one of the most knowledgeable people on the planet about the League of American Writers, having been from its founding a prominent, energetic, and influential member for most of its seven-year existence. But in 1953, and for some years before and after, the League of American writers and his activities on its behalf would have been the very last thing Malcolm Cowley would have cared to talk about. His involvement with the League had been one of his most disastrous failures of political judgment in a decade full of them. It was known, even at the time, as a transparently Communist-front organization created to sway literary opinion in the direction of the Party and Soviet Union, and his reckless visibility as a Stalinist fellow traveler on its behalf — and more than a dozen other front organizations of lesser stature and duration — had come close to destroying his career. It had gotten him cashiered from his editorial post at The New Republic and reduced to mere contributor status, with serious financial consequences. Even worse, he was very publicly hounded in 1942 and forced to resign from a government position in the Office of Facts and Figures in Washington by the Dies Committee, Whittaker Chambers, and conservative attack-dog columnist Westbrook Pegler. In the postwar, his bright pink past had a tendency to rear its head whenever he was appointed as a visiting scholar at a state college, and on occasion the job offer would vanish. By the time of Wolfe’s letter Cowley had long retreated into a defensive crouch where politics was concerned, and he kept his opinions (straightforwardly liberal now) resolutely private. 

    Wolfe did not connect with Cowley at that time, but they finally did meet two months later, when he interviewed Cowley for what would become his doctoral thesis, “The League of American Writers: Communist Organizational Activity Among American Writers, 1929-1942.” I discovered this fact in the course of writing a biographical study of Malcolm Cowley, and the connection intrigued me no end. Here was a totally unexpected name to add to Cowley’s vast association with the great and the good of twentieth-century American literature. I had followed Tom Wolfe’s exploits in journalism and latterly in fiction from his earliest days as a pioneering New Journalist in Esquire and New York magazines, usually with delight but sometimes with annoyance and even distaste. (I Am Charlotte Simmons — eeeuuuuwww.) Eager to learn just what it was Cowley and other figures, including James T. Farrell and Archibald MacLeish, told Wolfe, and also to see the kind of writer he had been in his grad student phase, I downloaded and printed out the thesis, all 355 pages of it, and began to read it. You can do this yourself, but I don’t really recommend it. 

    A very famous writer once disparaged the dissertation as “so diligently dull and puritanically objective, it’ll dry up your skin and make your teeth fall out.” That writer was Tom Wolfe. Put less colorfully, his book-length thesis demonstrates all the flaws of graduate student prose, including bland language, repetitiveness and padding that suggests the writer was being paid by the word, a graceless grinding out of core arguments that soon gets tiresome, an utter absence of humor and lightness of touch — in sum, not a single trace of the skywriting prose style and cheekiness that would in the next decade make Wolfe famous. There is a high Cold War approach to the material that feels to the contemporary reader uncomfortably adjacent to Hearst red-baiting and the HUAC hearings (from which much of the underlying evidence derives). Example: “The Expansion of the Communist Writers Apparatus: The Master Plan” is the title of Chapter Four. I could go on in this vein, as Wolfe did. 

    Yet for all its longueurs — well, it’s basically one long longueur — the portrait that Wolfe’s dissertation paints of the political credulity of the American literati and intelligentsia in the 1930s is an important one — cold, mean-spirited at times, devoid of empathy and the effort to understand, but factually accurate. Wolfe’s approach is historical rather than literary — not a single book from the 1930s is mentioned, aside from a brief take on the short-lived vogue for proletarian novels — and also heavily sociological. He discovered Max Weber in New Haven and became enamored of Weber’s theory of the central role of status in the conduct of human affairs — a discovery that would serve him well throughout his writing life. This, in his telling, is the sole motive that drives his dramatis personae, who otherwise have all the agency and internal psychology of the average lemming. The unlovely phrase “manipulable mass” is one that recurs constantly in Wolfe’s pages. And yet there was a genuine aspect of herd behavior in the way that the American literary elites of the time swung left politically after the aestheticized 1920s and put themselves in the orbit and under the influence of the American Communist party, and by extension the Soviet Union. 

      The story that Wolfe tells in his dissertation is narrowly focused on means rather than ends. Marxist ideology and politics in general interested him very little. He announces up front that “this is an organizational study and does not deal with the history or the influence of ideas per se.” Instead he concentrates single-mindedly on “the role of the Communist party (in all countries) as an agency of social control” — one gets the feeling that Wolfe at heart rather admires this role for its theoretical discipline and technical expertise.

    In the first half of the 1930s the approach of the Communists towards writers and intellectuals applied a one-size-fits-all strategy that stressed conformity to, literally, the Party line above all else. In fact, many American writers had in some sense met the Communists more than halfway. A group of prominent writers calling themselves the “League of Professional Groups for Foster and Ford” — including Cowley, Edmund Wilson, Sherwood Anderson, Erskine Caldwell, John Dos Passos, Sidney Hook, and Lincoln Steffens — had organized on behalf of the Communist candidates for president and vice-president in the election of 1932, scanting the pre-New Deal Roosevelt and even the Socialist candidate Norman Thomas. They generated considerably more ink than votes.

    The Party’s main strategy regarding the literati in those years was the establishment of a nationwide network of John Reed Clubs, which gave economically beset younger writers in need of camaraderie places to meet and share tips for advancement and survival and, of course, to strategize, with Communist guidance, about the downfall of capitalism and the fight against fascism. Many of these clubs had their own literary magazines, shoestring operations with names like Leftward, The Hammer, Red Pen, Cauldron, and most resonantly and lastingly, Partisan Review. Their circulations were tiny, but they gave their writers a place to see their work published at a time when they were shut out from the established magazines and publishing houses. The clubs were in one sense farm teams meant to develop a new generation of ideologically-trained writers for the bright revolutionary future just up ahead. 

    The Communists viewed “The Professional Writing Class of the United States as a Manipulable Mass,” as Wolfe ungracefully puts it, because they were by and large concentrated in urban areas, especially in New York, making the goal of taking “effective control of the literary style of life” a practical one. Their putatively anomic and isolated way of living meant that writers were vulnerable, in Wolfe’s account, to the “status structure of the writing craft,” and the prefabricated social and professional life that the Communist orbit offered. Thus, he asserts, the Party was able to apply “the economically non-rational goal of literary prestige” to organize writers of even mildly leftist persuasion into a near-bureaucratic faction of considerable usefulness. In some of the most po-faced sentences Wolfe ever wrote, he zeroes in on the “bizarre side of party-situations prevalent within the New York literary community” and their “extraordinary degrees of intoxication, sexual promiscuity, and exhibitionistic garrulity.” Writers! They weren’t just getting loaded and laid at cocktail parties as normal people do. They were “acting out the value tensions which existed between the literary community and the surrounding spheres of influence and which insured the integrity of the literary style of life. “

    In any case, the word “prestige” is key to the change of Communist strategy in this regard that began in earnest in 1935. The Party had discovered “the utility of sheer prestige” in the literary and cultural realms, which meant a shift in emphasis towards established writers with attention-grabbing names rather than the tyros in the John Reed Clubs, which were swiftly and cold-heartedly dissolved, along with their magazines. In their place Alexander Trachtenberg, the head of the Communist publishing house and acting under orders from Moscow, began to lay the groundwork for something called the League of American Writers, the equivalent in the United States of the Union of Soviet Writers. His first action was to announce an American Writers’ Congress to be held in New York City in April of 1935, a conclave consciously modeled on the Soviet Writers’ Congress held eight months earlier in Moscow. 

    Wolfe does not mention this, but the call for the Congress ran in The New Masses in January: “A new renaissance is upon the world… The revolutionary spirit is penetrating the ranks of the creative writers…” It was signed by sixty-two writers of considerable clout, including Theodore Dreiser, Dos Passos, Farrell, and Cowley, as well as Trachtenberg and Earl Browder, then secretary of the Communist Party. The Congress took place at the Mecca Temple on West 55th Street, today called New York City Center, and was a great success, attracting more than 4,000 attendees who heard stirring speeches by the assembled literary celebrities (and Earl Browder). Wolfe bases his claim that some of the non-Communist members of the organizing committee were not fully aware of the Party’s control of the proceedings on his interview with Cowley, which if true demonstrated an exceptional blindness or naivete. 

    At the final session of the Congress the climactic event was the formation of the League of American Writers, with an executive committee chaired by the novelist Waldo Frank and including such figures as the Marxist bullyboy Mike Gold, Joseph Freeman, and Granville Hicks (all three of them Party members), as well as Cowley and his close friends Kenneth Burke and Matthew Josephson. The League and the many smaller satellite organizations, sometimes ad hoc and usually short-lived, provided the institutional structure for the Communist Party to begin to rationalize and bureaucratize its use of prestige as a tool of influence. (Wolfe’s organizational language in his thesis is catching, as you can see…) The timing, whether by accident or design, worked to the Party’s advantage, because in the late summer of 1935 the Comintern announced its Popular Front policy, which involved a retreat from revolutionary sectarianism to a new, big-tent, we’re-all-in-this-together strategy against fascism abroad and against hunger and want at home. The cunning new ethos was expressed by Earl Browder’s infamous but effective slogan coined around this time, “Communism is Twentieth-Century Americanism.” The façade of the Popular Front made it far easier for writers who might otherwise have kept a careful distance from Communist causes to sign on to such organizations and activities with a clear conscience — thus becoming, in Wolfe’s view, which was correct, part of an ever-widening “manipulable mass” of the literati cunningly controlled by the Party’s puppet masters. Or, in Wolfe-speak, “the League became an effective instrument for the political mobilization of the New York literary community.”

    The outbreak of the Spanish Civil War in 1936 provided just the right cause with which to energize and accelerate the Popular Front. The Loyalist’s struggle against the Nazi-supported insurgents and anti-fascism in all its international and domestic manifestations became the binding agent for a wide array of writers who might otherwise as usual have acted upon the narcissism of small differences. (The derisive Communist term for such people was “stooges.”) Undoubtedly the high point of the League’s visibility occurred in 1937 at the opening session of the second American Writers’ Congress when Ernest Hemingway, newly arrived from his journalistic exploits in Madrid and terrified of public speaking, gave a speech to a packed Carnegie Hall in support of the Spanish Republic and against Franco’s Fascist insurgency. 

    For a while the LAW’s mobilization of the means and the markers of literary status succeeded in keeping a majority of the country’s best-regarded writers “on the left.” Good things could happen in your career if you kept in line; a great deal of punitive abuse and sometimes a diminished status could occur if you did not. Your politics and your social life could align nicely, or as Wolfe puts it in another ludicrous sentence, the intoxication of so-called “cause parties” became a “state which Communist organizers found most advantageous, a half-chemical, half-psychological state in which the conventional cocktail party symbols of literary status would be indistinguishable from the political symbols and ideology embodied in the cause and the active commitment of the money contribution.” Arthur Koestler, meet Robert Benchley. (And, later and infamously, Leonard and Felicia Bernstein.) 

    It was all downhill from there, however. The Party may or may not have been, as Wolfe claims, “on the threshold of creating a monolithic writer-bureaucracy” — that seems badly overstated to me — but the Nazi-Soviet Non-Aggression Pact of August 1939 and the LAW’s hypocritical and incoherent support of it spelled the organization’s doom. The naked fact of its Soviet control could no longer be ignored, and a mass resignation of its most glittering names, including Thomas Mann, Archibald MacLeish, and Malcolm Cowley, followed, leaving in their wake only the Party members like the president, the screenwriter Donald Ogden Stewart, the most pliant fellow travelers and a cohort of commercial writers distinctly lacking in literary status. The League of American Writers limped along for the next couple of years until 1942, when it was not disbanded so much as abandoned like a dilapidated house. 

    Anyone who might read Wolfe’s arid dissertation without prior knowledge of the human drama of the American literary left in the 1930s and its individual players would come away with no sense whatsoever of the political excitement, hopeful exultation, and moral anguish of those years. Or of the vast catastrophe of the Depression, one so overriding that capitalism seemed not just rocked back on its heels but comprehensively failed. For that you have to turn to books such as Daniel Aaron’s classic and still definitive Writers on the Left and Malcolm Cowley’s own too-delayed memoir, The Dream of the Golden Mountains. (It did not appear until 1981.) The Communist Party (and Tom Wolfe) might have seen the American writers caught up in all this turmoil as a manipulable and faceless mass, but each one of them had to decide in the privacy of their conscience, as the language of the time had it, which side they were on. There is a rich and somewhat tortured literature on this subject, but Wolfe’s dissertation fails to cite a single instance of it, nor does it even mention any other highly pertinent works of literature published in those years — not The Grapes of Wrath, not “September 1, 1939,” not Waiting for Lefty, not U.S.A. He must have been aware of such works; twenty years later he would write that the Depression “stimulated the great phase of social realism in the American novel.” 

    Instead Wolfe’s relentless focus is the sociological and even Pavlovian uses of status hunger as a tool for the control of even — or maybe I mean especially — the most culturally and intellectually developed members of a society. In this aspect alone does the dissertation foreshadow the role of social philosopher of status in human affairs to which Wolfe would later appoint himself — the village explainer of its peculiar folkways on the island of Manhattan. See his relatively early and still amusingly on-target “The Big-League Complex,” in the New York Herald Tribune in 1964, on “the pleasures of status, the impulse of status striving in the status capital of the United States, if not the whole hulking world” for the details.

     Although, come to think of it, the contempt for writers and intellectuals that runs just below the surface of the thesis also foreshadows the goad and scold that he would become and his conservative — even reactionary — political bent. Decades before the phrase was coined, Wolfe made a specialty of “owning the libs,” most famously in his viciously effective takedown “Radical Chic.” His receiver was tuned very finely to the contradictions and hypocrisies of the left, and he delighted in skewering its most visible figures. 

    So is Wolfe’s thesis just another dreary Ph.D. dissertation that deserves to be consigned to an intellectual landfill with many thousands of others? What nagged at me was that I could not quell my curiosity regarding how a writer who was such a keen observer of the human spectacle managed early in his career to have noticed not a bit of it while writing about the 1930s. So I dug deeper into the matter of just what kind of person Tom Wolfe was in his five years in New Haven, including a sojourn in his archives at the New York Public Library. And the more I gleaned about the circumstances under which his Ph.D. thesis was produced, and the more I looked into the personality, the writer, and, yes, the thinker he was, the more telling the whole episode became. I came away with a richer, more complex, and altogether more proleptic sense of who Tom Wolfe, Graduate Student, was. 

                   

    II.

    In his later years Tom Wolfe did not look back up on his time engaged in graduate studies fondly. “I’m not sure I can give you the remotest idea of what graduate school is like,” he wrote in 1973. “Nobody ever has.” Still, he tried: “Try to imagine the worst part of an Antonioni movie you ever saw, or reading Mr. Sammler’s Planet at one sitting, or just reading it. . . . That will give you the general idea.” There is the telltale Wolfean anti-intellectualism. Paeans to the joys of graduate school are nonexistent, but still, something in it at least triggered his impulse towards impressive, even mildly astonishing productivity judging from the considerable handwritten and typed record that he left behind. If he really was that miserable, he was the hardest-working miserabilist in the history of graduate education. A good guess as to his motivation might have been an impulse to show those effete, snobby Northeasterners just what he could do.

    Wolfe came from Richmond, Virginia, and had deep Southern roots. His grandfather fought for the Confederacy and his father, an agronomist and the editor of The Southern Planter, was an intellectual rooted in the soil. He retained the gentle manners of his class for his entire life; pretty much every profile of him ever written begins by noting the contrast between his wildly idiosyncratic prose style and bomb-throwing persona and his soft-spoken demeanor. This was a handicap in New Haven: “It was the one time in my life I was really stuck. I couldn’t stand out because everybody was eccentric in graduate school. They had everything from genuine dirty-neck Bohemians to true British fops.” An Eisenhower supporter in the land of Adlai Stevenson, he was reduced to wearing a black turtleneck and growing his hair long (for 1956, that is) to make some sort of impression. The bespoke tailored white suits came much later. 

    Wolfe attended Washington and Lee University in Lexington, Virginia, where he studied English and was a good enough baseball pitcher (big hands) to have had a cup of coffee — a tryout — with the New York Giants. But while he had a slider, a screwball, a sinker, and a variety of curves, he lacked a fastball (“my tragic flaw,” he remembered), so he was cut after three days. Instead, under the influence of his American culture professor Marshall Fishwick, he applied to the American Studies program at Yale and was accepted. 

    In 1951 American Studies was a fairly new interdisciplinary program in academia, but it was aligned in the highest degree to the intellectual spirit of a country newly aware of itself as a superpower. It really came of age in the 1940s and 1950s and was the product of an emerging Cold War consensus that sought to study and to bolster “‘the American way of life,’ which was rooted in liberal individualism, pragmatism, suspicion of rigid ideologies, democratic self-governance, and capitalism,” as the historian Greg Barnhisel has written. Unlike the New Criticism-dominated English departments, which focused single-mindedly on the text, and the text alone, of a work of literature, American Studies scholars brought the perspectives of history, psychology, sociology, anthropology, folklore studies, political science, and other disciplines to bear in studying American civilization. Which, by and large, it viewed as a highly positive thing, for the country itself and for the world. It was not a discipline congenial to scholars of an anti-American bent. It was designed to bring close attention and, by extension, stature to the artifacts of American culture, and to our literature above all. 

    Perhaps the most energetic figure in this nexus of academic study and cultural warfare was the enigmatic and altogether fascinating Yale professor Norman Holmes Pearson. According to Greg Barnhisel’s revelatory biography Code Name Puritan, Pearson earned one of the first doctorates in American Studies and was the program’s first undergraduate director at Yale. But just as importantly he was a man of parts, some of which do not fit together very comfortably. He was deeply involved in modernist poetry, becoming close friends with and a career advisor and something like an unpaid literary agent for both Hilda Doolittle and Bryher. He moved adeptly in the highest altitudes of the American poetry establishment, including correspondence and involvement with T. S. Eliot, Ezra Pound, and William Carlos Williams. He was, as one of Barnhisel’s chapter titles has it, “a Networker,” and a very skilled one. 

    There was something else of compelling interest about Pearson. He had been one of the most effective American spymasters of World War II. A serious hip infection of long standing had rendered him semi-crippled and unfit for military service, but in the Office of Strategic Services in London he applied his scholarly habits of mind to gleaning counterintelligence about sleeper agents left behind in the wake of the German retreat in Europe and rolling up their networks. His main tools in this endeavor were index cards on suspected spies, which he fanatically kept accurate and up to date. He was awarded the Medal of Freedom for his exploits. After the war he was briefly involved in planning for the new Central Intelligence Agency that would replace the OSS, but he then returned to his teaching post at Yale. 

    But his services to his country were not over. Yale in the 1950s was infamously a feeder school of high-born, highbrow recruits for the Company — William F. Buckley, Jr., and Peter Matthiessen, for example — and Pearson was instrumental for some years in this effort. It was hardly coincidental that James Jesus Angleton, the ultra-literary head of counterintelligence for the CIA and an intimate of T.S. Eliot, had been Pearson’s student at Yale before the war. And Tom Wolfe was another one of Pearson’s students. There is no evidence, alas, that Pearson tried to recruit him for the Company, but his influence on the young graduate student was profound. 

    Pearson was a riveting showman in the classroom; none of his students ever nodded off. In a warm memorial tribute to him in 1975, Wolfe wrote that “he was the most superbly theatrical teacher I have ever seen,” going on to an extended description of a mesmerizing piece of business with a cigarette Pearson pulled off while describing how a police reporter for the Kansas City Star turned himself into Ernest Hemingway, revolutionizer of American prose. This eulogy is, as far as I know, unique in the Wolfean canon, in that it is completely free of the least trace of irony or sarcasm or hyperbole or stylistic capers of any kind, let alone the snide tone that was his default register. Here he was transparent and sincere. He clearly meant every word of it, and the words were deeply affectionate. 

    Wolfe spent so much time and energy mocking and attacking other writers, particularly his contemporaries in American fiction — his nadir in this department was his broadside in 2000 against John Updike, Norman Mailer, and John Irving, which he charmingly called “My Three Stooges” — that it is hard to remember how hyper-literary he really was. His manifesto “The New Journalism,” from 1973, name- drops everyone from Balzac, Beckett, and Borges to Zamyatin and Zola, stopping off at Cervantes, Dickens, Defoe, Flaubert, Rabelais, Smollett, Sterne, and a dozen or so other big shots along the way. He pretty clearly had read them all, and he was highly aware of, in Mailer’s phrase, “the talent in the room.” 

    Wolfe’s papers from his graduate school days show that this hyper-literary bent arrived very early. He enrolled in 1952 in Pearson’s popular graduate seminar on Ezra’s Pound’s Cantos, one of the densest and most arcane long poems in the modern canon. As a service to Pound and his publisher, each student’s term project was to annotate all of the references in a single canto. Wolfe got the bit between his teeth and produced 117 dense pages of glosses for Canto XLVII’s 109 lines, tracking down allusions to the Odyssey, Metamorphosis, Works and Days, Virgil, Theocritus, figures such as Isis and Osiris from Egyptian mythology, and much, much more. In those pre-search-engine days the paper must have required interminable hours in the stacks, and reference books stacked high on either side of the young man’s carrel. In comparison, two other students produced annotations of other Cantos in two pages each. 

    This paper was by no means an outlier. Wolfe’s original idea for a thesis subject was “a study of Thoreau’s prose style as it relates to the formal values assumed by the Concord group as an esoteric literary school.” In his archives is an entire folder of a hand-drawn Excel Sheet-style table of Thoreau’s rhetorical figures and strategies in Walden and other works — the compilation is of almost alarming density and obsessiveness. He not only read Melville’s virtually unreadable Mardi (try it some time), he also produced a long paper titled “An Annotation of the Quest Allegory in Mardi” that hunted down the book’s numerous allusions and borrowings from, among other sources, Rabelais’ Gargantua and Pantagruel. Who knew? There are papers on “Hemingway’s Use of the Mystic And in The Old Man and the Sea” (864 times in its 193 pages!), “On Allusions in Eliot’s Marina” (“If we may regard Marina as a beatitude of the irrational…”) and, tellingly, one on H. L. Mencken, clearly one of his later career models, which predicted that he “will be considered the top American humorist since Mark Twain.” 

    There are also papers produced for his sociology classes that mimic the dreadful vocabulary and style of so much of the discipline without a trace of irony. There are notes from those sociology classes so heroically detailed and organized — his note-taking everywhere in his archives is nothing short of astounding — that to read them is to experience the class as if on fast forward. These written artifacts are the products of an ambitious overachiever with Stakhanovite energies more than comfortable in the world of literary arcana and disciplinary jargon, not a grad school drudge in the slough of despond. 

    The youngish Tom Wolfe also tried on other forms and styles outside of his academic work. He made a stab at short fiction with a stylistically unusual if less than fully achieved story about a bunch of American GIs on a tear in Seoul during the Korean War, titled “Goddamn Frozen Chosen.” There is a cycle of brief poems from 1956 under the rubric “Jocko Thor’s Keyhole Autobiography” that some observers think was a parody of Beat poetry; they read to me rather as a mash-up of E.E. Cummings and William Carlos Williams. Most intriguing to me was a small Champion notebook: the notes inside were indecipherable, but on the cover in Wolfe’s baroque and instantly recognizable hand is written “Santa Barranza,” one of his signature apostrophes. How he latched on to the phrase is unknown, but he liked it enough to deploy it numerous times. 

    Wolfe’s adviser for his thesis was the distinguished American historian Ralph Gabriel, who was one of the founders of Yale’s American Studies program and had been its chairman. It is more than a little suggestive that in 1950 Gabriel resigned that position when Yale accepted a donation of half-a-million dollars from the business magnate and fervent anti-Communist William Robertson Coe. The gift came with the proviso that “the Professor to head the Program of American Studies shall always be one who firmly believes in the preservation of our system of Free Enterprise and is opposed to the system of State Socialism, Communism and Totalitarianism.” (A year later Gabriel was twitted for his insufficient fervor for American capitalism in William F. Buckley’s career-launching God and Man at Yale.

     In any case, Wolfe’s original 1953 paper for Gabriel is in the archives and perusing it is a revelation. Unlike his eventual dissertation, this paper is a genuine, if modified, pleasure to read. The writer who would become Tom Wolfe is already in there, scratching to get out. An example: “In fact, it might well be asked to just what extent did the votaries of the Communist Party’s three cultural renaissances, the ex-aesthetes, the willowy liberals, the mooncalves [!] with their bright images of a Soviet Russia, their sullied image of a tired capitalism, fascism, war — to just what degree did they differ from the ‘Lost Generation?” Wolfe’s conclusion was that “the two generations could not have been more alike.” Well, excuse me, but Santa Barranza! Where had this stylish person been hiding? Wolfe throws off nifty little phrases such as “the hour of liberal frolic was fun, but short.” — in reference to The New Masses — and “that awfully serious, rather preposterously earnest pose which one naturally acquired in the 1930s.” 

    Professor Gabriel must have liked what he read since he approved Wolfe’s thesis subject. Wolfe then proceeded to write a draft of this dissertation that, like his initial paper, differed considerably, even radically, from what he eventually produced. This first draft is present only in snatches in the archives, but what is there shows that he proceeded to address his subject in the same cheeky, somewhat personal (as opposed to coldly objective) and distinctly un-grad-student-like manner. In one folder I discovered and became unreasonably excited by ten pages of typescript titled “Beaux Arts on the Barricades,” probably a draft chapter on the socially conscious artists of the 1930s such as David Siqueiros, Rockwell Kent, Stuart Davis, and Diego Rivera. It is quite a bravura performance: “Legions of the less famous artists were clogging the stale corridors and second-story flats of lower Manhattan, where, grimly, vicariously, with veins popping out of their necks, they spent the decade arguing those issues so intimate to them all: Communism, capitalism (and its overthrow), fascism (and how to fight it), the United Front, the Hearst papers, the Scottsboro Boys, the Lincoln Battalion, Father Coughlin, Trotskyism, Kulaks, breadlines, and the Chinese Kuomintang.” I thought, here are the birth pangs of The Painted Word and From Bauhaus to Our House, and it needs to be published! Sadly, the piece breaks off mid-sentence and was either lost or never finished. But this was the vein that Wolfe was writing in as he made his first stab at his dissertation, and it was blazingly premonitory of what was to come. 

    When he handed the work in, however, it turned out to be an academic disaster. The three anonymous examiners were not pleased, and the comments in their reports were scathing: “He has written a piece of polemical journalism in which he offers too many assertions that are not supported by the evidence.” “He has however oversimplified the issues in reference to those who were decoyed; and weakened the persuasion of his arguments.” “Wolfe’s polemical rhetoric is . . . a chief consideration of my decision to fail the dissertation in its present form. The consequences color every page.” Another premonition of the Wolfe to come! This last examiner concluded that “Wolfe has brought together a synthesis of evidence which might have been a most valuable contribution (and still could) had he not yielded to polemics and a manner quite unsuited to the requirements of a scholarly dissertation.” 

    Of equal or perhaps even greater seriousness than these objections about style was evidence that Wolfe had been less than scrupulous — sloppy, really — with his sources. As David Potter, the American studies graduate supervisor, wrote in a summary, all three of the examiners lodged “the criticism that you misused your sources, giving incorrect quotations, misstating evidence, etc. All three readers checked various sources (a routine duty of readers) and all three made this criticism.” In a file, there is a long and detailed list of specific instances of Wolfe’s scholarly defalcations in this regard and they prove without a doubt that this charge was both true and extensive. 

     Wolfe-watchers will immediately recognize that the above criticisms echo many of the attacks that were lodged against him and his New Journalistic practices by his adversaries over the years. In any case, Potter wrote Wolfe a more-in-sorrow letter sharing the bad news about the reports and telling him that he was not going to be recommended for a degree. This would have been devastating to the ambitious and still youngish writer, of course. But at the end he was thrown the bone that there were still enough positive aspects to his first attempt, specifically “that you write very skillfully, that you organize well, and that you have an excellent case that the literati were indeed manipulated by the communists.” And he still had the opportunity to rewrite his thesis in light of these responses and it might be accepted in a new and improved form. 

     This was even worse than being cut from the Giants. A different writer might have given the whole business up as a bad job and found another avenue of endeavor, but not the dogged young Wolfe. He wrote, angrily and more than a little self-servingly, to a friend that “these stupid fucks have turned down my dissertation, meaning I will have to stay here about a month to delete all the offensive passages and retype the sumbitch. They called my brilliant manuscript ‘journalistic’ and ‘reactionary,’ which means I must go through with a blue pencil and strike out all the laughs and anti-Red passages and slip in a little liberal merde, so to speak, just to sweeten it.” None of the readers’ reports get anywhere near the word “reactionary,” and no one can reasonably point to anything in the revised dissertation resembling “liberal merde.” Wolfe is imputing a political bias to his academic naysayers that simply did not exist. 

    Wolfe would find the succor and assistance he needed in the person Barnhisel calls “the misfit-collecting” Norman Holmes Pearson. He was more than familiar with the figures Wolfe had interviewed, including Cowley; he was fully simpatico with Wolfe’s anti-Communist stance; and as one of the biggest dogs in the American Studies department, Pearson’s voice would be crucial when the time came for Wolfe’s new dissertation to be judged anew. Within a year Wolfe would be awarded his Ph.D., and after a six-month period of literary-minded slumming he began his journalistic career as a reporter for the Springfield Republican. He would stay in contact with Pearson until his death in 1975; his lifelong scholarly project, an edition of his fellow Puritan Nathaniel Hawthorne’s complete correspondence, still uncompleted. 

    The whole extended episode of Tom Wolfe’s time at Yale and the painful rejection of his initial dissertation maps, however imperfectly, onto his subsequent career and especially on to his — for me dubious — reputation as a conservative firebrand and hit man. There is no question that Wolfe’s resentments ran deep and were permanent. In context we can see that his revision process was really a calculated act of self-erasure. He cold-bloodedly expunged every trace of his unmistakable individual style — the wit and the high spirits and the verbal virtuosity — and substituted a cold and distant objectivity that he thought would please his faculty critics. It must have been a galling process, accomplished in anger and with gritted teeth. 

    He never forgave his having had to do it, and his gentlemanly demeanor obscured a considerable chip on his shoulder. As Niall Ferguson says in the documentary film Radical Wolfe, “That is a kind of searing experience, and I think it explains the vendetta that Wolfe pursues against the intellectual left.” This is certainly true. Wolfe’s favorite targets were the people who used to be called “limousine liberals,” the well-heeled, highly credentialed, bien pensant denizens of elite universities and think tanks and publishing houses and urban salons whose loudly expressed sympathies for the downtrodden sailed right over the ordinary people Wolfe used to call the “raw vital proles.” There had to have been a lot of payback in that. 

    And yet, while Tom Wolfe may have written about the most colorful of those raw vital proles, he certainly never consorted or socialized with them. Did he really have anything that can accurately be called a politics? He may have had a perfectly reasonable distaste for Communism, but if he ever took a position in favor of, say, the Vietnam War, it has escaped me. He certainly never endorsed a candidate of either party. He preferred rather, Ali-like, to float like a butterfly above the human comedy, politics included, and from time to time, to sting like a particularly lethal bee when something struck him as irresistibly amusing, hypocritical, ridiculous, or all three at one. He was at heart an eighteenth-century satirical moralist of a Southern cultural bent, but never the Manchurian Candidate of journalism launched from Cold War-era Yale at the heart of liberalism that some people accused him of being. 

    There is no question, though, that the form of American Studies in which he engaged at Yale profoundly influenced the other kind of American Studies that his entire career in nonfiction and fiction alike represented. You don’t have to love Tom Wolfe to recognize that he was one of the major figures in American writing in the postwar years — and an uncommonly interesting one at that. As far as I know, though, no one seems to be working at the moment on the big biography that he requires and deserves — and we live in a world more than fully supplied with biographers. Time for someone to get cracking. The man cast floods of light on his time.

    “Orthodox in Nothing”: The Saga of Bernard Lazare

    In Notre Jeunesse, or Our Youth, his memoir of the Dreyfus Affair, Charles Péguy, the poet of Catholic France’s salvific mission, wrote that “the prophet in this great crisis of Israel and the world was Bernard Lazare. Let us salute here one of the greatest names of modern times and . . . one of the greatest among the prophets of Israel.” If, for Péguy, the Dreyfus Affair was proof that in the republic “everything begins in the mystical and ends in the political,” Lazare was one of the few to withstand this degradation of a noble cause. He said of Lazare that “he undeniably had elements of a saint, of saintliness. And when I speak of a saint, I shouldn’t be suspected of speaking metaphorically. . . . He was a prophet. It was thus only right that he was buried prematurely in silence and oblivion.” 

    Lazare earned these exalted words of praise from his fellow Dreyfusard by being the first person outside Alfred Dreyfus’ family to come to his defense after he was found guilty of treason in 1894. If other names, more famous ones such as that of Émile Zola, come to mind when we think of Dreyfus’ defenders, that is more a reflection on the vagaries of history and memory than of the truth of events. And Lazare was more than just a noble controversialist. He was also a Jewish thinker who, after perversely explaining antisemitism as the fault of Jews, created a form of radicalism that viewed the Torah as a foundational revolutionary document. A saint with a strange trajectory.

     

    Lazare was born Lazare Marcus Manassé Bernard in Nîmes on June 14, 1865. On his mother’s side his family had resided in that city for several centuries; one of his maternal ancestors was said to have been a silversmith in service to the pope in Avignon in the fourteenth century. His father, a successful tailor, came from a family that settled in the South of France in the late eighteenth century after leaving its native Lorraine region. Lazare, after completing his high school studies in 1882, chose to educate himself in literature. After leading a literary society in Nîmes, he “ascended,” as they say in French, to Paris in 1886.

    Writing as Bernard-Lazare, he published symbolist tales, a collection of which was reviewed by Anatole France. France wrote floridly of Lazare that “there is in him the Asiatic and the African. His style has the thorny richness, the thick and heavy abundance, the enormous flower, the bitter juice of the vegetation of the torrid zone.” France concluded on a note that never had the chance to be proven true: “He is precise and firm in his thinking, and he expresses himself with verve. He will be totally to my taste when age will have given him gentleness and a tad of indulgence.” Lazare’s early death saved him from gentleness and indulgence, and the fire France saw in him never dimmed. 

    Lazare was militantly apolitical at the beginning of his writing life, but he soon became an outspoken anarchist. He explained his commitment in a book published in 1895 by his friend, the anarchist writer Augustin Hamon, called Psychologie de l’anarchiste-socialiste. Lazare’s contribution to the volume was a veritable profession of faith. “The value of authority is something I could never understand . . . . For me, that a man could arrogate to himself the right to in any way dominate his kind was, and still is, something inconceivable. In my eyes, nothing was and is as dishonoring as obedience, that is, the partial annihilation of the individual.” He was, by his nature, a purist, a dissenter, an agitator, an almost naive sort of moralist. He continued to write literary fables, but now with a political point. Lazare would publish articles in a variety of anarchist journals, coming in contact with leading figures of the movement such as Sébastien Faure and Zo d’Axa. He also established a number of journals of his own, editing the impressive though little-read Entretiens politiques et artistiques, which had a total of eighty paid subscribers. (But here we are, remembering it!) 

    Though some have questioned the sincerity of Lazare’s anarchism, he was serious enough in his beliefs to have the police trailing him and reporting on his doings. A police report on one of his speeches quoted Lazare proclaiming, “Yes, we want no more armies, no more fatherlands, and our acts of revolt are acts that are just.” Lazare found much to admire in the act of Auguste Vaillant, who threw a bomb into the Chamber of Deputies on December 9, 1893. This violence became known as the “propaganda of the deed,” a concept with a long and highly destructive career in modern politics. Writing to the Dutch socialist Ferdinand Domela Nieuwenhuis, he praised Vaillant for his courage and spoke of how “the blood of sacrifices is a fertile rain; it causes the good wheat to germinate even in the souls of the indifferent; it prepares the harvests of the future.” The panic that accompanied the wave of bombings in the early 1890s resulted in a series of laws known to history as the lois scélérates, which effectively banned anarchist propaganda. Fearing for his freedom, Lazare briefly fled to Brussels. He returned home and defended his friend Félix Fénéon, an art critic who was accused of planting a bomb in a Paris café that cost its one victim an eye, in the press. Fénéon was found not guilty. Lazare’s activities during this period earned him a reputation as a man willing to defend the unjustly accused, a reputation he would soon confirm.

     

    In 1892 Lazare took to the pages of the journal he edited, Entretiens politiques et litteraires, and published the first of a series of articles which he would later expand into a book that remains his most controversial book, L’Antisémitisme, son histoire et ses causes (Antisemitism, its History and Causes). The book marks the confluence of two currents that were on the rise in France at the time, antisemitism and anti-capitalism. Lazare’s book is a work that is deeply and unapologetically anti-capitalist. This is hardly surprising in the work of an anarchist. But anti-capitalism had many faces in fin-de-siècle France, and not only the radical left embraced it. Opponents of modernity hated capitalism for destroying traditional ways. And hatred of the Jew as the beneficiary and the ace of capital was where left and right met. 

    What, for Lazare, was the explanation for antisemitism’s ubiquity, its seemingly eternal nature? “If this hostility,” Lazare wrote, “this repugnance had been shown towards the Jews at one time or in one country only, it would be easy to account for the local causes of this sentiment.” But this is not the case. For the Jews, or as he calls them resorting to the concepts of the time, “this race,” (though elsewhere in the book he calls the notion of race “a fiction”) “has been the object of hatred with all the nations amidst whom it ever settled.” Those who have hated the Jews over the millennia have come from every class, every nation, virtually every corner of the world, “ruled by different laws and governed by opposing principles.” Their own ways of viewing the world are radically opposed, they worship God in different ways, and yet they all hate the Jews. There can only be one explanation according to Lazare: “It must be that the general causes of antisemitism have always resided in Israel itself, and not in those who have antagonized it.” 

    The outrageousness of Lazare’s assertion is hardly mitigated by what follows this disturbing diagnosis. “This does not mean that justice was always on the side of Israel’s persecutors, or that they did not indulge in all the extremes born of hatred; it is merely asserted that the Jews were themselves, in part, at least, the cause of their own ills.” Lazare’s lapidary explanation for how they are the cause of their own ills was a simple one: “Until today and everywhere, the Jew has been an asocial being.” He is asocial because unlike other conquered peoples, the Jew has refused to accept any master but the Law: the Torah. 

    The Jews’ separateness was religious and political, for wherever they settled their sole desire was to be able to follow their own religion and be ruled by their own laws. They thus always formed a state within the state. (The phrase became a platitude of modern antisemitism.) In Lazare’s telling, the results of this were rights and privileges that differentiated them from the rest of the nations in which they lived and which excited the jealousy of those around them. “Thus, Israel’s attachment to its law was one of the first causes of its unpopularity, whether because it derived benefits and advantages from that law which were apt to excite envy, or because it prided itself upon the excellence of its Torah and considered itself above and beyond other peoples.” 

    Antisemitism, in Lazare’s account, will not last forever, and in fact, thanks to its condemnation of capitalism, antisemitism bears within it the seeds of its own destruction. Lazare bizarrely saw something positive in Édouard Drumont, France’s leading antisemite, the editor of the daily La Libre Parole and the author of the two-volume best-seller La France juive. Lazare wrote that in attacking Jewish capitalists, the antisemites are actually opening the way for socialism! Antisemites such as Drumont “stir up the middle class, the petit-bourgeoisie, and sometimes the peasantry against Jewish capitalists, but in this way gently [lead] them to socialism, [prepare] them for anarchy, [lead] them to the hatred of capitalists and above all of capital.” The revolution that will inevitably arrive will be carried out by forces that will include Jews, for the Jews are, by their very nature, a revolutionary people. Jews naturally “seek justice” and their revolutionism has its basis “in their concept of life and death,” their insistence on the here and now and not the afterlife. The coming revolution will see the Jews as actors, for they are “a revolutionary ferment in society.” Antisemitism “works at eliminating not only the economic causes [of antisemitism], but also the religious and national causes that engendered it and which will disappear with the society of today of which they are the products.” Antisemitism, in the view of the pre-Dreyfus Lazare, is an ally — an unpleasant ally, but an ally nonetheless — in the revolutionary struggle. 

    Antisemitism will end not only with the end of the antisemitic movement, Lazare continues, but also with the fading of Jewish religious belief and practice. With general social trends moving in a secular direction, “the prejudice against the Jew, a prejudice as persistent as the Catholic prejudice against the Protestant, and the Jew against the Christian, cannot be the only one to continue. It is diminishing in intensity and they will doubtless soon no longer hold the Israelite responsible for Jesus’ suffering.” And this is not all. National prejudices, built on religious foundations, are also headed for extinction. “National particularism and egoism, however strong they might still be, are showing signs of decay.” (This, of course, was the standard Marxist assurance about the inevitable disappearance of nationalism, which was a supremely bourgeois distraction from the class struggle.) Foreigners are no longer hated, humanity is tearing down barriers, and antisemitism is doomed. It will perish “because the Jew is being transformed, because the religious, political, social and economic conditions are changing, but above all, it will perish because it is one of the final and persistent manifestations of the old spirit of reaction and narrow conservatism which vainly attempts to halt the revolutionary evolution.”

    Lazare had no kind words for rabbis, those voices of reaction, but the Jewish religion itself possessed a “revolutionary spirit” that, given the antisemitic conditions in which Jews lived, also “constituted the Jewish fatherland.” Limned in Antisemitism, Its History and Causes is a historiosophical vision, a version of religion that was not just a source of sustenance for Jews as a people but also for Jews as a revolutionary force. The Torah served as a revolutionary manifesto for Lazare. Drawing from God’s words that “For unto me the children of Israel are servants” Lazare asks “what authority can, then, prevail by the side of the divine authority?” Here is the revolutionary core of Judaism: “All government, whatever it may be, is evil, since it tends to take the place of the government of God.” Writing at the time of the propagandists of the deed, the anarchist Lazare declared that earthly powers “must be fought, because Yahweh is the only head of the Jewish commonwealth, the only one to whom the Israelite owes obedience.” Through the centuries of oppression, this was the essence of Judaism that kept them going. “The funeral pyres, the massacres, the carnage, the insults, everything contributed to make dearer to them the justice, the equality, and the liberty which during many long years were for them the emptiest words.” 

    There is more than a hint of Karaism in Lazare. For Karaites, a medieval fundamentalist Jewish sect that survived into the modern era, the Torah, the “written Toirah” that was given to Moses at Sinai, is the only valid document of their religion: the Talmud, or the “oral Torah,” the dicta of the rabbis, has no validity, and all “rabbinite” authority is rejected. Lazare’s references to the Talmud are, in true Karaite fashion, all negative. Those who created the Talmud are dismissed as “casuists.” Marx is called a Talmudist owing to his mind “which does not falter at the petty difficulties of fact.” The dreadful state of the Jews of Eastern Europe, “where the Talmud still dominates,” is caused by the dominance of rabbis, “where he lives under the guidance of his rabbis, who unite with the powers in authority to prevent him from attaining light. (Two decades before Lazare wrote these words, the German Catholic theologian August Rohling published Der Talmudjude, or The Talmud Jew, which became an instant classic of modern Jew-hatred.) But even though rabbinical power via the Talmud “has put to sleep their instincts of revolt,” Lazare went on, it “did not debase all Jews.” There remain “some who persisted in the belief that justice, liberty, and equality were to come to this world; there were many of them who believed that the people of Yahweh was [sic] charged with working for this coming.” Lazare considered himself to be one of them. He did not stand outside the Jewish people, and his immersion as a Jew in the struggle of the Jews was about to begin.

    On October 15, 1894 — the same year that Lazare’s tome on antisemitism was published — Captain Alfred Dreyfus, an officer on the French general staff, was arrested for espionage on behalf of Germany. Three pieces of evidence served as the basis for his condemnation: a secret dossier that was never shown to Dreyfus’ defense lawyers; a document called the bordereau containing information on French military matters fished from the garbage at the German embassy by a cleaning lady working for French intelligence; and a note referring to “cet canaille de D.” — “that scoundrel D.” — alleged by the government to refer to Dreyfus, though it was later learned that it referred to an officer named Dubois. Dreyfus was found guilty and publicly degraded, with his epaulets and the stripes on his pants torn from his uniform as he continued to proclaim his innocence. 

    This was not yet the Dreyfus Affair, but simply the case of an officer found guilty of treason. The headline of Édouard Drumont’s antisemitic newspaper screamed “The Traitor Condemned. Ten years of Prison and Degradation. Down with the Jews!” There was no public outcry, but Dreyfus’ brother Mathieu believed in his brother’s innocence. He regularly visited the condemned man in La Santé Prison, where he was being held before being sent to Devil’s Island for life. Mathieu wanted to publicize his brother’s case, and he had an unexpected ally: the governor of the prison, who had been impressed with the supposedly treasonous officer. He told Mathieu that there were only two men who could help him: the antisemite Édouard Drumont and Bernard Lazare. How and why he would have suggested Drumont is an unsolvable mystery. As for turning to the Jewish anarchist, Lazare later wrote in an unpublished memoir of his involvement in the affair that “I suppose that during the preceding years he had at La Santé revolutionaries and anarchists who had spoken of me.”

    Even before meeting the prisoner’s brother, Lazare was convinced of Dreyfus’ innocence, and when he met Mathieu in February 1895, while the captain was still waiting to be sent to Devil’s Island, “I had decided to do all I could to wrest him from there.” Lazare gathered all the facts he could, with Mathieu Dreyfus providing invaluable information. In the summer of 1895 he began writing a pamphlet outlining the injustice of the captain’s fate and the falsity of the case against him. The pamphlet was quickly written, but the Dreyfus family requested that its publication be delayed until they felt the right moment had arrived. For a year nothing happened, and it was only in September 1896 that the family decided that the time had come to go public. In the end Lazare published three pamphlets defending Dreyfus, in 1896 and 1897 (both titled A Judicial Error) and in 1898 (How to Condemn an Innocent Man). They were masterpieces of forensic literature, detailed examinations of the case, laying bare the weakness and the falsehoods of the prosecution’s case. Yet however cogent the arguments in Lazare’s pamphlets, they failed to obtain Dreyfus’ release or calls for a new trial. They did, however, bring on board some new defenders of the cause. 

    On the left, the affair was generally viewed as an intra-bourgeois dispute. Why should an anarchist or socialist care about the army’s attack on a wealthy officer? In 1897, at the end of his pamphlet in defense of Dreyfus, Lazare had this to say to his comrades: “There are men for whom freedom and justice are not vain words. I am going to speak to them. They have no right to content themselves with general and generous theories if they refuse to apply them.” He condemned those leftists who, “preoccupied with humanity as a whole, turn away from unfortunate individuals.” If not all of the left lined up behind Lazare, some of the most important anarchists, such as Sébastien Faure, did support him, though with certain reservations, as did the greatest of France’s socialists, Jean Jaurès. It was thanks to Lazare that the editors of La Revue blanche were won over to the cause, bringing the future head of state Léon Blum into the fight.

    The pamphlets that Lazare wrote are important for another reason. Lazare dared to state the obvious: “With such weak evidence, and if the pressure of the antisemitic gang hadn’t forced the hand of a government lacking in character or courage, they would never have dared lead Captain Dreyfus before a court martial.” The Jewish angle was a touchy one, and even among Dreyfus’ defenders there were Jew-haters. Captain Georges Picquart, who would deliver the fatal blow to the prosecution by proving the real traitor was Major Ferdinand Walsin Esterhazy, despised Jews. 

    As support grew for Dreyfus, his defenders accused French Jews of doing nothing so as not to rock the boat. French Jews were throughout his life a particular bugbear of Lazare. Feeling that Juif was too harsh a term, one that loaned itself into being turned into an epithet, French Jews of a certain class called themselves “Israelites.” (German Jews of the same era had the same problem with the term Jude.) Israelites went along in order to get along. Israelites didn’t raise Jewish issues. Israelites turned their backs on their brothers, both in France and in Eastern Europe. 

    Antisemites spoke of a shadowy group, a “syndicate,” that was working behind the scenes on behalf of Dreyfus, suborning politicians and journalists as well as men and women known by the newly-coined term “intellectuals.” According to Lazare’s biographer Philippe Oriol, there in fact was a group working behind the scenes in the Dreyfusard cause, called the Defense Committee, and Lazare was its most active agent. Lazare visited politicians at the highest level, proof in hand of the falsity of the military’s case. He visited the editors of the country’s most important newspapers, sometimes obtaining support, more often not, yet never deterred. Writers and critics received his visits. Among them were men and women of the left, such as Octave Mirbeau and the revolutionary feminist journalist Séverine, who would soon be strong allies. Lazare did not only visit editors; he also started Dreyfusard newspapers from scratch. The list of people he importuned is enormous, and some, such as Lucien Herr, the librarian of the École Normale Supérieure and the great French Jewish archeologist and historian Salomon Reinach, would become key figures in the Affair. When Lazare first went to visit Émile Zola to convince him to become involved in the case, “I found [him] full of sympathy, but grace only struck him when the full tragedy, with Esterhazy the traitor, and Picquart the good genie, and Dreyfus the martyr seized his novelistic imagination.” Lazare’s specialty as emissary of the “syndicate” was in providing funds to groups on the left. Le Libertaire, an important anarchist newspaper, received funds for the paper’s general use and to publish a pamphlet in support of Dreyfus to be distributed free of charge. He assisted in the setting up of public meetings, and of newspapers such as Le Parti ouvrier, L’Antijésuite, and L’Ouvrier des deux mondes. Lazare continued this work for several years, spreading funds even into Belgium, where in 1901 he was the intermediary in forwarding money from the Defense Committee to establish a Committee for Propaganda Against Antisemitism. The organizer of that group wrote in thanks to Lazare: “You’re a magician.”

    In 1899, during which two further trials took place, the second before a military tribunal, Mathieu Dreyfus asked Lazare to step back from the case, to have less of a public presence, to “let others act.” Lazare acquiesced, explaining in later years that “I cared more about justice than about myself.” Even so, he continued to work furiously behind the scenes. Yet in the heat of the Affair Lazare felt his role was already being forgotten and he worried about how he would be remembered. In a moving open letter sent to the head of the League for the Rights of Man, Lazare spoke out on behalf of his efforts: “For my family, I want it to be said that I, who was the first to speak out, to stand up for the Jewish martyr, was a Jew.””

    Lazare attended Dreyfus’ military trial in Rennes in 1899, and it was there that for the first time he laid eyes on the man he had been defending. He spoke of his “fear” as he awaited Dreyfus’ appearance, and also of the effect the man had on him. “My fear vanished as soon as he entered. When I saw him as I’d never hoped to see him, after all these years of torture, full of the internal flame of life, firm and stiff, all I then felt was the tranquility of a perfect soul and more than ever the certainty that the innocent man was going to emerge victorious.”

    When Dreyfus finally received a pardon in 1899, after being found guilty with attenuating circumstances of espionage at the Rennes trial, the various factions of the Dreyfusard movement exploded into opposing factions. Some supported Dreyfus’ acceptance of the pardon, which allowed him to be freed while letting the guilty verdicts stand, while others felt that the pardon had to be refused in the name of truth and justice. Lazare supported the former group. Dreyfus was fully exonerated in 1906, three years after the death of the first Dreyfusard.

     

    Lazare’s activities in support of Dreyfus prepared him for the next and final phase of his political development. Before the Dreyfus Affair, in 1894, Lazare had been contacted by the cultural-Zionist thinker Ahad Ha’am about assisting the latter in having a collection of his writings published in French, though nothing came of it. In 1896, barely a month after the appearance of Theodor Herzl’s Der Judenstaat, Lazare wrote to Herzl asking about a French translation, telling him that “I would be happy to learn of this, and when it occurs to learn of your means of action and your practical projects so as to make them known here.” The two men met in Paris in July of that year, and Herzl described Lazare in a letter to a friend as an “excellent example of a fine, intelligent French Jew.”

    Lazare was ripe for Zionism, and he would become a central member of the movement in France, though, as he wrote in 1897, “Zionism in France does not exist” and had but ten adherents in the entire county. In the same year he published a pamphlet called Jewish Nationalism. It is a typically unorthodox work, for the nationalism in the title is not, or is not just, the construction of a Jewish homeland. Lazare proposed two solutions to the injustices that Jews endured in Christian-dominated countries: a Jewish land, and a Jewish nation within the greater nations in which they live. Like the Jewish Labor Bund, Lazare accepted that Jews were a nation within a nation, a claim he himself had made and employed in explaining Jews as the cause of Jew-hatred. This was now viewed as a positive thing. It was, indeed, the road to liberation, and away from the worst outcome of all: assimilation. At the same time Lazare, unlike the Bundists, believed that Jews needed a country of their own, and saw Zionism as the only way to establish it. These contradictory ends were not always clearly melded in Lazare’s writings.

    Lazare first had to establish that the disparate people of the Jewish Diaspora was one, that there really was a basis for Jewish nationalism. Jewish history, he wrote in his pamphlet, was made up of “common traditions and customs, traditions and customs that have not all equally persisted, for many of them were religious customs and traditions. Nevertheless, they have left their mark on us; they have given us habits and, even more, a similar attitude, thanks to which, despite the necessary individual divergences that separate us and must separate us, we look upon things from the same angle. How, then, do we translate this fact of a certain number of individuals having a common past, traditions and ideas? We translate it by saying that they belong to the same group, that they have the same nationality. . . . There is a Jewish nation.” 

    This, as he had once observed, was one of the sources of antisemitism, but the hatred of the Jews had a paradoxical effect. In an expansion of Jean-Paul Sartre’s later assertion that the Jew is created by the antisemite, Lazare wrote that the effect of antisemitism is to render that nationality more tangible to the Jews: “it is to make even stronger their consciousness that they are a people.” This old claim that the origins of Jewish identity were to be found in Jew-hatred does not withstand even perfunctory historical scrutiny, though in the modern period it explains some of the psychological perplexity of the assimilated Jew. 

    Nationalism for the Jews, Lazare proclaimed, means “collective freedom.” The Jew who proclaims himself a nationalist is saying that “I want to be a completely free man; I want to enjoy the sun; I want to have the right to my dignity as a man. I want to escape oppression, escape insults, escape the contempt that they want to bring to bear on me. At certain moments in history, nationalism is for human groups the manifestation of the spirit of freedom.” What Lazare’s Jewish nationalism did not mean at this point was a mass exodus to the Holy Land. It does not mean, as he wrote, that “I am a man who wants to reconstitute a Jewish state in Palestine and dreams of re-conquering Jerusalem.” 

    For Lazare the anarchist — which he remained until his dying day, however eccentric his version of anarchism might have been — the unity of the Jewish people, of the Jewish experience, ended at the point where the classes divided. In this pamphlet as in all his Jewish writings, the Jewish bourgeoisie desired nothing so much as to be not only like Christians, but actually to be Christians, and as such is an enemy. Lazare wrote with no ambiguity that “the Jewish bourgeoisie, deprived of its secular stays, poisons the Jewish nation with its rot. It will poison the other nations as long as it has not decided — and this is something we cannot encourage it strongly enough to do — to adhere to the Christianity of the ruling classes and to leave Judaism behind.” Jews, like all nations, are divided into different camps: “There are Jewish conservatives, Jews of the juste milieu, and socialist and revolutionary Jews.” Lazare’s Jewish nationalism, and later his Zionism, would stand on this Marxist ground of class division and class struggle. It would ultimately drive him to oppose Herzl and leave the Zionist movement.

    Was a retreat into nationalism a denial of the central socialist doctrine of internationalism? Lazare dismissed this possibility out of hand. Abandoning his earlier hopes of Jews being liberated by the revolution, he now felt — given the sufferings of the Jews, as well as the depth of Jew-hatred — that “I don’t think that it would be legitimate to count on an economic and social transformation. In the first place this transformation, which I hope for, and whose coming I will participate in fighting for with all my might, sadly seems to still be far off. And then, it is not proven to me that it will bring Jews better conditions.” Going further, Lazare saw the Jewish fight for its nationhood as consistent with his anarchist goal of a future society being made up of a free federation of groups. “I believe that one day humanity will be a confederation of free groupings and not organized in keeping with the capitalist system; free groupings in which the distribution of wealth and the relations of labor and capital will be completely different from those of today. These groups must be allowed to be constituted, to form themselves. Why wouldn’t Jews form one? I see nothing that opposes this, and it is in the development of Jewish nationalism that I see the solution to the Jewish Question.”

    Lazare was perhaps most clairvoyant in his response to those who denied the right of Jews to self-determination, to nationhood. In words that could be written today, he asks “In any case, are most socialists, even the internationalists, totally consistent? Do they act in conformity with their doctrines? Are they not demanding — and rightly — autonomy for the Cubans, Cretans, and Armenians? Don’t they recognize that all have the right to fight for their freedom, and don’t they join together this freedom with the demand for a nationality? Can someone tell me in what way the Jews are different?”

    He then pivots one last time. Invoking the Passover “chant” of “next year in Jerusalem,” he calls on his fellow Jews to take their fate in their own hands. “From this day forward they should not expect help from heaven, or the assistance of powerful allies. The Jews will only find their salvation in themselves. It is through their own might that they will liberate themselves; that they will re-conquer that dignity that they have been driven to lose. The contemptible and vile portion, without convictions or any other motives than their personal interests, will convert. It won’t have any scruples to overcome in order to do this. What will believers and non-believers do who will never resign themselves to the recantation? They will ever more strongly feel that they will be free as individuals when the collectivity to which they belong is free, when that nation without a territory that is the Jewish nation will have a land and can dispose of itself without any constraints.” His reasoning here brings to mind the “auto-emancipation” of which the Russian Zionist thinker Leo Pinsker had written in 1882.

    Lazare attended the second Zionist Congress in Basel in 1898, where he was greeted as a hero. As a contemporary account had it, as he entered the hall he was introduced as “Bernard Lazare, the noble, the bold, the strong,” and the hall rose as one in a standing ovation. Prior to the conference he had given an interview to a British journalist for The Jewish World, in which he seemed to have opted for colonization, saying that “I am in favor of colonization, the establishment of new colonies, and the introduction of industries in Palestine.” But he insisted that his Zionism had little to do with the Jews of Western Europe, specifically the French. He had joined the movement not for the hundred thousand Jews of his native land, but rather as an answer to the question of “What shall we do with millions of our brethren in Eastern Europe who have been ground down by their misery?” All the ambiguity, the eccentricity, of Lazare’s Zionism is summed up when he told the journalist that “As I see a future for them in Zion, I am a Zionist.” 

    The first task before Zionism, he told the interviewer, was to “form our masses into a Jewish people. It is therefore our duty, besides attending to colonization, to attend to the education of our people.” It is worth noting that unlike many early Zionists, Lazare was not Ashkenaz-centric. He spoke regularly of the sufferings and pogroms inflicted on the Jews of Algeria, who had been French citizens since the promulgation of the Crémieux Decree in 1870. In Basel he insisted that the proceedings be published not only into German but also into French — not for the Jews of the metropole but because “the Jews of Morocco and Algeria need us to bring them the consolation of Zionism.” 

    Yet Lazare was not convinced that the Zionist movement as it was constructed could bring that consolation to anyone but the bourgeoisie. Basel was the only Zionist conference he ever attended. (They were annual events until 1901.) He rejected the program developed at the congress as “a program of a faction of the bourgeois Jewish intellectuals, which in no way responds to the reality, to the situation of the Jewish people and its needs.” 

    Lazare’s opposition to the Zionist movement of Herzl began with his opposition to a bank to be established to support colonization. He was opposed less to the bank itself than to its centrality in Herzl’s thinking. The mobilization of the Jewish proletarian masses was Lazare’s aim, not the gathering of funds from wealthy supporters. He wrote to Herzl in February 1899 that “you are bourgeois in thought, sentiments, and ideas; bourgeois in social concepts. Being so, you want to lead a people, our people, a poor people, an unfortunate people, proletarians.” Herzl and the movement wanted to rule the Jewish people from on high, “in an authoritarian fashion.” Lazare accused Herzl of wanting “to lead them where you think is best for them. . . . You want to set a herd on the march.”

    Lazare had been elected to the presidium of the movement as well as to its action committee, but this spell among the leadership was not fated to last. Along with the bank, Herzl’s entourage displeased Lazare, but even more Lazare found Herzl’s imploring the aid of the world’s autocratic rulers abhorrent. He resigned his official posts in the movement in early 1899, and when the fifth congress of the movement, in December 1901, paid homage to the Ottoman Sultan Abdul Hamid, it was the last straw. In an article published in the review Pro Armenia, Lazare let loose: “The representatives — or at least those who call themselves such — of the most ancient of persecuted peoples, those whose history can only be written in blood, sent their greetings to the worst of assassins.” Their degradation could not be worse: “This people, covered in the blood of its wounds, is thrown at the feet of this Sultan covered in the blood of others, yet in this assembly not a single protest echoed.” Herzl was clearly the subject of his reproach that “still today the descendants [of the Jews of the past] follow a man and ignore the means employed. . . . Yesterday [the leaders] placed the Jews at the feet of Kaiser Wilhelm; today they make them kneel before the Sultan; tomorrow they will have them lay flat on their bellies before the Tsar, and we will have the beautiful and grand spectacle of slaves licking the whip of the master.” The anarchist Lazare could not accept this. It was not enough to obtain foreign support and land. “For the task, the great task to be accomplished, is that of the intellectual and moral regeneration of the Jewish people.”

    Herzl and his allies, with their realpolitik, were incapable of fulfilling this role. They were wed to “a politics of the ghetto, of serfs, worthy of those who walk hand in hand with the most fanatical rabbis of Galicia, Russia and Poland.” All forms of superstition must be abandoned. “It’s necessary to wrest them from their economic slavery. And the day of their liberation they will be able to freely dispose of their destiny.” In a letter to Chaim Weizmann, Lazare vented his spleen against what his entire life he viewed as a factor inhibiting Jewish liberation, again in tones strongly reminiscent of his peculiar modern Karaism. “I am convinced that the essential task is, above all, that of liberating the Jewish people, but especially liberating it from its internal obstacles. . . . The Jews must be taught to think, they must be wrested from ritualistic and Talmudic superstitions. . . . They must be shown that the very basis of Judaism is rationalist thought. We must extirpate from their minds all false Jewish belief as well as the Christian prejudices which have encumbered those who have escaped depressing rabbinism.” 

    Lazare did not abandon Zionism, though. He remained an independent Zionist, and most of all a defender of his fellow Jews. As he remarked in his final letter to Herzl, announcing his resignation from the established movement, “If I separate myself from you, I don’t separate myself from the Jewish people, from my people of proletarians and beggars, and it is for their liberation that I will continue to work, though by roads that are not yours.” Lazare the anarchist, the internationalist, and the universalist never entirely disappeared. This was made clear in the series of articles that he wrote in 1902 for the newspaper L’Aurore and, more importantly, for Charles Péguy’s Cahiers de la quinzaine (with whom he was close from the time of the Dreyfus Affair) under the general heading of “The Oppression of the Jews of Eastern Europe: The Jews of Romania.” The article, which also appeared as a pamphlet, was written before Lazare visited the country. When he later did, he was greeted as a hero. 

    If ever there was a situation that would appeal to a Zionist of the time, it was that of the Jews of Romania.  Treated as foreigners in their own land, left to their own fate by other countries, already emigrating en masse, Lazare did not present Zionism at the end of this grim portrait of Romanian Jewry, which, alongside Russian Jewry, he regarded as the most oppressed.. Instead Lazare saw the only hope in a classically left-wing solution. “If tomorrow [the government] prepares to attack the rural workers, it will ignite the flame. Perhaps if it causes the Jews to despair, if it pushes them to the limit, the latter, despite their passivity, despite the counsels of the timorous rich, will unite with the workers of the fields and will assist it in shaking off their yoke. But even if they don’t join, it is the rebellious peasant who will, directly or indirectly, resolve the current Jewish question by freeing itself and liberating the Jew.”

    Until the end of his days Lazare was torn by the tension between a socialism built by Jews where they currently lived and a socialism built in the Jewish homeland. At around the same time as his article on Romania, Lazare, in an article on Russia, wrote that “there will doubtless only be a cure in the midst of a general cure. Jews will only be free when the countries will be [sic] free.” But in 1900 he averred that “I am convinced that Judaism can only free itself from its slavery by emancipating itself as a nation and not by seeking a vain emancipation in the countries in which the Jews live. . . . Judaism must everywhere organize in a national proletarian party.” Lazare’s final work speaks eloquently of the contradictions that animated his radicalism. 

    Left unfinished at the time of his death, Le Fumier de Job, or Job’s Dungheap, incorporates the ideas that Lazare developed across all the periods of his life. (An English translation was published by Hannah Arendt at Schocken Books in 1948.) The short volume is written with poetic flair, harkening back to Lazare’s Symbolist days. It is militantly proletarian, a final nod to his anarchism. But it is also and centrally Jewish in a religious and socially radical way. Lazare said of himself that he was “orthodox in nothing.” This work is proof of that.

    Job’s Dungheap is a thumb in the eye of the Christian world, a case not just for the right of Jews to live like any other people but also an uncompromising argument for the nullity of Christianity. Even more, it is explicit in its expression of the superiority of Judaism over Christianity. For Lazare, Judaism is not a religion, it is a form of superiority. Judaism, he says, “can be considered, not as a religion, not as a nationality, (so many diverse peoples are Jews), but as a discipline of the mind, as a concept of the world superior to the Christian discipline and concept.” He is nothing if not candid. And there is more. “The Jew consoled himself for his social abjection by recognizing himself to be intellectually superior and by inventing a thousand adventures in which his adversaries recognize his intellectual superiority.” Judaism is, in its very essence, modern, owing to the reason for its superiority: “The torch of rationalism in Israel has not gone out for an instant.” (Historically speaking, this is not remotely the whole story.) Judaism is therefore an element of progress, for “what has allowed the Jew to do better, to have a social influence, is that he has maintained the idea of justice with the hope of realizing it in this world, something Jesus put off until the millennium.” In any case, Christianity “is unable to understand justice, for it is founded on a monstrous injustice, since for [the Christian] the death of the Innocent One led God to pardon the guilty, since his justice was in this way satisfied.” His distance from actually existing Judaism notwithstanding, the idea of Judaism loomed very large in Lazare’s eccentric and intensely moral view of the world. 

    Job’s Dungheap is also a brutal examination of antisemitism. Lazare still maintains that “the religious factor is [its] basis;” that “the religious passion has always needed economic reasons as an auxiliary factor. But when these reasons are born religious passion and prejudices aggravate them.” “Every Christian is profoundly antisemitic,” he declares. He warns his co-religionists, “Jew, you will never know the hatred for you hidden in the soul of Christians, venom reheated for two thousand years in the darkest corners of the mind, poisons distilled for centuries in the spirit, ferments of scorn, leavenings of disgust and horror, all that sleeps and is being awakened.” 

    Lazare’s analysis of antisemitism had been deepened and simplified by the experience of the Dreyfus Affair and his studies of the experience of Eastern European Jewry. He still maintains that “the wealthy provoke antisemitism for which the poor suffer. They are at least the pretext.” But he adds that “antisemitism is a form of Christian hypocrisy. The Christian is cleansed by attributing his vices to the Jews.” This attribution of their own sins to Jews is a key idea in Job’s Dungheap. Any trait attributed by antisemites to Jews is actually a Christian trait: “Filth is Christian, it was never Jewish.” The claim of Jewish ritual murder is simply a Christian way of deflecting Christian thirst for blood as manifested in their art. The Christian spirit is mired in petty subjects, for “a religion where one wonders if the Word was resurrected with or without its foreskin has no right to mock Talmudic subtleties.” This is why the Jew is better able to grasp science, since he does not have to “cleanse his spirt of antirational dogmas, those of the Trinity and the Eucharist for example.”

    The delicacy and saintliness for which Péguy and others admired Lazare appears to have abandoned him here. Filth is human. But Jew-hatred signifies something else for Lazare, and shows him overturning his original view of the universality of antisemitism. This hatred is now evidence of something positive, a cruel honor: “To be hated in this way by all of humankind, one must bear something grand within oneself.” 

    Zion is the solution, but a Zion in keeping with Lazare’s revolutionary vision. “Go to Zion to be exploited by rich Jews? How does this differ from the present situation? That’s what you are proposing to us? The patriotic joy of being exploited by being those of the same race?” Despite this, Zion is still the way out. The Jew must “leave his brothers in habits, mores, and customs, and go to those from whom you have been separated for years, whose faces and language are foreign to you. And yet, one must go there to guide them down the path that we, more fortunate, have gone down.”

    Lazare began to feel run down as the twentieth century began. His friends and comrades attributed this to his overwork during the height of the Dreyfus wars, but this wasn’t the case. He was suffering from stomach cancer, and surgery performed by his physician brother was unavailing. He died in Nîmes on September 1, 1903.

    In February 1905, Pierre Quillard, a writer and childhood friend of Lazare, spoke of the need for a monument honoring his late friend. The following year, after Dreyfus’ rehabilitation, a subscription for the monument was opened, and a total of 21,384.85 francs was contributed.The monument, close to ninetee feet tall, featured a bust of Lazare with a woman bearing a torch below. The inauguration was scheduled for October 5, 1908, but anti-Dreyfus forces had not given up the fight. In June 1908, a member of L’Action française fired on Dreyfus at the ceremony of the transference of Zola’s ashes to the Panthéon. The inauguration of the Lazare monument gave them further cause for upset, and the far right held a demonstration in the city against the “statue of dung.”

    On July 14, 1909 the statue was vandalized by a man who swung a hammer at the bust and broke off its nose. The figure remained in this damaged state, and during World War II it was covered in antisemitic graffiti then taken down and stored in a museum in Nîmes. At the end of the war it was not restored it to its original location, and the monument was finally confiscated from the museum and its material was reused for a statue honoring the city’s liberation. For a long while, history committed the forgetting that Lazare had feared. But last December an exact replica of the monument was erected in precisely the spot the original once stood. 

    Kyiv War Diary

    January 1, 2026

    It’s 12:56 AM. My family and I have just greeted the New Year together. We heard explosions nearby and a siren wailing over the city: Russian drones. Explosions were also reported in Odessa, Donetsk, and the Sumy region, and drones were detected flying in the direction of Kyiv and its surroundings. And yet I am calm and happy: For New Year’s we had our family’s special roasted duck. Just like my mom used to make. 

    January 2, 2026

    Yesterday was my friend Dania’s birthday. Friends toasted to him tonight. We told stories and celebrated his life. But just as I was walking home suddenly the wound of losing him opened up again, and so here I am sobbing in the middle of downtown Kyiv close to midnight, a military curfew.

    War keeps taking the best of us.

    January 5, 2026

    1:48 AM 

    Two Russian bombs have just been detected. If the report is accurate, missile launches will begin four hours from now. It will take them about an hour to reach Ukraine post-launch, so they will arrive around 6 AM (if they are coming). Do I feel anything as I write this? No. My brain takes care of itself. I feel only that I have a cup of tea to make and there’s a car chase in Inception (which I just paused) that I want to continue watching.

    2:42 AM

    I am in a hallway and Russia just attacked the city with ballistic missiles. Heard very loud explosions. Doors are shaking, which means that the attack is close enough for the sound of the blast wave to reach us before we could hear the rounds of smaller explosions and the sound of air defense shooting down Russian drones. I am in a puff jacket because it’s quite cold here in the hallway. It’s warm in my room, but it’s safer out here.

    2:48 AM

    Monitoring channels report that Russian Shahed drones are falling on the southern part of Kyiv and I am trying hard to remember whether I am in the south or the north. 

    3:03 AM 

    I am sitting on a hallway floor feeling the warmth leave my body second by second. I decide that I can’t stand it anymore, that I need a hot cup of tea and I need my warm room, but as I move to open the door I hear what sounds like a Russian Shahed drone buzzing, and adrenaline starts pumping into my blood as the buzzes get louder. I open the door and inside the buzz gets even louder. Just as I am about to panic I see through the window that it’s only a trash truck driving past my building. I exhale. I make my cup of hot tea and take a sip. God, tea never tasted better. 

    6:38 PM

    You don’t know true luxury until you’ve seen your lights stay on for three minutes after a blackout was supposed to have started. Oh what bliss: the heater is still working. My home is warm and well lit. I am so rich. 

    January 6, 2026

    Being adventurous means deciding to wash your hair despite the fact that there is again a blackout in Kyiv: you will have to dry your hair, and it is getting colder inside your apartment by the minute, and you have only an hour and half until you need to leave home, and outside it is minus one degrees. But you have a feeling that it’s all going to work out. 

    January 8, 2026

    4:50 AM

    Blackouts got so bad in the city of Dnipro today that its metro stopped. Trains didn’t run. Escalators didn’t work. People were evacuated from the tunnels. This might sound like some thrilling post-apocalyptic movie script, but thanks to Russia it is life in Ukraine now.

    11:39 PM

    Talking to my dad and there is an explosion outside. Russian drones are attacking Kyiv at this minute. 

    11:52 PM

    I hear our air defenses shooting down Russian drones over Kyiv at midnight while having a relentless cough that just won’t stop. A very special way to spend a Thursday night. 

     January 9, 2026 

    12:12 AM

    Russian drones are attacking Kyiv and I have a violent reaction to a loud noise, but it’s just the neighbor upstairs moving a chair.

    1:33 AM

    The explosions continue: now the drones are attacking the thermal power station. It feels like negative eleven degrees outside. Yes, Russia wants to freeze us.

    2:39 AM 

    A couple of blasts that sounded just like thunder. Almost familiar. Oh, here is another one — rocking my building gently.

    9:33 PM

    This evening I learned from my doctor that air raid siren alerts had started in Kyiv. I was in his office at my appointment. We had just finished — and good thing we did, because doctors don’t work in public hospitals during air raid siren alerts. “Not great,” the doctor said, “after Oreshnik yesterday.” “Yeah,” I replied, “though my blocked ear pisses me off much more.” He smiled. Bonding in Ukraine on a Friday night. 

    January 11, 2026

    I don’t have gas here so I am hoping there will be electricity in the morning because otherwise there will be no green tea for me and that will make me mad.

    January 13, 2026

    Since about midnight we haven’t had electricity in my part of Kyiv. It came back on ten minutes ago. That’s more than seventeen hours without light. The longest so far. The streets outside looked pitch dark, I couldn’t see the other bank of the river at all. Have I started to panic? I’d say a bit. 

    It’s inhumane. What Russia is doing to us is inhumane.

    And after a while, however heroic or stoic or resilient you think you are, sooner or later you will learn that resilience has a price. Just as I’m discovering now. It is scary.

    January 16, 2026

    It is midnight and an air raid siren is wailing over the city because Russia is preparing to attack us yet again. A ballistic missile threat. It’s the fifth air raid siren alert today.

    10:00 PM

    I woke up to find the electricity on in my part of Kyiv. How did I know it was on? My sister suddenly started shouting “woo” from the kitchen and singing “I’ve got the power.”

    January 17, 2026

    Hello to all from Kyiv. It’s 9:30 PM here and we have had no light since… I honestly don’t know when it went off. It certainly has been dark for over seven hours. We have no schedules now — it can go out at any moment. But still I’m thankful. You could say I’m privileged: we have heating and gas here, so hot tea is available. Others have it worse, there are buildings where ice is on the inside already. And there are reports that Russia could attack our energy systems again — today or tomorrow — and that is… what I’m trying not to think about.

     11:44 PM

    The light was just restored to my part of Kyiv after almost ten hours of blackout. A happy moment.

    January 18, 2026

    12:27 PM

    Oh the luxury of boiling water in an electric kettle after ten hours in darkness! 

    January 19, 2026

    8:03 PM

    Here we go again: the light just went out. And so the usual routine starts: we switch to blackout lighting — turn on the small pre-charged flashlights in the kitchen — check; bathroom flashlight — check; the Christmas lights in the room — check. No talking, just switching everything on. It’s a kind of ritual now. 

    January 20, 2026

    12:41 PM

    Electricity is back in my part of Kyiv and we have heating, but no water supply. “For a change” my sister said. “Keep things from getting boring.”

    Not boring at all. 

    January 21, 2026 

    11:50 AM

    The first thing I heard when I woke up was my dad and my sister talking. Turns out he came to visit us and make us breakfast — we have a gas stove and he doesn’t, and the electricity is down in my section of Kyiv. We all gathered in our small kitchen. I made the tea, dad made the oats, my sister worked on her laptop. And for a second I forgot it was all because of Russia’s bombs, I forgot about the war. 

    It was just… nice, being all together like this.

    January 23, 2026

    1:30 PM

    Yesterday the impossible happened: we had electricity for almost the whole day! After days of blackouts. I charged everything. I mean everything. After nothing was left to charge, the electricity was still on. 

    That’s when I started getting nervous.

    3:19 PM

    Woke up to no electricity in my part of Kyiv. Then the water went out too. “Maybe the heat will turn off soon,” my sister mused when I got to the kitchen. We checked to see if there were bottles that we could fill with hot water for warmth that night, and we planned to get a whole pot of water boiling on the gas stove to warm the kitchen. 

    Just a typical morning chat in Kyiv.

    January 24, 2026

    1:25 AM

    The air raid siren is on because Russia is attacking us with drones. I hate the beginning of the siren, that loud first note and the way it starts rising, crawling through your bones and muscles. It’s the most disgusting part of the whole experience. It gets under your skin. I was told once that they made it irritating and disgusting on purpose, so that it would be difficult for people to ignore it. It’s revolting in order to save lives. There’s a certain irony in that.

     1:31 AM

    Detonations again. Doors shaking. Explosions continue. Air defense work.

    1:39 AM

    Drones and missiles. Heard many explosions. As I write this, air defense is trying to shoot down — what? — a Russian drone by the sound of it. We have been warned that more are coming. 

    1:53 AM

    In a hallway, trembling. Not because I’m frightened but because it feels as if part of every explosion gets stuck in my body, and maybe it does. And every subsequent one makes my body shake more.

    1:56 AM

    It is amazing, in a way, how normal air defense work now sounds to me. Like the most mundane thing. Like the sound of a generator. The buzzing. A mundane and precious thing that keeps us alive.

    2:04 AM

    Russian bombers just launched missiles in the direction of Ukraine, reports say.

    2:33 AM

    Fuck it was so loud and sudden! I almost screamed. Now I’m shaking and it’s stuck in my chest.

    2:42 AM

    “Have no idea how we managed to not go fucking insane from all of this:” a monitoring channel that reports on Russian drones and missiles just posted that. “Oh we did, a long time ago,” the first commenter replied. “Yeah, we did big time, we just don’t notice it yet,” said the next one. “My neighbor threw beer bottles out the window to shoot down a Russian Shahed drone” wrote someone called Alexander. 

    And I thought to myself, shooting down a Shahed drone with a beer bottle. Huh.

    3:10 AM

    The radiator battery started making weird sounds just after the explosions. It’s still a bit warm but it is certainly colder than it was before.

    1:07 PM

    Temperature already dropped two degrees in the room.

    Went for a walk to a spot where my parents used to bring me as a kid and saw people sledding and skating. That is what the people of Kyiv did the day after Russia bombed us. We went sledding and skating because fuck you Russia, that’s why.

    6:41 PM

    I managed to warm our rooms a bit with heaters, but the light went out. Still no heat and it’s negative twelve degrees in Kyiv right now. The darkness and the cold are slowly crawling in through the windows with no heating to protect us. Vaguely unnerving.

    7:56 PM

    Just put on my third pair of socks. Already in a hat. Warm sweater and a puff coat. Blackout in Kyiv, and no heat for seventeen hours now.

    9:22 PM

    Just heard my neighbor shouting “there is hot water!” with excitement and I’m now just so happy for them.

    January 25, 2026

    1:04 PM

    The heat is back! Radiator is still a bit colder than usual, but it’s warm again!

    4:21 PM

    A Russian drone just hit an apartment building in Kharkiv. There are casualties.

    5:08 PM

    I just used a microwave. I forgot we even have one. Right before I managed to switch on my electric kettle. Standing in the kitchen feeling like a billionaire. 

    January 26, 2026 

    2:09 PM

    I’ve never felt a Kyiv winter this cold in my entire life.

    2:45 PM

    Light returned but went off after less than two hours. Feeling strangely relieved: I felt guilty having light while so many others in Kyiv did not.

    6:39 PM

    The lights are back again and the street looks festive. I just finished a therapy session. The connection was breaking every ten minutes. Russia bombed all our power stations. I thought I’d be sad but instead I feel light and good for the first time in a while. I somehow stumbled upon the set of a DJ I once liked and here I am dancing in the middle of my room with Christmas lights on and windows in the building opposite shining yellow and white and me shimmying because I am okay. I am okay.

    A Beginner’s Guide to Coming and Going

    One thing I learned from farm people: “Close the gate.” I’d pull up to my sister-in-law’s road gate in wide-open Oklahoma, and my wife would jump out of the truck, fiddle with the chain wrapped around the post and secured by a link wedged into a steel slot. She’d swing the ten-foot span of steel tubing inward until it dragged in the grass beside the drive. I’d pull forward just far enough for the truck to idle clear when she closed the gate behind us. Then she’d wedge a chain link back into the steel slot. 

    I never imagined the many devices that folks come up with to keep a gate shut on a farm. They vary depending on what you are closing out or in. At Classic Tango Stables in Kansas, co-owner Kent welds a steel tube to the top left of a double gate and fixes a sliding bolt to the right side, as many gate latches in the country are dreamed up on the spot by men or women who have an arc welder. That gate secures an area of some dozen pens and corrals, and in this way serves a small herd of humans and horses. This latch sits on the top rail, but you need to wedge a boot toe under the right-side gate and lift slightly with your foot to free the bolt from its slot. There are many slots on gates and fences, and you just know they don’t always perfectly marry. 

    “Rule #1,” one sign on the place asserts: “If the gate is open, leave it open. If it’s closed, close it.” Some of those spring-loaded gate anchors have me shying sideways at first, as do two-way latches with pins on each side, which you lift to open a gate one way or the other. It’s simple once you use them a time or two, but confusing if you are, like me, a visitor helping feed animals in the low light of morning.

    Through me, a gate seems to echo Dante, you may find hope or its abandonment. Children run into gates from the time they can crawl, blocking stairs, halls, laundries. My son’s preschool had no fence between the play area and parking lot, and when the church announced that it would fence off the area, the preschool director, a woman named Magda, quit. “My children are not animals,” she explained. Fencing human beings in — or out — is a morally complicated proposition.

    When you go through a gate, know what you’re doing. My sister Christine, age seven, and I, ten, who’d never been much outside the city of St. Louis, went to walk up to my Aunt Rosemary’s barn in Jefferson County, Illinois, where we were staying one Thanksgiving week, when the older man, “Dad” Wedemeyer, my aunt’s father-in-law and whose farm it actually was, began to lead a bull to a pen on the far side of that barn. We were forty or fifty feet down a slight incline from the barn and had just gotten the gate open when the bull clearly noticed us, lowered his head and started to huff. “Get back behind that gate!” yelled Mr. Wedemeyer, in a voice that catches his panic in my ear to this day. “Close the gate!” We were guests, of course, and I didn’t know Mr. Wedemeyer well, so I realized, being the older of us kids, that I had violated a fundamental protocol of a farm, which had something to do with the responsibility of opening and closing a gate. Know what you’re doing. That gate was wide, as gates sometimes are. 

    The incident with Mr. Wedemeyer’s bull suggests that one can go through a gate and not return. Seamus Heaney’s first egress in his poem “The Barn” translates farmyard implements — harness, plough-socks — as an “armoury.” The pastoral bliss of picture books has been transgressed for another, dangerous world. From what bearing on the farm did this threat arrive? The boy of Heaney’s poem lay down on the barn floor to become, he says, actual chaff, “to shun the fear above,” he says, which, in the poem, the reader will experience firsthand. In that way, a poem swings open, asking a reader to walk into another consciousness where injury and sweat, birth and death, are common. This is why an outsider, as I am, takes seriously both the poem and the passage. There’s a backhoe out in the pasture to bury the old gray horse, and the kids watch from a distance, behind the gate. 

    A woman my wife and I know had been alone on the farm one night, when her husband worked the late shift in town. A strange man opened their gate and went to the back of the house and began to bash the back door and the woman’s dog, alternately, with a plastic lawn chair, while the woman spent nearly thirty minutes on the phone with 911. A sheriff’s deputy was coming, said the operator, from a long way off. This intruder was one of those fellows, it turns out, who would not return through the gate. By the time he busted open her door, the woman was waiting with a twelve-gauge and blew him out onto the back porch. He was confirmed to have been on drugs — another kind of gate, some say. The intruder’s wife called several days later to apologize and say she had been in fear of her own life from the man. 

    Folks in the country often hang padlocks on their road gates and don’t bother to snap the locks shut; or they sometimes hang a key nearby on the fence, a nod to the neighborly dust that trails each pickup passing on the road. Even now, the padlock on this woman’s gate hangs unlocked, with a sign nearby: Keep closed. Those who choose to honor the wishes of others will find the gate unlocked. “Though we could fool each other,” wrote the poet William Stafford, “we should consider — lest the parade of our mutual life gets lost in the dark.”

    The Coherence of the Obscure

    On the last page of Elena Ferrante’s Neapolitan quartet, Elena Greco, or “Lenu,” ascends in an elevator and closes herself in her apartment. There she examines the two dolls that she and her closest friend, Lila, played with as girls—objects once ablaze with meaning, now smelling of mold and seeming, to her eyes, “cheap and ugly.” Confusion surfaces from the dissolving vividness, the passing solidity, of the past. Ferrante, by way of Lenu, then observes: “Unlike stories, real life, when it has passed, inclines toward obscurity, not clarity.” 

    It is true that written stories, by their very nature, are constructed objects, filtering and fashioning life as well as imagined experience. They aspire not only to a netting—and a fixing—of reality, but also to a measure of coherence and order, even if at times they immerse the reader in life’s elusiveness. Whether or not they achieve clarity depends on the tale in question. And yet some degree of narrative integrity, some sense of completeness, remains a virtue. Memories, by contrast—however provisionally coherent—are fickle; we pin them one way and they shift. Over time, too, they fade and change, as we revisit them, and as we ourselves change. We may be left one day utterly puzzled before our old playthings. Ferrante, then, is right in part: The degree of form that fiction confers upon experience is part of the reason why we turn—or at least I turn—to it, because it is the form that tempers the murk. 

    But how can novels render and explore the obscurity so often felt in life? Is obscurity not the antithesis, the sworn enemy, of form? In the representation of lived experience, is not form a distortion and even a lie? John Banville’s acclaimed novel The Sea tests precisely this boundary between obscurity and coherence. Slim in size but rich in sensibility, it is moody and meticulous—a kind of swan song in sentences. 

    The narrator, who goes by “Max Morden,” is a dilettante art historian, attempting to write a book on the French painter Pierre Bonnard. He has retreated to an Irish village by the sea where he spent summers as a child. We know that he has taken a room at The Cedars, but the details of the plot remain tenuous. In this retreat, Max drifts through memories as they threaten to fade, offering in the process what may be a parting account of himself. Of his long-time wife, Anna, who has died, he says: “Already the image of her that I hold in my head is fraying, bits of pigment, flakes of gold leaf, are chipping off. Will the entire canvas be empty one day?”  

    Thankfully, Banville’s account of Anna is intact on the page, preserved with Max’s tone of fond attention, albeit speckled at times with purposeful ignorance and vitriol. At one point he bursts out, as if to her phantom: “You cunt, you fucking cunt, how could you go and leave me like this, floundering in my own foulness, with no one to save me from myself. How could you.”  

    And Max is susceptible to vitriol as well about his and Anna’s daughter, Claire, and her boyfriend Jerome. No, not vitriol exactly—but a vexation and a sort of smug remoteness. Early on, when Max recalls Claire referring to Jerome, he says: “she meant of course the chinless do-gooder—fat lot of good he did her.” (Note the assonance even here: Banville never settles for prose that sloughs off rhythm.)

    Interwoven with Max’s memories of Anna are his recollections of a childhood summer with the Graces, a family he deems “the gods,” beginning in the haunting first sentence of the novel: “They departed, the gods, on the day of the strange tide.” The gods bring real teacups with saucers to the beach, they boast a motor car, they offer delicacies such as mint sauce alongside their meals. Max’s summer with them zaps his molecules, as he falls in love first with Connie Grace, the mother, whose armpits are to him “excitingly stubbled,” and then with Chloe, Myles’s twin, prone to cruelty and giving off a “cheesy tang” he relishes.             

    Further down in the first paragraph Max goes on, as if to promise himself, “I would not swim again after that day.” Then, to cap the paragraph, he adds once more, “I would not swim, no, not ever again.” Already a sense of loss and foreboding have entered the frame. Max is caught drifting through his past, made bitter and wounded—at times baffled—by the grief that has threaded its way into the present. 

    The Sea has drawn lavish and well-deserved praise since its publication twenty-one years ago, and was adapted into a film starring Ciarán Hinds and Charlotte Rampling. Re-reading Banville’s novel today, I was struck by how its form becomes a microcosm of one of its central themes: the narrator’s desire for cohesion, and the ways in which reality, even within fiction, resists that cohesion. The image of the sea becomes an emblem of this tension. Meanwhile, the novel’s structure, loose and tidal though it is, insists upon a certain coherence. Together, these elements add shade and nuance to Ferrante’s simpler adage. 

    At the level of the sentence, Banville captures the mind’s tentative motion: reaching out to make a judgement, rejecting it, and reaching out again. Consider the following passage, about three-quarters of the way through the novel, in which the narrator lays bare his relationship with alcohol: 

    Have I spoken already of my drinking? I drink like a fish. No, not like a fish, fishes do not drink, it is only breathing, their kind of breathing. I drink like one recently widowed—widowered?—a person of scant talent and scanter ambition …  

    The movement here goes: “X is like Y,” “No, not Y,” “X is closer to Z.” Countless passages throughout the book take on comparable form—they pivot, that is, upon simile or metaphor and then revision. The latter happens here with the “drink like a fish” “no, not like a fish,” “like one recently widowed.” (Then there is an echo of it, in passing, with the “widowed—widower” bit.) This pattern underscores Max’s awareness that his initial judgements are not reliable; again and again, he must gauge and recalibrate both his perceptions and his phrases.  

    Consider another example. Early in the novel, describing one of his first encounters with a male Grace, Banville—by way of Max—writes: 

    As he turned back to the house his eye caught mine and he winked. He did not do it in the way that adults usually did, at once arch and ingratiating. No, this was a comradely, a conspiratorial wink, masonic, almost, as if this moment that we, two strangers, adult and boy, had shared, although outwardly without significance, without content, even, nevertheless had meaning.

    In this passage the revision lies in the assessment of the wink’s undertone. Max is not only recalibrating his perception, he is attempting to pinpoint indeterminacy within an ordinary gesture. The effect of this pattern that manifests at the sentence-level is to draw attention to the obscurity of Max’s mind and memory—if not to make the stone stony, then to render the murk murky. The final clause— “although outwardly without significance, without content, even, nevertheless had meaning”—leaves us all the more adrift. 

    Scaling up, the novel in its entirety worries about the narrator’s return to an “empty canvas” —to primal confusion, transformation, indifference. The move from lived experience to fictional terrain cannot and should not fully cloak these forces (even if The Sea lacks the fickleness of life, when looked at in retrospect). There is, then, a mirroring between Banville’s sentence-level patterns and the central thrust of the novel—like a monad in Leibniz, the parts reflect the whole. 

    Banville is not so much following Ferrante’s adage—that real life, when it has passed, inclines toward obscurity, and stories toward clarity—nor defying it, but revising it (as he is wont to do), showing how meaning, whenever momentarily achieved, emerges only against a backdrop that threatens to put all at sea.  

    That phrase “put all at sea” is not merely idiomatic for Banville; the sea itself becomes the novel’s ruling metaphor—of loss and the limits of the mind’s rage to order. The sea, after all, has long served writers as a site where the desire for clarity founders, where depths, distortions, and dissolution prevail. Indeed, Banville’s use of the watery expanse recalls Iris Murdoch’s in The Sea, The Sea, from 1978. Murdoch’s narrator, the retired theatre director Charles Arrowby, has retreated to a coastal village in England, where his single-minded and self-deluding tendencies manifest as misadventures marked by zeal. In the Postscript, Charles becomes explicit about the sea’s link with uncertainty and the misted, provisional nature of human judgement:  

    Time, like the sea, unties all knots. Judgements on people are never final, they emerge from summings up which at once suggest the need of a reconsideration. Human arrangements are nothing but loose ends and hazy reckoning, whatever art may otherwise pretend in order to console us.

    But where Murdoch’s waters take on a moral (and at times metaphysical) cast, Banville’s concern is more perceptual and epistemic—less about vanity and the ego, more about the instability of perceiving and knowing, especially in the face of loss. I can almost hear Max Morden murmuring, off the page: How difficult it is to interpret this world. How difficult.  

    Which brings us back to the question prompted by Ferrante: insofar as real life, when it has passed, inclines toward obscurity, and stories toward clarity—or, at least to coherence—what kind of coherence can the novel claim, particularly novels that make this very possibility a problem? By way of The Sea, Banville’s answer, like Virginia Woolf’s before him, is not to let obscurity engulf the narrative, but to give it shape, to let form emerge through syntax, voice, and imagery — in other words, through decisions informed by sensibility. This approach yields exquisite descriptions; near the end of the novel, the sea’s “small waves” are depicted as “breaking in a listless line, over and over, like a hem being turned endlessly by a sleepy seamstress.”

    In an interview from 2017, Banville remarked that he trusts “the sentence to create characters, to make a plot, to carry them along.” He then asks, with a characteristic tinge of the incredulous: “Anyway, what’s the point of a plot? Has life got a plot? Not that I know of. So why should novels have a plot? But it has to have some kind of story, it has to have some kind of structure.”

    Though some contemporary reviewers of The Sea found “a lot of lovely language, but not much novel,” as one of them put it, The Sea does have structure. However fragile, and self-aware of being so, there is a whole that can be spoken of. A rather painterly whole, in fact. It is as if Banville, like Lily Briscoe in To the Lighthouse, has had to think: “A light here required a shadow there.” A dab of this sequence here, a line of that one there. 

    Banville’s remarkable novel belongs to a tradition that Woolf helped to define: they both reflect a realism that seeks to capture, as closely as the written word allows, perception, memory, and the primacy of the subjective – transient inner states. If The Sea has its own luster, it is one that gleams in conversation with To the Lighthouse. The latter is more epiphanic, lit up as by “matches struck unexpectedly in the dark,” but The Sea, too, is distinctive in how its sentences operate in relation to its central thrust. And, while Woolf’s prose adopts a stream of consciousness, Banville’s is more precisely measured, contributing to a mood that is darker and more detached, quietly melancholy and mysterious. In this way, Banville is closer to his beloved Henry James than to Woolf, though I remain intrigued by the latter connection. 

    Woolf’s masterpiece consists of three parts—“two blocks joined by a corridor,” as she puts it in her diary. By contrast, Banville’s The Sea consists of two parts, two blocks, with certain corridors emerging therein: passages that link reflection, memory, and loss. The novel opens with the image of “the gods” departing; and near its end, we learn that Chloe and Myles, the Grace twins, have departed not fleetingly but finally—through drowning, whether by accident or choice, we do not know. The two “pale dots” they amounted to, Max recalls, between “pale sky and paler sea” simply disappeared. 

    After that it was all over very quickly, I mean what we could see of it. A splash, a little white water, whiter than that all around, then nothing, the indifferent world closing.

    The threads related to Anna’s death and that of the twins, woven throughout both parts of the novel, swirl together, like converging currents, in the novel’s final pages. When Max recalls the morning Anna died—before dawn—another moment surfaces in his mind: the summer he spent with the Graces, when he had gone swimming alone.  

    As I stood there, suddenly, no, not suddenly, but in a sort of driving heave, the whole sea surged, it was not a wave, but a smooth rolling swell that seemed to come up from the deeps, as if something vast down there had stirred the shore and then was set down on my feet as before […]  

    And after this, Max drifts back to the memory of that morning:

    A nurse came out to fetch me, and I turned and followed her inside, and it was as if I were walking into the sea.  

    This is why I turn to fiction: to feel buoyed by the page’s fragile order yet awash in another consciousness; to find an order, however imperfect, in all the passings; to have the world, and language, and (it sometimes feels) even my own molecules, zapped, confounded, and renewed.   

    The Eternal Childhood of Egon Schiele

    When he was sixteen years old, Egon Schiele painted Landscape of a Meadow with Houses in oil on a rectangular cut of cardboard. A little higher than the landscape’s horizontal center a thick band of green, unevenly applied in a gently undulating motion, cuts across the scene from the left, where the paint is thickest, to right, where it has run thin and dry, though it is still thicker than the marks of rubbed-down greens and browns which fill the lower half of the painting. Above the thick green line — grass hills — three couplings of houses are positioned, surrounded and separated from one another with small swoops of deep and hardly differentiated shades of purples, reds, and greens — trees. The sides of the houses are rendered in rich sheets of whites tinged lightly with either blues, pinks, or purples. Cardboard does not absorb oil paint as avidly as canvas or paper (which slurps oils down to a dim, flat mark) and so the walls of the buildings are raised up, the heavily applied paint has dried thick and the volume contrasts pleasingly with the flat wisps of surrounding trees. Behind this layer of incident, a larger building separates this strip from a band of pale purple markings which fills the canvas from left to right — a body of water, delicately rendered, a whisper lighter than the whitish blues of sky that reach the top of the painting. 

    The scene is complex. The rubbed-down greens at the bottom and the pale whites and blues at the top frame the dark interplay of field, forest, and buildings which Schiele has put in quiet communion with one another at the landscape’s middle. The composition was organized by an artist attuned to the interrelations of complicated machines. 

    Today Egon Schiele is famous for his brazen erotic drawings and paintings. His name has become a shorthand for erotic edginess and that aura clouds a proper conception of what Schiele was actually like. He was born in 1890 and died twenty-eight years later. Every painting that he produced in the short span of years when he was working was created by a man with the same psychological and emotional temperament: he simply did not have time to win and then evince the wisdom of a mature man. For the entire time he was working Schiele’s curious fascination was childlike — not a word that readily comes to mind when looking at his renowned nudes and portraits. But every painting and drawing that he created — whether of landscapes, flowers, or naked women — was the fruit of this fascination girded and regulated by an astonishingly sophisticated draftsmanship. That combination — sophistication and juvenility (but never puerility!) — is what makes Schiele immortal.

    Schiele’s sophistication was kindled by a fastidious attention to detail, a capacity which he developed growing up in close proximity to powerful and complicated machines which he studied obsessively as soon as he could walk. Two generations of men on his father’s side had worked for the Austrian railroad. His grandfather, Ludwig Schiele, had directed the construction and then served as the first general inspector of the Imperial Royal Privileged West Railway of Bohemia. Adolf Schiele, Egon’s father, was stationmaster in Tulln, the small town where Egon was born and was first fascinated by the movement and the structure of the epochal metal enormities. The family lived over the railway station, and the young boy’s first drawings were detailed illustrations of different kinds of trains. When he was five years old, Egon climbed out the window and onto the roof for a better view of the gigantic locomotives below. Two years later his parents gifted Egon a sketchpad, a single page of which they instructed him to fill each day — but to his mother’s bitter exasperation, before the sun set that evening Schiele had filled every sheet with drawings of trains which he hung throughout the house as if a railroad ran through every room. 

    Schiele’s mother had wanted her only son to become an engineer. His stubborn disinterest in any scholastic subject was just one of the many disappointments to which Marie Schiele was subjected. Adolf had married her when Marie was only seventeen years old — he was twenty-eight — against the wishes of both sets of parents. Adolf brought syphilis with him to the marriage, a disease which he passed on to his wife, and which curdled Adolf’s once beautiful body and then killed it. Sex, disease, death, and birth were bound together in the young Schiele’s mind since before sentience and he became conscious of them at the same time that he evinced a precocious interest in railroads and their intricate machinations. Schiele was curious about the mechanisms which regulated complex processes in animate and inanimate objects which were interrelated in his imagination. He was fascinated by how vitality functioned, how a creature became inflamed, sapped, satiated, and extinguished. There is in Schiele’s landscapes a merger of two qualities that for the nineteenth century had been antithetical to each other, almost spiritual enemies: the organic and the mechanical. 

    The landscape that Schiele painted when he was sixteen translated the drama of the fields, hills, houses, sea, and sky into a coherent single structure made up of rhythmically related shapes which were distinct but connected. His mind made order automatically out of things, environments, and human beings. The rigor of his intellectual orderings was matched by Schiele’s sensitivity. He felt deeply, he was easily hurt and slow to recover. Schiele wanted very badly to be understood, loved, and recognized for the genius that he at first suspected and then became certain he was. His childlike desperation for recognition and admiration — even as he created some of the most adult images in the history of art he wanted to be patted on the head — was a source of pain throughout his life.

    Adolf Schiele’s syphilis entered its tertiary stage in Egon’s early teens. Fewer than ten percent of the disease’s victims suffer from neurological degeneration and Adolf was among the unlucky. Elaborate hallucinations seized the sick man regularly, and his wife and children would have to participate in the madness or risk humiliating the man they loved. Invisible visitors would come for dinner and Adolf would introduce them and carry on conversation with them while the rest of the Schieles tried to feign comprehension. Adolf died on the last day of 1904, though in order to qualify for a larger pension the family reported the death as having occurred on January 1, 1905. Egon came home from school on that dark December 31st and found his father’s corpse dressed in the uniform of a high-ranking railroad official, despite having long since left the profession in which he had never risen very high. Egon sat quietly in the corner of the room alone with the dead body of his once-beautiful father, staving off the final moment of separation, determined to prolong a macabre communion. 

    Schiele never recovered from the loss of his father and never stopped looking for a paternal replacement. Shortly after Adolf’s death, Egon took his younger sister, Gerti, then twelve years old, to Trieste — the city where his parents had honeymooned. They spent a night together in a hotel room. Many years later Egon would marry on his parents’ wedding anniversary. And for years into adulthood he had visions of his father’s ghost in his dreams. As an adult he wrote to his friend and roommate Anton Peschka (who went on to marry Gerti): “I don’t know whether there’s anyone else at all who remembers my noble father with sadness. I don’t know who is able to understand why I visit those places where my father used to be and where I can feel the pain. . . . Why do I paint graves and many similar things? — because this continues to live on in me.” And Anton wrote to Gerti while still living with her brother that Egon “always talks in his sleep. Last night he said that papa was with him and he was real, not a dream — he spoke to him a lot.”

    His father was replaced as a guardian by Egon’s uncle, Leopold Czihaczek, a railway station inspector in Vienna, and an uptight, respectable gentleman who disapproved of his nephew’s artistic ambitions and scholastic indifference. Their relationship was fraught, and Schiele was tormented by this new father figure’s disapproval — he painted his uncle more times than he painted any other subject except himself. In 1907 alone he painted eight portraits of Leopold. A year earlier Egon had at last succeeded in persuading his mother to let him present a portfolio of his paintings and drawings to the School of Applied Arts in Vienna. This was the school from which, that very same year, a certain Adolf Hitler received a marking of “unsatisfactory” on his entrance exam. The school’s entry committee was astonished by the works Schiele put before them: first they insisted that he could not possibly have created them himself, and then, when convinced of their authenticity, instructed his mother to enroll Egon in the storied institution the next day. Uncle Leopold, enraged, arrived at the entrance to the academy the following morning and attempted to physically block Egon and his mother from entering the building. 

    A short while later Egon identified, sought out, and attached himself to the only man whose support and instruction could replace the father he had lost.

    In 1897 Gustave Klimt led a group of fellow students enrolled at the Vienna Academy in a revolt which established “The Secession” society, with which Viennese Art Nouveau is to this day synonymous. The goal of this rogue group was to thrust Viennese art onto the world stage and to encourage artists of global renown to exhibit in their city. The Secessionist’s first exhibition, held the next year, was so successful that the society was able to build its own exhibition hall. The Secessionists were distinguished among the nascent international community of modernist art into which they were striving to gain entry in that they considered the applied arts on equal par with the fine arts. Thus out of the Secessionist group grew a subsequent institution, the Viennese Craft Guild or the Wiener Werkstätte, which was founded in 1903 by Josef Hoffman and Koloman Moser. The institution produced metalworks, furniture, graphic arts, bookbindings, ceramics, postcards, and jewelry. A reverence for design — and for the incorporation of design and illustration into fine art — bound the two institutions together and glowed at the center of Viennese modern art. Highly stylized compositions, tight structures, efficient and neat iconography — these characteristics rhymed with Schiele’s innate taste for mechanistic formulations. As a small child he had already united the realms of art and engineering, and he entered the arena of modern art armed with this prior association.

    In 1907, ten years after Klimt led his revolt, Schiele presented himself at the elder artist’s atelier. He found Klimt in his garden, clad in his typical and somewhat Orientalist long blue smock and sandals. The younger man handed Klimt a portfolio thick with drawings and simply asked, “Do I have talent?” The generosity which obliged the elder man to take the portfolio from the seventeen-year-old’s hands turned to awe and after a long pause he muttered, “Talent? Much too much.” A friendship fueled by common respect was forged — the pair often traded drawings rather than paying one another for their works.

    Schiele’s friendship with Klimt changed his life and vaunted him into the maelstrom of modernism in Vienna. Still enrolled in the Academy, he came to his classes drunk with pride. Professor Griepenkerl, the same man whom Klimt had scorned while still at the academy under the professor’s tutelage, was tormented by Schiele’s invigorated arrogance. “Schiele,” he hissed, “the Devil has shat you into my class.” He had forbidden his students to so much as attend the Secessionist’s exhibitions, let alone to join their ranks. But it was clear from Schiele’s work that he had done more than this, that he had been overwhelmed by the Secessionists’ relationship with paint, form, and decoration — that he had found an affinity and chosen a side. For a time Schiele lost himself in Klimt’s gilded universe. The eerie quasi-mechanical, quasi-biological idioms which had ordered Schiele’s early works, and which would be resurrected once the virus of Secessionist loveliness had run through him, dwindled and then, for a whole year, disappeared. Klimt’s supremely decorative taste seeped slowly into Schiele’s work and by 1909 had saturated it. Schiele needed to metabolize the smooth easy flow, the pastels, the limpid limbs, the gentle, ornate, pretty beauty, to master it before he could translate this study into a notation, a conception, a line, that was specific to him and in service to his own peculiar fascinations. 

    Today Klimt and Schiele sit astride Viennese modernism on conjoined thrones, on pedestals of equal height, and so it is useful to remember that for a flicker of time one flame outshone the other. Schiele had to move past Klimt’s brightness, his ripe luxuriousness, his veneration for surfaces, his spirit of shiny extravagance — no easy thing for a young artist too proud to be satisfied by the praise of his lesser lips but desperate for the paternal pride that this master readily offered. It was a sign of respect that, in 1909, so many of Schiele’s paintings look like copies of his teacher’s. In that same year a man named Heinrich Benesch visited the international Kunstschau organized by Klimt, in which Schiele had been invited to exhibit. Benesch had noticed Schiele’s work the previous year at a group show — the first show Schiele ever appeared in — and he made a note of the painter’s startling work. This time, though, he was disappointed in Schiele’s “weak imitation of Klimt.” It is no surprise that this was the same season that Griepenkerl’s tolerance for Schiele wore out. Twelve years after Klimt’s secession and three years after his arrival at the school, Schiele left the academy. At about this same time Schiele moved out of his family home — in search of himself and, what was almost the same thing, in search of sex. 

    Art nouveau and expressionism may both seem like varieties of decadence, but they are antithetical to one another. Cool Klimtian decoration cannot be used as a psychological tool for self-expression, but self-expression was the essence of art for Schiele. As he put in a letter to a collector shortly after one of the most traumatic experiences of his life: “I have to turn inward and paint pictures that have value only for me. [This painting] was simply produced out of deep feeling.” As Klimt’s stylistic dominance over him retreated, two genres came to consume Schiele: self-portraits and nudes. Often both at the same time, and often doubled. 

    The Double or the Doppelgänger, was a psychological theme which preoccupied modern artists and writers of the time. Rilke, who replaced Rimbaud as Schiele’s favorite writer, published The Notebooks of Malte Laurids Brigge in 1910, the same year that Schiele embarked on the first of his nude Doppelgänger paintings, which he titled Self Seers. In Rilke’s novel there is a haunting scene in which a man tries to remove a mask he has put on, and fails:

    Hot and furious I dashed in front of the mirror and with some anxiety on account of the mask I saw my hands working away. But that’s just what the mirror had wanted. Its moment of vengeance had come. While I was straining, with an increasing feeling of restriction, to somehow squeeze myself out of my disguise, the mirror compelled me, I don’t know what with, to look up and it dictated to me an image, no, something real, an alien, incomprehensible, monstrous reality that permeated me against my will, for now it was the stronger of us and I was the mirror. I stared at this large, terrifying, unknown something and it seemed incredible that I was alone with it. But at the very moment I thought the worst had happened: my mind emptied, I wasn’t there. For a second I had an indescribable achingly futile longing for myself because there was then only him, there was nothing but him.

    Self Seers could have been conceived while reading this passage. 

    Schiele was assiduous from a young age about developing complicated compositions as sketches before setting brush to paint. He drew and redrew these sketches in his notebooks, often on graph paper — engineering, again — to better determine the proper proportions and placements. While still under Klimt’s influence, many of the preparatory sketches and subsequent paintings were determined by the structures of similar Klimt paintings — but not Self Seers. Here was a complex narrative which Schiele had himself conceived. In it — or in the black and white photographs of it, which are all that remain of the lost work — both Schieles are naked. One is pressed against the other from behind, caressing his twin’s hair and cheek, while the pair stare at their reflection at which the frontmost Schiele squints. Both twins’ legs are bent at the knee, and the figure in front is kneeling with his legs pulled apart. His pale, skeletal body is accented by shaded indications of bone tight beneath flesh and the pale flesh is disrupted by a deep rectangle at the groin, his exposed genitals. A cascade of black ink flows around both bodies like an open cloak and slips from the frontmost figure’s shoulder down to his thigh and then behind his knee. It is an image of confused and tender seduction.

    Schiele was proud of this work, considered it a breakthrough, and instructed his dealer to find a worthy owner for it. When one was identified, Schiele wrote him a letter:

    Without meaning to flatter you, I know of no greater Viennese art connoisseur than you. Therefore I have chosen you to receive this picture from my newest series.—In time you will be completely won over by it, as soon as you begin not to look at it, but to look into it. This is the picture of which G[ustav] Klimt remarked that he was happy to see such faces. It is certainly the best thing that has been painted in Vienna lately. 

    The following year Schiele painted himself masturbating in front of the enormous mirror which he carried with him from apartment to apartment every time he moved, which was frequently. He titled the portrait “Eros.” It is a disturbing image. In watercolor, gauche, and chalk on a yellowish brown paper, a gaunt, greenish man in an open black robe is depicted seated with his legs apart, thrusting the flowing robe back behind them so that the robe covers nothing but the man’s shoulders, much of his arms, and traces of his chest, which, like his face, is rendered in greens, whites, and a few accents of pale watery orange. An enormous erect penis, over half the size of his torso, is rendered in reds and oranges, which contrast grossly with the greenish hands which fondle its base and crown. Three years after this painting was completed, in his study at Berggasse 19, about a half hour’s walk from the art academy Schiele had long since left, Freud wrote his essay “On Narcissism,” in which he defined a narcissist as “a person who treats his own body in the same way as otherwise the body of a sexual object is treated; that is to say, he experiences sexual pleasure in gazing at, caressing and fondling his body, till complete gratification ensues upon these activities . . . Developed to this degree, narcissism has the significance of a perversion.” What would he call a man who watched himself achieve gratification in reflection and then immortalized the act?

    As with all people, Schiele’s own body was the arena in which he first explored sex, but in his case sex was a subject of enthusiastic and studious obsession. His appetite was also his subject. He used his own body as the urtext to which he compared all other nude models, all other objects of sexual interest and autonomy, and he studied himself, his arousal, the way that erotic excitement coursed through him and operated on his flesh, as if it were his job as an artist (which, in fact, it was). Consider the logistics of the masturbatory self-portraits — the extraordinary attention that was required for Schiele to be able to make such a painting. This was no ordinary self-portrait: he could not execute it by standing pen and pad in hand before a mirror, flicking his eyes up to his reflection and back down to the drawing, rapidly glancing from one to the other as self-portraitists often do. He would have had to spend hours sitting with legs pulled apart before his reflection, watching carefully as the colors and shapes changed at his gentle or rough provocations, and then gone to his easel to conjure on the canvas the what he had just witnessed in the glass. There was no model for how to do this. No one had ever done it before; it was rare enough at the time for specific, identifiable men to appear naked in paintings at all. He was seized by the idea, borne forward by an internal force that directed him to paint it. It was appetitive: he hungered for something inside himself. As he put it in a letter: “I want to rip into myself, so that I may create again a new thing which I, in spite of myself, have perceived.” 

    Because his ur-nude was his own body, it is not strange that Schiele’s nudes are sexual creatures, not sexual objects. Unlike most other female nudes in art, the eroticism of Schiele’s naked women emanates from them, they are conscious of it, they revel in their sex. There is more curiosity than tawdriness in these startlingly candid depictions. Schiele preferred hiring prostitutes outfitted with corsets and lacy thigh highs because he wanted his women to participate in their own eroticism just as he did in the paintings for which he served himself as model. Schiele’s overt, earnest sexuality is deeply unsettling: in painting, as in every other instance of sexual identity, sex is tamed by etiquette, but not here. Schiele can seem almost sociopathic in his brazen authenticity. He was not depraved. He had a high standard for visual truth and he was sure of himself.

    A person can be overcome by desire without knowing how to satisfy it. The capacity to be aroused into an erotic frenzy, to be swollen with longing, does not always come with the capacity to relieve that urgent need. Desire threatens humiliation to every person because every person is sooner or later forced to ask themselves what do I want?, and is rattled by the discovery that they may have to try to coax satiation from another human being, an autonomous person whose erotic compulsions may not complement or satisfy theirs. What is worse — to have no partner, or to have successfully led a lover to one’s bed only to betray one’s own helpless, inarticulable longing? Or is it worse still to costume that longing in familiar sexual tropes, to invite the lover to a performance of the clichés in the standardized script which conceals oneself from oneself and one’s partner? In the bedroom, etiquette is an awkward guest. Revealing oneself to another is not like choosing the right fork. Even the slick are naked when their clothes are on the floor. On top of the sheets, or under them, is where there occurs the great experiment in abandoning society for yourself, in attempting to shed the mores you carry with you as you shed your clothes; and the irony in this post-sociality is that it is achieved in company. Social currency is useless in the bedroom, where the proud are brought low and an animal aristocracy always reigns. Schiele’s radical paintings were created by a member of this erotic monarchy. 

    Schiele’s nakedness is idiosyncratic, which is what all intelligent eroticism demands. He was rigorously aware of what he wanted. His taste for flesh and bone was pronounced, specific, fully formed even before he started painting erotic imagery. Schiele knew the skull beneath the skin. His eroticism is discomfiting not only for the obvious reason that he painted naked bodies, embarrassingly splayed and hung on museum walls for men in button downs and khakis to eye on family trips to museums, but because he painted naked bodies that brazenly revealed his peculiar, exacting fetishes. Schiele wanted flesh pulled tight over bone, he demanded a manic emaciation, a body charged with energy like an electric current, and he wanted the viewers looking at his works to recognize in his hand the volatility, the febrility, of his appetite. 

    In Schiele Drawing a Nude Model in Front of a Mirror, from 1910, a seated Schiele stares straight at the mirror in which the model in front of him is eyeing herself. The drawing depicts the model’s reflected back as well as her front. She is doubled, and costumed in a feathered hat and thigh highs. One hand rests on her right hip, which is slung low, while she juts the left hip up and out in a seductive thrust. The artist is watching the model watch herself; he is a double-spectator and her experience of her own seductions is the subject of the drawing. She is posing for herself. There are three distinct forms, each overlapping the other in depreciating size: the model’s reflection facing backwards and filling the rightmost side of the drawing, the model facing forward slightly lower on the page close to its center, and the artist seated with his head reaching about the middle of the sheet and his feet at the bottom left corner. The drawing is a simple draft of pencil on paper and its execution is virtuosic. Heinrich Benesch, who often had occasion to watch Schiele work, noticed that “the sureness of his hand was almost infallible. When drawing he most often sat on a law footstool, the drawing-board with its paper on his knee, and the right hand with which he drew propped on a support . . . an eraser was unknown to him.” All three bodies in this work are drawn mostly in contour which means that he was able to eye the relative sizes of each without marking up the page to remind himself of where to place them. One of Schiele’s shocking strengths was his capacity to hold in his mind the sizing of each subject in relation to the other subjects on the page. The contour drawings in his oeuvre are especially virtuosic in this respect. He is like a mathematician who is able to do extensive and complicated arithmetic in his head so that all that is left on the page is the solution. 

    Benesch reported that, whereas Schiele always drew from life, he would go back and fill in the colors later when no longer looking at the model. Observed in a Dream, which Schiele painted in 1911, was contoured in pencil and then colored afterwards to extravagant effect: a woman lies across the center left of the picture, naked except for a skirt pulled all the way to her waist and hidden behind her arms and hands. Her head careens towards the left of the sheet and is half-covered by a black blanket which frames her body and outlines the skirt’s white slip. The upper part of the woman’s arms are close to her sides and bent at the elbow; her hands are raised above her wrists and rest cross her hip bones, over which her fingers spread. The fingers of each hand pull back the corresponding bright red flap of the woman’s labia so that her vagina is held open. The curves of her breasts, hips, and thighs imply the maturity of a grown woman, an impression confirmed by the dark wisps of pubic hair which ring the red from the top and bottom. The labia’s bright red is the same shade as the woman’s lips and the lace garter which lines one thigh-high stocking. The accoutrements of erotic expression — the garter, a red lace ribbon on her skirt slip, pinkish-purple stripes on her stockings — communicate to the viewer that this woman has styled herself, that Schiele selected a model with a developed relationship with her own sex and this inward eroticism, the interior life of her sexuality, is what he wanted to witness and capture.

    This is quite different from another style of nude for which Schiele became well known and reviled. In 1911 he executed a drawing of a child, Nude Girl with Arms Outstretched, in pencil on paper. A young girl is positioned down the center-right of the sheet. Her head is at the center-right of the top of the drawing and the upper part of her calves meet the bottom of the page. She is lying with her head bent forward so that her neck is not visible. Watery black watercolors and light pencil marks communicate the bones of her chest. A dark black curve cuts in a semi-circle to indicate the bottom of her ribcage, stark against her pale skin. She appears fragile. Enormous and wide-open blue eyes communicate the child’s innocent curiosity. Pencil strokes barely curve around her nipples, which are rendered in red watercolor, slightly more translucent than the opaque red of her lips. Her belly is flat. Near-straight lines of pencil run from the bottom of her curved ribcage down to her hips and one hip bone juts out from beneath her skin, cutting off the torso. One third of the way up the sheet from the bottom, the girl’s bare, hairless vulva is drawn in simple pencil marks. A black blanket with red, white, and blue stripes frames her legs which are so thin that they curve out away from one another and meet again at her knees, the sockets of which bulge together. Black socks reach the upper part of her calves and blend into the black of the blanket.

    It is a characteristic Schiele nude: delicate intimations of bone and pulled flesh, light washes of color which imply bone, muscle, the torsion in a tense or twisted or thin body. The motion of the brush is sparse and its sparsity is a cutting whisper. In this painting, as in so many of his best drawings and watercolors, the negative space is in vital communication with the subject cutting into it. The skillful placement looks easy, natural, and belies the immense talent required to put so little on a page without allowing the drawing to feel unfinished, though about two-thirds of the paper are left blank. There are regulations in Schiele’s forms. Flat, sharp, vibrating with a complicated energy that chugs from one limb to another like a train on its tracks. The machinery is complex, and the mechanics obey an intricate engineering which he developed in his imagination and then obeyed. It is eerie, it is unnerving, when a person obeys laws that the rest of us cannot discern. That is one definition of insanity and of genius. Schiele once described himself in a diary: “I, by nature, one of the freest, bound only to the law which is not that of the masses.” This portrait is the fruit of such a self-conception.

    Schiele’s law determines the structure and the vigor of his figures, and it also situates his figure and regulates the field onto which his subjects have been flattened. Fidelity is the force which vibrates in Schiele’s painting. Truth churns across the canvas laying waste to all vers de société. For this reason admiration was not all that Schiele’s truthfulness inspired. Many good people were also outraged, and not on aesthetic grounds. Schiele did not shirk convention, he was oblivious to it. He did not seek out and objectify children because he wanted to shock: he sought them out because their bodies fascinated him and because he did not respect etiquette enough to consider why that might be offensive to other people. He was captivated by their innocence, by the pure and firm fragility of their bodies which required no exaggeration to convey vital vulnerability. Otto Benisch, the son of Schiele’s attentive collector, once said that Schiele possessed “the quiet seriousness of a man absorbed by a spiritual mission . . . Schiele’s nature was childlike, not childish.” His earnestness and his unaffected confusion at other people’s horrified responses to his work were indeed childlike. 

    Only a few years earlier, back at Berggasse 19, Freud produced his essays on infantile sexuality, which was the most provocative of all his provocative ideas, and brought down upon his head the wrath of bourgeois Vienna. Even in Viennese society, in which illicit sex was a subject of constant whispering titillation, no other contemporary painted explicitly sexual paintings of nudes, and certainly not of nude children. The storm that ensued when Schiele’s nude children were exhibited may remind some Americans of the controversies that swirled around the photographs of Sally Mann (whose models were her own children) and Jock Sturges some decades ago. In painting, the only other master who forced viewers to ponder the sexual power of children was Balthus, whose painting Thérèse Dreaming from 1938 recently provoked a petition for its removal from the walls of the Metropolitan Museum of Art. It is almost prudish by Schiele’s standards: even Balthus’ exercise in fetishization never permitted him to splay the naked legs of young girls and render their sex in bright red paint as Schiele often did. (Though Balthus also painted The Guitar Lesson, from 1934, to which I refer students of the delicate subject.) Balthus’ boldness was prurient, it explores sexual longing whetted by social shame which is what prurience is; but Schiele was beyond prurience, beyond shame. He was dangerous.

     

    In August 1911, a woman named Wally Neuzil, a model whom Klimt had sent to Schiele, threw herself into Schiele’s life and his work. Shortly after they met, the two left Vienna for the town of Neulengbach, about which Schiele wrote:

    I have come to Neulengbach in order to remain here forever. My intentions are to bring great works to completion, and for this I must work in peace — that was impossible in Vienna. Up to now I have given, and now, because of this, I am so rich that I must give myself away.

    The petty dramas and internal politics of the art world vexed Schiele and distracted him. He worked better away from the red-hot center. Unfortunately, “away” was home to the people of Neulengbach, who were precisely the kinds of neighbors who were shocked by the painter’s brazen obliviousness to social mores. He spent several happy months in their town, but his happiness was terminated horribly on the morning of April 13, 1912. On that day, two police officers arrived at the small garden home in which Schiele and Wally lived together, confiscated Schiele’s drawings, and imprisoned him in a basement cell of the Neulengbach district court house. 

     For over a week he languished in his cell without any justification for the arrest. He would find out later that he had been charged with “immorality” — the court claimed that he had shown pornographic images to children — and “seduction of a minor.” He was held for twenty-four days. Schiele managed to keep a diary for much of time behind bars, in which he chronicled a childlike disbelief at the townspeople’s intolerance. When he was at last notified about the circumstances of his arrest, he wrote: 

    My arrest is no misunderstanding! [I was arrested] because of a suspicion of lewdness with children, little girls, because of the production of erotic — that is, obscene — drawings which I am supposed to have left lying about! Now at last I know why I am “sitting” here! It is a scandal! An almost unbelievable crudity! Vulgarity! And a great, great stupidity! It is a cultural blasphemy, a shame for Austria that such a thing can happen to an artist in his own country. I do not deny it: I have made drawings and watercolors that are erotic. But they are still always works of art — that I can attest, and people who understand something of this will gladly affirm it. . . . But one has never imprisoned an artist for this. No erotic work of art is filth if it is artistically significant; it is only turned into filth through the beholder if he is filthy. . . . I do declare as untrue that I showed such drawings intentionally to children, that I corrupted children. That is untrue! Nevertheless I know that there are many children who are corrupted. [Did he concur with Freud after all?] But then, what does it actually mean: corrupted? Have adults forgotten how they themselves were as children? Have they forgotten how the frightful passion burned and tortured them while they were still children? I have not forgotten, for I suffered terribly under it. 

     And I believe that man must suffer from sexual torture as long as he is capable of sexual feelings.

     . . . I am not an evil man! I have not ravished, stolen, murdered, set fires — not in any other way offended the sensitive human “society” — except by my existence.

    At his trial the judge selected one of Schiele’s drawings that had been confiscated at his arrest and burned it in the courtroom. This ritual murder of a work of art traumatized Schiele as much as the imprisonment itself. He returned to Vienna from Neulengbach as a recluse in society and also in his work. He created several self-portraits in which he appeared as a hermit and a monk, in rebellion against society and in religious service to an art that he believed was holy. He died of the Spanish Flu just six years after his release. He died a married man, a father, and a survivor of World War I. So many boys came home old men from the same hell that he survived, but Schiele somehow protected his shocking youthfulness from the necrotic brutality of the trenches. 

     

    Shortly before his death, Schiele penned this poem:

    I, eternal child — 

    I sacrificed myself for others

    Who looked and did not see me…

    Everything was dear to me — 

    I wanted to look at the angry people 

    With loving eyes,

    To make their eyes do likewise…

    And the children

    Who looked at me with big eyes

    And rewarded my looking back with

    Caresses.

    Thy Tents, O Jacob

    for Thea Wieseltier

     

    I

    In the spring of 1866, on the front page of Ha’Carmel, a Hebrew literary weekly that appeared in Vilna for a few decades in the latter half of the nineteenth century and served as an important organ of the Haskalah, the Jewish Enlightenment, which was beginning its exhilarating and damaging march through the traditionalist Jewries of Russia and Eastern Europe, a poem was published. It was a manifesto-like summons to a more cordial relationship, even to a deep bond, between the Jews and Western modernity. It was called “Awake, My People,” and its author was Judah Leib Gordon, who had composed it three years earlier. Gordon was an extraordinary man of letters and one of the most controversial writers in modern Jewish history. (As is often the case, the controversies included injustices.) Though twentieth-century Hebrew poets came to despise Gordon for reasons of political or aesthetic doctrine, and though his poems now read with a quaintness that often crosses the line into archaism, Gordon was a founding father of modern Hebrew culture, which in some ways was an even more breathtaking creation than the modern Jewish state. There is an old story, se non è vero è ben trovato, that Gordon once called on one of the masters of the newly founded “science” of critical Jewish scholarship in Germany, another founding father, and when the old sage asked the young firebrand to identify himself, he declared “I am a Hebrew poet!” “Oh really?” came the reply. “When did you live?”

    Having exhorted its readers to participate in the societies in which they live, to speak their languages and learn their ways of thinking, to participate in the cultures that they proudly shunned, Gordon’s poem comes to its climax in its penultimate stanza whose infamous third line continues to reverberate in the interminable struggle of the Jews to find an honorable definition of who and what they are. “Become a man as you leave and a Jew in your tent” — from Deuteronomy 34:18, “rejoice Zebulun in thy going out, and Issachar in thy tents.” As I say, archaizing. By a “man,” of course, Gordon means a human being, and more specifically, an agent of universal principles that are shared across the society in which the Jewish community finds itself. (The idea that Western principles, or Eastern principles, or any principles, may call themselves universal is offensive only to those for whom the provenance of an idea is its most salient feature.) In modern parlance, Gordon’s line is most familiar as “Be a man in the streets and a Jew at home.” A great psychological and cultural bifurcation, though it should be added that Gordon was not calling for the abolition of either term and certainly not for an erasure of Jewish culture in favor of European culture: the development of Jewish culture was his holy if secularizing cause. In his hostility to Jewish religion, in fact, and in the hostility of other      maskilim or enlighteners, one witnesses the birth of the very notion of Jewish culture: the culture is what one continues to adore when the religion is gone. Gordon regarded his prescription as a meliorist formula for a respectful integration. The irony seems to have been lost on him that he was insisting upon the revival of precisely the punishing spiritual structure wrought by the Iberian catastrophe centuries earlier, the eternally anxious double life of the crypto-Jew.

    A double life was anathema as much to traditionalists as to assimilationists; the warring camps both sought the same thing, which was a single life, a ringing consistency, one thing and one thing only. Gordon’s prescription at least had the merit of recognizing that there is no such thing as one thing. Who else would dream about uniformity, if not creatures who are multiple? The furiously (and learnedly) anticlerical poet was correct: a person’s identity is finally determined by the internal relations of the parts, all of which are real regardless of their respective ideological reputations — by the negotiations and the accommodations between the many elements of which every person is always comprised. There are no hollow humans. Radical exercises in self-amputation are invitations to psychological misery. Before one even arrives at the matter of one’s presentation to others, at home and in the street, there is the pressing question of one’s presentation to oneself: the inward arrangement of one’s influences and ideals, and how one’s inner tent shall comport with one’s inner street. The burning question of the seam.

    Gordon’s proposal was roundly scorned as a kind of soft treason, even though there is not a page in his writings that is not animated by the love of his people. In 1885, in Odessa, HaMelits, a Hebrew weekly that in the following year became a Hebrew daily, the first one in Russia, published an influential response to Gordon’s maxim by Moshe Leib Lilienblum, an early maskil who became an early Zionist.

    Be a man in the street and a Jew at home. The poet is not the first to give us such a lesson. Many are the writers who urged us to hide the “Jew” in us (that is, our being Jewish), this contraband stuff, in the secrecy of our tent, as if it were a disgrace for a man in the nineteenth century to be known as a Jew. Many have heard this cry and the results are evident in our children.

    It was Gordon’s luck, I suppose, that the notion of the “self-hating Jew” was still half a century away. (This call to order, this insinuation that development is defection, has been the stock-in-trade of American Jewish reactionaries, of “proud Jews,” for a long time. I first heard it from the lips of Meir Kahane.) (Already I regret mentioning Kahane, and certainly his lips, in the context of Lilienblum and these other Jewish nobles.) Other contemporary critics of Gordon were a little kinder. The beloved Hebrew writer Yosef Hayim Brenner complained about the “spiritual obsequiousness” of Gordon’s formula and called it “terrible,” but he instantly qualified his polemic with an affectionate encomium to Gordon’s lifetime of devoted membership in his nation, adding that “a life is greater than any formulation.” He also noted graciously that Gordon had composed his call for other-directedness in Hebrew, the Jewish language. (Which is to say, most of the Jews in America’s tents and streets would not know what to make of it.)

    In the spring of 1892, also in Odessa, in a short-lived Hebrew periodical called Pardes — the kind that used to be called a “miscellany,” the one in which Bialik’s first poem was published: for some of us these forgotten Hebrew publications are like stars in the night sky — there appeared a highly original response to Gordon’s slogan and its scandal. Its author was Ahad Ha’am, born Asher Ginsburg, whose nom de plume is a Biblical phrase that means “one of the people.” (An unimaginable appellation in the age of YouTube channels and naming opportunities.) In the form of an open letter to the editor of the journal, who was himself one of the most formidable Hebraists of this Hebraizing era, Ahad Ha’am did not denounce Gordon for his suggestion that Jews, too, must engage with universal notions of truth, goodness, and beauty. Not at all: Ahad Ha’am was an anti-parochial nationalist. He contended, instead, that Gordon had misconstrued the relations between the tent and the street, between the particular and the universal, by banishing the universal to the street. In his view, this amounted to a false and paltry conception of the Jewish tent.

    His essay — he was a significant thinker who never wrote a book, who did it all in essays — was called “A Man in His Tent.” Like some of the other critics, he worried that Enlightenment would weaken group feeling and “nationalism.” (Earlier he had authored one of the most sterling Jewish sentences I have ever read: “I for one know ‘why I will remain a Jew,’ or more properly, I do not understand the question, just as I would not understand the question of why I will remain my father’s son; I can make any comment that my heart desires about the beliefs and the ideas that my ancestors bequeathed to me without any fear that I will thereby sever the connection between myself and my people;” and these heterodox thoughts included, he added, his support for “the scientific heresy that carries the name Darwin.”) He proceeded to demolish the distinction between universalism and particularism, arguing that the abstraction “humanity” is real even if it is not available to the senses, for “there is no sensuous existence for universal concepts, since every individual, for example, is of necessity either Reuben or Simeon and so on.” The compound character of human life therefore requires of us that we double up and make ourselves responsible for both “the reform of ‘humanity’ and the reform of ‘nationalism.’”

    Ahad Ha’am’s splendid objection to Gordon’s construction of the relation of the universal and the particular in the Jew was not merely that Gordon sought to confine the particular to the private sphere — such a mistake was obvious to a Zionist who championed the cause of Jewish nationalism publicly. Of course we will be Jews in the street! No, Gordon’s error was to expel the universal from the private sphere of Jewish life. Here was a Jewish nationalist instructing that Jewish nationalism was not enough, not the whole Jewish story. In his own time, when the Jews “turned their hearts to nationalism,” they “forgot humanism, the inner spirit on which everything depends.” And so Ahad Ha’am coined a retort to Gordon: Be a man in your tent. This, he explained, is “the great teaching [he uses the word “Torah”] that has been lacking in us.” The man in his tent is “the great cause whose wisdom and power will be apparent in all our external actions.” He insisted that this become “the strong foundation of the new edifice” of Zionism.

    He concluded with a counterintuitive dictum that most present-day Jews in America would find confusing or insulting: “A day of adversity is a day of castigation.” By castigation, he means self-castigation. Are we, then, to blame ourselves as we mourn? It seems like an unfair demand, but the Zionist philosopher demands it. (As does the ancient and medieval rabbinical tradition, for which the ultimate objective of sorrow is introspection. The mourner’s repentance!) Ahad Ha’am asserts that times of grief and peril are appropriate not for self-celebration — for dwelling on “Israel and its nationality, on its beauty and its wisdom and its righteousness” — but for the discovery of our failings, when “the eyes of the nation will be opened to its own shortcomings,” all of it with “a warm heart, a loving and hurting heart.”

    It has been a long day of adversity for Jews, but castigation has not been welcome. Even a loving and hurting heart finds little hospitality in the community if it deviates from the emotional and intellectual and political consensus; if it proposes to think critically about trauma and the methods for mastering it; if it disputes the bleakest reading of events and dissents from the climate of despair and makes no time for easy pessimism at the Shabbat table (people love dead Jews, pass the challah); if it suggests that calm analysis will serve us better in the crisis than rampant feeling, and that hysteria never made anybody smarter; if it has the insolence to insist that Jews have been, in our other country, in Israel, not only the victims of adversity but also the agents of adversity, and that we have betrayed our Israeli brothers and sisters by inhibiting and even forbidding the expression of the misgivings, or the horror, that we may harbor about some of their actions.

    Introspection, even under duress? Self-criticism, even in the midst of self-defense? I’m afraid so.

     

    II

    About the scale of the trauma that October 7 inflicted upon Jews in Israel and in America there can be no doubt; and this crucible occurred at the same time that American Jews were confronting a revival of antisemitism — and not only a revival: some of the anti-Jewish violence of recent years was unprecedented on these shores. All this misery seemed to rattle and even to refute the hallowed American Jewish theory of the United States as a place where such a fate, the master plot of the other refuges in exile, cannot happen. Suddenly American Jews could feel the sort of natural connection to their ancestors that the American dispensation had blessedly disrupted: after all, when American Jews allude in their prayers to the sufferings that “we” have endured, the first-person plural requires an exertion of the imagination. Except in collective memory, which is not like personal memory, we ourselves endured none of them. We were the lucky children of our ancestors who had escaped the sordid and unjust conditions of their lives, so much so that the United States, which certainly represented a revolution in world history, also represented a revolution in Jewish history. But now a dark continuity with the time-honored typological interpretation of Jewish history — the view that Jewish history is an eternal repetition of a single wretched story in which, as the Passover Haggadah puts it, “more than once has an enemy risen to annihilate us” — seemed plausible here, too. The community put on a shroud. American Jewish exceptionalism seemed to be going the way of American exceptionalism itself, as the United States surrendered to meanness and chaos, to ethnonationalism and a war on difference, and suddenly there seemed to be a basis in American life for the fear that the philosophical and political protections of the American republic, the ideas of the Declaration of Independence and the laws of the Constitution, American liberalism, would not hold.

    It has not been easy to assess the dimensions of the danger or the historical implications of the unfamiliar circumstances. Our default setting can be very dark. It has been clear, at least to anybody interested in the whole sobering truth, that we have enemies to the right of us and enemies to the left of us, just as it has been clear that some American Jews have found a home in both fevers. A certain tiresome game of whose-side-is-worse has characterized a lot of the internecine communal debate — internecine debating does not exclude shouting on cable television — and wasted a lot of our time. It should be uncontroversial, for example, to assert that what has happened in the United States in recent years is a mainstreaming of antisemitism in American politics, and that the perpetrator of this poisoning, the man who switched on the green light, was Donald Trump — uncontroversial even if the American left, unsurprisingly, also has a lot to answer for in the contemporary fragility of the Jews. Trump and the MAGA right have introduced a new creature into the American political bestiary — the pro-Israeli antisemite; and too many American Jews, even when they are guilty of no more than anti-anti-Trumpism, have fallen for it. (So, too, has the prime minister of Israel, who recently, as he welcomed a leading official of the reactionary antisemitic Freedom Party of Austria, affiliated himself with the so-called Patriots for Europe, a grouping of populist anti-Jewish parties presided over by Marine Le Pen’s marionette Jordan Bardella, including Fidesz from Hungary, ANO from the Czech Republic, the AfD from Germany, the Party for Freedom from the Netherlands, VOX from Spain, and other gargoyles of post-liberal Europe.)

    The most important requirement of threat assessment, as military planners call it, is that it be nothing if not empirical. Generalizations from lurid public events are not enough to go on. In evaluating the dimensions of antisemitism in America, what is needed is not ethnic pride but social science, data and the study of it, numbers, responsible inferences and projections; and in combating antisemitism what is needed is not soaring eloquence but a tireless exercise of local and national politics and political coalition-building, and the wise exertion of political and economic muscle. The media, without which we would not know what we need to know, always provides much more than we need to know, and so, once the facts have been established, as they still can be for those who care, the less media the better; the impact of the media upon the formation of opinion in America is one of the gravest challenges to our democracy and our good sense. I am myself unqualified to do the empirical work that is necessary to arrive at a trustworthy judgment about the trajectory of antisemitism in America. Since we must always err on the side of caution, I can live with a certain amount of prudent exaggeration. What I find it hard to live with is the sort of exaggeration that is designed to inculcate in the Jewish community a permanent state of apocalyptic excitation, when the apocalypse — this is as much a philosophical issue as an empirical one — is nowhere near.

    Let us begin with the Holocaust, which hovers inevitably over every discussion of Jewish danger, and rightly. I say rightly, because the European catastrophe permanently planted in contemporary Jewry a suspiciousness about the world that we are powerless to exorcise, and it would be a deformation of our character to wish it otherwise. It actually happened, and not so long ago. (In this instance no amount of italicization will suffice.) Refresh the shock diligently and bequeath it to your children. Jewish honor requires us to live partially in its shadow, and in that semi-darkness to devise ways not to hate the world. Yet the invocation of the Holocaust must never be designed to shut down clear thinking, or the power to make distinctions. We do not properly respect the memory of the Holocaust by seeing it everywhere. Consider the most sensitive example of all: it took only a few hours after the pogrom of October 7 for a platitude to form, not only among Jews but everywhere. Owing to the scale of the slaughter, October 7, it is everywhere said, was the worst day in Jewish history since the Holocaust. Quantitatively, on a per diem basis, if you will pardon the expression, this is true. But the analogy never sat well with me, because these things do not, as we like to say, scale smoothly up or down. Every atrocity is as large as it is and no larger. (And no smaller.) The genocidal intention of the Hamas invaders was clear, and in this respect the association was justified; and twelve hundred Israeli Jews — may God avenge their blood, the traditional Jew in me almost said, except that He has, and in Biblical proportions — were murdered on October 7 because they were Jews. On any given day in Eastern Europe between 1942 and 1945, however, more Jews were murdered, by many orders of obscene magnitude.

    The platitude makes me uneasy. Sometimes precision matters most where it seems to matter least, not only for the sake of knowledge but also for the sake of compassion: I do not want my people to be more tormented than reality asks them to be. The platitude certainly did not clarify anything for me about the crime that was committed on Jewish soil. I already knew that Hamas is a wildly antisemitic organization consecrated to the destruction of as many Jews as possible and to the total extirpation of the Jewish state. (That is why I wondered for years about Benjamin Netanyahu’s bizarre insistence that Hamas had changed for the better, a cunning bit of misdirection whose aim was to shift the national focus to the cause of “Judea and Samaria,” where his bread is buttered.) I rejoiced in the assassination of Yahya Sinwar, but not because he was Adolf Hitler; it was enough that he was Yahya Sinwar. The platitude is more comprehensible as an expression of emotion, of anger, of sorrow, about a defeat without precedent in Israel, and I certainly lack the temerity to tell my brethren in Israel that they should not feel what they feel. The problem is that feelings are sometimes elevated into something grander, into an illusion of objectivity, into a worldview or a philosophy of history, which then plays a decisive role in the analysis of particular cases. The platitude struck me as an obfuscation that was an early form of incitement.

    What is gained by imprecise or excessive analogies, except to frighten — and in almost every conceivable circumstance in which the Holocaust is invoked, to delude — those whom one seeks to fortify? In a heartfelt essay called “Despair Not!,” my old friend Ruth Wisse, ma soeur-ennemie, attempted to stiffen the spirits of her Jewish readers with the battle cry of the Jewish warriors in the final days of the Warsaw Ghetto. Gvald, Yidn, zayt zikh nisht m’yaesh! “Oh no, Jews, do not despair!” (Gevald or Gevalt: how to translate that common cry of alarm? Wisse leaves it as untranslatable, but I must note a macabre linguistic coincidence that I owe to Hillel Halkin, who many years ago pointed out that the canonical Yiddish translation of the Bible renders as ot shray ikh gevalt Job’s lament hen ets’ak hamas.) Wisse admits that she never expected to begin an essay addressed to the present day with those words. And she admits also that “we are a century removed from the existential and intellectual threats that Jews faced in the nineteenth and twentieth centuries, and it is unwise and even dangerous to exaggerate urgency if there is no need for it,” and also that “we are now a sovereign people and in America too Jews have agency as never before,” and also that “the many fellow American lovers of Israel among us prove that everyone does not hate the Jews,” and also that Jews were far from perfect and often fell short of their own moral standards. Wise words, all of them. (But is proof really required for the view that everyone does not hate the Jews? And must you be a lover of Israel to not be a hater of the Jews?)

    Wisse then proceeds to a review of her customary opinions that “the nations [the Jews] lived among were constituted very differently,” that the Arabs will never agree to coexistence with Israel, that “Islamist anti-Jewism [is] the regnant ideology of a new world order,” and so on. None of those propositions seem correct to me as a description of the world in which we live and for which we must plan, and some of them are disagreeable historiosophical speculation. And when she writes that “just as the Germans were considered the most cultured nation in Europe, Palestinians were considered the most intelligent among their fellow Arabs,” and that “the worst Israel does is far better than the best that the Palestinians do, to their own people and to their neighbors,” she is discarding the moral probity and historical complexity for which I was just now admiring her, and turning nasty. Wisse worries about the “demoralization” of the Jews, which she has always located in Jewish politics that differ from her own, but it is not a form of “demoralization,” or self-loathing, or indeed any failing at all, to reject the discourse of superiority. Must we really be the best of all peoples? Is it not good fortune enough to be what we are and revel uncomparatively in our richness? It is more than a little comic to encounter the belief, stated without a trace of irony, that my ethnonationalism is better than your ethnonationalism.

    So why, then, mention the Warsaw Ghetto? Are there Nazi stormtroopers, or jihadi terrorists, on the Upper East Side, or any likelihood of them? And why omit the central difference between the Warsaw Ghetto and us, which is the astounding and historically anomalous fact of Jewish power — political and economic power in America, every variety of power in Israel? If only Mordechai Anielewicz had commanded an army! Whereas the spectacle of what Israel’s army has recently accomplished, and recently wrought, is plain to all. In the aftermath of October 7 and the virulent eruptions of domestic antisemitism, the American Jewish community almost completely surrendered to its proclivities for morbidity, which were already quite considerable. Just when we needed clear heads, we could not pry our heads away from commemorations. We lost our elan. We quaked. We were in mourning, of course, and lucidity is not to be expected from mourners but that is precisely why shiva must last no more than a week, and why avelut, or mourning, must last no more than a year. The rabbinical tradition warns sternly against those who mourn overmuch, and in the medieval centuries even the mourning for the destruction of Zion devolved into a sect when it became permanently active and the primary coloration of the faith of certain unrelievedly grim believers. We are enjoined instead to return to the world, to the reality from which our grief severed us, and to operate smartly and decently within it, injured and intact, for the sake of our values and our interests.

    Let us consider another sensitive example. I watched with dismay as the Jewish community of New York appeared to take leave of its senses about Zohran Mamdani. I have little admiration for the man: he has a lovely smile but he is a doctrinaire socialist who began his victory address with the words of Eugene Debs, so as to compare his own election to nothing less than “the dawn of a better day for humanity.” How can we know everything that we know about the history of socialism and not almost completely detest it? Mamdani also adheres to the odious belief system that is the postcolonialist picture of the world, with its selective preferences among the world’s victims and its high-minded scorn of the West and its pious disguise of its political agenda as nothing more than a search for justice, though Mamdani is correct to insist on the imperative of affordability and on the outrage of economic inequality in present-day America. It is obvious that there is no room in the mayor’s progressive heart for a fair view of Israel, and that Israel’s policies toward the Palestinians are just about all that he has heretofore cared to know about the Jewish state.

    When there are so many legitimate grounds for the criticism of Netanyahu’s Israel to be made, why go rabid? The zeal is what perturbs me. Too much of the progressive opposition to Israel has curdled into something more, and less, than disagreement. Hatred is more, and less, than criticism. A prejudice has no place in a battle of ideas. (So many American battles of ideas have decayed into battles of prejudices.) While I have no expectation that Mayor Mamdani will treat the Jewish citizens of his city unfairly or unkindly, he owes it to them to seek to understand what Israel means for them, just as they owe other groups with whom they live an effort to comprehend their deepest motivations. Still, Mamdani was not running for the presidency, and he will not determine in any way the foreign policy of the United States. Moreover, not all the citizens of the city are Jewish, and the Jewish citizens who supported him were not all traitors to their people, and every mayor of New York must find a modus vivendi with its Jewish community and, well, times are changing. Brothers and sisters, they are. The response of the Jewish community of New York to the Mamdani movement exposed a complacent lack of preparedness for new challenges and new generations.

    It was discouraging to learn that the central body of the Jewish community of New York had instructed the rabbis of the city to devote their sermons on the Sabbath before Election Day to denunciations of Mamdani, and then to learn that many of them had complied. (And a few days before Mamdani’s expected landslide! Pretty shrewd.) Their compliance was owed, I expect, to their need to pander to the panicked Jews in their pews. Leave aside the matter of whether political endorsements should count as teachings from the pulpit. The donors wanted to hear it. Yet the responsibility of the rabbis, the wisdom with which they could have strengthened their shaken parishioners, was, I think, to explain to them why panic was not warranted. For a start, sometimes in political contests we will lose. When we lose, does our existence hang in the balance? Is Amalek now the mayor of New York? Has the day of the suitcases arrived? I think not. Our insecurities have disarmed us for the vicissitudes. All my life I have listened to American Jews compulsively assuring themselves that the current occupant of the White House, from Kennedy to Trump, is “the best friend that Israel ever had.” (The exception was Carter, who certainly became a friend to the Arabs but also established the peace between Israel and Egypt that persists as a pillar of Israeli security.) Anxiety what psychologists in recent decades have called, in less grandiose contexts, “extinction anxiety” was always given the last word in our accounting of our weaknesses and our strengths. The psychological is one of the most effective obstacles to the empirical.

    In a democratic society, one must be ready for disappointment and ready to go into opposition. The rules allow you, they encourage you, to try again and to try harder. In a democratic city, moreover, one should be pleased that the future of City Hall has not been determined by a bunch of bullying plutocrats in their Escalades who flee the city at the first sign of a pandemic, even if some of them are to be found in our synagogues. A democratic city is not a macherocracy. And American Jews are proud democrats, aren’t we? But there, on the night of Mamdani’s victory, was a prominent New York rabbi a man who built a shul in the Hamptons and, more to the point, created a Foundation for Ethnic Understanding tweeting in a tone of emergency that “with the news of Zohran Mamdani’s mayoral victory, I am announcing plans for the building of the first Jewish day school in the Hamptons. This is in anticipation of the thousands of Jewish families that will flock to the Hamptons and greater Suffolk County to escape the anti-Semitic climate of Mamdani’s New York City.” Maybe the jitney can get them across the border under cover of night. I always suspected that they regarded the Hamptons salvifically.

    The extinction anxiety of American Jewry has recently taken a new form. It is the idea that, to quote a popular cover story in The Atlantic a few years ago, “the golden age of American Jews is ending.” “For several generations,” Franklin Foer ruefully wrote, “liberalism helped unleash a Golden Age of American Jewry, an unprecedented period of safety, prosperity, and political influence . . . without [Jews] having to abandon their identity.” He added: “Their anxieties became American anxieties. Their dreams became American dreams.” The evidence that he adduced for this Golden Age, this historical apotheosis, included Jerry Seinfeld, Bob Dylan, Susan Sontag, Leonard Nimoy, Henry Winkler, Ralph Lauren, Barbra Streisand, Lenny Bruce, Woody Allen, Betty Friedan, Sid Caesar, Ruth Bader Ginsburg, Adam Sandler, Gilda Radner, Steven Spielberg, Art Garfunkel, Paul Simon, Norman Mailer, David Selznick, Louis B. Mayer, Jack Warner, Irving Berlin, Sammy Cahn, Bella Abzug, Barry Manilow, Edward G. Robinson, and Kirk Douglas, along with the old chestnut I was not aware that you could still cite such a statistic with a straight face that “approximately fifteen percent of all Nobel Prize winners are American Jews.” Where in Yahweh’s name was Sandy Koufax? The only rabbi in the hallowed bunch was Abraham Joshua Heschel, but that was owed to his solidarity in Selma. The American Jewish veneration of entertainment has always baffled me. Comedians have become our pastors.

    Yet Jewish success in American society is not the same as a golden age of American Jewry. A golden age is not defined by what our hosts do for us or by what we do for our hosts. It is defined by how we utilize the political and social blandishments that we enjoy, all this stupendous liberty, to cultivate ourselves not only as Americans but also, and sometimes mainly, as Jews. None of the stars and the luminaries listed above contributed anything of significance to Jewish culture and Jewish religion, except perhaps to solidify certain ethnic stereotypes about us, for which a thousand thanks. In a golden age one aims higher than ethnicity. Foer appositely celebrates Horace Kallen, the philosopher of American pluralism many decades before the melting pot fell into disrepute, but Kallen’s epochal contribution was to the American self-understanding, rather like the Hollywood moguls who were his contemporaries. Some of the American Jewish writers whom Foer included in his ode to the glitter gone by Saul Bellow, Philip Roth, Alfred Kazin, Irving Howe, Bernard Malamud, Clement Greenberg, Cynthia Ozick are intermediate cases: Howe and Ozick lived (and Ozick still lives!) also in Yiddish, which means that, unlike the author of the article and no doubt his editors too, they could actually read the Yiddish fortune-cookies with which the magazine’s cover was sprinkled; and Bellow, who, stubbornly and with a subtle understanding of what it means to be a Jewish writer insisted on not being regarded as a Jewish writer, was gleefully immersed in Yiddish culture and could mischievously include a Hebrew verse from Psalms in a hilarious limerick in Herzog. (He told Yiddish jokes in their native tongue, which was part of their delight for him.) Malamud was the strange and magical case of a Yiddish writer writing in English. Roth, who began his career as the enfant terrible of the Jewish middle class, making locker-room fun of Jewish daughters and Jewish mothers, and eventually decided that he was the community’s exilarch, boasted in Operation Shylock that Hebrew letters may as well have been Chinese to him. He made ethnicity brilliant, nothing more. There were so many other Jewish intellectuals the ones whom Carole Kessner once identified as the “other Jewish intellectuals,” Hayim Greenberg (the Jewish Orwell, if you ask me), Ludwig Lewisohn, Marie Syrkin, Trude Weiss-Rosmarin, and many others who did not make the cover of The Atlantic because they chose to deploy their talents internally, to the substance of the culture of which they veritably were the heirs and the stewards.

    I do not mean to give out grades or to check the tzitzis of other Jews. None of us has done enough in America. I mean only to suggest that the “Golden Age” was not all that it is now cracked up to be, and that if we wish to satisfy ourselves with that quality of gold we will certainly have more of it again. (Chalamet, Madison, Steinfeld, Eisenberg, Garfield, Hill, Safdie, Baumbach, and many other fledgling proofs of the spirituality of show business for American Jews are waiting in the wings.) We did a lot for America but we did too little for ourselves. There was not enough integrity in this. Moreover, during “the Golden Age of American Jewry” many of its heroes had to contend with ferocious American antisemitism. Indeed, the real golden ages of Jewish history in Spain, for example, and in Eastern Europe were attended by anti-Jewish discrimination and anti-Jewish persecution. What made them golden was what the Jews did in those circumstances with their own material. Physically they were forced inside, but spiritually they preferred to be inside, where their treasures could be developed. And when they were forced outside, amid all the expulsions, they persisted inside. They envisaged an informed and cordial apartness, with various levels of vigilant participation and porousness, depending on the circumstances, but such a benign segregation was almost impossible; and when a convivencia was achieved — to borrow the idealizing phrase of Américo Castro, the remarkable Spanish historian who argued tirelessly for the heterogeneous character of Spanish society and included the Jews, in their presence and in their absence, as one Spain’s constitutive and ineradicable elements — it always turned out to be only a beneficent parenthesis in a longer saga of aggressions against our alterity. (More con than vivencia, as a friend of mine once said.)

    The primary blessing of tolerance and emancipation and freedom is not the outer-directness that it makes possible, though episodes of acculturation can have a quickening effect on our own traditions; but too many modern Jews felt differently, and construed the new possibilities for complexity and adventure as opportunities for promiscuity and flight. In freedom we are free not least to grow ourselves. Let us live multiply in our tents — but two commitments means two commitments; and how is it that more of our traditions have slipped through our fingers in conditions of security and prosperity, here in America, where we are welcome, than ever fell away in the cursed times and the cursed places?

    How close should we be to ourselves, as individuals and as groups? What is the scope, and the proper order, of our obligations? Since we are beings with many fidelities, our ethical probity requires that we arrive at a justifiable account of our priorities among them — at a hierarchy of our homes. And it is entirely natural, there is no selfishness or solipsism involved, in beginning with our own. What defines the distinction of our moral life, rather, is what we do after we have seen to our own. The Jewish tradition has fruitfully wrestled with this quandary. There is a verse in Exodus about the moral requirements for moneylending (preferably without interest) that begins, “If you should lend money to My people and to the pauper among you”, and in his gloss on these words Rashi, citing a passage in the Babylonian Talmud, deals with the problem of precedence: “A Jew and a gentile — a Jew comes first; a poor man and a rich man — a poor man comes first; the poor of your town and the poor of another town — the poor of your town come first.” Indeed, the fact that the poor man is one of your own, Rashi exquisitely continues, will enhance you ethically, because it will inhibit you from deriding him for his poverty. And then Rashi adds — in his own voice, with no Talmudic prooftext — a few far-reaching words on the Biblical phrase “and to the pauper among you”: “You are obligated to see yourself as if you are yourself poor.”

    In other words, this ethical exercise, like most ethical exercises, depends upon a certain detachment from self, upon a correction of personal experience by means of an imaginative enlargement, so that for the fulfillment of our human purposes we are never completely imprisoned in who and what we are. A cognitive incarceration is where we always begin, but many of the conflicts between individuals and groups are owed to their willingness to end there, too — to what we might call their epistemological self-satisfaction. It is always surprising how little about the world and its meanings may be understood from the standpoint of personal experience, which we must beware of granting more authority than it deserves. Cosmopolitanism represents an inversion of a natural order of care, an insult to the suffering that is before one’s eyes, but its extension of perspective is an admirable objective, even if no woman ever gave birth to a universal and all citizens of the world have local addresses.

    We may place ourselves in the center, then, but only if we recognize how small a place the center can be. It is a dot; a hot dot. One of the afflictions of our time is a resurgence of Judaeocentric pictures of the world. Antisemitism is Judaeocentric at its core; without such a cosmology it could not attribute to the Jews, or rather imbue them with, the superpowers by means of which they dominate all things. Why fear the powerless? An omnipotent enemy, on the other hand, gives the gift of an entire world-picture. (Such hyperbole, we now know, marred the postwar literature on totalitarianism, Arendt and Friedrich and some others, who painted a picture of the Soviet Union as so unprecedentedly strong, as such a juggernaut, that it dissolved any hope that it could change or collapse.) I confess that I have sometimes enjoyed a tenebrous chuckle at the sheer stupidity of antisemitism, especially since all we have ever asked of our host societies in two thousand years is to be left alone to practice our ways and to live peaceably with our neighbors. We never sought to change minds, at least not non-Jewish minds. It was not even that we dissented; we preferred not to engage. Many of our thinkers welcomed new ideas from their surroundings, but they brought them home.

    For a minority, you might say, this was a practical matter, but keeping our heads largely to ourselves was not merely the cunning of exiles. It was also the temper of our faith. We had no desire for the world to be like us. Our sense of chosenness played out chiefly as a magnificent sense of self-esteem, unbreakable even when we were laid low, and if the idea of chosenness seems crazy or repugnant, so, too, were the circumstances in which we maintained our pride. We did not need power to validate our beliefs. And this turned out to be precisely the uncoercive religious sensibility that Christians (and to a lesser extent Muslims) could not abide. They simply could not live with people around the corner or on the other side of the wall who woke up every morning not in the conviction that Jesus was the son of God. Their hysteria about difference was pathetic, and it had the dangerous consequence of promoting the Jews into a historical and religious force so formidable that they qualified as a mortal enemy worthy of persecution and even destruction. Instead of putting us in the center they should have left us alone — which eventually was what the Zionists wished to accomplish for us, even as they themselves failed to leave others alone.

    Along with their Judaeocentrism, however, there is our Judaeocentrism — our absolute centrality, our unreconstructed immediacy, to ourselves. The view of history according to which the fate of the world is in the hands of the Jews, which is a terrible thing, is currently being fought on the Jewish right by the view of history according to which the fate of the world is in the hands of the Jews, which is a wonderful thing. Sometimes the self-love becomes an instrument for interpreting events that have little or nothing to do with the Jews, as if we are the key to all mysteries. During the most recent rebellion in Iran, for example, Bret Stephens published a column in the New York Times with the rather unexpected title “The Ayatollah’s Antisemitism Has Undone Iran.” I had been following the tumultuous events in the Islamic Republic closely, hoping against hope as I had hoped against it before, and I was in touch with various Iranian friends and scholars so as to better understand the rebellion, and not once did I hear, or did it cross my own mind, that many thousands of Iranians were marching and dying in the streets for the sake of the Jews, for my sake, or as a protest against what the Supreme Leader had done to anybody but themselves, the perdurable people of Iran. Am I falling off the path of righteousness when I suggest that a Judeaocentric explanation for democratic unrest in Iran is a narrow focus?

     

    Stephens was enamored of his group-narcissistic reading, and he developed it further: “Societies that have expelled or persecuted their Jewish communities, from Spain to Russia to the Arab world, were all destined for long-term decline. The same has been true for modern-day Iran.” This is risibly crude, though it is not unlike the view of world history that prevailed among the Jews in the medieval centuries, prior to the development of rigorous historiography, when the history of the world was the history of the Jews in the world. So let us begin at the beginning, with a primer on historical causality, and stipulate that many things happened in countries that expelled the Jews after they expelled the Jews and that not all those things were caused by the absence of the Jews, because there is no monocausal explanation for the fates of nations. Countries decline and fall for a host of reasons. It is possible to decline and fall, as it is to grow and prosper, without ever having enjoyed the company of Jews on your soil. Examples are beside the point; the point is the size of the world and the lasting inadequacy of simplifications about it. Stephens’ eccentric causality had a certain internetine quality. He laudably concluded that “a regime that sought to project on Jews its own malevolence may soon have its long overdue comeuppance,” but when the happy day arrives it will not be because the Jews are the pivot of Iranian history, because Murray finally persuaded Mordy at the Elders of Zion brunch at Barney Greengrass to settle the Persian account. That we are central to our own experience does not mean that we are central to the experience of others.

     

    The Judeocentrism of the Jewish right, the myopia of intense belonging, takes also other forms. There is the impassioned identification of Judaism and Americanism, for instance, which is always accompanied by the strikingly inaccurate idea that the American republic was founded on the values of the Hebrew Bible. Recent decades of scholarship have added certain Hebrew influences to the tale of the American founding, but looming large over the Biblical verse on the Liberty Bell is, well, the glory that was Rome. The authors of The Federalist Papers called themselves Publius, not Yankel. Yet the psychological satisfaction in the perfect equivalence of country and religion is clear: everything that one loves becomes the same as everything else that one lives. What enviable harmony! If not quite a single life, the myth of such a confluence promises at least a double life without dissonance. There are many ways to purchase untroubled sleep. Never mind that this patriotism or these patriotisms, which I share enthusiastically up to the point of their putative merger into a single patriotism flies in the face of what we know about history and about religion.

     

    The mistake in the modern German-Jewish identification of Deutschtum and Judentum was not that they picked the wrong tum to make synonymous with Judaism. The procedure itself was at fault. Judaism is like nothing but itself. It is a singular and autonomous entity that in its long progress through time has accepted some external influences and rejected some external influences without ever being reducible to any external influence, and never lost its quiddity, its essence, against the many forces that were arrayed against it. (Except later at the hands of Jews, but that is for another day.) It is infinitely larger and deeper than any political ideology and any political constitution. It is neither liberal nor conservative, neither Jeffersonian nor Hamiltonian. It shares principles with other traditions, but not because it is congruent with them. There never was and never will be such a coalescence. The Judeo-American tradition is as much of a fiction, a politically cheering fiction, as the Judeo-Christian tradition. The coziness is a fantasy. (Meir Soloveichik recently published a piece called “The Christian-Jewish Alliance and Its Enemies,” which bore this imperishable subtitle: “For Hashem, for country, and for Yale.” What do you give the man who has everything?) All good things, when they are good things, are not the same. Dissonance, friends, is our destiny.

     

    After the outcome of the experiment in Deutschtum und Judentum I refer not to Auschwitz, which really I do not expect to see, but to the intellectual and political confusion that followed from the myth of German-Jewish communion it takes a certain foolhardiness, and a certain arrogance, to play this game, especially now. The purpose of religion is not to justify states or sacralize parties, and not only because the First Amendment frowns on the politicization of the holy. The distance that religion should keep from its society is warranted, rather, by its essentially countercultural nature. How can a religious individual call a society Christian or Jewish, how can he identify a moral code and a spiritual telos with any particular society, when every society sins and every society corrupts? There has never existed a country so morally impeccable that it would make an equivalence with an ideal version of religion even remotely plausible. Indeed, religion has sometimes demonstrated its usefulness to society by providing an independent standpoint from which to criticize it. The prophets did not counsel complicity with the majority, and neither did the man from Nazareth. Anyway, all is not well in the actually existing estates of religion, to which politics has been laying waste for many decades; and these optimistic fairy tales about incandescent convergences have a way of blinding their votaries to certain crises in their own midst, or worse, of simply hating those crises, which they regard as somebody else’s fault and somebody else’s problem. (I have a lot of Orthodox Judaism in mind.) Of course it makes no sense to hate a crisis. It is only a way of battening down a bubble.

     

    There is an even more absurd claim in the attempt to elevate the Jews historically, to install us at the epicenter. It is the idea that we will save Western civilization. This breaks new ground in Jewish self-importance. It is a somewhat more complicated endeavor, since there are not many people, however excitable they are, who could posit a Judeo-Greek or Judeo-Roman tradition. Yet there is no denying that Western civilization or the humanities needs saving. (This journal was created as a modest contribution to that effort.) Jews in the West, living doubly as we do, have been among the greatest beneficiaries of the West’s better angels, and we should certainly lend a hand. But as Jews? I’m not so sure. We have experienced also the West’s worst demons. (Its two very worst demons, in fact, and less than a century ago.) Western civilization is both our civilization and not our civilization, our heritage and not our heritage. Its preservation is only partly our responsibility, though I will remain indefatigable in the struggle. The notion that Plato will survive because he will be taught in some Jewish schools is ridiculous. (It also suffers from a certain Straussian preciosity.) I ardently support the teaching of Plato in Jewish schools, not least so that the students may gain a sense of the contradictions beyond the civilizational pieties and learn to see through the empty promise of frictionlessness. When I was in a yeshiva day school, we used to call this “secular studies.” “Secular studies” were fully the other half of our education, all that stood between Athens and Jerusalem was lunch, and long may such abundance flourish, though I gather that in many religious Jewish schools today it is far from flourishing. We never had the sense that we were coming to their rescue, or that they Shakespeare, Keats, even Ferlinghetti depended on us for their survival. We felt, rather, that we were fortunate to have been born in such a diaspora.

     

    I will confess that the dimensions of Jewish ignorance and Jewish illiteracy in the American Jewish community even now, when many educational initiatives are taking place incline me to prefer a rescue of Maimonides to a rescue of Plato. We have the resources with which to restore Maimonides whose name I am using here as a shorthand for the Jewish heritage, religious and secular, in its own languages to his rightful place in the minds of his descendants; but in the fight for Plato we will need allies, Jews and non-Jews, mainly non-Jews I expect, who are already on the front lines at many of the universities that the American Jewish right is so gladly giving up on. The poor of your town and the poor of the other town: eventually they will both receive charity, eventually we may bring relief to them all but we will not save Western civilization, though the goal is noble and we must do what we can as liberals who are Jews or as conservatives who are Jews. It will have to save itself. The Jewish humanist’s work is never done. Will our Shabbes ever come?

     

     

    III

    The most egregious failing of the American Jewish right, its most lasting delinquency toward our children, is its refusal to discuss, or even to acknowledge, that the State of Israel committed war crimes in Gaza. After 1,200 of our own were massacred, 73,000 of their own were massacred, and it is estimated that it will take three years to clear the rubble and reveal what is beneath it. I do not see how the murder of 73,000 people falls comfortably under the rubric of self-defense. There were thousands of Hamas fighters among them, but still. And the death of their own civilians was an important element of Hamas’ savage strategy, but still. In the course of defending itself, the state that many of us love (though in smaller and smaller numbers) performed acts of evil. This is not the first time in history that a state has broken the hearts of its supporters in this way. A just war was turned into an unjust war. (My spirits sank when I saw that a certain apologist for the Netanyahu government announced a lecture on “the Jewish morality of war.” Get it?) The campaign in Gaza, whose outcome was never in doubt, was waged by Israeli leaders who disregarded all considerations of ends and means, for whom the malice of the campaign mattered more than its justice. It turns out that Jews, when they are threatened, are no saintlier with power than many other nations; and this too, alas, is a demonstration of Zionism’s rudest ideal, which is the normalization of the Jewish people. We are, in many senses, human beings in our tents. Worse, our children saw the images with their own eyes, and all the media criticism in the world will not erase from their memories the suggestion that the Jewish state is a cruel state. The depredations of intersectionality were not responsible for the carnage in Gaza.

    It needs to be added that in the West Bank the Jews are not threatened. In the West Bank they are the threat. What exactly are the government-approved and army-protected pogroms against Palestinians in the West Bank a retaliation for? The opposition of the people of Hebron and Jenin and Nablus to Israel? But opposition to Israel is surely not a warrant for Jewish atrocities; and it should not be beyond our empathetic abilities to comprehend why Palestinians might hold a rather negative view of Israel. The Jewish violence in the West Bank is premised on a number of assumptions that desperately need to be examined and challenged that most, or all, Palestinians are terrorists, or potential terrorists; that the interests of the Jewish population in Israel are the same as the interests of the Jewish population in the West Bank; that the Bible should play a significant role in determining political sovereignty in the territory, and that Judaism (which anyway teaches us to be pursuers of peace) has a proper place in the discussion of Israeli safety; that the apartheid reality on the West Bank does not impair Israel’s cherished self-image as a democracy; that a state founded axiomatically on the doctrine of equal rights, as Israel was, has the authority to take away rights and behave as capriciously as monarchs and dictators; that Israel has no reason to fear the future effects of decades of the systematic humiliation of the millions of people with whom it will always, like it or not, share the land; that our feelings for the land preempt their feelings for the land because they are our feelings for the land. And if we still propose to include democratic pride about Israel in our instruction to our children, we must reckon also with the democratic disfigurements of a government that defines full citizenship ethnically and seeks to destroy the high court’s power of judicial review and attempts to ban a newspaper that has the impudence to criticize it. Jews who are concerned that the love of Israel the critical love, the love that is not blind, which is the only valuable love there is should not end with us must attend to what is happening.

    Yet the American left, and the American Jewish left, will not help us. They are generally the sworn enemies of such an effort. On Israel, the left is venomous. While it is true that not all anti-Zionism is antisemitism, a lot of anti-Zionism assuredly is antisemitism and more, anti-Zionism represents a denial of, a slander against, one of the fundamental pillars of Jewish identity, which is Jewish peoplehood. If you deny Jewish peoplehood, you do not deny Jews but you do deny the Jews; and as Edward Said taught, you have no right to correct or distort our understanding of ourselves, to tell us who we are. In the vocabulary of modern politics, in which peoples came to be called nations, nations have been deemed to be worthy of states; and I fail to see how you can be against Zionism and not against nationalism and nation-states tout court. If the Jewish right to a state does not negate the Palestinian right to a state, neither does the Palestinian right to a state negate the Jewish right to a state. Unless, of course, you believe that the Jews are different, a special case of unparalleled iniquity, that they alone among the nations do not deserve a state because Israel’s culpability against the Palestinians dwarfs the culpability of all the other nation-states in all their wars against all their minorities and all their neighbors that, with their hands more bloodied than the hands of any other people ever were, the Jews and only the Jews have forfeited their claim to sovereignty; but only imbeciles and villains can believe that.

    The indifference of the American Jewish left to intellectual honesty in the matter of the historical comparisons upon which moral judgments are made is despicable. And in these times the left is a veritable gusher of indifference. I do not recall it marching for Bosnia and Kosovo and Rwanda and Syria and the Uyghurs and the Iranian protestors (from 2009 to 2026)  and Ukraine. Even before Trump’s fraudulent action against the Venezuelan dictator, progressives were loudly  defending Maduro and denouncing María Corina Machado, the running dog of American imperialism, because her democratic revolutionism made her an ally of the dwindling numbers of Americans who still support the proliferation of democracy — in whose company the marauding Trump is nowhere to be found, he may as well be the United Fruit Company.

    The astonishing range of the progressives’ indifference to les damnés de la terre — except those in une certaine terre — encompasses their own people. I do not recall articles in progressive journals, Jewish and otherwise, on problems of Israeli security and Israeli vulnerability: they have no time for such disquiet. More, I do not recall many progressives marching against antisemitism. Quite the contrary. Many on the left, and the American Jewish faction of it, regard the campaign against antisemitism cynically, as a community-wide maneuver to disguise the ugly designs of the “Zionist settler-colonial state” with the critic-proof discourse of victimization. Or more accurately, they denounce the antisemitism of the right, which is more ideologically inconvenient for them, and then pretend that reactionary antisemitism, Richard Spencer and Tucker Carlson, is all the antisemitism there is. There are many prooftexts for this mendacious electivity to cite, but here are some dismal sentences from Foer’s essay: “Like many American Jews, I once considered antisemitism a threat largely emanating from the right. . . . Part of the reason I failed to appreciate the extent of antisemitism on the left is that I assumed its criticisms of the Israeli government were, at bottom, a harsher version of my own.” No! It is precisely this failure to see the difference between progressivism and liberalism that is making the political renovation of America much harder. Foer’s piece is a lame document of a late awakening. After all, the career of the left wherever it has flourished has been thoroughly riddled with antisemitism. That is Jew-Hatred 101. There are libraries replete with the documentation, and nobody who calls himself a liberal, let alone a progressive, should be ignorant of them. May we agree, please, that in the struggle against those who hate us we are on our own, with no presumption of solidarity about any of the sides, catching allies as catch can?

    Of late a new strain of Jewish anti-Zionism which is a tale as old as Zionism, by the way, even though contemporary Jewish anti-Zionists regard themselves as valiant freethinkers, a sizzling vanguard has appeared in America. It consists in an infatuation with diaspora. The new Jewish diasporism is a way of abandoning Zionism without seeming alienated from, or unfaithful to, your origins. It holds that there is glamor in an anomalous existence, in extraterritoriality, and it prefers the costs of powerlessness to the moral wrestling with power. Are these anti-statists aware that there is no predicament in the world more dire than statelessness? For these anti-Zionists, however, the “subaltern status” of the Jews was a sign of their virtue, of their cultural force. (George Steiner used to gloat that no Kafka appeared in the Jewish state, but of course no Kafka appeared in the Belgian state either.) The Jewish identity that diasporism recommends is entirely a cultural identity, as if the Jewish people will ever enjoy the luxury of an existence without politics. There is something decidedly not charming about tenured professors in Berkeley and Cambridge and Manhattan choosing to overlook entirely the subject of Jewish security. I wonder how their blithe contentment with the exile would fare among the Jews of Argentina and Iran and Russia and France.

    Anyway, the historical joke is on them. Of all the principles of classical Zionism, the “negation of the diaspora” was the first to go. Conceived by an early-twentieth-century thinker named Jacob Klatzkin a philosopher of Jewish particularism who joined the outcry against Judah Leib Gordon and championed robustly by David Ben-Gurion, the expectation was that the founding of the Jewish state would empty the exile of its Jews, since Jewish life everywhere else had become illegitimate; but the expectation did not survive the early years of statehood, when it became clear that, as with the ancient return from the Babylonian exile, most of the Jews were not coming back. One of the most obvious facts about the history of the Jews is that Jewish civilization was created overwhelmingly in their dispersion. For some Zionists this persistence of the exile which for many Jews, and certainly for all the Jews of the United States, is a voluntary condition was a crushing national failure, though it turned out that the Jewish state needed the Jewish diaspora, and especially the American diaspora, to survive, and also the mass emigration to Israel of Russian Jews in the 1970s and again in the 1990s somewhat vindicated the earlier hope. In the presence of a homeland, moreover, the diaspora came to seem less “pathological” and more like another aspect of the common fate of peoples. Jews will live where they want to live unless they are no longer wanted where they live. Progressives all, the new diasporists may enjoy their non-Levantine cocoon for as long as it is enjoyable, but in the coming struggle for the reconstruction of Israel after Netanyahu, in the epic project of re-liberalization of the restoration of decency that sooner or later will be undertaken in Israel and the Jewish world, we will be sure not to count on them. Et qui vivra verra.

     

    IV

    In the Jewish tradition there are two canonical approaches to the question of self-criticism, of telling the truth to ourselves within the confines of our fidelity to each other.

    One approach advises going easy. It originates with Hillel, the sage who moved from Babylonia to Judea in 70 BCE and restored the study of Torah there, founding the intellectual lineage of rabbinical Judaism. The Babylonian Talmud relates an incident in which the Jewish authorities who preceded him there were unable to produce a conclusive ruling in a matter of ritual law. It pertained to the question of whether Passover overrides the Sabbath in the sacrificial obligations at the Temple. (The question was not yet moot.) They had lost the thread of tradition on this question. When Hillel demonstrated, rationally and hermeneutically, that he had the answer, the defeated authorities abdicated and appointed him in their place, at which point he began to scold them for their “laziness” in the perpetuation of their patrimony. Then they asked him a related question, and he replied, perhaps a bit sheepishly, “I once heard the correct account of this law, but I have forgotten it.” And then he added this momentous observation: “But let the Jews be, for if they are not themselves prophets they are the sons of prophets.” In other words, let them do what they believe it is right for them to do. Or as a medieval commentator paraphrased it, “we can rely on what they do on their own.”

    Hillel’s words hanah lahem, let them be acquired a meaning beyond the realm of law and became the motto for an inclination to self-forgiveness, to a prior generosity in our evaluation of ourselves. Let them be or trust them, or relent in your rigor about them, or cut them a break. Hillel’s reference to their descent from prophets might signal a certain snobbery about the Jews that they know better because they are spiritual aristocrats; but I detect in it also a measure of compassion, and also a measure of love, which is consistent with Hillel’s spirit in his other pronouncements. I find it hard, when I come across those words in later Jewish sources, to keep history out of the picture, and not to reflect that the hardships that the Jewish people have endured perhaps qualify them for an extra parcel of kindness from the world. I do not expect the world to comply with my tender thought; and more importantly, my tender thought can be easily abused, as it has been by contemporary Jews who have angrily maintained that after the Holocaust the world has no license to lecture the Jews on their conduct. I was raised among them. There are many responses to victimization, and one of them is to imagine, and even to demand, a degree of exemption from universal standards of truth and goodness, which certainly did not restrain the people and the peoples who made us victims. But that is not what Hillel had in mind.

    Their acquaintance with injustice may also have a refining effect on the consciences of victims, since they have been deported, you might say, from evil as a concept to evil as an experience, and will never unsee what the rest of us may never see; and this evolution to vividness may become particularly important when victims become victimizers, which has been known to happen. Is it harder or easier for the oppressed to oppress? The answer is excruciatingly unclear. And so there is the other Jewish approach to the challenge of self-criticism. It is even older than Hillel and may be found in Leviticus: “Reproach and reprove your neighbor.” Hocheakh tochiakh: the verse uses the verb for “criticize” twice, emphatically, “criticize criticize,” from which the rabbis in the Talmud inferred that this criticism of one’s fellow must be delivered “even a hundred times” and “in all circumstances.” While the verse explicitly gives as a reason for the injunction “that you should not hate your brother in your heart,” a fine instance of moral education by means of law, there is nothing lenient about the injunction. We owe each other an honest reckoning. The rabbis derived a protocol for the rebuke from the interpretation of this and other verses: that it should be uttered privately to avoid the humiliation of the wrongdoer “anyone who whitens the face of his fellow with public mortification has no place in the world to come” though in medieval Jewish communities there were occasions and allowances for the public airing of criticisms, and there was plainly nothing private about the classical chastisements of the prophets; that it should be delivered patiently and gently; that it must be made clear to the wrongdoer that the critical words are uttered for the sake of his improvement and his well-being (the verse after this one strictly forbids vengeances and grudges); and that if it is not taken to heart by the wrongdoer on its first hearing, the rebuke must be repeated to the point of obnoxiousness, until the wrongdoer can no longer stand the hectoring and strikes the critic and says “I am not listening to you.” Reformers can sometimes be an effective bulwark against reform.

    Let them be, and rebuke and reproach them. All that one can conclude from this thicket of ethical regulations is that the Jewish tradition was not designed for angels, or for a unanimous community. Also that there are no situations that unburden us from scrupulousness about our own behavior. Also that where there is love, there must be reason.

    END

    Grids, Glass, and More Glass

    I have started thinking of them as spaceships to nowhere. In my city, another one is always on the way; the latest touches down at 213 Bowery this fall. The last to arrive at that address, the SANAA-designed New Museum, was finished in 2007, the year of Obamamania and the iPhone and the first gentle rustling of the Great Recession. Like its predecessor, the new annex is a sleek, politely glowy object. It disrupts the skyline but not too much, making a statement but not too loudly. Designed by Rem Koolhaas and Shohei Shigematsu, it took three years to build, doubles the original floorspace, and cost something like sixty million dollars, or what Manhattan developers now call a “bargain.” The style is the kind still optimistically described as futuristic, though the reopening of a new New in 2025 is a reminder that this future is close to twenty years old, a throwback to a time when smart design and calm authority were still widely believed to be capable of saving the world. The first exhibition on the calendar is titled, too perfectly, “Memories of the Future.” 

    The timing is always strange with these things. 2025 has been a terrible year for most American museums, the year rivers of cash went dry: millions lost for the Clyfford Still, the Berkeley, the Pennsylvania Academy of the Fine Arts, and dozens of others, because the National Endowment for the Arts would no longer provide them; and millions more in international tourism, because suddenly nobody wants to fly here and take a Nighthawks selfie. It was the year the National Endowment for the Humanities cut major funding to museums. 

    This was also the year the White House went snarling after the Smithsonian, and when suited Beltway goons paid the National Gallery a visit to discuss its “legal status.” It was the year an executive order blasted the National Museum of African American History and Culture for encouraging its visitors to learn something about racism, and the year Trump squeezed out the National Portrait Gallery’s director for daring to think that diversity was on the whole not such a bad thing. It was the year hundreds gathered in Washington, D.C., to defend the Smithsonian, chanting, “Hands off our history,” though it was also the year protesters booed the Brooklyn Museum’s latest round of layoffs, and others crowded the lobby of the Whitney to condemn the cancellation of a pro-Palestinian performance, and others danced outside the Museum of Natural History to condemn board members who profit from oil. It was the year climate activists stayed in jail for throwing soup on Van Gogh’s Sunflowers and the name “Warren Kanders” stayed in the Whitney lobby. 

    Depending on who you talk to, in sum, 2025 was the year museums were the victims or the problem or part of the problem. What is beyond dispute is that 2025 was the year America’s big museums got bigger. Every year is. 

    The New Museum; the Studio Museum, which reopens this fall in a new three-hundred-million-dollar package by Adjaye Associates and Cooper Robertson; the Frick Gallery, recalled to life after a two-hundred-million-dollar renovation by Selldorf Architects; the Met, seventy million dollars poorer but one twinkly Michael C. Rockefeller wing richer. 2025 is no anomaly. In 2015, the Whitney moved to a new four-hundred-million-dollar Renzo Piano building; in 2019, MoMA reopened after a years-long, block-darkening, four-hundred-fifty-million-dollar expansion, its third in as many decades. COVID paused the growth for a while but did nothing to challenge the trustees’ confidence that growth is good. The rest of the 2020s will add fifty thousand square feet of waterfront property to the Tampa Museum of Art (a little ominous, given the state of the Atlantic, but hopefully Florida knows what it’s doing), a hundred thousand square feet to the Portland Art Museum in Oregon, sixty thousand to the Portland Museum of Art in Maine. Upward and outward they swell: palaces of art covered in endless pricy lifts and implants and transplants, not so different from the kind the sponsors lavish on their own bodies. 

    If you want a clear x-ray of an era, every triumph and delusion crisply rendered, you can always study its art. In the case of the United States in 2025, however, it might be more revealing to study its art museums. Such anxious, blustery things! By the time a new renovation has hatched, a successor is already pecking through the shell. The final products slant and shift their weight as though aware that there is nothing final about them: not a chance in a society that relishes moving fast and breaking things, including itself. 

    Is it ungrateful, in Trump Part II Year One, to be skeptical of the art museums that have managed to keep expanding, thanks to billionaire largesse? The American system of private cultural philanthropy has a lot to answer for, but at least it provides some cushion from POTUS 47’s whims. The better question might be: given the rain-or-shine ballooning of these buildings, and the municipal taxes that help make them possible, and the unaffordable restaurants, and the thirty-dollar tickets, and the shady land rights deals, and the write-offs, and the walls covered in donor names so that your eyes start to burn well before you reach the paintings, and the gift shops of deluxe crud, and the gentrifying neighborhoods that make the restaurants look affordable, and the galas — given all this, what, exactly, does museum expansion have to do with art? 

    The concept of growth, I am not the only one to notice, is having a rough twenty-first century. Blame the housing bubble, the overextended American empire, the mallification of urban centers, the net worth of the plutocracy, the greenhouse gas emissions, or all of them, since they may be symptoms of the same sickness. At least among people without summer houses, growth is reckless, boorish, decadent, cancerous, inherently suspicious; growth is the needle tower that could wipe out homelessness but stays empty fifty-one weeks of the year; growth is the rising tide that exclusively lifts yachts. Even its cooler friend, sustainable development, may only be growth with a better PR agent. 

    In the midst of this, museum growth seems to enjoy something like the benefit of the clergy. Not always, and not all museums — many journalists have wondered how much of 53rd Street MoMA will swallow before its stomach stops growling — but an expanding American art museum is still innocent until proven guilty, as an Amazon headquarters or a McMansion is not. In the clash between love of art and skepticism of growth, love triumphs. Art museums are sacred spaces where many visitors have the closest thing to a religious experience they will ever feel. What could be wrong with making room for more worship? 

    Start with the simplest justification for museum expansion: structural necessity. Some of these places are a century old, and nobody can worship if the walls crumble. All big buildings require repairs: ceilings blotch, plumbing and heating fritz. They are the kinds of problems that irk every museum, and, traditionally, they are the kind trustees have no interest in paying to fix. Upkeep is as important as it is unglamorous — no ribbon-snipping ceremony welcomes the new roof tiling, and nobody wants their name on a radiator, though to be fair I did see a named fire escape on a recent visit to MASS MoCA. It is a curious side-effect of tycoon psychology that a museum director may have an easier time scraping together fifty million for a new building than half a million for new toilets: vanity being vanity, unnecessary expansion is one of the shrewdest ways of funding necessary repair. 

    Who cares about necessity, as long as the results are beautiful? Some of the recent museum growth in New York, where I live, certainly is lovely. The cantilevered staircase that connects the two floors of the Frick has an elegance that doesn’t overpower; the Breccia Aurora marble somehow splits the difference between the flowery Bouchers upstairs and the pale chill of the Reception Hall below, so that transition rivals either destination. I have heard sniggers about the Gilder Center at the Museum of Natural History, but I think its oozy granite interiors are built to last in the most important sense: while other, more self-consciously tasteful buildings are doomed to look more like the 2020s with each passing year, the Gilder will go on looking like itself. 

    For every triumph, though, there are multiple museum makeovers that inflame my inner Peggy Lee. Is that all there is? I thought this when I visited the new Whitney a decade ago. The design was far from terrible; an outright terrible building would have been so much less perplexing. The eastern façade resembled four or five façades stacked together while their architect, recently dubbed “our Brunelleschi” by one of America’s leading magazines, decided where to put them. The western side, facing the Hudson, resembled a ship with a white sail, if the sail was a hunk of Styrofoam and the ship was sinking. North had lots of exposed pipes that somebody must have found pleasant to look at, and south was so utterly, breathtakingly okay it could only have been the work of a renowned architect dozens of museums deep in his career. 

    And for everything that goes up in a city this snug, something else must be knocked down. The Frick’s Music Room was one of the most ravishing places in New York before Selldorf’s renovation scrapped it to make way for temporary exhibitions (some excused the act by saying the venue was too small, as though this wasn’t half the charm of the Music Room, not to mention the rest of the building — architectural victim-blaming). In 2014, as an amuse-bouche before its next meal, MoMA chewed up the American Museum of Folk Art, having bought the building and decided that Williams and Tsien’s bronze façade disagreed with its house style of grids, glass, and more glass. That the world’s most influential modern art museum has taken to junking work that stands too far outside the aesthetic norm is a sick joke I will leave hanging. 

    Who cares about beauty, as long as the results are bigger? At least in press releases, the rationale for museum expansion is fiercely utilitarian: more space equals more wall area, which allows for the display of more art and a greater bang-for-buck for the common museumgoer’s eyeballs; more space also means more floor area and elevators and stairwells, which work together to relieve congestion. All very sensible on paper — funny, though, how the tiny museums that would benefit most from additional acreage cannot afford any and the museums that already own football fields of it seem to get more congested with growth, as any MoMA visitor knows. But only a fraction of the new layout goes to art, and the total amount may go down — at the moment LACMA is wrapping up a new six-hundred-fifty-million-dollar building by Peter Zumthor with ten thousand fewer square feet of galleries. 

    Even when the new space is bigger and one hundred percent art-devoted, it is unclear why a mega-museum gets intrinsically better with more stuff on the walls. “One cannot enjoy a pure aesthetic sensation,” Kenneth Clark terrifyingly put it, “for longer than one can enjoy the smell of an orange.” The purpose of going to the Met should not be to huff every orange on the tree, nor should it be the Met’s duty to pelt visitors with as much citrus as possible. In point of fact, large museums never come close to displaying everything they own and instead rotate their permanent collections in and out of storage. The Met’s collection includes close to two million works, only about five percent of which fit in the building; for MoMA, the number is somewhere around ten percent; for the Guggenheim, three. Making room for everything will always be a quixotic cause — besides, if given the choice between a shinier museum with marginally more on the walls or an already massive museum that doesn’t cost a family a hundred dollars to visit, which one would our mythic common museumgoer choose? 

    The question is theoretical, needless to say. One of the tartest ironies of this era of nine-figure art museum philanthropy must be how little of the money reaches the consumer: across America, pay-what-you-can entry has been scaled back to free weekends, free weekends to seasonal free weekdays, and seasonal free weekdays to free parking. With every hundred-million-dollar expansion, free museum admission looks more like a weird twentieth-century fossil. Administrators cite study after study proving that cheaper tickets have no measurable effect on the size of museum audiences, and so — ah, terrible shame! — they might as well charge twenty or thirty dollars. I am less interested in the studies than I am in why free admission is now posed as a question instead of a right, economics be damned. Had the same stern, penny-pinching scrutiny been applied to the hundreds of new wings and annexes of recent years, I wonder how many of them would exist. 

    Fear not, though: museum expansion is of debatable value to the visitor but of enormous value to somebody else. If there is a central reason why museums keep growing, it may be that the donors like a guarantee that their gifts will remain on permanent display and not be buried in storage — more space, more guarantees. (Philip Guston’s daughter Musa Mayer, for example, made the Met a present of 220 of her father’s works on the condition that at least half be on display at all times.) When done right, philanthropy pets the ego and pads the wallet. There is the psychological reward of knowing your art collection will be ogled long after you are dead, plus the charitable tax deduction, plus all the ancillary ways of pocket-lining. A high-end museum drives up property values, attracts tourists, fills up hotels and department stores, and generally enriches the sort of people who populate museum boards to begin with. Conflicts of interest are, of course, discouraged — why else would it say so in the code of ethics? Gentlemen, I’m shocked, shocked to find that profiteering is going on in here! 

    Still, reasons only take you so far. Follow utilitarian logic through to the end, usually, and you arrive at some humorless “well . . . because.” Few multi-millionaries and billionaires are famous for the practicality of their wants; probably not even they know why a rising tide delights them so much. Why have America’s museums kept getting bigger, then? To fit a few thousand more artworks next to fifty thousand others, certainly. Fundamentally, though, museums expand because expansion does not need a reason: to the people who make the decisions, it justifies itself, like life or happiness or, for a few hopeless fogeys, art. For a long time now, the signature style of the contemporary art world has been something like real estate aestheticism — growth for growth’s sake. 

    Even though I’m the other kind of aesthete, my first instinct is to say, let the tycoons do what they want. There are worse things to do with money than burn it, and every million dollars spent on a modern art wing that nobody likes is a million dollars not spent on predatory loan marketing or the reelection of some moussed, drooling climate change denier. 

    If museum expansion warped buildings and buildings alone, I could laugh it off, but it has a way of warping what’s inside them, too. Make a quick list of the glitteriest art careers of the past twenty years or so, and you find a few genuine talents and a rollcall of mediocrities with a gift for ritzing up the vast gray interiors in which museums increasingly abound: Ai Weiwei, scatterer of porcelain seeds at the Tate Modern; Yayoi Kusama, wallpaperer of the same institution and dozens of others; KAWS, whose giant brown dolls I am doomed to pass every time I find myself in the lobby of the Brooklyn Museum. The first task for these people is to fill up lots and lots of space, and at this they succeed brilliantly, since their work consists of a few simple elements (seeds, dots, dolls) that can babble on to whatever degree is required of them. When museumgoers walk in and ooh at the dots disappearing into the distance, they are oohing at the giant space that hosts them, handsome in its bright new costume. Space is boss, and art does what it says. 

    The premier filler-upper artist of the decade so far must be Jeffrey Gibson, the MacArthur genius and proud occupant of the American pavilion at last year’s Venice Biennale. With the help of a stable of assistants, he assembles hundreds of thousands of rainbow beads into sculptures, paintings, and costumes, none of which exhibit the slightest grace or facility with color, unless turning on the entire spectrum full blast is your idea of chromatic wizardry. Glance one-eyed at a Gibson and you absorb the whole thing along with most of the others. The best test of this is the impressive forgettability of his work — not long ago I spent a while in POWER FULL BECAUSE WE’RE DIFFERENT, his installation at MASS MoCA. I am still pondering that fire escape, but today I would be hard-pressed to say if this dress was bright yellow or bright blue, if that bit of wall was bright orange or bright pink, or much else beyond the fact that the room was big and everything was bright. But this would seem to be part of what museums love about Gibson, and why at the time of this writing his beads are being slobbered over from sea to shining sea: they dress up anywhere because they don’t say too much of anything. Like Kusama’s dots, there is something superficially innocent about them that gives the most bloated museum hangars a sweet glaze of populism. 

    Bloat and populism are having spectacular twenty-first centuries. It seems strange that both should be doing so well simultaneously, but here we are. Bloat won economics, while populism seems to have won aesthetics some time ago. (Politics is the usual tug o’ war between them.) Peacocking displays of wealth are so common that our senses have numbed to them, but “elitist” has been one of the filthiest words in the English language for as long as I have spoken it. Nobody gets in trouble for selling out anymore, but the idea of making art that might alienate some of its audience has become vaguely impolite, to the delight of some and the horror of others. (The inevitable Mark Fisher quotation: 

    “The assault on cultural elitism has gone alongside the aggressive restoration of a material elite.”) Nowhere do bloat and populism clash with such matter-antimatter explosiveness as they do in museums: the new spaceships to nowhere are for everybody, and they are toys and tools for billionaires. The harder they strain to seem down-to-earth, the more bloat they hide. 

    The art historian András Szántó bottles the bloat and the down-to-earth-ness and the rest of contemporary arts administration culture in The Future of the Museum. A collection of twenty-eight interviews with museum directors, all conducted in the early months of COVID, the book is a quietly amazing compendium of the ways in which art people — but not artists — think about art. I read and reread it like a novel. There are twenty-eight main characters, half men and half women and all fluent in their regional dialects of bureaucratese. (The fungal creep of the word “immersive” in the last decade or so has spared few museum directors.) Their institutions are scattered across fourteen countries in every continent but Antarctica. Together they preside over some seven million objects and an annual budget of nearly a billion dollars. Their average age is forty-nine. Many studied art history in college, though one is an ex-Louis Vuitton executive and another is an ex-child star. Part of the pathos and the comedy of this novel is that nobody is allowed to say what’s really on their mind, but sometimes they are so determined not to say X it is clearly X and nothing else that they are thinking. 

    The first thing I noticed, reading The Future of the Museum in this bumper year of buildings, was that nobody fesses up to wanting a bigger museum. In hindsight, at least some of these people were speaking to Szántó in between frantically rescheduling the new sculpture wing, yet the subject of expansion goes all but unmentioned for three-hundred seventeen pages — like the dog in the Sherlock Holmes story, it doesn’t bark because it recognizes its master. Instead of growth, museum directors would like to talk about community. It would be impossible for me to overstate how badly they would like to do this. The executive director of the M+ Museum in Hong Kong believes in the importance of community. So does the director of the Garage Museum of Contemporary Art in Moscow. So does everybody — “the term ‘community’ is bandied about too much,” says the director of the Brooklyn Museum after bandying it about too much. Not that anyone can really be anti-community, but the tic-like repetitions suggest a guilty conscience. The more I read the word “community,” the more vividly I pictured a big concrete slab named for a Sackler. And it is strange to see communities praised on page after page with so few mentions of what they are communities of

    To put it another way: this is a book of conversations with twenty-eight of the world’s most educated and powerful arts administrators in which almost nobody speaks with passion, or even much warmth, about art; in which everybody remembers to praise community but nobody rhapsodizes about a painting or a sculpture or a film or a tapestry or a drawing. At times, some of these people seem almost sheepish about managing such flimsy things. One director does speak at length about the value of his museum’s collection, but he is talking about their cash value, which apparently is five billion dollars. (“The conversation about how the liquidity trapped in artworks can be used has been a very unnuanced one.”) In the book’s most touching and depressing moment, the Brooklyn Museum’s director confesses that she wonders if she should have gone into politics instead. 

    These interviews were conducted in a pandemic year, to be fair, and perhaps it struck Szántó’s subjects as insensitive to extoll Rembrandt in dark times, though it might have struck them that in 2020 some of us needed Rembrandt, who lost his lover to plague, more than ever. Americans consider art “a luxury rather than a necessity,” as the poet and one-time NEA chair Dana Gioia wrote in 1991, well before the pandemic or the dismembering of the NEA. By “Americans,” Gioia meant people who have not chosen to devote their lives to art, but this book made me wonder if some art bureaucrats are hiding the same sneer. If you didn’t believe paintings to be of vital importance during the COVID-19 pandemic, you don’t really believe them to be of vital importance at all. 

    To understand a tribe, anthropologists say, it is not enough to pay attention to what the tribespeople talk about. Truth lies also in what they are not talking about: all the thoughts they consider too self-evidently absurd to mention. The big unspoken subject in The Future of the Museum, even bigger than museum expansion, is pleasure. Museum directors differ in their attitudes toward retail or political neutrality, but on pleasure, and the possibility that a museum might afford its visitors some, they sing the same silent song. Twenty-eight times Szántó asks what a museum is for, and almost every interviewee replies with something about education or activism or building community — all admirable goals, but lifeless when the central one goes missing. The director of the de Young in San Francisco says in four words what everyone else in this book says in zero: “We are not entertainment.” 

    His grimness would have amused Alfred Barr, MoMA’s first director, who felt his museum’s purpose was to help people “enjoy, understand, and use the visual arts of our time.” We can imagine what would happen if the art museum directors of the early twenty-first century had to agree on their own definition, though actually we don’t need to imagine anything: in September 2019, the International Council of Museums determined that museums “are participatory and transparent, and work in active partnership with and for diverse communities to collect, preserve, research, interpret, exhibit, and enhance understandings of the world, aiming to contribute to human dignity and social justice, global equity and planetary well-being.” A revised version appeared three years later, with “enjoyment” tossed in at the end like a pack of gum in the checkout line. I have nothing against planetary well-being, and you may quote me as saying so. How bizarre, though, to hold entertainment, one of a handful of things that makes this planet bearable, in such low esteem, like an opera house that proclaims its commitment to justice but forgets to mention music. 

    The final twist of The Future of the Museum is that it is full of odes to the power of the image, just not the kind of image you would expect museum workers to praise. There are millions of people who will forgo sleep, sex, sun, water, and food to keep staring at screens, and clearly arts administrators have taken envious notice. Things go from bizarre to sinister here: the museum directors of the early twenty-first century look at the red-eyed consumers and the companies selling their own dopamine back to them and think, “How can we be more like that?” “How can we better understand the motivations and intentions of the kinds of experiences that people are seeking through online games,” wonders the director of Singapore Art Museum, “so that we may use these as a way to steer them toward, as well as complement and enhance, the experiences museums can offer?” “This idea of the museum as photo backdrop arrived here early,” adds another director; “We spend a lot of time thinking about how to turn this inexorable urge into something productive.” (Notice he doesn’t say, “Something pleasurable.”) “People love serialized content,” the nuanced liquidity guy opines. “Imagine if museums found a way to have each program build off the previous one, and if we figured out a way to distribute that through digital media in a way that was binge-worthy. That is a digital future I would like to imagine.” 

    Binges and gamifications and inexorable urges — behold the museum directors’ glorious dream! What disturbs me more is that they claim to be dreaming in our names. 

    Pleasure, you have surely noticed, is having a spectacular time of late, and a terrible one. Some audiovisual thrill is always available, provided your devices stay charged, but if you have never felt the ache of all this bottomless fun, bully for you. There are whole clinics clotted with people who got such a kick out of online games or porn or other pixelated delights that they no longer feel much of anything; and their undiagnosed kin absent-mindedly run the world. It is telling that the concept of the guilty pleasure has almost disappeared from the culture — now there are only different pleasures for different folks. Neuroscience concurs, cheapening the feeling to a chemical squirt. 

    Some dour economic principle seems to be at work: mint too much pleasure too fast and it is devalued into the merest itch. As though to keep its stock trading high, meanwhile, fine art gets cashed in the stabler currency of community or duty or self-improvement or self-advancement — “something productive,” as that wise museum director might put it. Hence all the books insisting that the function of great literature is to make us nicer (Céline? Hamsun?); hence all the op-eds my grandfather used to mail me about how a humanities degree could help me get a job at McKinsey. 

    There is something I have not yet mentioned but should. I get enormous pleasure from art museums, not only the underfunded ones, but also the gray lugs I have been complaining about. And not only the art that hangs in them; I mean also the lines, the selfies, the gross kid-friendly installations in the lobby, the humid elevators of tourists, the tour groups, the wall texts written in something that somewhat resembles English. Of course I also get sick of every one of these things, but I believe that any real love for the exhilarating, exhausting art museum involves some irritation, a healthy mix of because and in spite of that is stronger than because alone. The kind of pleasure I get from museums, I suppose, is the kind I get from communities (we shouldn’t let the art bureaucrats ruin the word), and from art, and from almost anything else that is intrinsically worthwhile. 

    The question with which I began was not all rhetorical: what does museum expansion have to do with art? Very little, but also everything. Unless you happen to be wealthy enough to buy masterpieces yourself, to experience art means to experience it with a pack of strangers in a shiny new room named after people you couldn’t stand much more than they could stand you. The people who make such places happen seem to think of museumgoers as dopamine junkies, utility maximizers who will of course want to see more things since more things equal more dopamine. You can, if you like, play along with this and try to binge as you might binge on serialized content. You can also slow down, choose a handful of works, and go swimming in them. I recommend option two, not because I have any illusions that it measurably alters the world but because I believe that real pleasure exists outside measurement, and because I believe that real pleasure needs no reason to exist. If you require one, though, might I suggest the satisfaction of not acting like the sheep that art bureaucrats would like us to be? If we need help, we can always consult artists. 

    The artist Johann Zoffany has been of some help to me, even though I can’t always convince myself that he really existed. His work hangs in the Tate, and the ZOFFANY, JOHANN (1734/5–1810) entry in my edition of The Oxford Companion to Art is respectably long. Still, ask yourself, does this sound like a person or a literary character? Born Johannus Josephus Zauffaly near Frankfurt, he moved to Rome at seventeen and reinvented himself as Zoffani. In his twenties he got himself a court painter gig in Wurzburg, but three years later he ran off to London, leaving his wife behind. In Georgian London, he changed his name to Zoffany, took a mistress whom he passed off as his wife, and befriended the greatest actor of the era, David Garrick. For a while he had the favor of Queen Charlotte, but by the 1780s he was cash-strapped and resolved to sail to India to start again yet again. On the voyage back to England he was shipwrecked on the Andamans and, he claimed, ate a sailor to survive. I have no idea why someone would say this if it weren’t true. I have no idea why someone would say it if it were. Start self-mythologizing as a teenager, I suppose, and you never stop. 

    “Indifferent artistic merit” is what the Oxford Companion has to say about Zoffany’s work. I’m not so sure myself, and neither, for that matter, is the Oxford Companion, which gives Zoffany’s Tribuna of the Uffizi pride of place on the front cover. I have an odd relationship with this image, having never seen the original at Windsor Castle but glancing at the little ink reproduction most days for the last five or six years. Ordinarily, I would not write about a work of art I had never seen with my own eyes, but given that the Queen instructed Zoffany to travel to Florence in order to paint the most important room in the city’s most important museum, cramming his canvas full of tiny reproductions of works she had never seen with her own eyes, it seems forgivable somehow. 

    Zoffany spent six years doing the cramming, and it shows: every kind of image and sculpture can be found floating somewhere in this swollen gut of a painting. It is true that museums before the twentieth century displayed art frame-to-frame, but even by this standard the Uffizi that we are shown by Zoffany is a mess — compare it with a calmer gallery interior like Samuel Morse’s Gallery of the Louvre, completed a few decades later, and you see how far Zoffany is overstepping the curatorial rules of his own era, not just ours. More is more, and still not enough. Works that ordinarily hung elsewhere in the Uffizi were rushed into the Tribuna for the Queen’s delight. So were works that ordinarily hung in other museums. There are so many things here that some cannot fit on the walls and need to be carried or dumped on the floor: Rubens’ Consequences of War, a sculpture of baby Hercules, Titian’s Venus of Urbino, an Etruscan urn, a Holbein, a Correggio, a few Raphaels . . . 

    Charlotte hated it. She had expected a Tribuna overflowing with paintings and sculptures; instead she got one overflowing with paintings, sculptures, and people. These are tourists, in the original sense of the word: educated and wealthy men hitting the last stop on their Grand Tour of the European continent. We can imagine the Queen’s anger at this unsolicited reminder that mere gentlemen had been to the Uffizi and she had to be content with copies. This is a particularly bovine bunch, too — “a flock of traveling boys,” Horace Walpole thought, “and one does not know nor care whom.” Look how they swarm and gawk, sticking their noses and fingers where neither belong. The painter Thomas Patch pokes Titian’s Venus but doesn’t look at it — he is too taken with the homoerotic The Two Wrestlers. Zoffany himself makes an appearance on the painting’s left: he is the one grinning too widely as he holds up Raphael’s Niccolini-Cowper Madonna, to the fascination of everyone around him. Even Pietro Bastianelli, the Uffizi’s curator, seems unenlightened by his daily exposure to the sublime: he’s got his greasy digits on the Titian, too.

    But to look at The Tribuna of the Uffizi a quarter of a millennium later is to breathe easier and, dare I say, to believe in art slightly more. If he was anything at all, Zoffany was a skilled copyist. His miniature Rubens preserves the meaty writhe of the original, and, adjusting for superficial things like clothes, he more or less copied the feel of any big museum in the twenty-first century, too. There are few problems with the contemporary art world that were not also problems in the 1770s. Hopeless commercialism? Zoffany added paintings to The Tribuna of the Uffizi because his friend was trying to sell them to George III. Congestion? You can barely scratch your cheek in this room. Distractable tourists? Mr. Patch cannot keep his eyes on a Titian. The cheapening of artworks into lifestyle props? The only reason most of these posh yahoos are here is because the Grand Tour is an experience that they are supposed to collect — a pretty accessory for a life of foxhunting and gout. Art only matters because someone looks. The more renowned the art, the greater the number of clueless lookers, joyless collectors, donors in search of tax breaks, and steroidal museums. It’s the muck that clings to most worthwhile culture. It is not going anywhere, and neither is art. 

    We are all in the muck, to slightly paraphrase a writer who was serious about pleasure, but some of us are looking at the stars. One of the few figures in The Tribuna of the Uffizi who shows some glimmer of life in his eyes is a young man toward the painting’s left side whom the professors identify as the painter and politician Charles Loraine Smith. He is one of the few people in the scene who is seated, which would seem to mean he intends to be there a while, and he is the only one who is making something — sketching on a little pad — instead of gulping things down. Not his face but his whole body points at an ancient sculpture of Cupid and Psyche, and one imagines him taut with his own fervid staring. A cloud of contagious distractions hangs over his right shoulder, but somehow he is immune: Zoffany and his friends could walk away, but Charles would keep sketching. A grenade could go off and he wouldn’t wince. But the bigger miracle is the boy hunched behind Charles: given the choice between the loud, louche circle and the artist quietly sketching, the boy chooses the artist. He squats, trying to feel whatever pleasure keeps Charles seated — the grenade might not startle him, either. Under the right conditions, attention can be more contagious than distraction.

    The Nonsense of ‘Neoliberalism’

    A Conceptual Trash Heap

    Toni Morrison was wrong when she intoned that language is violence. But let’s give her this: the reckless use of words can do violence, idiomatically speaking, to clear thinking and therefore to political analysis. Slinging about words whose meaning is muddled, misleading, or tendentious — or whose usage is meant to oversimplify or to inflame — makes it impossible to think rationally, coherently, and productively.

    It is a tall order in this age of slogans and shibboleths to select one word to expunge from our political vocabulary, but if asked to do so I would nominate “neoliberalism.” A coinage of the late 1970s and early 1980s, the term remained fairly limited in its use for two decades, gaining currency at first in academic circles and then exploding in popularity after the financial crash in 2008 and Bernie Sanders’ rise to celebrity. Then, just when it was fading from overexposure, it surged back into fashion. Critics, scholars, consultants, and commentators now finger neoliberalism as the reason for practically all our political problems, especially the Democrats’ failure to keep the presidency out of the hands of Donald Trump. 

    “What Trump is attacking is neoliberalism. Economic neoliberalism underpins the past seventy years of Western economic and cultural order,” declares America’s most overrated senator, Chris Murphy, who alleges that neoliberalism has bequeathed a “very real epidemic of American unhappiness.” (Struggling with his cognitive dissonance over a concept he doesn’t quite understand, Murphy added: “Though it contains the word liberal, neoliberalism was devised by libertarian-conservative economists.”) Ro Khanna, another ambitious, out-of-his-depth operator, calls for “the rejection of neoliberalism. For forty years, we made a mistake. Frankly, it was both parties.” (Forty? Wasn’t it seventy? But what are a few decades among friends?) The Hewlett Foundation, which bankrolls efforts to replace neoliberalism with something else — the left hates billionaires except when they fund the left — defines neoliberalism as “free-market fundamentalism” and “the free-market, anti-government, growth-at-all-costs approach to economic and social policy.” Search the horizonless steppes of the internet and you will find countless pundits, politicians, and even ostensibly knowledgeable policymakers invoking the bogeyman of neoliberalism to explain where the Democrats and America went wrong. 

    The promiscuous use of the word “neoliberalism” has plagued our discourse since well before Trump. Over the years several intrepid explicators have pointed up its semiotic bankruptcy. Back in 2009, in an academic article titled, “Neoliberalism: From New Liberal Philosophy to Anti-Liberal Slogan,” the political scientists Taylor Boas and Jordan Gans-Morse concluded that “neoliberalism has become a conceptual trash heap capable of accommodating multiple distasteful phenomena without much argument as to whether one or the other component really belongs.” A decade later, the fine intellectual historian Daniel Rodgers warned that “the success of ‘neoliberalism’ is a measure of its substantive hollowness” and noted “four distinctly different phenomena” that fly under its banner: an economic theory; a set of economic policies; the capitalist economy itself; and — take a breath — “the hegemonic force of the culture that surrounds and entraps us.” The journalist Jonathan Chait meanwhile traced how “neoliberal” morphed into an off-the-shelf slur used to denigrate regular Democrats. “The ubiquitous epithet is intended to separate its target — liberals — from the values they claim to espouse,” he shrewdly observed. “By relabeling self-identified liberals as ‘neoliberals,’ their critics on the left accuse them of betraying the historic liberal cause.” In his Substack newsletter, Matthew Yglesias continues valiantly to puncture what he calls “anti-neoliberal” thinking. Yet for all these debunkings, the term has only gotten more popular, leaping out of academic tracts and leftist polemics and into the vernacular. 

    As it is used today, “neoliberalism” contains at least three assumptions that its users hope to promulgate but which are, in fact, wrong. The first concerns what historians call periodization: reliance on neoliberalism as a historical framework depends on the flawed premise that in or about 1980, with the election of Ronald Reagan, the American ethos changed. Second, the invocation of neoliberalism incorporates a critique of liberals and Democrats, who, it is insinuated, supinely acquiesced in Reaganism, creating a “Washington consensus” by jettisoning the party’s historic commitment to using government to better people’s lives. Third, the neoliberal mantra implies that the economic policies pursued by Democrats when they had power were an economic, political, and even moral failure. 

    Each of these ideas may contain kernels of truth. But none holds up as an overarching and empirically demonstrable proposition. If we want to understand liberalism and liberal governance over the last half century — and there is no denying that it is now facing a crisis — we should start by euthanizing this unenlightening word. The sooner we clarify our thinking about our recent economic and political history, the more intelligently we can debate what should come next. 

    The Origins of Neoliberalism 

    To understand where “neoliberalism” came from, we must return to the 1970s, when American voters were repudiating liberalism — known ominously in those days as “the L word” — in droves. 

    By the late 1970s, the enormous achievements of Lyndon Johnson’s Great Society had become clear — but it was no less clear that they had failed to stanch the spread of social maladies such as divorce, out-of-wedlock births, drug use, and violent crime. The civil rights movement had secured formal equality for black Americans and invigorated efforts to do likewise for women, gays, and other groups, but liberals suffered when they counseled more intrusive governmental measures to guarantee not only political and legal equality but also economic and social equality. Keynesian policies that had fueled prosperity since World War II proved powerless to combat the beast of stagflation, and the rise of a post-industrial economy — which had shifted away from heavy manufacturing and toward white-collar jobs that demanded a college education for the new hordes of “symbolic analysts” — triggered a long series of painful geographic and professional dislocations. In foreign policy, the Vietnam War stained the luster of liberal internationalism, leaving many Americans leery of wielding power abroad and voters leery of trusting the Democrats as a younger generation of leaders slouched toward isolationism. 

    The political wreckage was immense. During the presidency of Richard Nixon — who, though loathed by liberals and already tainted by Watergate, cruised to reelection in 1972 — the Democratic Party hemorrhaged support from key constituencies, including white Southerners, blue-collar workers, Catholics, and the intellectuals soon to be known as neoconservatives. Watergate allowed the Democrats a brief reprieve, but Jimmy Carter’s hapless White House sojourn propelled more voters rightward. In 1980 and 1984, Ronald Reagan twice routed the Democrats, while the Republicans also seized the Senate for the first time since the 1950s. Between 1968 and 1988, Democrats lost every presidential election but one, almost all in landslides. “Unless they recover their partisan energies and intellectual vigor, the Democrats could enter a long historical passage of declining influence and relevance,” warned Lance Morrow of Time magazine in 1980, “becoming the political equivalent of some of the decaying cities of the Northeast, once flourishingly productive, the exuberant places where the modern Democratic Party originated.” 

    Projects arose to ask where the Democratic Party had gone astray. Politicians and analysts drew up new strategies and policies that they hoped could restore confidence in an affirmative if more realistic vision of government’s capacities. These efforts are commonly described as designed to steer the Democratic Party to the political center. Exhibit A is the founding in 1985 of the Democratic Leadership Council, a group led by Southern moderates aiming to win back Reagan Democrats by stressing values such as patriotism, religion, work, discipline, and responsibility. But the call for internal reform did not come only from centrists; it was heard across the center-left spectrum, urged by liberal stalwarts as well as middle-of-the-roaders. In the late 1970s, Edward Kennedy, the liberal lion, took up airline decontrol and criminal sentencing reform, breaking with recently enshrined left-wing orthodoxies. In the 1980s, his aide Paul Kirk, as Democratic Party chairman, implemented a platform emphasizing “traditional values.” Barney Frank, another quintessential liberal, wrote a book called Speaking Frankly urging Democrats to swallow their unease about brandishing their patriotism or condemning criminals. The civil rights hero John Lewis, elected to Congress in 1986, prodded his fellow Georgian Sam Nunn — maybe the most conservative Democrat in the Senate — to run for president in 1988. Lewis also attended DLC events, offering the insurgent group advice on forging biracial coalitions. Notwithstanding its sobriquet as the “Southern White-Boys Caucus,” the DLC included many pragmatic dyed-in-the-wool liberals who, like Lewis, wanted to build a big tent in order to win again — including prominent African Americans such as Tom Bradley, Maynard Jackson, Kurt Schmoke, Andrew Young, Mike Espy, Floyd Flake, Bill Gray, Doug Wilder, and Ron Brown. Refashioning the party’s public philosophy, in other words, was a goal pushed by Democrats of all stripes. 

    This crisis was what gave rise to the impulses that came to be known as “neoliberalism.” Apart from the DLC, the most prominent group of reformers in these years were those who hoisted the neoliberal flag. (The DLC included some neoliberals, such as Al Gore and Dick Gephardt, but the two groups were not identical.) The word itself was invented around 1979 by Charlie Peters, majordomo of the Washington Monthly, a scrappy little policy magazine and popularized by Peters and Randall Rothenberg, who wrote a defining article and book on the topic. (Peters’ and Phillip Keisling’s A New Road for America: The Neoliberal Movement and Rothenberg’s The Neoliberals: Creating the New American Politics are the ur-texts for understanding the phenomenon.) Even then, the meaning was vague. No hard-and-fast set of doctrines united neoliberals. “There are no meetings, no dues, no constitution,” said Gephardt, a Missouri congressman who was among those tagged with the label. Voting patterns in Congress revealed neoliberals to be no more conservative than other Democrats. 

    Despite the lack of a membership roster, the same people typically appeared in discussions of the movement: officials such as Gephardt, Gore, Gary Hart, Bill Bradley, Paul Tsongas, and Bruce Babbitt; academics such as Lester Thurow, Robert Reich, and Amitai Etzioni; and the journalists trained by Peters at the Washington Monthly, including James Fallows, Nicholas Lemann, and Michael Kinsley. The New Republic, then a weekly magazine at the center of Washington debates, published neoliberal policy proposals alongside critiques of the movement. Of course these people often disagreed about policies, candidates, and even principles. But a few commonalities among the neoliberals could be discerned. 

    For the most part, neoliberals focused not on cultural issues or foreign policy or judicial fights but on economics. Reacting to the crises of the 1970s, they called for policies suited for the emerging post-industrial landscape centered on technology and information. In the 1970s, many on the left had hailed an “age of limits” and called for relinquishing the hope of ever-rising living standards. Neoliberals, without forsaking the goal of economic fairness, reemphasized growth as a cornerstone of their agenda and message. 

    Neoliberals also extolled efficiency. They excoriated bureaucracy, public and private, and allowed themselves to defy their allied interest groups such as government workers, unions, public-interest lawyers, and pro-regulation lobbyists. They favored investments in education and research and development. Many championed what was clunkily called “industrial policy,” or having the government select up-and-coming sectors of the economy for support. Technology captivated them, giving rise to the phrase “Atari Democrats.” They foresaw that high-tech innovation could help maintain America’s global competitiveness. They were far-sighted, too, in acknowledging the tightening interdependence of nations — a condition that spawned the word “globalization,” a close cousin of neoliberalism — and the need to adapt. In the 1970s, in deference to the unions, congressional Democrats had begun discarding liberalism’s traditional commitment to free trade; but most neoliberals, underscoring the folly of protectionism, countered that lowering trade barriers and opening markets would help both the United States and its international partners. 

    The philosophy just described bears scant resemblance to the caricatures proffered by Chris Murphy, Ro Khanna, and their ilk. Contrary to current mythology, the neoliberals were not libertarians, conservatives, free-marketeers, supply-siders, rampant deregulators, Reaganites, Thatcherites, Friedmanites, Hayekians, or enemies of the New Deal or the welfare state. More than other liberals, they saw a role for markets in their new policies, but they rejected the axiom that the market was all wise. “First of all — and most important of all — we are liberals,” Peters explained, noting “large areas” of policy in which neoliberals scarcely differed from other liberals. “We criticize liberalism not to destroy it but to renew it.” Babbitt defended the “welfare state” from the Republicans who would gut it, calling for sustaining “an activist federal government in areas such as environmental matters, health, and entitlements.” Morton Kondracke of The New Republic in 1980 called neoliberalism “an attempt to combine the traditional Democratic compassion for the downtrodden and outcast elements of society with different vehicles than categorical aid programs . . . or new federal bureaucracies.” Neoliberals sometimes derided their liberal forbears: “We are not a bunch of little Hubert Humphreys,” Gary Hart famously railed. But more often they affirmed the values and the principles that had animated twentieth-century liberalism — coupled with a desire to devise new ways to meet the demands of a new economic reality. Far from Reaganites, neoliberals were practical-minded welfare-state anti-Reagan liberals seeking to adjust their means to meet their traditional ends. Neoliberalism was a revision that took place within the liberal tradition. This may be why many of the detractors of neoliberalism on the left and the right are really just old-fashioned enemies of liberalism. 

    So why do so many people misunderstand neoliberalism? Why is it now equated with what we normally call economic conservatism? For that, as we shall see, the fault lies, at least partly, with Michel Foucault. 

    A Little Knowledge Is a Dangerous Thing 

    By the early 1990s, as Bill Clinton emerged as the Democrats’ standard-bearer, the word “neoliberalism” took a backseat to a more capacious label: “New Democrat.” Clinton had not often been listed among the neoliberals and didn’t quite fit the bill. He appears in neither Rothenberg’s nor Peters’ books. In the Democratic primaries in 1992, on economic issues Clinton ran to the left of his closest rival, the card-carrying neoliberal Paul Tsongas, contrasting his own pledge to protect Social Security with Tsongas’ dour fixation with trimming entitlements. Some Clinton aides, such as Robert Reich and Ira Magaziner, were called neoliberals, and as a governor and a presidential candidate Clinton had found promise in neoliberal ideas about growth, high-tech investment, government reform, and globalization. But he balanced his technocratic side with a visceral economic populism and a critique of Reaganomics for catering to corporations and the rich. Clintonism was a synthesis of several strands of liberal reformism, of which neoliberalism was only one. A chairman of the DLC, Clinton stressed the values of community, opportunity, and responsibility. He also captured the loyalty of a diverse mix of other groups: the black community, the nation’s governors, assorted academics and intellectuals. During his presidency, Clinton’s program was described not as neoliberal but as that of a New Democrat or, starting in his second term, as a “Third Way” — a label also used by center-left leaders in Britain and Germany. 

    In short, Clinton’s ascent rendered “neoliberalism” obsolete as a taxonomic category. A different strain of updated liberalism — call it Clintonism — now held sway. Yet just as Washington journalists were retiring “neoliberalism,” it got picked up, by sheer coincidence, by European leftists — people who had no familiarity with the legislation once bandied about by Bradley, Gephardt, Hart, and the others; who were not well-versed in American policy debates about military reform or education reform or “reinventing government”; who hadn’t read the neoliberal books and journals. Some of them probably had not even kept up with the decades-old shift in the meaning of the word “liberalism” itself, which in the nineteenth century had meant an assertion of individual rights, including economic rights, against the state, but in the Progressive Era had also come to encompass a belief in an active governmental role in the economy. Tethering liberalism to its former and now-antiquated meaning, these left-wing European academics thus felt none of Chris Murphy’s addlement in applying a word rooted in liberalism to a non-liberal philosophy. 

    These European academics glommed onto “neoliberalism” to name a school of conservative or right-of-center economic thought that they traced back to the 1930s. It turned out that neoliberalism — or more precisely the French néo-libéralisme — had been fleetingly applied in 1938 to a group of intellectuals who attended a conference in Paris called, charmingly, the “Colloque Walter Lippmann,” which debated the ideas in the American journalist’s book The Good Society. Like Lippmann, the convener of the conference, a French philosopher named Louis Rougier, hoped to develop an “essentially progressive” alternative both to rigid nineteenth-century laissez-faire doctrines and to socialism. To this end, Rougier invited twenty-six thinkers, ranging from the liberal humanist Raymond Aron to the free-market economists Friedrich Hayek and Ludwig von Mises, for a long weekend in Paris in late August. But Rougier’s dreams went unrealized. The discussion in Paris “remained vague and broad,” according to the historian Angus Burgin’s well-researched account, “because of both the relative brevity of the individual contributions and a general sense of uncertainty about whether . . . [to] focus on a reexamination of foundational principles or . . . practical policies.” A follow-up symposium the next year was canceled after Hitler and Stalin invaded Poland. So much for néo-libéralisme

    Enter Foucault, four decades later, who appears to have been the first European to misapply the resurrected term “neoliberal” not simply to the Paris conferees of 1938 but specifically, and inaccurately, to the conservatives in attendance — Hayek, von Mises, and their intellectual allies. Foucault did so in a series of lectures from 1979, published in 2004 as The Birth of Biopolitics, which included an account of the Colloque Walter Lippmann. (Biopolitics describes the — inevitably sinister — workings of political and governmental power upon the body and organic life more generally, as states manage their populations through policies relating to reproduction, sexuality, public health, and the like.) Foucault’s core point was a reasonable one: that unlike the nineteenth-century apostles of pure laissez-faire, who had theorized a weak state, these economists of the 1930s believed that governments had to take an active role in underwriting any market-based system. “The problem of neo-liberalism,” Foucault argued in one of his lectures, “was not how to cut out or contrive a free space in the market within an already given political society, as in the liberalism of Adam Smith and the eighteenth century. The problem of neo-liberalism is rather how the overall exercise of political power can be modeled on the principles of a market economy.” His blunder in choosing the label “neoliberal” — which was just then coming into circulation in the United States with a categorically different and indeed nearly opposite meaning — can be understood when we recall that he was resuscitating a forgotten French term and was surely unaware of neoliberalism’s contemporary American meaning. 

    Foucault’s application of this appellation to twentieth-century free-market economists such as Hayek, Ludwig von Mises, and (later) Milton Friedman was historically ignorant — and triply so. First, Foucault seemed not to have known that, as Burgin tells us, neoliberalism as “a formal designation” for the ideas at the Lippman Colloquium was “raised and rejected” at the time. Second, the Hayekians in fact did not call themselves neoliberals; those who had briefly flirted with that name were those on the center-left, like Lippmann and Rougier, not those on the right. Finally, for most of the century nobody else called these conservatives neoliberal either. Foucault’s was thus a highly peculiar and misleading usage. Yet just as with some of his other dubious theories, he got away with it. 

    It took time for this weird use of “neoliberalism” to catch on and still longer for it to reach American shores. By the 2000s, books by the eccentric British Marxist geographer David Harvey, the barrister Daniel Steadman Jones, and then Angus Burgin, along with a zillion academic articles and conference papers, had fused the idea of “neoliberalism” to market-based economics — and specifically with a genealogical narrative centered on Hayek, von Mises, Friedman, and their kind that ran from the Lippmann colloquium to the Mont Pelerin Society of the 1940s (a Switzerland-based hub of conservative thought) to the University of Chicago in the 1960s and 1970s. Interestingly, there were a few superficial points of overlap between Washington Monthly neoliberalism and Mount Pelerin pseudo-neoliberalism. As the historian Kevin Schultz remarks in his new book, Why Everyone Hates White Liberals (Including White Liberals): “Both prioritized economic growth. Both hated excessive government intrusion. Both were attempts, in rhetoric at least, to expand individual freedoms. But the Democratic ‘neo-liberals’ were more welcoming to social welfare programs, national allegiance, and government intervention to assist people.” More importantly, the Foucault/Harvey/ Steadman-Jones/Burgin conception of neoliberalism had no actual historical or intellectual connection to the standard meaning of neoliberalism in American political analysis. That the same word was used for both was a deeply confusing coincidence. 

    Yet perhaps unavoidably, the two were confused, and conflated and commingled. That commingling created a conceptual error that has since warped our discourse. Imagine dusting off the old meaning of “filibuster” — originally from the Dutch word for “freebooter,” used to refer to eighteenth-century pirates in the Caribbean — and concluding that today’s speechifying U.S. senators are all sword-swinging buccaneers. Or merging two meanings of “gay,” so that all happy people are deemed homosexual or all homosexual people are deemed happy. A half-knowledgeable Washington observer could see the absurdity of saddling a genuine neoliberal such as Gary Hart with the views of Milton Friedman. But the Europeans and academics bruiting about the label were not knowledgeable, or even half-knowledgeable, about these matters. And after 2000, the political climate made the merging of the two meanings of “neoliberalism” irresistible to some. Cursory understandings of the concept allowed left-wing critics to brand Obama as a neoliberal because he had bailed out the banks. Clinton’s support for the North American Free Trade Agreement (even though it was negotiated by his predecessors) and his repeal of the Glass-Steagall Act (enacted in 1933 to separate commercial banking from investment banking) were cast as pivotal moments when Democrats surrendered to market forces and set us on a path to where we are now. The misnomer stuck. 

    The leftist academics who tossed about “neoliberalism” almost always used it as a pejorative. As it bled into popular usage and the commingling continued, it became a malaprop cocktail to lob at anyone associated with the post-1980s intellectual ferment among Democrats. This meant that Bill Clinton, Al Gore, Robert Rubin, Larry Summers, Gene Sperling, and the rest of the Clinton economic team were not only branded “neoliberals” but cast as the ideological progeny of Hayek, von Mises, and Friedman. Multisyllabic and Latinate, “neoliberalism” posed as a sophisticated idea harboring profound and subtle analyses, but by the 2010s it had hardened into a blunt rhetorical tool, a form of invective, with which anti-capitalist writers could bash anyone they deemed to have betrayed the cause. This tendency reached its delicious reductio ad absurdum in 2017 in an online contretemps between Cornel West and Ta-Nehisi Coates, with the former charging the latter with possessing a “myopic political neoliberalism” and the latter responding by quitting Twitter. 

    Witting or unwitting, the wrongheaded conflation of neoliberalism with free-market conservatism has continued to flourish. The practice yokes together two groups who are clear ideological enemies. A category that embraces such stark opposites as Ronald Reagan and Bill Clinton, or Hillary Clinton and Donald Trump, can only obfuscate. And, besides, good names already exist for market-friendly economics: free-market conservatism, economic libertarianism, classical liberalism, laissez-faire. But leftists prefer “neoliberal” because it enfolds liberal Democrats in their blunderbuss critique. If to a hammer everything looks like a nail, then to a Marxist every non-Marxist looks like a neoliberal. One suspects, as Jonathan Chait has written, that “the whole trick is to bracket the center-left together with the right as ‘neoliberal,’ and then force progressives to choose between that and socialism.” 

    The Periodization Problem 

    If neoliberalism is a hot mess as a category of economic thought and political classification, it also flops as a tool of historical analysis. Here we come to the question of periodization, the way in which historians segment the past into units. Those who hold up neoliberalism as a tool for organizing recent events insist that it has hegemonically governed our era. But, as we can see from Chris Murphy’s and Ro Khanna’s failure to get their stories straight, there’s no consensus on when this supposed hegemony began or ended, or indeed if it has ended at all. Some place the beginning in the Clinton ’90s. Others point to the late 1970s. Most will say Reagan’s election in 1980 was the turning point, since he came into office preaching lower taxes and smaller government (even though he followed through much less than is supposed), and people talked about a Reagan Revolution as if seismic changes were underway. 

    But was Reagan’s ascendancy really the major break point of the recent past? To organize our recent history around Reagan’s rise — that is, around Reaganomics — enshrines a crude economicist mentality. It subordinates historical events of manifestly greater historical magnitude to trends in economic thought and policymaking. (There are people who believe that Watergate was a less significant episode in Nixon’s presidency than the end of the gold standard.) Specifically, the 1980s-centric periodization ignores the most transformational decade of the post-World War II era — the 1960s, when dramatic changes occurred in culture, foreign policy, law, and society, though somewhat less so in economics. Before the vogue for the neoliberal periodization came along, historians agreed that the span of the late 1960s and early 1970s was a hinge in American history. That was when the Cold War began to thaw, when Vietnam shattered belief in American virtue, when old manners and morals were overturned, when cultural backlash politics scuttled dreams of expanding the Great Society, when so-called hard hats beat up antiwar protesters, when lifelong Democrats gave Nixon his landslide, when the liberal vision fell on hard times. Debates today about the Democrats’ electoral struggles to recapture the working class should recall that those struggles, too, date to the early 1970s and not to the later “neoliberal” period. 

    Even by strictly economicist measures, the historical focus on Reagan instead of Nixon fails key explanatory tests. Today’s critics blame neoliberal policies for “hollowing out” manufacturing communities, sending onetime Democrats into the Republican column. But manufacturing began collapsing long before Reagan. The steel and auto industries faced competition from Japan and West Germany in the 1960s. By the 1970s, magazine stories, think-tank studies, and congressional hearings proliferated about plant closures and job losses in Rust Belt cities. When neoliberals came along in the 1980s, they were reacting to manufacturing losses, not driving them. Neoliberalism’s detractors have their chronology backwards. 

    The Reagan–Bush years were not the start of an historical era but the end of one. Beginning in 1992, with Bill Clinton’s election, a long stretch of Republican dominance ended. Presidential politics became competitive again. Since 1992, Democrats have lost the popular vote just twice. Divided government has reigned, with control of the White House and Congress seesawing between the parties. Our talk of red states and blue states and polarization dates to the year 2000 and the knife’s-edge contest between Al Gore and George W. Bush. 

    The 1990s amounted to a break, too, in America’s economic fortunes. The wage stagnation now blamed on neoliberalism actually occurred in the 1970s and 1980s, not in the 1990s. The Clinton years sparked a run of higher productivity and wage growth, along with stiffer taxes on the rich, reductions in poverty, and growth that has outpaced Europe’s. Whatever was happening in the 1990s, it marked a sharp reversal from the doldrums of the 1970s and the uneven recovery of the 1980s — historical shifts that the neoliberal periodization does not take the trouble to accommodate. 

    The Myth of the Washington Consensus 

    Once we recognize that the 1990s constituted a departure from — much more than a continuation of — the 1980s, more problems with neoliberalism as an operating concept emerge. Related to the claim that no real economic policy differences have separated the two parties is the corollary that Democrats guzzled the Reagan Kool-Aid, joining in a “Washington consensus” by ditching liberalism’s commitments to the welfare state, progressive taxation, regulation, and helping blue-collar workers. 

    This is more nonsense. The last quarter of a century has been defined not by consensus but, famously, by polarization. Americans sorted into red and blue camps, telling one another that each election was the most important of our lives, with each contest fought as if the entirety of the republic’s fate hung in the balance. In part, these stark and inflamed partisan divisions have been about sociocultural issues such as abortion, gay rights, racial progress, and immigration, as well as about political-legal questions such as civil liberties, civil rights, and the scope of presidential power. But they have also been about economics. Since Clinton’s presidency, knock-down, drag-out fights have occurred over core differences in fiscal and regulatory policy. The Democrats press for progressive taxation, increased social provision, and restraints on business; the Republicans seek to cut taxes, domestic spending, and restrictive rules. Clinton’s first major action as president was to raise taxes on the rich. Bush’s was to cut them. Obama then ended Bush’s tax cuts. Trump passed new ones, and then passed them again. Equally bright lines have separated the parties over health care, Social Security, and the whole litany of kitchen-table issues. Only on one major issue — trade — have the parties’ leaders been relatively united. (More on that below.) Yet the enveloping partisan rancor of our times is seldom noted by tellers of the neoliberalism tale, since they have no way to account for it in their fantasy of a seamless elite bipartisan comity. 

    To see how badly some people misremember even recent history, consider Clinton’s campaign in 1992. Jennifer Harris, who served in the Biden White House and now works for the Hewlett Foundation, recently wrote in Foreign Affairs that Clinton “won election in 1992 in part by stressing his adherence to Reagan’s free-market dictums.” Come again? The opposite is the case. Here is Clinton debating George Bush before seventy million viewers in 1992: “We’ve had twelve years of trickle-down economics. We’ve gone from first to twelfth in the world in wages. We’ve had four years where we’ve produced no private-sector jobs. Most people are working harder for less money than they were making ten years ago. It is because we are in the grip of a failed economic theory.” Clinton’s economic plan, published as Putting People First, excoriated Reaganism, promising instead a bottom-up path to growth including higher taxes on the rich, universal health care, and investment in transportation and communication infrastructure — which became key pieces of his blueprint for governing. 

    Neoliberalism is said nowadays to denote the rejection of New Deal economics, but neither under Clinton nor in the years after did the Democratic Party allow the dismantling of the postwar liberal tentpoles of a mixed economy, progressive taxation, robust regulation, and a welfare state. Consider the case of regulation. Contrary to the neoliberal mythology, Democrats since Clinton have reliably lined up against the Republicans’ anti-government agenda. To be sure, Clinton did loosen some constraints on business, as in repealing Glass-Steagall. His “Reinventing Government” initiative trimmed other requirements, too — not to allow business a free hand but to prune bureaucracy so that the public would again trust the government to be efficient and responsive. (It is hard to come up with any regulations phased out by the Reinventing Government project that anyone misses today.) Overall, however, the pattern of the past three decades shows Democrats fairly consistently promoting environmental protection, workplace safety, civil rights safeguards, public health, and financial oversight. In 2010 Obama gave us the Dodd-Frank legislation, which imposed tougher capital and oversight requirements on big banks and created the Consumer Financial Protection Bureau. 

    Count the pages in the Federal Register, which lists new governmental rules. Under Ronald Reagan, the count fell from 87,000 to 53,000. Under Clinton it climbed from 63,000 to 77,000, and under Obama from 80,000 to 98,000. “It’s pretty clearly true,” Matthew Yglesias observes, “that the overall scope of regulation is larger in 2024 than it was in 1974.” Think about it: if the 1990s and 2000s had been such an orgy of slash-and-burn, how could Trump’s first term have witnessed such a barrage of headlines about the trashing of vital protections? “E.P.A. to Lift Obama-Era Controls on Methane, a Potent Greenhouse Gas.” “Consumer Bureau Scraps Restrictions on Payday Loans.” “How the White House Rolled Back Financial Regulations.” “Trump Says His Regulatory Rollback Already Is the ‘Most Far-Reaching.’” Whose rules do they think Trump was undoing? 

    One can tick off the different policy areas. Fiscal policy? Democrats in the 1990s raised the minimum wage, boosted taxes on the rich, and expanded the Earned Income Tax Credit — all over Republican objections. Social provision? As he had in his primary race against Tsongas, Clinton continually prioritized the protection of Social Security: “Save Social Security first,” he vowed in his State of the Union address in 1998, revealing his intention for his newfound budget surpluses. He bested Newt Gingrich and Robert Dole in budget battles mainly by defending Medicare and Medicaid. Clinton and Obama both made universal health care a top priority — Clinton unsuccessfully, Obama triumphantly. Investment in infrastructure, science, and technology? Clinton poured money into building the internet and mapping the human genome; Obama’s stimulus package in 2009 was so sweeping that the journalist Michael Grunwald wrote a book about it called The New New Deal. Labor? The Clinton administration enacted the Family and Medical Leave Act, fought right-to-work laws, appointed union allies to the National Labor Relations Board, and curtailed sweatshop labor. Obama backed a controversial law to let workers unionize without a secret ballot. In none of these cases did Democrats receive much Republican help. Partisan division — not consensus — was the Washington norm. 

    Believers in the fiction of neoliberalism-as-Reaganism sometimes believe they have a smoking gun in Clinton’s statement in 1996 that “the era of big government is over.” But never has a sentence been more grossly distorted. Rarely quoted is the next line, a rebuke to the laissez-faire ideologues: “But we cannot go back to the time when our citizens were left to fend for themselves.” Clinton’s original draft had contained the punchier “But we can’t go back to ‘every man for himself.’” That formulation, however, was deemed sexist and rewritten, and the graceless new iteration got dropped from headlines and soundbites. In any case, the snippet was never meant as a death knell for government’s role in helping citizens; it was an acknowledgment that ambitious Great Society–style projects such as Clinton’s failed health-care initiative were, given the congressional log-jams and insuperable deficits, unlikely to be forthcoming. As John Lewis said at the time, the “era” of big government might be over — the climate of opinion that had birthed programs like Medicare in the 1960s was now in the past — but the “role” of big government was not going to change. 

    In each of the policy realms noted above, Republicans firmly opposed the Democrats’ agenda and vice versa: it was a Washington dissensus. In one realm, though, bipartisan majorities did exist: trade. And when you peer closely at the charges of Democratic perfidy, they usually boil down to the fact that Clinton, Obama, and other party leaders backed NAFTA in 1993, and permanent most-favored-nation status for China in 2000, and the Trans-Pacific Partnership in 2016. In these cases, a hefty majority of Republicans and a sizable minority of Democrats came together in favor of a freer trade regime. 

    A few complicating points bear mention. First, there is near-unanimity among economists about the benefits of free trade, just as there is among public health officials on the danger of lead exposure and among education researchers on the worth of early-childhood schooling. Although certain constituencies have over the decades called for tariffs, leading to pitched political fights, the recent trade deals all had strong expert justification and scholarly support. This reality vitiates the charge that Democrats were cravenly acquiescing in Republican dogma. If conservatives today were to stop questioning the danger of a warming planet, would they be capitulating to a Democratic ideology? Or would they simply be grounding their policymaking in an accurate, objectively established set of facts? We should at least entertain the idea the Democrats backed free trade because it was good policy. The alternative is to understand policy and politics only cynically. 

    Relatedly — and here the periodization problem again rears its head — the neoliberal era is alleged to have begun in the 1980s or 1990s. But support for free trade was the standard liberal position since the nineteenth century. One of Woodrow Wilson’s first steps as president in 1913 was to sign the Underwood Act reducing tariffs. Franklin Delano Roosevelt produced the Reciprocal Trade Act in 1934 and the Bretton Woods Agreement in 1944. Harry Truman signed the General Agreement on Tariffs and Trade in 1947. John F. Kennedy enacted the Trade Expansion Act of 1962. If any position represents an abandonment of liberal principles, it is the protectionism that some Democrats began adopting under pressure from organized labor in the 1970s. Thus, the Democrats’ support for trade, too, turns out to be a flimsy peg on which to hang the weighty conceptual behemoth of “neoliberalism.” 

    The Fiction of Liberal Failure 

    Neoliberalism, as we have seen, can no longer be said to accurately describe a coherent body of economic thought. Nor does it designate a clear-cut political affiliation. Nor does the notion of an “age of neoliberalism” linked to Reagan’s rise survive scrutiny. And the critique implicit in today’s pejorative use of “neoliberalism” — that contemporary Democrats junked their values for pro-market cheerleading — also unravels once we review the countless policy conflicts that have riven the two parties in our fevered times. 

    But let us allow that, even if it has been grossly overstated, the Democrats of the 1990s and 2000s did tilt their party in a somewhat more pro-market direction. That’s true enough. What of the criticism that their modifications to their party’s governing philosophy wrought horrendous economic damage, especially to society’s lower strata? Even if the Clinton–Obama agenda wasn’t the brainchild of Milton Friedman, even if it does not deserve the opprobrious epithet “neoliberalism,” didn’t it nonetheless buoy the rich and oppress the poor? 

    Here, too, history undermines the anti-neoliberal arguments. The Clinton and Obama presidencies boasted some of the strongest economic records of recent times. The Clinton numbers are so phenomenal, so jaw-droppingly enviable, that they beggar belief. Clinton presided over the longest continuous peacetime economic expansion in history, with growth averaging 4 percent annually. Unemployment fell from 7.3 to 4 percent and inflation stayed low. The stock market boomed, but prosperity also extended to the lowest rungs of the ladder: poverty fell by nearly one quarter, from 15.1 to 11.3 percent, and the two lowest-income quintiles saw their earnings increase nearly 17 percent. Real median household income grew by 13.9 percent. Blacks and Hispanics made especially strong gains. All of this was achieved as once-crippling budget deficits turned into record surpluses and Americans’ trust in government spiked for the first time since the 1960s. As Hillary Clinton later said when her husband’s record came under fire, “I always wonder what part of the 1990s they didn’t like: the peace or the prosperity?” 

    These policies succeeded politically, too. Clinton wooed many Reagan Democrats back into the fold. In both of his races, he drew more than 40 percent of the working-class white vote — a quantum leap over Carter, Mondale, and Dukakis, and a high-water mark that no subsequent Democratic presidential nominee would match. These voters had been drifting from the Democratic column before Clinton and would drift away again afterward, but Clintonomics was not the reason for their defection. 

    Obama’s economic legacy, though not as strong, also holds up well. His presidency kicked off an expansion that, while less robust than Clinton’s, lasted longer, extending into Trump’s first presidency until the pandemic hit in 2020. Taking office just after the 2008 recession, Obama, in Rooseveltian fashion, followed through on the bank rescue and the auto industry rescue. Those efforts, including Obama’s huge stimulus bill, constitute, along with the Affordable Care Act, his most important achievements. Whatever name we affix to his economic policy, it, too, worked. Unemployment fell; inflation remained modest; median household income rose by 5.3 percent. Politically, Obama’s performance was also a bit weaker than Clinton’s; his reluctance in the early days to rhetorically balance the bailouts with a dose of Clinton-style left-populism gave an emergent right-wing proto-Trump movement, the Tea Party, room to grow. But Obama’s economy still performed well enough to win him reelection in 2012, thanks to a decent showing among the white working class, especially in states like Michigan and Ohio where the auto bailout saved jobs. To be sure, the slow growth of Obama’s second term hurt Hillary Clinton in 2016, a year when the economy merely inched along. But his was hardly an economic program geared toward the superrich. 

    Given this mostly admirable economic record, can it really be said that the last thirty-five years have amounted to failure — especially on the Democrats’ part? It is worth addressing two economic failures of the recent era that have been especially salient. In both cases we can fairly criticize Democratic governance, although in neither case more so than Republican governance. First is the fallout from the trade regime of the twenty-first century. While globalization benefited Americans overall, fueling growth and lowering consumer costs, the downsides hit hardest in the de-industrializing regions. The stories of constricted job opportunities, impoverished civic life, family dysfunction, and drug and alcohol abuse in these communities are legion and heartbreaking. In post-industrial cities and towns, rural areas, low-income suburbs, and other lagging regions from Appalachia to swaths of the South, the toll has been severe. 

    The second problem is also one of inequality, but on a broader societal level. We have all seen the statistics that portray the yawning gaps between the top 1 percent and everyone else, the growing chasms between CEO pay and the going hourly wage. Inequality has also deepened a sense of deprivation among the working and middle classes. As important, it has meant that a large segment of Americans has been prosperous enough to shoulder the high costs of child-care, health care, housing, college, and retirement, but that a much bigger group has watched those elements of the American Dream recede from their grasp. 

    These hardships must not be minimized. They pose urgent challenges — of politics and policy, of solidarity and sympathy — for both parties. The Democrats as well as the Republicans failed to do enough to address the privations and the struggles that, while not new to our times, continued to afflict the de-industrializing regions into the 2000s. Both parties also failed to deliver effective solutions to the skyrocketing costs of big-ticket life-event costs such as health care and housing. But the critics of “neoliberalism” imply that remedies were readily at hand for Clinton and Obama and other Democratic leaders, who turned away from them. Yet no such obvious remedies existed (or exist today). For one thing, these inequality trends stem mainly from factors other than public policy. The manufacturing decline long preceded the controversial trade deals, and its recent acceleration derives more from automation, technological advances, and turbocharged worker productivity than from Chinese imports. Drawing an analogy, the Harvard economist Robert Lawrence notes that the number of agricultural jobs in the United States has plummeted not because of trade but because of a transformation in farming technology. 

    Wealth inequality, similarly, has widened not primarily owing to any public policy decisions but owing to the huge spikes in stock market and real estate valuations. Democrats, left, liberal, or centrist, have generally wanted to do more to address these serious inequities, but since the Reagan years we have been hobbled by divided government. We have had no period like the 1930s or the 1960s when one party could work its will; the congressional majorities that Clinton, Obama, and Biden all briefly enjoyed were never large enough to overcome the threats of the filibuster. (Alas, we may now be embarking on such a period, led by the other side.) Democrats may pass redistributionist taxes or expand spending programs, but when the Republicans return they get blunted or reversed. Most of the time, it is simply impossible to pass a large-scale social program in the first place. That is why we say that the era of big government is over. It is a description of reality, not a wish. 

    Of the two parties, the Democrats are the ones who have consistently favored measures to mitigate the burgeoning inequality. Their efforts, unfortunately, have not helped them much politically; ironically, the hard-hit communities in places such as Arkansas, West Virginia, and rural Wisconsin have gravitated toward the GOP — worse, toward the MAGA GOP. But these voters are not moving rightward because the Republicans are delivering higher wages or more bountiful health insurance. There are many other reasons for this realignment, rooted in values, culture, identity, and style. Politics never consists entirely in economics. The Democrats’ noble words about economic fairness will not win them elections in the absence of creative and effective new ideas. (Kamala Harris’ campaign proposal of $25,000 handouts for down payments isn’t going to cut it.) But if working-class and non-college-educated voters have been abandoning the Democrats because of their economic record, they are not going to find the Republicans’ solutions any more congenial. 

    The acute suffering in these afflicted communities demands our attention. It also creates rhetorical space for the continued bashing of Democratic policies. It has provided justification, for example, for the Hewlett Foundation to pour millions into a project groping for a “post-neoliberal” vision that it hopes will amount to a reverse DLC for the 2020s. The Hewlett project — regrettably based on the sort of murky understanding of “neoliberalism” that pervades our discourse — was expected to bear fruit under Biden. Yet despite a lot of hyperventilating in early 2021 about a “transformational” Biden presidency and absurd comparisons of his decidedly non-radical program to the New Deal, Biden governed mostly in the same center-left mode as Clinton and Obama, albeit less effectively. 

    Biden touted a purportedly new economic vision, saying he would build the economy “from the middle out.” He failed to acknowledge that Obama had used and popularized the exact same phrase, and Clinton had propounded the same basic idea. Biden also hyped the value of his child tax credit, which was more generous than past iterations, but which also had first been implemented by Clinton and then expanded by succeeding presidents. Biden talked up anti-trust actions against the tech giants, but this, too, was something Clinton had pioneered with a lawsuit against Microsoft, the Goliath of its day. Apart from keeping some of Trump’s tariffs, Biden’s main claim to policy innovation was to jack up the domestic outlays in his spending bills to dwarf even Obama’s $800 billion Recovery Act of 2009 — something that he could do because we were stumbling forth out of the pandemic. Unfortunately, just the year before Trump had signed the gargantuan CARES Act, and, on top of that, Biden’s two huge spending bills combined with pandemic-related shortages to produce inflation rates higher than they had been since 1981 — one of the main reasons that Harris lost the election in 2024. If his approach was designed to improve on the “neoliberalism” of his Democratic predecessors, it failed. 

    The historian Tara Zahra has written about the backlash against progressivism and globalization in the aftermath of World War I. Where goods and people had moved freely across borders, restrictions now limited exchange. Governments framed migration not as an opportunity but as a threat to national strength and social cohesion. Nations pulled back from international bodies and treaties. Democracies and dictatorships alike preached self-reliance. This inward turn promised order amid chaos, rootedness in place and tradition, and protection from the dislocations of global capitalism. Fascism, communism, and anti-semitism flourished. It was not the age of Trump, but it was the age of Ford, Lindbergh, Coughlin, and Mussolini. The worst war in history followed. 

    Now, too, an anti-globalization backlash is in flower. Liberal democracy is regularly derided. Elites are demonized. Strongmen are admired. Pluralism is regarded as weakness. Trade is blamed for poverty. Borders are walled and fortified. Illegal immigrants are targeted. On the left, voters flock to fantastic promises of free rent, free buses, and free food. Or they hear prophecies of a future liberated from work, so that we can all enjoy a government-provided universal basic income. On the right, Trump recklessly plays around with tariffs, wreaks economic havoc, and impulsively decimates government agencies. These are only a few of the latest proposed replacements for what has come to be disparaged as neoliberalism. If we persist now in trashing the many things that liberals, whatever their failings and flaws, have done rightly and reasonably well, we will breathe life into the poisonous ideologies that liberalism once rose up to defeat.

    Other Canons, Other Wars

    In the summer of 1981, the novelist Italo Calvino published an article on the great books in the Roman weekly news magazine L’Espresso. “Why Read the Classics?” is classic Calvino: playful, charming, erudite, skeptical, humane. It consists of fourteen “suggested definitions” of a classic that deliberately contradict each other. Per definition one, the classics are books you are always rereading, even if you are discovering them for the first time, per definition five; or they are books you have yet to read because you are still waiting for the opportune conditions to enjoy them, per definition two. Classics are pre-selected for us by the group: “they come to us bearing the aura of previous interpretations, and trailing behind them the traces they have left in the culture” (definition seven), they generate “a pulviscular cloud of critical discourse” (definition eight), and are often known through “hearsay” before they are known by experience (definition nine). But they are also chosen by the individual reader for personal reasons: “‘your’ classic is a book to which you cannot remain indifferent” (definition eleven). Ancient or modern, a classic is a book that “relegates the noise of the present to a background hum,” (definition thirteen) and at the same time one that “persists as background noise even when a present that is totally incompatible with it holds sway” (definition fourteen). 

    The implication being: a classic is impossible to define. Rather, it is a designation relative to an individual reader’s position in a particular culture at a particular moment in history. In his scholium to definition fourteen, Calvino gives a reason for this. The proliferation of books in “all modern literatures and cultures” has led to “the dissolution of the library,” such as the one inherited by Giacomo Leopardi, the reclusive nineteenth-century poet and philosopher who was one of the last people who could plausibly confuse his thorough education in European literature, philosophy, history, and science with the totality of knowledge. The “eclecticism” characteristic of late twentieth-century culture is the result of its inescapable awareness of the contemporary, on the one hand, and the global, on the other. Just as the books of the past and the present are indispensable to understanding each other, Calvino told the readers of L’Espresso, the classics of his language and culture, such as Leopardi’s Canti, “are indispensable to us Italians in order to compare them with foreign classics, and foreign classics are equally indispensable so that we can measure them against Italian classics.” That we will “never be able to draw up a catalogue of classic works to suit our own times” was not a cause for worry, in his view. He proposed that each of us replace the catalogue or list model of the great books with “our own ideal library” consisting of works that have been meaningful to us and those that have been meaningful to others, making sure to leave “a section of empty spaces for surprises and chance discoveries” as we accumulate new experiences over the course of a lifelong relationship with the written word. 

    The following winter, a rather less cheerful assessment of this state of affairs appeared in the pages of National Review. In “Our Listless Universities,” Allan Bloom diagnosed “an easygoing American nihilism” among students at the country’s top schools. Already “socialized” as historicists and cultural relativists, incoming freshmen viewed “the comprehensive truth about man” as at best “opinion,” at worst “prejudice,” and in any case “unavailable” to knowledge — and nothing about their four years at the university was likely to disabuse them of this “dogma.” Encouraged by their professors, according to Bloom, students in the humanities were unwilling to acknowledge that “one culture is superior to another,” that the “old books” of the Western canon were any “better than any others” being produced in the present, let alone ones that might “contain the truth.” As a result, classics such as the Bible and Plutarch — to use his examples — no longer made up the “furniture” of the “souls” who were bypassing the liberal arts altogether for degrees in the hard sciences, where at least the aspiration to truth-finding was integral to the program of study, and the professional schools, where at least there were material rewards to be had upon graduation. In the name of an “equality of values,” Bloom concluded, students had lost the ability to discriminate in their moral and aesthetic judgments; in the name of “openness,” they had become closed-minded. The only remedy — a sustained encounter with the great books — was the one that was being foreclosed by the usual suspects: structuralists, deconstructionists, Marxist humanists, and those professors who would introduce course requirements in non-Western civilizations and cultures. 

    Although he shared Matthew Arnold’s view that culture is “the best which has been thought and said in the world,” Bloom’s denunciation of relativism is less Arnoldian in spirit than Calvino’s endorsement of it. The apocalyptic tone of Bloom’s invective causes him to make absurd claims, some of which, like his animus towards rock music, are comically square, while others, such as his claim that among his students “it is almost respectable to think and even do the deeds of Oedipus,” cross the line into hysteria. The special contempt he reserves for feminists — whose demands for equality in the workplace, the domicile, and the culture he holds responsible for the destruction of everything from the family to eroticism to literature — is downright sinister. 

    On the last point, Bloom has this to say: “In the absence (temporary, of course) of a literature produced by feminism to rival the literature of Sophocles, Shakespeare, Racine, and Stendhal, students are without literary inspiration.” It is neither here nor there, but off the top of my head I can think of dozens of female writers who are more deserving of our attention today than Racine, starting with his contemporary Madame de La Fayette. Where the canonical status of Stendhal (and, by extension, the force of that parenthetical) are concerned, I would just like to add that, as Calvino points out, when the author of Le Rouge et le Noir was still alive, he was dismissed by none other than Leopardi as the sort of faddish litterateur, admired by his sister, whose work would never stand the test of time. 

    Yet what “Our Listless Universities,” lacked in Arnoldian “sweetness and light,” it made up for with popular appeal. Encouraged by his friend Saul Bellow, Bloom expanded the essay into The Closing of the American Mind, which became a surprise bestseller when it was published in 1987. The ensuing “Canon Wars,” which pitted conservative defenders of “dead white men” against “multiculturalists, feminists, and postmodernists” were misnamed: they were more like a theater or a front in a far broader political conflict. They helped to establish a pattern whereby the intellectual habits, political views, and sexual mores of eighteen-to twenty-two-year-olds at a handful of elite universities were opportunistically turned into full-blown moral panics outside them by conservative activists, whose concern about the curricular “corruption of the youth” has proved less sincere than their desire to destroy the institutional independence of the university, a four-decade-long siege that now appears to be in its final stages. The Closing of the American Mind was nothing less than “the opening shot of the culture wars” in the words of Camille Paglia, who meant it as a compliment. 

    For better or worse, the debate about the great books was my introduction to American intellectual life: well-thumbed copies of bestsellers such as Harold Bloom’s The Western Canon, from 1994, and David Denby’s Great Books, from 1996, could be found among the precocious-naïve collection on my high-school self’s book shelves. (The Closing of the American Mind and Paglia’s Sexual Personae I read only later — at the insistence of my father and an ex-girlfriend, respectively.) It was still raging when I enrolled at Columbia in the fall of 2001 to take Literature Humanities and Contemporary Civilization, the mandatory survey courses in Western literature and philosophy that formed the core of what the university calls its Core Curriculum. During my senior year, I was one of the student representatives to the Committee on the Core Curriculum, a position that had been created in the aftermath of the campus occupations of 1968, to give students a seat at the table of university governance, along with faculty and administrators. I soon came to understand that the representation was merely symbolic and the governance was entirely nominal. The few meetings of the committee that I attended took place in one of the administrative offices in Low Library, the stately dome that is the architectural centerpiece of the upper Manhattan campus. They were largely taken up, I recall, by the same activity that was always taking place among undergraduates there: arguing about which books did or did not belong on the curriculum. 

    This is a cultural habit that is neither original nor exclusive to the West; it is simply the byproduct of any educational system that is based on a finite set of books. Such an education will be one that necessarily includes disagreement about which ones are selected, why they were selected, and what the value of reading them are to individuals as they are, to society as it is, and to both as we might prefer them to be. Definitively resolving these disagreements cannot be the aim of education, since to do so would end the debate and thus the education itself — in other words, the disagreeing is in no small part where the educating happens. Not long ago, in his review of Rescuing Socrates, Roosevelt Montás’ memoir of his time as the director of the Core Curriculum, the poet John Michael Colón concluded that the Canon Wars were a missed opportunity. For Colón, the way the debate about the great books was framed by its defenders and its critics alike presented a “false choice between two impossible options”: to treat “as the world’s sole inheritance traditions whose claim to universalism we know is false, or to live . . . without any deep connection to the past that created us.” The way out of this impasse, he wrote, was not to throw out the baby of canonicity with the bathwater of Western chauvinism, but to create a canon that was genuinely global. To the claim that a particular set of books ought to be considered canonical because it, rather than some other set, is the best which has been thought or said in the world, the first question a well-educated person ought to ask is: how do you know? 

    In 1754, when Columbia University was in its first year of existence as King’s College in the Province of New York, a man named Wu Jingzi died in Yangzhou. Born to a prosperous family of late Ming and early Qing officials from Anhui province, about three hundred miles inland from Shanghai, Wu seemed to have a promising future ahead of him when he passed the preliminary civil service exam at the age of twenty-two. But money burned a hole in Wu’s pocket: he gave it away to anyone who asked. He was also a bit of a bon vivant, spending his time in tea houses, taverns, and brothels. Subsequent examination attempts ended in failure. In his early thirties, he moved his small family to Nanjing, where he eked out what would later become known as a bohemian existence, surrounding himself with a circle of writers, philosophers, and actors. 

    In the culturally vibrant “southern capital” of the Empire, Wu wrote poetry and published a now-lost commentary on the Book of Songs, one of the Five Classics, which, along with the Four Books, comprises the core of the Confucian canon. He built enough of a reputation as an independent scholar to be personally invited to Beijing to sit for a special round of exams, but for reasons that are unclear he did not attend. Romantically inclined historians interpret this as a principled rejection of corrupt officialdom; others say he was sick on the day of the exam. In 1739, he spent what remained of his funds helping to dedicate a temple to an ancient sage in Nanjing, which he considered the pinnacle achievement of his life. The following year he started work on a long piece of prose fiction — a satire of life under the Qing dynasty centered on the literati, the class of scholar-bureaucrats who managed the Empire and the examination system through which they were selected — for the amusement of his friends, who all belonged, however peripherally, to this class. Written over the course of the next ten years, the completed book, consisting of fifty-five chapters, circulated in manuscript for decades after Wu’s death, until a Yangzhou firm published it as Rulin Waishi, or The Unofficial History of the Scholars, in 1803. 

    Along with four novels from the Ming dynasty — Romance of the Three Kingdoms, Water Margin, Journey to the West, and The Plum in the Golden Vase — and Dream of the Red Chamber by Wu’s younger contemporary Cao Xueqin, The Scholars is sometimes considered one of the six “classic Chinese novels.” The designation — having gained currency following the publication in 1968 of a book of that title by C. T. Hsia, the Shanghai-born scholar of Chinese literature who spent three decades on the faculty at Columbia — represents a moment of cultural syncretism, adding long vernacular prose fiction to the extant canons of Confucian and neo-Confucian philosophy, imperial historiography, Taoist and Buddhist scripture, and anthologies of poetry and short stories. 

    If you had to pick only one of the six classic novels to read, you would probably choose Dream of the Red Chamber, which is “one of the great novels of world literature” — what “Proust is to the French, or Karamazov is to the Russians,” in the words of the critic Anthony West. But what interests me about The Scholars is that its central subject is a society whose cultural, legal, and administrative institutions are grounded in the humanistic study of a canonical body of texts. 

    Hsia praises The Scholars for its “shrewd realism” and “intelligent satire” whose “stylistic and technical innovations” were of “revolutionary importance” for “the development of the Chinese novel.” In “pure and functional” narrative prose, Wu manages to paint a panorama of the entirety of Chinese society from the Emperor, his generals, and high-ranking officials to provincial judges, merchants, booksellers, and farmers to mendicant monks, swordsmen, actors, and prostitutes. With one exception — the painter-sage Wang Mien, whose tale opens the book — these characters are drawn not from history or legend, as is the case with the Ming dynasty classics, but from the imagination or experiences of their author. Some are based on Wu’s friends and acquaintances; some are based on his rivals and nemeses; the account of the prodigal poet Tu Shao-ching is undoubtedly a self-portrait. Wu puts the tenets of Confucianism and the folk beliefs of Buddhist and Taoist popular religion into the mouths of his characters, but these are treated with an irony not present in earlier Chinese fiction. The tragicomic sensibility expressed in The Scholars is his own. 

    But any resemblance between The Scholars and the novel as it was then being developed in England by Samuel Richardson and Henry Fielding — to say nothing of its evolution during the century of Jane Austen and Henry James — ends there. To readers whose expectations of the form were set by Pride and Prejudice and Portrait of a Lady, the most distinctive and puzzling feature of The Scholars is, first of all, its structure. The critic Steven Moore compares it to a long-distance relay race. The narrative follows one minor character for a few chapters until it is handed off to another, and so on, creating a cast of more than sixty principals, none of whom, not even Tu Shao-ching, function as its protagonist. Nor are any of them truly “round,” in E. M. Forster’s sense, since Wu is less interested in the psychological interiority of individuals than in the networks of social relations that connect them. The unified chronotope that organizes Western realist fiction is also absent from The Scholars. The book ranges great lengths in both space — from the Lower Yangtze region where most of the book is set, up to the imperial capital in Beijing, and down to Guizhou where the military brutally pacifies Hmong rebels — and time — after the prologue, set in 1368, the book spans the years 1487 to 1595, without organizing the narrative around the familiar allegorical unit of the single, multigenerational family. (It is thus set entirely in the Ming dynasty, no doubt because Wu wished to avoid any trouble for his satirical barbs against the current rulers: as Manchus, the Qing emperors suspected, not without reason, that they were regarded, like the Mongol Khans before them, as ethnic usurpers by the Han literati who staffed their civil service. They were known for conducting so-called “literary inquisitions,” which involved burning seditious books, and imprisoning or executing their authors.) Because of its sheer mass, the book has been described as plotless, though, as we will see, its apparently episodic structure is subtended by a deeper thematic logic. 

    Perhaps it would be better to think of The Scholars as the culturally specific “unofficial history” its title says it is, rather than as a novel, whose use as a catch-all term for “long fictional prose narrative” tends to obscure more than it illuminates. As the name suggests, an unofficial history is a parody of official or orthodox history — a genre that extends from the Records of the Grand Historian, written in the first century B.C., to The History of the Ming, completed in 1739 — in which a chronicle of the noble lineages and heroic deeds of emperors is replaced with a chronicle of mostly petty, pompous, and vicious scholar-bureaucrats; classical Chinese is replaced with the vernacular; and ostensibly factual persons and events are replaced with those that are ostensibly fictional. To this burlesque of high literary tradition, Wu grafts one from the other end of the class spectrum. For centuries, professional storytellers had entertained popular audiences at tea houses with tales of lovers, ghosts, warriors, and criminals. By turns didactic and bawdy, and often interlaced with topical observations and social commentary, these tales, whose episodes a skilled performer could parcel out over the course of months, began to be collected and published toward the end of the Ming dynasty for the consumption of literate audiences. 

    The imprint of the storytelling tradition on The Scholars can be seen in the short synopsis that opens each chapter and the formulaic sentence that concludes all but the last (“if you would like to know what happened next, you must read on”), devices that live on today in the recap sequences and cliffhangers of soap operas and other serial narratives. It can also be seen in the proem, which states the “moral of the book”: 

    Dynasties rise and fall, 

    Morning changes to evening . . . 

    And fame, riches, and rank 

    May vanish without a trace. 

    Then aspire not to these, 

    Wasting your days 

    “The idea expressed in this poem,” the narrator acknowledges, “is a commonplace one.” Indeed it is: the same idea can be found in Ecclesiastes and the Meditations of Marcus Aurelius, to cite just two examples. Wu illustrates the point with the story of Wang Mien. A good son from a humble background and an autodidact of genius and genuine curiosity, Wang chooses not to apply for an official career. Instead, he uses his talents to become a painter, which only serves to bolster his reputation among powerful officials. To keep his integrity intact, Wang is forced to come up with a series of increasingly elaborate and comical ruses to avoid meeting with them, ultimately becoming a hermit who lives in voluntary poverty. Near the end of his life Wang receives a visitor he can neither escape nor refuse: Chu Yuan-Ching, the founding Emperor of the Ming dynasty, who seeks his advice on the management of his kingdom. That the powerful are irresistibly drawn to those who shun power and contemptuous of those who seek it is another commonplace, as Diogenes and Plato, in their respective interactions with Alexander and Dionysius II, could both attest. 

    In his famous essay on the figure of the storyteller, Walter Benjamin remarks that “the nature of every real story” is that “it contains, openly or covertly, something useful.” “This usefulness,” he continues, “may consist in a moral; in another, some practical advice; in a third, a proverb or maxim. In every case, a storyteller is a man who has counsel for his readers.” For Benjamin, having counsel is what distinguishes a storyteller from a novelist, who, being “himself uncounseled . . . cannot counsel others.” Fortunately for readers whose tastes have been formed by the novel, for whom being spoon-fed moral counsel and practical advice adulterates aesthetic pleasure, the commonplace about the vanity of external things that opens Wu’s unofficial history turns out to be a red herring. 

    If The Scholars can be said to have a protagonist, it is not a person but an institution: the imperial examination system, which is the only thing that directly or indirectly touches the lives of most of the characters in the book. First instituted in the seventh century, the idea behind the exams was to select officials based on merit rather than birth, and to promote moral rectitude and ideological coherence among the group that would be in charge of administering an increasingly large and populous territory by grounding their education in a common set of culturally venerated texts, namely, the Confucian classics, and a common skill, namely, the ability to read and write hundreds of thousands of characters of non-vernacular Chinese. 

    By the time of Wu’s birth a little over a millennium later, the exam system had become a bureaucracy within the state bureaucracy. No fewer than three departments — the Imperial Secretariat, the Ministry of Rites, and the Han Lin Academy — were responsible for overseeing a bewildering array of exams. There were district exams, prefectural exams, qualifying exams, special preliminary exams, special exams, provincial exams, metropolitan exams, palace exams, and exams for the military, conducted every other year for two to three million candidates at between thirteen hundred and fourteen hundred sites around the country. Each dynasty put its distinctive stamp on the testing regime, according to the needs and the fashions of the times. One character, the publisher and bookseller Ma Chun-Shang, 

    (somewhat inaccurately) summarizes the history thus: during the Spring and Autumn period civil servants were selected for their skills in the art of the aphorism; during the Warring States period, for their skill in rhetoric; during the Han, for their exemplary deeds and character; during the Tang, for their ability to write poetry; and during the Song, for their knowledge of neo-Confucian philosophy. In one of his first official decrees, the Ming emperor scrapped the poetry requirement in favor of examinations based on the “eight-legged essay,” so-called because of the eight elements of its structure, which candidates had to follow step-by-step to answer a question on a topic selected from one of the Four Books. (The Qing civil service retained the eight-legged essay and Song neo-Confucian orthodoxy; shortly after Wu’s death, it also reinstated the poetry requirement in order to make the exams more competitive.) 

    Although the exams afforded, on principle, a degree of social mobility, there were no public schools, so candidates from the landed gentry and the merchant class, who could pay for private tutors and study materials, were at a distinct advantage; besides, it is much easier to hold your brush steady if you have had something to eat that day. (From the numerous dining scenes, whose menus are described in greater detail than the appearances of some of the characters, one gets the impression that The Scholars was written on an empty stomach.) Wu delights in slipping errors of fact into the mouths of pompous examiners and graduates of the prestigious Han Lin Academy, who were tasked with determining what constituted the proper and orthodox interpretation of the classics. In any event, any claim the system might have had on being genuinely meritocratic was undercut by the simple fact that women were barred from sitting for exams, a point underscored in The Scholars by the story of Lu’s daughter, whose intelligence, learning, and abilities as an essayist put to shame those of the successful literatus she has been married off to. 

    Since a position in the civil service conferred a degree of financial stability and social status on the successful candidate and his extended family, making sons more marriageable, the psychological pressures on the candidates were immense. Not for nothing is the title of one of the few books available in English on the subject China’s Examination Hell. At the mere sight of an exam school, one traumatized failure, Chou Chin, blacks out; another, Fan Chin, has a mental episode when he learns that, after almost a quarter century of sitting for the provincial exams, he has finally passed. Conditions like these encouraged favoritism, corner-cutting, bribery, and cheating. An intelligent and filial scholar, Kuang Chao-jen is corrupted by success; he later impersonates an exam candidate for money and becomes involved in a number of criminal schemes. As his case demonstrates, there was no necessary connection between the study of morality and its practice — and Kuang is no outlier. In fact, there is an inverse correlation between proximity to officialdom and decency as a person; perhaps that is why one of the book’s few virtuous literati, Dr. Yu Yu-teh, encourages his son to study medicine instead, and why its most sympathetic characters are poor farmers, actors, and others from socially-despised backgrounds. 

    To top it all off, there is a pervasive sense of fatalism among the candidates that ought to have been at odds with their rationalist philosophical training. Characters consult fortune tellers, astrologers, mediums, alchemists, and dream interpreters — behaviors that are typical of those who feel that the course of their lives is beyond their control. Along with tutoring the sons of the wealthy, doing clerical work for local officials, and contributing to China’s growing marketplace for books, these were some of the services provided by members of the vast underclass of highly educated exam failures produced by the system. It was hardly a recipe for social stability. 

    For me, the main pleasure of The Scholars is reading the debates conducted by the characters about the pedagogical, aesthetic, and political implications of every aspect of the exam system, which, for those who have ears to hear it, rhyme with many of the debates conducted in the United States over the past four decades. Wu’s third person narrator remains largely neutral on these debates, allowing the actions of the characters to stand as subtle confirmations or denials of the validity of their positions about the social value of poetry and essays, the criteria for judging exam performances, the trustworthiness of experts (or lack thereof), the relationship between scholarship and governance, the respective virtues of general knowledge and specialization, the duties of scholars to participate or refuse to participate in the civil service, and so on. In a handful of cases, however, he puts his thumb on the scale. After his audience with the emperor, for instance, Wang has this to say about using the eight-legged essay for exams: “These rules are not good. Future candidates, knowing there is an easy way to a high position, will look down on real scholarship and correct behavior.” To institutionalize something of value is to risk compromising it, and Wang predicts that linking scholarship to wealth and social status will have the effect of turning the exam, rather than the knowledge it is supposed to test, into the purpose of study. As if to prove his point, many years later Ma, the publisher of a bestselling collection of eight-legged essays, tells a young charge: “Even Confucius, if he were alive today, would be studying essays and preparing for the examinations instead of saying, ‘Make few false statements and do little you may regret.’ Why? Because that kind of talk would get him nowhere: no one would give him an official position.” Dostoevsky’s Grand Inquisitor would have been impressed by Ma’s reasoning. 

    Disillusioned by the corruption of scholarship, a motley crew forms. This group of failed candidates is made up of talented refuseniks and independent men of letters in Nanjing, and led by Wu’s “romantic” alter ego Tu and his friend Chuang Shao-Kuang, who has just declined the Emperor’s offer of a ministerial post thanks to a timely intervention by a scorpion that crawls into his scholar’s cap. The group decides to create a counter-institution called the Tai Po Temple. Chapter Thirty-Seven, which is about the temple’s dedication ceremony, is unanimously held to be the central episode of The Scholars, the one that gives structural and in all respects from the other chapters, including its form, whose “rhetoric of repetition” and “schematic expository style,” in the words of Shang Wei, the Du Family Professor of Chinese Literature and Culture at Columbia University, are pastiches of the ancient ritual manuals that were objects of great interest in Wu’s circle in Nanjing. Wu catalogues the ritual’s seventy-six participants, led by the virtuous Dr. Yu as master of sacrifice, in a manner reminiscent of the catalogue of ships in the Iliad; he describes their rites of purification and ceremonial dress; he details the decorations put up in the temple, the various items sacrificed to the ancient sage Tai Po, and the period-specific musical instruments played to entertain his spirit; finally, he gives a blow-by-blow account of the ritual itself. By recreating this Confucian-era practice of ceremony and music, the participants hope to help “produce some genuine scholars, who will serve the government well.” 

    It is a gesture that is as nostalgic as it is quixotic. “Of all the eighteenth century novelists,” Shang writes in Rulin waishi and Cultural Transformation in Late Imperial China, “Wu Jingzi was the one most engaged with issues of contemporary intellectual discourse.” During the eighteenth century, independent scholars publishing in China’s thriving literary marketplace began to subject neo-Confucian orthodoxy to the sorts of philological and evidential analysis akin to European “higher criticism” of the Bible. At the time, Shang notes, the literati were experiencing an “unprecedented degree of division and fragmentation,” and a corresponding anxiety about the “decline of the Confucian world order” that provided, in their view, the legitimacy of the state. By placing ritual at the center of his sweeping social critique, Wu’s unofficial history displays what Shang calls the “paradoxical combination of cultural iconoclasm and Confucian revivalism” characteristic of his generation’s intellectuals. 

    But how paradoxical is it, really? Critique of the present is just as often legitimized by an appeal to an idealized past as it is legitimized by an appeal to an imagined future: look no further than Bloom’s attempt to combat what he perceived as cultural decadence by regrounding elite education in the study of the Bible and Plutarch during the Reagan administration. Wu, for what it is worth, seems to be aware of the problem. Here he does not offer anything so simplistic as a commonplace about the externality of rank, riches, success, and fame to a flourishing life. In the final chapter of The Scholars, which takes place forty years after the dedication ceremony, a tea-house keeper named Kai Kuan visits the Tai Po Temple only to find it abandoned, its roof collapsed, its gate in ruins, and the musical instruments gathering dust inside or missing altogether. The events in the lives of Tu Shao-ching and Chuang Shao-Kuang have begun to fade into legend; no one can remember the details with any precision. The destiny of institutions may be corruption, as The Scholars amply records, but counter-institutions that fail to become institutions lack the material base necessary to ensure their longevity. 

    This is the paradox that has not ceased to be germane to the many who are concerned about the state of the humanities in particular and the American university more generally. Compared to the culture wars of today, the debate about the great books seems high-minded, even quaint. In the 1980s, the one thing that the advocates and the critics of the Western canon agreed about was the importance to society of humanistic education — a proposition that can no longer be taken for granted. While faculty argued at department meetings and at symposia and in op-eds about undergraduate humanities curricula, the material foundations for reproducing academics as a class and the liberal arts as an institution was cracking up. Skyrocketing tuitions and student loan debt, dried-up tenure lines, crushed graduate student unions, adjunctification, bloated administrations, drop-offs in enrollment and faculty hires, shrinking or shuttered departments, incoming freshman whose reading abilities were formed in the wake of No Child Left Behind, the smart phone, and now generative AI — the litany will be familiar to anyone who has been paying attention to the state of higher education for the past decade and a half. To this we can now add direct political interference in curricula and hiring decisions at Columbia, Harvard, and elsewhere by a presidential administration staffed wall-to-wall by culture-war berserkers who are in style, if not in substance, Bloom’s grandchildren. 

    It seems increasingly likely that, in the future, the experience of reading any literature, let alone the classics, let alone a global canon of the classics, will be the pastime of interested amateurs building for themselves the ideal library that Calvino envisioned, rather than a course of study pursued by undergraduates receiving a formal education in a college setting as Bloom assumed. Given this, it is hardly surprising that the last few years have seen numerous para-academic, educational, and cultural counterinstitutions — publications, seminars, salons, and institutes in the humanities and social sciences — set up shop outside the university system, like so many intellectual lifeboats floating alongside the hull of a sinking ship. Yet those who are prepared to abandon the university to its fate would do well to contemplate what happens to the Tai Po Temple. 

    The Scholars ends on a somewhat hopeful note, with the stories of a bohemian calligrapher, a draughts player, a painter, and a lyre-player. Each engages in these traditionally aristocratic pastimes not to appear refined, or to achieve social status, but simply because they “happen to like these things.” In spirit they recall Wang Mien, the sage whose story opens the book, or Tu Shao-ching, who is contented to live by his pen in the company of his family and friends, and does not complain of the decline in his social status and personal fortunes. Perhaps, in good Confucian fashion, Wu suggests, they will become the foundation of a new cycle of order emerging from the disorder he has chronicled in the relay race of his unofficial history. 

    The next cycle, however, was to be the last for the imperial literati. It was to come to an end in a surprising place. Just as The Closing of the American Mind was published when the United States was on the cusp of becoming the world’s sole superpower, Wu’s unsparing critique of contemporary society was written during a period that historians now call the “High Qing.” In retrospect, it was an apex moment — in terms of territorial expansion, political influence, and cultural achievement — for imperial China. Alongside the cyclical narrative about the imperial bureaucracy, Wu tells a story whose import would be far greater in ways he could not have predicted: the rise of the merchant class, which goes hand-in-hand with a coarsening of social life — as exemplified, in Wu’s satire, by the philistines of Wuhan — and the corruption of officialdom through the power of money. 

    The Scholars is awash in silver, which is at first given away, then loaned without expectation of return, and finally loaned with interest for profit. Mined in the Spanish colonies of South America in the eighteenth century, silver reached Chinese merchants through the intermediary of the British East India Company, who used it to purchase silk, porcelain, and, most importantly, huge amounts of tea. Shortly after Wu’s death, the emperor instituted the canton system, restricting European trade to a single port, today’s Guangzhou, at the mouth of the Pearl River. After the world’s silver reserves contracted, due to overmining, Britain, which had been running a severe trade deficit with China — sound familiar? — scrambled to find a product that would help keep the tea trade afloat. The East India Company discovered one in the Indian territories of Bihar and Bengal that they had recently conquered: opium. 

    As the novelist Amitav Ghosh detailed in his recent book Smoke and Ashes, the East India Company, having supplanted their Dutch rivals, proceeded to run one of the largest illegal narco-trafficking operations in history. Their monopoly was soon to be broken by a newcomer on the global scene, the United States Merchant Marine, whose ships gave them a run for their drug money. When the Qing dynasty tried to crack down on opium smuggling and smoking, EIC sepoys invaded Guangzhou, forcing the Emperor to legalize the drug at gunpoint. Following the Treaty of Nanking, which ended the First Opium War, the EIC took home millions of pounds of reparations payments, a free market for unlimited trade on British terms, the territory of Hong Kong, and a new idea: the use of exams to select potential employees. 

    For China, it was the start of “the century of humiliation,” which would go on to include a Second Opium War, a rebellion led by a failed examination candidate that left twenty to thirty million dead, and further military defeats at the hands of Britain, France, Russia, and Japan. Shocked by these events, the Qing intelligentsia blamed, among other things, the imperial examination system for holding back the modernization process that would be necessary for China to compete militarily and economically in the new globalized world. In 1905, shortly before both the dynasty and the empire collapsed, educational reformers persuaded the court to eliminate the exams. 

    That same year, halfway across the world, the president of Columbia University inaugurated a library on the school’s new campus in Morningside Heights. Since he had paid for it out of the funds of his inheritance, he decided to name the temple of learning after his father, Abiel Abbot Low, who made his fortune smuggling opium into Guangzhou. 

     

    There Is No Privacy Pill

    On a warm Monday in June 1965, the Supreme Court declared that married women had the right to use contraceptives. This was a hard-won victory for Estelle Griswold, executive director of the Planned Parenthood League of Connecticut and namesake of the case, Griswold v. Connecticut. She had previously helped displaced persons after World War II and, motivated by her conviction that contraceptives could alleviate poverty and human suffering, fought tirelessly to overturn the birth control laws in Connecticut, then some of the strictest in the country. Her persistence in the face of failed appeals, fines, and even jail time managed to transform contraceptive access from something reserved for well-resourced women to something available for all (married) women. But the legacy she left behind is far greater than the outcome of this single court case and the women her clinic personally helped with family planning. Griswold v. Connecticut set a precedent for the blockbuster reproductive health victories that followed, like Eisenstadt v. Baird, which extended contraceptive access to unmarried women, and Roe v. Wade, which granted women the federal right to an abortion. It also laid the groundwork for future court cases that decriminalized sodomy, same-sex marriage, and interracial marriage. But that wasn’t all. Estelle Griswold’s fight for contraceptive access paved the road for something else, something she couldn’t have imagined at the time: the right to internet privacy. 

    Even as a privacy researcher, it wasn’t immediately obvious to me that a married woman’s right to contraceptives was somehow related to internet privacy. The connection is thanks to the argument made by Justice William O. Douglas, who delivered the majority opinion in Griswold, in which he stated that to interfere with the contraceptive use of married couples would be a violation of their — and here is the key word — privacy. To connect the dots explicitly, he asked and then answered his own question. “Would we allow the police to search the sacred precincts of marital bedrooms for telltale signs of the use of contraceptives? The very idea is repulsive to the notions of privacy surrounding the marriage relationship.” On its own, this section of his opinion is worth celebrating, at least for married couples, but it is narrow. Yet Douglas didn’t stop there. He argued that the marriage relationship was actually just one example of something that falls in the “zones of privacy” afforded to Americans, as implied by the First, Third, Fourth, and Fifth Amendments. Essentially, Americans have the right to privacy even though it isn’t explicitly stated in the Constitution, and that right manifests across many different zones of life. Other zones include the home, and the papers and other personal effects within them, as covered by the Fourth Amendment’s protection “against unreasonable search and seizures” and the fifth amendment’s self-incrimination clause, which turns our words into a zone of privacy, such that we cannot be forced to speak in a way that can be used against us. 

    Obviously, the internet was not explicitly mentioned in a ruling made decades before it existed, but it has since become a space where we regularly make decisions akin to the ones we make in the “sacred precincts of marital bedrooms” and manage documents as sensitive as the papers we keep in our homes. The notion that at least some corners of the internet should be zones of privacy is evidently held by a majority of Americans who, according to the Pew Research Center, are concerned about the state of privacy online and in general. We have Griswold, and the subsequent legislation built on top of it, to thank for that. In the sixty years since this landmark ruling, the fates of both reproductive health and internet privacy have continued to touch, even as they both face an uncertain future. Perhaps this unexpected entanglement can teach us something about what is to come for them both. 

    In the decades since Griswold v. Connecticut, the internet has expanded rapidly, subsuming much of our social infrastructure. As a consequence, the many data pieces that stitch together a single life often exist as online digital traces, covering everything from shopping to taking tests, filing taxes, and making doctor’s appointments. Although the United States still has no comprehensive federal privacy law — unlike other governing entities, like the European Union — it does have some regulations which acknowledge that certain categories of data require more privacy than others. For example, the Health Insurance Portability and Accountability Act of 1996 restricts the collection and use of protected health information, like emails, phone numbers, social security numbers, or IP addresses, that can be used to link an individual to their health records. This regulation has taken a cue from Griswold and carved out specific digital records as belonging to a zone of privacy. 

    To see it in action, consider a case from two years ago. A patient at Redeemer Health, a Catholic non-profit health system, requested that Redeemer send a prospective employer a test result from her records, likely the results of a drug test or physical exam. Rather than only send the specific result requested, Redeemer sent the prospective employer her entire medical record without her consent, including her reproductive health and other OB/ GYN history. Consequently, Redeemer Health paid over $35,000 in settlement money and committed to a two-year corrective action plan. HIPPA serves as an example of how the law has codified the notion that reproductive health, even in digital form, is deserving of privacy in a way that cannot legally be breached. 

    Still, even though we have some internet privacy regulation today, it is insufficient for robust privacy. Although this reality has been somewhat demystified through large and well-documented scandals, like Cambridge Analytica’s unauthorized collection of Facebook data and Edward Snowden’s NSA disclosure, it can be easy to forget or dismiss. This is especially true when we use the internet to browse more intimate matters in isolated physical spaces, under what can feel like a cloak of invisibility. When privacy violations intersect with reproductive health, however, it becomes clear just how important, and lacking, internet privacy really is. This is particularly easy to see through the lens of targeted advertising. Consider the case of a Minneapolis teenager, whose targeted advertisements exposed secrets she hadn’t yet shared with her own family. Her father, initially outraged by the ads, discovered that she was pregnant from the recommendations she received for baby clothes and cribs from, ironically, the company Target. That this behavior feels spooky and inappropriate is a testament to the cultural understanding facilitated by Griswold that, when it comes to reproductive health, women deserve privacy from entities like corporations or governments. Even so, this type of tracking still happens today, often facilitated by big technology companies like Google and Meta, on hospital websites and pharmaceutical websites selling products like Plan B. This means that the personal decisions women are making about their bodies — such as searching for abortifacients — are collected and stored such that they may be made available to advertisers or, in some cases, the government. This type of privacy violation can already cause harm when living in a country with strong reproductive health protections. In a country without them, the consequences can be far worse. 

    Today, it is unclear which category the United States falls in. By many measures, reproductive health care access is hanging on by a thread. Roe v. Wade was overturned three years ago by another landmark ruling, Dobbs v. Jackson, after which twelve states banned abortion entirely and an additional seven states enforced restrictions after a certain number of weeks. To some extent, this ruling creates similar challenges for those seeking an abortion in the United States to those who were seeking birth control in Griswold’s era. Women lucky enough to be born in the “right” state, or with the means to travel to one, can still get abortions in this country, while the rest cannot. Once again, reproductive health access is stratified across socioeconomic lines. Unlike Griswold’s era, however, the internet is now widespread and facilitates access to contraceptives, medical professionals, and other community networks for all women, regardless of the state they are living in. This collision tests the limits of how much the internet can fulfill its original promise. Eight years ago, on its twenty-eighth birthday, World Wide Web inventor Tim Berners-Lee wrote a message calling for progress towards an internet “that gives equal power and opportunity to all.” A crucial component of an equal internet is one that respects the right to privacy that William Douglas found in the penumbra of the Constitution. 

    That is not the internet we have today. Much like abortion access, the internet exists in a state of stratification, with comprehensive privacy only available to those with the technical expertise to use privacy enhancing tools. Once again we see internet privacy and reproductive health run on parallel tracks. For Jessica Burgess, a woman in Nebraska who helped her 17-year-old daughter acquire abortion medication, the dearth of internet privacy had catastrophic consequences. Their conversations about it on Facebook Messenger created a digital trace that prosecutors were able to obtain and use to send both mother and daughter to prison. In another example, New York–based doctor Margaret Carpenter was able to prescribe abortion pills from New York, where abortion is legal, to patients in Texas and Louisiana, where it is not. Yet both states have filed charges against her, a civil penalty and criminal felony respectively, at least in part thanks to a digital trail of evidence. This digital trail doesn’t even need to be so explicit as to detail specific medications or procedures to become problematic. Privacy experts have already raised alarm bells about the role that menstrual tracking application data could play in future court cases. 

    Reproductive health rights and internet privacy have been unexpectedly entangled as both experienced a rollercoaster of expansion and contraction. The right to privacy in the United States, the guiding principle that unites them, has become largely uncertain. We are now living in a time where we are increasingly stripped of the right to make decisions about our bodies and conduct our intimate business online free from peering eyes. If history is any guide, those in desperate need will increasingly turn to alternative methods or technically complex tools for both abortion and internet privacy, ones that are less regulated and may be more dangerous. But hopefully that can change. With any luck, our generation’s Griswold is already gearing up for the next fight.