Historians Killing History

    In the aftermath of the Hamas attacks of October 7, the subsequent congressional hearings with university presidents, and the encampments that followed, academia has once again found itself at the center of the culture wars, from which it rarely strays far. On one side, critics denounce universities for “wokeness,” while on the other side, defenders of universities condemn the anti-woke critics of reactionary politics and bad faith. These battle lines are tediously familiar to anyone who paid attention to the history wars, which had until October 7 formed the principal theater in the academic culture wars. The combatants are occupying the same lines of trenches from which they fought over the 1619 Project, the Florida AP African-American Studies standards, and so on. Behind the coils of rusting barbed wire, the front has scarcely budged.

    It is striking to me, as a somewhat rare creature — a military historian in civilian academia who has studied the work not only of Clausewitz but also of Foucault — how incomplete and self-serving are the arguments of both sides. But one need not know German military theory to appreciate that the narrow slits of pillboxes offer less than comprehensive views of reality. Nor need one know French critical theory to grasp that the tales people tell about themselves and their opponents do psychological work for them. Analyzing these tales requires attention to what they exclude as well as what they include. In all the hue and cry about the university, there has been virtual silence about the ostensibly unexciting subject of scholarly standards — those quaint things supposed to ensure that academics generate and communicate knowledge more rather than less rigorously. Instead, we have had a great deal of talk about ideology, or rather ideological corruption, by and about both academics and their critics. The driving question is not, “what is the evidence for your argument, and is it sufficient?” but “whose side are you on?” 

    Accusing each other of ideological corruption enables both sides to avoid reckoning with the collapse of scholarly standards in their own ranks. In effect, they have colluded to misdiagnose — or at best incompletely diagnose — the nature of the problem, and in a way that serves both their interests. To paraphrase the description of the Sirius Cybernetics Corporation’s products in the Hitchhiker’s Guide to the Galaxy, the superficial design flaw of ideology hides the fundamental design flaw of declining standards. 

    The indifference to scholarly standards should be evident to even casual observers of the academic culture wars. The late Harvard law professor Charles Fried justified his refusal to consider the accusations of plagiarism against Claudine Gay on the merits because of who was making them: they were part of an “extreme right-wing attack on elite institutions.” In strikingly similar language, the hedge-fund manager Bill Ackman refused to consider the accusations of plagiarism against his wife on the merits because they were part of “attacks on my family.” This is not the language of standards. This is the language of the bunker. 

    Likely less evident to the casual observer is the indifference to scholarly standards that has characterized the history wars in recent years, or the way in which shared silence about standards between ostensible opponents in the history wars anticipated the same shared silence in the academic culture wars more broadly. Too many historians, of varying ideological stripes, mimic the forms of scholarship without reproducing its substance. They make trips to archives, consult the secondary literature, and cite sources in footnotes, but their research lacks rigor and integrity. It is not scholarship, it is pseudo-scholarship. The intellectual incompetence — or dishonesty — of many critics of the historical profession simply mirrors that of a great deal of the profession itself. No wonder neither wants to look in the mirror: they would find their enemy, themselves, staring back out at them.

    Reframing the problem in terms of standards rather than ideology is important for three reasons. First, it provides the basis for a vital center in academia, which is needed there just as much as in politics. This center, if it is to be real and not merely a band-aid over differences, cannot be defined by a priori ideological commitments; it must be defined by reinforcing commitments to process — be it scholarly or liberal-democratic — and to human dignity. A call for “process” may not sound like an inspiring blast from a trumpet, but (as I have argued previously in these pages) process secures moral substance, to which the heart should thrill. It channels historians towards humanizing and away from dehumanizing the people they study; it encourages them to treat those who lived in the past as subjects to be understood as fully as possible, not as mere objects to be over-simplified and manipulated according to historians’ whims. Thus standards, which distinguish between more and less rigorous process, also distinguish between more and less humanistic substance. Not coincidentally, liberal democracy requires the same humanism.

    Second, thinking in terms of standards rather than ideology has implications for how we understand the directionality and the substance of the relationship between universities and politics in seeking to explain the dysfunctionality of both. One formulation has it that (left-wing) politics have corrupted universities; spaces that were once about the pursuit of objective truth have succumbed to the lure of pursuing ideological victory. Another, expressed by the statement that “we all live on campus now,” has it that universities have corrupted politics by leaking out dangerous (identitarian) ideologies. While these two formulations reverse the directionality, both see the substance of the transmission as ideological in character. Hence the transmission of low scholarly standards — the decline of the quality of evidence and argument — has also largely escaped notice and analysis.

    Doubtless this transmission in and out of the academic bubble goes in both directions. But the transmission from universities to the public deserves special scrutiny, because the public is not responsible for upholding scholarly standards the way universities are. In effect, the American people have delegated to universities the responsibility for generating reliable and methodologically proven knowledge, and for educating the public (especially its young) in the rigors of the scholarly process through publication and teaching. As the price for accepting that responsibility, academics have demanded certain rights and privileges — most importantly, academic freedom and tenure — which amount to sovereign exemptions from regular political accountability and market forces. Simply put, tenured academics enjoy a degree of economic security available to few others. By accepting payment in the form of these special rights and privileges, academics have accepted special responsibility for upholding scholarly standards in trust for society. If and when they betray this trust, they must take special responsibility for the collapse of scholarly standards in society at large. The rough comes with the smooth — and whether or not academics accept it in principle, it comes in practice. What happens on campus does not stay on campus. When academic disciplines debase themselves through an aversion to rigor, precision, complexity, and non-conformism, when they behave indifferently and even adversarially toward the apparatus of evidence and logic, we should expect the same debasement of public discourse; and we find it everywhere now. This is social accountability more pervasive than legislatures can conjure.

    Finally, reframing the problem in terms of standards rather than ideology makes visible the similarities between what has happened in academia and what has happened in other socially important institutions across the ideological spectrum. The police come to mind as a parallel. In return for promising to provide a public good — security — they too have claimed exemptions — “qualified immunity” — from normal political and economic accountability, while insisting on the right to hold everyone else to it. Rather than keeping their promise, they have too often failed to uphold standards internally. 

    My purpose in indicting this institutional degradation is not to endorse knee-jerk anti-institutionalism. We need institutions. But we also need those institutions to have the honesty and the self-confidence to look in the mirror and admit when they have behaved badly, no matter how badly their critics may be behaving. The resort instead to pointing the finger at outsiders is a terrible evasion, which goes far towards explaining America’s social and political dysfunction today.

    I do not propose to attempt a historical account of why and how pseudo-scholarship conquered the historical profession, its critics, and its defenders. Instead I want to try to offer a snapshot view of the historical profession as it exists today. No such view can be comprehensive; there are too many people to know and too many things to read. Mine is based on my experience over a dozen years since being inducted into the profession. Conversations with colleagues have made me believe that my observations are shared by others. I offer them on that basis. 

    The single most important aspect of the collapse of scholarly standards in the historical profession is the inability to distinguish between rigorous primary-source research and superficial primary-source research, especially archival research. Primary sources refer to sources produced by the people historians are studying (as distinct from secondary sources, which, as a rule of thumb, refer to sources produced by other historians). Primary-source research is supposed to be the heart of what historians do: it is the very basis of original knowledge about the past. Yet far too many dissertations and monographs contain far too little original research, instead offering clever ideas and new interpretations resting on slender and shallow primary-source bases. This is no surprise, since most doctoral programs in history offer little-to-no training in primary-source research. Indeed, there is now a generation of professors who cannot teach students how to do rigorous primary-source research because they never learned how to do it themselves.

    Perhaps the principal reason for this lack of training is that many historians regard research as easy and theory as hard. This is absurd. Primary-source research is a craft, and like other crafts it blends art and theory. If we loosely define “critical theory” as questioning everything that we think we know about the world as a potential attempt by powerful people to mislead us, then we might call good archival research applied critical theory. The reason is that good archival research is an exercise in examining the archives for what they are hiding and what they are missing. To do this, one must understand how archives are constructed and interpret sources in their archival context; one must not treat archives as existing in nature, or sources as possessing self-evident meanings independent of their archival contexts. The applied character of this critical theory tends to make its theory-ness invisible to historians socialized to expect theory in textual form.

    In addition to requiring theoretical sophistication, good archival research is a grind. If a particular archive is central to a historian’s research, then answering their questions should require months of research there and supplementary follow-up visits, as they realize that they merely skimmed key files because they had failed to understand the files’ potential relevance. This type of research is time consuming, difficult, and expensive. One may get lucky and have the help of a finding aid — or one may have to trawl through hundreds if not thousands of feet of uncatalogued boxes. One takes the tedium and the boredom along with the satisfaction and the pleasure because it really is the only way to develop a sense of a period and a subject, of the many roads not taken along with the few that were, of the universe of information that might once have existed. Without a sense of the whole, one cannot make arguments based on the parts that one has seen with appropriate qualification and precision.

    Too much of what passes for serious archival research these days is, to my eye, nothing of the sort. It is not the work of people who went to archives with a commitment to understanding their construction and a sense of excitement about learning new things. Instead it is the work of people who went to archives looking to anoint their argument with a few references for a patina of authenticity. Such historians chase trends more than proceeding from intellectual curiosity, tell tidy stories stripped of the messiness of real life, and produce the historical equivalent of fast fashion, which gets rewarded because the profession can no longer distinguish it from couture. It is a devastating indictment that the techniques of dumbing-down and appealing to emotion that are used to achieve social-media virality also help historians to achieve professional renown.

    The failure to master the craft of archival research leads to failures of what historians call “source criticism.” If a historian is interested in government policy, for instance, they need to understand that confining their research to the official records of the most central and highest-level bodies will not give them the same degree of insight as scholars who also take the trouble to consult the records of lower-level departments and other collections. Similarly, they have not done the same source criticism as another scholar if they order a photocopy or scan of a single document from a file cited by the other scholar and do not take the trouble to visit the archive to see the whole file. This is basic stuff, or it ought to be. It is appalling how many historians do not seem to understand these fundamentals.

    The inability of too many historians to distinguish between serious and superficial primary-source research is closely connected to what I would describe as functional historiographical illiteracy. My sense is that far too few historians today understand the difference between reading a book and reading in a book. When historians read a scholarly work, they should be reading, in the first instance, to try to understand what questions the author was trying to answer and why the author made the choices that they did. Readers may still end up disagreeing with the author’s answers or thinking that they made bad choices — but readers owe authors the courtesy of trying first to understand them on their own terms. Too many historians skip that step and default straight into criticism mode, especially if what they are reading conflicts with their existing views or their prior sense of how the world works. 

    I doubt that I am the only historian to have spent graduate school positively reveling in criticism mode. It filled my callow mind with such a sensation of power. But we are supposed to have such immaturity knocked out of us by strict yet compassionate supervisors, and by the process of writing a carefully researched dissertation and learning for ourselves how hard the scholarly craft is. If historians do not learn that lesson because they do not do careful research — if they approach their dissertation as a glorified think-piece and persist in believing that it is much easier to interpret the past authoritatively than it actually is — then they simply do not know how to read the same way as someone who has learned that lesson. They remain blind to the nuance and the qualification with which good scholars make their arguments, or they mistake these subtleties for unimportant details. They try to shoehorn into a conventional category a work that cannot be categorized because it is so original. They read in a book, looking for the key points and getting annoyed that the author did not package them conveniently. Oftentimes, I don’t think the ensuing misrepresentations of other scholars’ work reflect dishonesty — that is, deliberate misrepresentation — but incompetence. Many historians truly believe that their misrepresentations are fair and accurate.

    The collapse of standards for both primary- and secondary-source research has facilitated an epidemic of plagiarism. I am not talking about the classic form of plagiarism, which happens to be the easiest to spot: namely, the use of someone else’s words without quotation marks or citations. I am talking about the theft of ideas, which is softer and often more difficult to detect, because it requires a working knowledge of the relevant archives and an understanding of how the scholarly literature on a subject has developed over time. This type of plagiarism is sometimes camouflaged by citations to previous scholars on a relatively minor point of fact while the plagiarist makes off with their broader conceptual breakthrough — the phony attribution functions to cover up the theft. Another common form of plagiarism is source-mining. This is when historians use other scholars’ citations as a blueprint for their own archival research but fail to acknowledge their reliance on the prior work. The pioneers went to the trouble of slogging through box after (uncatalogued) box to find the out-of-the-way nuggets that the derivative historians glibly cited; the latter took a shortcut and pretended that they found those nuggets all on their own. 

    Peer review, which should have obstructed the collapse of standards and the spread of plagiarism, has done no such thing, because it, too, has broken down. Work that never should have survived peer review has been published on a large scale due to some form of corruption of the process — the reviewers were not competent, or they failed to take the responsibility seriously, or they waved the thing through because it was by a buddy of theirs. Too often, moreover, the rejection by conscientious peer reviewers of a manuscript as wholly unfit for publication does not prevent it from being published, but results in it being published in a different journal with only minor adjustments. The breakdown of peer review has led in turn to an erosion of brands that once might have provided the equivalent of a good housekeeping seal of approval. These days, publication by a “respected” journal or a “prestigious” university press is zero guarantee of quality. 

    The corruption of the process of peer review has been accompanied by the perversion of the norm of collegiality. Pretending that lousy research is good, misrepresenting the work of other historians, and so on ought to be understood as uncollegial behavior. But collegiality today seems to be understood as being a pleasant sort of person to have lunch with. In practice, it means going along to get along — hardly a recipe for scholarly excellence or integrity. In truth, collegiality isn’t about being “nice,” nor is it confined to the colleagues in front of one’s nose. It is about upholding standards on behalf of one’s colleagues across space and time. The scholarly college is an imagined community, with many members we do not know and will never even hear of, some of whom have died and others of whom have not yet been born. Collegiality means honoring one’s obligations to the entire scholarly college, and by doing so, establishing the college’s moral right to make claims on relatively scarce public resources. We are not entitled to a living as historians; tax-payers and tuition-payers do not owe us their money. We have to make a case that we are worthy of their investment, and the bare minimum for making that case is maintaining the standards that set us apart from lay historians, that is, the public. Those standards are what preserve the value of our credential, which is held by individuals as part of a collective trust. When individuals debase the credential, they harm their colleagues.

    Although more could be added, this is sufficient to indicate what a standards-centered, rather than ideology-centered, critique of the historical profession would look like, as well as to enable readers to spot analogues in their institutions and recognize that what has happened in mine belongs to a broader pattern in American society. This critique is not an endorsement of anti-intellectualism. On the contrary. Genuine scholarly expertise — which, by its nature, is hard-won, valuable, and limited — still exists, and it deserves to be defended. My point is that it needs to be defended inside the profession as well as outside it. When we ourselves value expertise so little that we no longer recognize it in our own ranks, uphold it through peer review, or protect it with a properly defined norm of collegiality, we do not deserve that our defenses of it be taken seriously by the rest of society. 

    Perhaps the strongest evidence that the professional blight I have just described is symptomatic of a collapse of scholarly standards, rather than of ideological capture, is the fact that two groups of historians often constructed — both inside and outside the profession — as ideological opposites share the same scholarly infirmities. Call it the horseshoe of scholarly incompetence.

    On the one hand, there are traditional military and diplomatic historians. Broadly speaking, they see themselves, and are seen by others, as continuing to study old topics (hard power, high politics, dead white men, and so on) using old methods of archival research. On the other hand, there are non-traditional historians of race, gender, and sexuality, who see themselves, and are seen by others, as studying new topics (knowledge production, the personal as political, historically marginalized groups, and so on) using new methods of critical theory. The methodological commitments of the two types are understood to imply corresponding ideological commitments, and vice-versa: “traditional” history is coded as conservative or politically right-of-center (for its defenders: hard, manly, substantial, tangible), while “woke” history is coded as progressive or politically left-of-center (for its critics: soft, effete, flimsy, intangible). Of course this dichotomy is an over-simplification, but it is one that I think many historians would recognize, however grudgingly.

    Historians and their allies in these two camps have inverse grievances about each other, which are discouragingly similar to those of Americans fighting the culture wars.  “Traditional” historians believe that “woke” historians regard them as knuckle-dragging fascists — dismissing them as methodologically unsophisticated on spurious ideological grounds. “Woke” historians believe that “traditional” historians regard their work as faddish and jargony left-wing activism — dismissing them as methodologically pseudo-sophisticated on spurious ideological grounds. Of course, framing the other as ideologically motivated does ideological work for each, implying that they are more objective than their opponents. 

    This ideological framing excludes a great deal of inconvenient evidence. For one thing, as we have seen, rigorous archival research is an exercise in applied critical theory — the two are not binary opposites. For another thing, anyone who has read the ninth chapter of That Noble Dream, Peter Novick’s classic history of the historical profession, or paid attention to the “new right” in the United States today, knows that the politics of critical theory, postmodernism, relativism, or whatever one wants to call it are protean, not inherently left-wing. Conversely, anyone who got whiplash from watching left-wing historians who yesterday had denied the possibility of objectivity today insisting on “facts” as against the Trump administration’s endorsement of “alternative facts,” or who sees the tensions between the Marxian left and the Foucauldian left, knows that materialism and canons of proof are not inherently right-wing. Epistemological conservatism and political conservatism are not linked as tightly today as they were during the culture wars of the 1990s.

    No less important, the “traditional” fields of diplomatic and military history most often championed by the political right and the methodological conservatives suffer from exactly the same type of scholarly weaknesses as they accuse the non-traditional fields of having. Where an identitarian historian sniffs at archival research as the fetishization of documents, a diplomatic historian who had lunch at several archives in several countries now claims expertise in all. Where an identitarian historian began their academic journey as one of the ninety-two percent of undergraduate students in women’s, gender, and sexuality studies at Yale to get A’s, a military historian got their doctorate from a war/strategic studies degree mill. Where an identitarian historian seeks contemporary relevance by reducing the complexity of the past into a tale of white supremacy, a diplomatic historian seeks the same relevance by reading the past backwards through today’s foreign policy categories—and both twist the historical evidence to conform to their present-day agenda. I could go on. And if you think that diplomatic and military historians do not plagiarize, fake their footnotes, and wave their buddies’ lousy work through peer review, then I have an exciting time-share opportunity on a river in Egypt that I’d like to discuss with you. 

    In other words, as in the broader culture wars, each side of the history wars accuses the other of behavior of which it itself is guilty. Seeing themselves and encouraging others to see them as opposites distracts attention from what they share. Much as the MAGA right and the identitarian left betray a lack of commitment to liberal values, so the supposedly opposite methodological and ideological poles of the historical profession betray a lack of commitment to scholarly standards. 

    In the historical profession, as in a liberal democracy, there must be a single set of standards that applies against all comers. If historians’ left-wing ideological commitments lead to better scholarship than that produced by historians with right-wing views — as was the case with so much women’s, African-American, and working-class history in the 1950s and 1960s — then great. If historians’ right-wing commitments lead to better scholarship than that produced by historians with left-wing commitments, then great. Ideological commitments do not delegitimate scholarship, but they do not legitimate it either. Scholarship must stand or fall on its own, according to professional standards of originality and importance, evidence and argument, much as citizens are entitled to due process regardless of their political beliefs. If “woke” history does not meet those standards, then it is bad scholarship. If “traditional” history does not meet those standards, then it is bad scholarship. 

    By all means criticize bad scholarship. But spare us the pretense that any one field or ideology has a monopoly on it. Just as right-wing illiberalism cannot provide a remedy to left-wing illiberalism (or vice-versa), so “traditional” diplomatic and military history cannot provide a remedy to the very real quality-control problems in “woke” history until it addresses its own equally serious quality control-problems (or vice-versa). Would that the problem of the historical profession was “merely” one of left-wing bias! It is much worse than that.

     

    To explain why their profession and universities more broadly are under attack, historians seem to have read Richard Hofstadter on the paranoid style as a how-to guide rather than as an explanatory analysis. They have produced a whole genre of literature blaming exogenous forces beyond their control — notably “neoliberalism” and the Republican Party — for the ills that beset them: fewer college students majoring in history, fewer economically secure jobs for newly minted PhDs in history, the redirection by states and donors of financial support away from the “useless” humanities and towards the “useful” STEM disciplines. If the profession dies, they want the world to know, it was murdered.

    There is considerable evidence to support such a narrative. The neoliberal drive to impose market discipline and market values on universities has indeed done enormous damage to academia. It is also true that many Republican politicians are not operating in good faith and propose “cures” that would be no better than the disease. But the murder narrative is incomplete and self-serving — as would be obvious to historians if they were examining anyone but themselves. Every institution facing criticism dismisses its critics as bad-faith hostile outsiders. It is what police departments do when summary executions of black people provoke accusations of racism. It is what the Trumpified Republican Party does when people with eyes and ears accuse it of gross hypocrisy. Institutions like to blame exogenous forces beyond their control for their problems because it enables them to avoid grappling with their own culpability for endogenous forces within their control. Consciences rest easier with a narrative of murder than with a narrative of suicide.

    It does not relieve illiberal critics of the historical profession (and of universities more broadly) of their culpability for murder to insist that the profession’s death is also a suicide. The critics must take responsibility for their misconduct; historians must take responsibility for ours. Neoliberalism does not force us to mistake bad work for good. A vast right-wing conspiracy does not make us hire our cronies from graduate school over better scholars. We have done that to ourselves. Like many of our critics, we are not going to be worthy of public trust until we admit this. Nothing better illustrates our transformation into a profession of pseudo-scholars than our institutional unwillingness to do so. 

    It is little short of a tragedy for the United States that universities have failed to uphold scholarly standards, which mandate the rejection of a bunker mentality. The “no enemies to the left” mentality, like its counterpart on the right, is neither scholarly nor humanistic nor democratic. It is unreflective, inhumane, and authoritarian. Scholarly standards, like liberal-democratic standards, do not come with a two-wrongs-make-a-right exception. Instead of upholding standards, universities are leading the way for other socially important institutions to excuse the abandonment of them. 

    For any institution, the right to self-governance and the obligation to self-govern are flip sides of the bargain struck with the public: the obligation is the fiduciary responsibility that an institution owes the public in return for the public respecting its autonomy. Academic freedom and tenure exist to protect academics from being policed by outsiders according to non-scholarly standards. They do not exist to relieve academics of the obligation to police ourselves according to scholarly standards. If we default on our obligation to govern ourselves, then we lose the moral high ground to complain when outsiders try to govern us.

    When institutions fail to hold themselves accountable, as mine and so many others have, Americans have every right to be angry. Speaking both as a citizen and as a member of an institution, I hope they channel their anger into a determination to make their institutions better and more faithful servants of the public good — not into a determination to destroy their institutions, or to transform corrupt institutions dominated by one political ideology into corrupt institutions dominated by a different political ideology. Such a reform campaign, conducted with the maturity befitting a free people, would require Americans across the political spectrum to examine their own attraction to pseudo-scholarly thinking. If they do not, their republic may disappear into a past with no institution left to study it. 

    On the Envelope

    Act One: Berlin and Prague

    Immanuelkirchstrasse 29 is a short walk from my house in Berlin. The five-story corner apartment went up in the early years of the twentieth century, when Prenzlauer Berg was a mixed-class district of workers and upwardly mobile Jewish immigrants. Like most buildings in the neighborhood, the façade suffered damage in the Second World War and was neglected during the four decades it stood in the capital of the German Democratic Republic. It has since been immaculately restored, with a fresh coat of cream paint and a new set of angel-faced mascarons above the turquoise window sills. There is a decent trattoria on the ground floor as well as a wine bar with outside seating where it is pleasant to sip Silvaner and wile away an endless July afternoon, yet there is no plaque near the entrance that would inform passersby that the building was once the site of one of the more remarkable episodes in the history of the German postal service.

    It was here that Fraulein Felice Bauer, a valued employee of Carl Lindström Inc., received or read hundreds of letters, postcards, and telegrams sent to her by Dr. Franz Kafka, a civil servant at the Worker’s Accident Insurance Institute of the Kingdom of Bohemia and a partner in the asbestos factory owned by his family. The two met on August 13, 1912 at the home of their mutual acquaintance, the novelist Max Brod. Bauer was on the first leg of a vacation through the Austro-Hungarian empire, and decided to pay a call on her distant relative in Prague. An anxious and vacillating Kafka was there to get some advice from his successful literary friend about the selection and arrangement of short stories for his debut collection, Meditation. That evening Kafka and Bauer spoke about their mutual interest in Yiddish theater, looked at photos of the trip that he and Brod had taken to Goethe’s house in Weimar a few months before, and shook hands on an audacious plan, never to be realized, to visit Palestine together. When it came to an end, Kafka volunteered to walk the visitor from Berlin back to her hotel. The next morning, she continued on to Budapest and Meditation went off to the Rowohlt publishing house in Lepizig.

    The first snowflake of what Kafka’s biographer Rainer Stach calls a “paper avalanche,” written with a typewriter on Worker’s Accident Insurance Institute letterhead, arrived at Bauer’s office on September 20. Two days later, lightning struck. Over the course of a single evening at his desk in the tiny, cold bedroom of his family home, Kafka produced “The Judgment.” Georg Bendemann, the protagonist of “The Judgment,” is a young businessman who would like to write a letter to his old friend in St. Petersburg announcing his wedding to Frieda Brandenfeld — note the initials — but is foiled by the unexpected outburst of the grotesque and soiled patriarch he keeps locked away in a dank adjoining room. The story, which later appeared with the dedication “For F,” bears, for the first time, many of the disquieting elements of the sensibility we now call “Kafkaesque.” Its author, who was almost never satisfied with his work, was pleased with the result, and considered the ecstatic process that had given birth to it to be the epitome of the writing experience. Within a week, he had the vision of the sword-wielding Statue of Liberty that would open “The Stoker” — the first chapter of The Man Who Disappeared — and had tracked down Bauer’s home address.

    During their doomed five-year-long courtship, in which they met in person only a handful of times and were engaged twice, Kafka wrote the majority of the fiction for which he is now famous: further chapters of The Man Who Disappeared, “The Metamorphosis,” the unfinished novel The Trial, the first draft of “In the Penal Colony,” and other stories such as “The Hunter Gracchus,” “The Great Wall of China,” “A Report to the Academy,” and “A Country Doctor.” In maintaining a romantic correspondence that intersected with his literary activities, he was far from unique. (Like many writers, he also kept a diary.) What distinguishes the five hundred and eleven pieces of mail that would one day be collected under the title Letters to Felice — the longest book he ever wrote — was, as media theorists such as Friedrich Kittler and Bernhard Siegert have argued, his peculiar relationship to the postal service.

    In the century before the House of Thurn and Taxis decided to grant the celebrity author of the epistolary novel The Sorrows of Young Werther and his adoring fans free use of their postal system as a promotional strategy, letters were primarily tools for communicating information across greater or lesser distances. Guidebooks offered templates that letter-writers could consult for the proper formulae for every sentence from a salutation to a valediction, for every situation to addressees of every social status. As literacy increased, as carriage routes were replaced by railway lines, and as postcards and telegrams were introduced into circulation, the volume of written communication skyrocketed. Parallel to these developments, letter-writers began to adopt an individualized and creative attitude toward the practice; epistolary exchange regularly came to include introspection and self-fashioning alongside the conduct of business and the transfer of intimate sentiments. Siegert quotes Clemens Brentano’s letter to his sister, Bettina von Arnim, one of Goethe’s correspondents, to this effect: “The writer must at the same time write to himself, since he must become acquainted with himself through the letter…in letters you look into the mirror of your soul…” 

    For male authors of the Romantic and post-Romantic periods, female correspondents — figured as some combination of Mother, Lover, and Nature — were the preferred relay points for making their own acquaintances. And writer or not, what man, in this new mass-republic of letters, didn’t secretly fancy himself, as Siegert puts it, a “miniature Goethe”? No doubt a similar notion was rippling through Kafka’s unconscious when, pondering his brief flirtation with the sixteen-year-old daughter of the custodian of the Goethe House in Weimar, he asked Brod if he thought it was true that one could “bind a girl to oneself with writing.” In the summer before he met Felice, he was a cage in search of a bird.

    With Bauer, however, he was dealing with an entirely different order of addressee. Bauer had been hired in 1909 by Carl Lindström Inc., Germany’s leading producer and distributor of phonographic machinery and office equipment, as a typist — or “typewriter,” as members of the profession were sometimes called, the worker being metonymically identified with her machine. By the time she met Kafka, she had risen up the ranks of the sales division and been entrusted with power of attorney to sign contracts on behalf of the company. She represented Lindström’s latest products — such as the Parlophone, a gramophone, and the Parlograph, a dictation device — at trade fairs across Germany. When the company wanted to advertise the Parlograph to potential customers, they created a flipbook, a precursor to the film reel, in which a bound series of photographs could be flipped through with the thumb, providing the illusion of movement. The photographs showed a typist in headphones demonstrating the use of the machine. The model they chose was Felice Bauer. 

    Bauer was thus a member of two historical vanguards. Not only was she in the first generation of female employees in white-collar settings, she was also working at the epicenter of technological media, which would shake the foundations of communication, including the part of it we call modern literature. (The aftershocks would reverberate for another fifty years. Lindström’s in-house label, which distributed the jazz and tango records Bauer liked to dance to, would expand into the British market a few months before Kafka’s death in 1924. Only two years after Bauer herself died and was buried alongside her husband, the banker Moritz Marasse, in the Angelus Rosedale Cemetery in downtown Los Angeles, not far from where I grew up, Parlophone scored its most enduring success when the A&R man and producer George Martin signed a Liverpool-based rock outfit whose name would certainly have raised the eyebrow of the author of “The Metamorphosis.” “P.S. I Love You,” the letter-themed B-Side to their debut single “Love Me Do,” was a post-script in more ways than one.) 

    To “bind” a modern woman like Bauer required more than technologies such as pen, ink, stationary, envelopes, and stamps; it required Kafka to integrate himself into the circuits of a communications network whose complexity would have overwhelmed a writer of Goethe’s era. Like a seasoned logistician, Kafka coordinated multiple timetables down to the minute. He coordinated collection times in Prague with delivery times in Berlin—not failing to account for variations for express mail, regular mail, postcards, and telegrams, as well as for weekday and Sunday service. Then he coordinated delivery times in Berlin with Bauer’s estimated arrival and departure times to and from home and office — not failing to account for all the potential intermediaries inside both buildings, and what he did or did not want to risk each of them seeing. (Bauer’s family and co-workers, as well as anonymous postal officials, loomed large in his imagination: he reports a dream in which the arms of the Prague postman who delivers Felice’s letters become the pistons of a steam engine.) In the earliest, most intense phase of their correspondence, between October 1912 and March 1913, when the Bauers left Immanuelkirchstrasse for a more upscale address in the western part of the city, messages traveled from Prague to Berlin several times a day. Some were letters of up to twelve pages long, which Kafka divided between several envelopes and mailed simultaneously to different locations so that Bauer would receive the contents in installments as she went about her business. 

    In one of them, he expressed his ambivalence about the products sold by her employer, envisioning a scenario, all too familiar to us today, in which a Parlograph in Prague carries on a telephone conversation with a gramophone in Berlin —machines communicating in the absence of human beings. Yet wasn’t this the unintended consequence, for himself at least, of his correspondence with Bauer? On the one-year anniversary of their meeting at Brod’s, Bauer had taken a sample of his handwriting to a graphologist, who detected “literary interests” in her newly minted fiancé. Kafka replied touchily: “I have no literary interests; I am made of literature; I am nothing else and cannot be anything else.” We should take him at his word, rather than gloss this as a Romantic declaration of his fidelity to his artistic vocation. Kafka had ceased to be a nineteenth-century subject embarked upon a literary project of epistolary self-knowledge; he had instead become a “machinic assemblage,” to use Deleuze and Guattari’s term, whose software was made of combinations of the alphabet, just as a computer was a machine whose software was made out of combinations of zeros and ones. The mirror of the soul that Brentano had once enjoyed peering into had been smashed; what remained of the self was a fragmented body, or if
    you like, a differential system for processing the networked shocks of modern life. When Kafka mailed the letter to Bauer he could not have known that both his engagement and the Archduke Franz Ferdinand each had less than a year to live. Their respective demises were to be followed by the first chapters of The Trial and “In the Penal Colony.”

    In Kafka’s most gruesome story, a European traveler to a remote tropical island is invited by an officer of the colony to witness a demonstration of a machine designed to torture and execute prisoners. As the officer explains to his visitor in their common language, French, the body of the condemned man is inserted face-down and strapped into the “apparatus,” a battery-powered “harrow” made of needles, which will tattoo, over the course of twelve hours, the condemned man’s “sentence” into his flesh, making him aware of it for the first time, and killing him in the process. The officer describes the working of the apparatus, built by the commandant who founded the penal colony (in German, frühere Kommandant — note the initials) with quasi-religious fervor. He claims that as the condemned “deciphers” his sentence — “Honor thy superiors!” — “with his wounds,” his face will radiate the beatific calm of enlightenment. Unfortunately, the apparatus has fallen out of favor with the colony’s new commandant, who regards it as barbaric. The new commandant has withheld the resources needed for its maintenance and repair, and is threatening to abolish the practice entirely. With his demonstration, the officer hopes to enlist the traveler in his mission to convince him to preserve this juridical “tradition.” When he fails, he takes the place of the condemned man in the apparatus, only for the rickety old machine to malfunction, botching the inscription of his sentence: “Be just!”

    If the literary sources for the story are Dostoevsky’s Crime and Punishment and Mirbeau’s The Torture Garden, credit for the idea of the apparatus itself is probably owed to Felice Bauer. In the early months of their courtship, she mailed him a “factory brochure,” which may have been, according to Siegert, the December 1912 issue of Phonographic Magazine, which contained an article about the distribution of gramophones to the French penal colony of New Caledonia. Siegert also points to the similarity between the Parlophone’s stylus and the needles of the apparatus, and notes that the officer compares the expression on the faces of the condemned to someone who is listening to speech (or music) rather than reading a text. Yet no less than in “The Judgment,” Deleuze and Guattari add, it is “impossible to conceive of [the apparatus] without it involving an epistolary aspect.” The envelope and the letter have long functioned as symbols of the body and the soul—the exterior and the interior, the visible and the invisible, the public and the private, the literal and the metaphorical, the letter of the law and its spirit. By revealing and applying judgment simultaneously, and by inscribing the judgment on the material envelope of human skin, the apparatus reverses these oppositions or collapses them outright. Just like Kafka, the men who are fed into the apparatus wind up being made of literature — that is, littera, letters — and nothing else. 

    When, for a short time, Kafka and Bauer renewed their engagement and their correspondence, he dispensed with the use of envelopes. The majority of the remainder of Kafka’s surviving messages to her are written on postcards watched over by the small blue, green, or red portraits of that whiskered old commandant, Franz Joseph I, whose death on November 21, 1916 took place shortly after a horrified public at a reading in Munich became the first people in the world to be exposed to “In the Penal Colony.” On that day, Kafka sent a postcard to Bauer. “I cannot let you, you of all people go on accusing me of selfishness,” he wrote on its exposed paper skin. “It does affect me deeply, because it is just.”

    Interlude: Buenos Aires

    Just as the court in The Trial is said to be “attracted” to Joseph K.’s guilt, criticism has been irresistibly drawn to the puzzlements of the Kafkaesque. Whether it is a plot point such as a traveling salesman who turns into a bug or an ape who gives a report to a scientific congress, whether it is a courtroom setting that opens onto a painter’s studio or a castle that becomes harder to find the closer one approaches, whether it is a defamiliarized folk-peasant or a defamiliarized Imperial Chinese setting, whether it is a character identified only by their function or with an incongruently biblical name or by a single initial, the seemingly fantastical elements of Kafka’s fiction intersect with the narrative expectations of realism at various points of tangency, opening up an area of undecidability that vibrates with intimations of the allegorical like a fata morgana.

    Take the apparatus of the penal colony. It is difficult, first of all, to tell how to locate it historically. Is it an atavism or a harbinger? A technological device put at the service of an archaic ritual, it combines, as we have seen, elements of orality and literacy in ways that do not neatly map onto the debate about tradition and progress carried out between the officer and the new commandant, who, for all his pieties about civilized punishment, is still running a penal colony. Since, like Joseph K, the condemned man is presumed to be guilty, the trial is dispensed with, while other features of the judicial process (charge, verdict, sentencing, punishment) are compressed by the apparatus into a single moment. The traveler remarks upon the fact that the condemned man remains ignorant of his crime, but even this concession to readerly expectations about the nature of the legal process turns out to be a case of sublime misdirection. What goes unremarked upon is that the “sentence” is not a sentence at all. It is not a statement of an offense (insubordination) or a statement of a punishment (death), but is instead a command, clearly modeled after the decalogue. The command is rendered impossible to fulfill by the apparatus itself, unless what it means to honor one’s superiors is to die in it, in which case it is not, properly speaking, a punishment, but a human sacrifice. 

    The officer’s reliability is similarly difficult to judge. We have good reason not to believe his claim that the condemned are able to decipher what is written on their bodies. When the traveler is shown samples of the old commandant’s handwriting with which the apparatus has been programmed, he finds it completely unintelligible. But neither can it be dismissed with any certainty: what he reports could simply be a feature of the fictional world Kafka has built, and not a matter of the character’s trustworthiness. In any case, it is he, not the condemned man, who ends up in the apparatus. Is his substitution for the condemned man, and the botched execution that follows, in which the traveler sees “no sign of the promised redemption” on the officer’s face, a proof of the fatal command to “be just,” or a refutation of it, or just a result of the traveler’s lack of expertise in noticing salient details? It seems as impossible to definitively answer this question — one of many raised by the story — as it is impossible to resist the desire to try.

    Stach dismisses “In the Penal Colony” as a “wind-up toy…beloved of literary critics.” He might have called it a “machinic assemblage.” In this story as in Kafka’s work more generally, inviting interpretation only to frustrate it has the predictable effect of stimulating interpretation. Disappointment with Kafka criticism is nonetheless understandable. (Perhaps it is even inevitable: after all, no small part of our concept of the Kafkaesque is that nobody is able to give a satisfactory definition of it.) Starting in the 1950s, numerous attempts were made to run his fictions through the apparatuses of established hermeneutic paradigms, such that they became psychoanalytic case studies of Oedipal struggle and castration anxiety, or indictments of the irrationality of bureaucratic domination on the eve of totalitarianism, or theological parables about spiritual longing after the death of god or the “human condition” in an absurd world — readings so crude that an exasperated Susan Sontag offered the critical “mass ravishment” of Kafka as a reason to give up on the hermeneutic enterprise altogether. Even the best critic — from Arendt and Benjamin to Jameson and Kittler, there has also been no shortage of distinguished readers who have tried their hand at exegesis and decoding — stands in the same position before Kafka as the man from the country stands before the gate of the law. He could spend a lifetime at the threshold of his oeuvre, knowing that there are indefinitely many readings of it that he has yet to approach, and in the end be forced to concede that the particular gateway to understanding which he has chosen reveals more about himself than about Kafka. To paraphrase the critic Michel Chaouli: when you read Kafka, Kafka reads you.

    One way around this impasse might be to follow the strategy proposed almost off-handedly by Jorge Luis Borges, in his short essay “Kafka and his Precursors.” There he turns Kafka on his head, treating him not as an object of interpretation, but as a hermeneutic paradigm in his own right; not just as a writer to be read, but as a way of reading. Although Kafka issued few programmatic statements about literature and had little use for aesthetic theory, Borges employs his stories as instruments for perceiving his “voice” and his “habits” in earlier texts. He lists Zeno’s paradoxes of motion, a fable from the Tang Dynasty court official Han Yu, two “religious parables on contemporary and bourgeois themes” from Kierkegaard, Robert Brownings’ poem “Fear and Scruples,” a short story by Léon Bloy, and one by Lord Dunsany. “Kafka’s idiosyncrasy is present in each of these writings, to a greater or lesser degree,” Borges writes, “but if Kafka had not written, we would not perceive it; that is to say, it would not exist.” The idea, taken from T.S. Eliot and later made popular by Harold Bloom, is that insofar as “each writer creates his own precursors” every creative writer can also be read as a literary critic. On this argument, the Kafkaesque is not the sui generis phenomenon it is sometimes considered to be, but the molecularization of a sensibility whose atomic particles were already scattered throughout literary history. 

    Borges disclaims any “connotation of polemic or rivalry” in his concept of the precursor. Still, one suspects that his purpose in writing the essay was to add Kafka to a list of his own. Of the six precursors he attributes to Kafka, Zeno would certainly have been familiar to him, and Kierkegaard was an acknowledged influence. Han Yu, Bloy, and Lord Dunsany represent a trinity of the antiquarian, the obscure, and the contrarian that is more characteristic of the taste profile of the future director of the National Library of Argentina. “Fear and Scruples,” however, is an inspired choice, for more reasons than one. Browning’s poem concerns an exchange of letters with an “unseen friend,” whose identity and fame are doubted by other acquaintances of the speaker, and by the graphologists he consults. The last of its twelve elegiac stanzas reads:

    “Why, that makes your friend a monster!” say you:

    “Had his house no window? At first nod,

    “Would you not have hailed him?” Hush, I pray you!

    What if this friend happened to be — God?

    More interesting than the resemblance Borges notices between the premise of “Fear and Scruples” and that of the “The Judgment,” or the concluding theological revelation he borrows from Browning to give to Kafka, is that the comparison hints at the presence of another precursor hidden behind the em-dash in the final line: Emily Dickinson.

    Act Two: Amherst

    At first glance, Dickinson and Kafka could hardly seem more dissimilar. They were born more than a half century apart: she to a locally prominent family of Puritan stock in a small college town on what was then the western frontier of the United States; he to a family of assimilating Jewish shopkeepers in a mid-sized city in an industrialized, six-century-old monarchy in Central Europe. Their different genders meant that they had to navigate very different social norms and expectations surrounding familial responsibility, education, career, travel, political participation, romance and sexuality, and, above all, the business of writing. Where the latter is concerned, they did not speak the same language, come out of the same literary tradition, or work in the same genre.

    Yet the biographical overlaps between them are too numerous to dismiss. Both had domineering fathers (Edward Dickinson, Hermann Kafka) who cast long shadows over their psychic and literary lives; both sought refuge in strong bonds with a sibling (Austin, Ottla) and a friend who acted as a first reader (Susan Gilbert, Max Brod). Neither married or had children. Neither traveled often, or made it farther than a thousand kilometers from the desk in the small room in their family homes that, for better or worse, functioned as the omphalos of their imaginations. Each displayed a heterodox, gnostic streak and expressed ambivalence about their religious heritage (Dickinson refused numerous entreaties from teachers, friends, and family to be rebaptized; Kafka, who frequently vacillated on the subject of Zionism, once remarked, “What do I have in common with Jews? I barely have anything in common with myself.”) Their most productive periods took place against the backdrop of a political crisis and a military conflict that would consume their countries (the Civil War, World War I) but which neither ever treated more than obliquely in their writing. 

    Common literary influences were few, but they included the Bible and Charles Dickens. More surprisingly, as Borges intuited, Robert and Elizabeth Barrett Browning were also touchstones for both writers (Dickinson cherished Elizabeth’s Aurora Leigh, about a young woman who is raised by an aunt who lives a “cage-bird life,” and wrote an elegy for her when she died in 1861; Kafka recommended the Browning’s letter exchange to Felice Bauer as a model for their own correspondence). Both used the metaphors of frost and dismemberment to describe the sensory effects of great literature (Dickinson’s famous line that poetry “makes my whole body so cold no fire can warm me” and made her “feel physically as if the top of my head were taken off”; Kafka’s no-less-famous line that “a book must be an axe for the frozen sea within”). Their work had a number of motifs in common (Moses denied entry to the promised land, waiting at the threshold of a locked door or gate, the metaphysical sea voyage, the becoming-animal of the self, the collapse or reversal of interiority and exteriority, the body in pain). 

    During their lifetimes, they were reluctant to make their work public (only ten of Dickinson’s poems were published in her lifetime, all anonymously; Kafka routinely refused his publisher’s requests for new material and left all of his novels unfinished). Both demanded that their executors (Mabel Loomis Todd, Thomas Wentworth Higginson, Lavinia Norcross Dickinson; Max Brod) burn their manuscripts after their deaths (the request was partly acceded to by Lavinia and refused outright by Brod). As a result, their achievements were only recognized posthumously. Dickinson became a popular success with the bestselling first edition of her Poems in 1890 and entered the canon of world literature around the same time as Kafka, following World War II. They have since been subjected to mythologizing (the reclusive “woman in white,” the “old maid of Amherst”; “St. Kafka”) and popification (witness the 2019 Netflix series Dickinson and the 2024 AHD miniseries Kafka). Conflicts over the ownership, preservation, editing, and scholarly interpretation of the diverse papers in their respective archives have continued into this century.

    Let us take a closer look at Dickinson’s. Although she published little, Dickinson was known to her friends and family, and around Amherst, as a writer of poetry, so when Lavinia entered her aunt’s room after her funeral she cannot have been entirely surprised by what she found there: a box containing over three thousand pieces of paper covered in her handwriting. There were forty small booklets of folded paper with two holes punched in them and bound with string, containing fair copies of around eight hundred poems, written between 1858 and 1864, that have become known as “the fascicles.” In addition to the fascicles, which represent the core of Dickinson’s work as a poet, there were “sets” of fifteen unbound papers containing around one hundred and fifty poems, as well as thousands of “scraps” mostly dating from after 1864 and written in pencil on such heterogeneous writing surfaces as a flier from the Home Insurance Company of New York and an advertisement for Orr’s Boneset Bitters, a page torn from family copies of The New England Primer and Washington Irving’s Sketch Book of Geoffrey Crayon, a wrapper for John Hancock writing paper and a wrapper for French chocolate, a page with a queen’s head embossed on it and one with a three cent locomotive stamp in the center, ordinary machine-made stationary and the fifty-two envelopes now collected in a facsimile edition entitled The Gorgeous Nothings.

    Rounding out Dickinson’s known literary production are the thirteen hundred or so surviving letters that she sent to a circle of a few dozen correspondents, most extensively to her bestfriend, confidante, possible lover, and later sister-in-law Susan Gilbert Dickinson. Hundreds of these letters contained enclosed poems, included lines of original verse in the body of the text, or were written in rhythmic and rhyming prose. Some two hundred were simply untitled poems without any accompanying prose that were signed, addressed, and mailed to one or more recipients. The borders between the fascicles, the sets, the scraps, and the letters are not as neat as this synopsis suggests: poems that were sent as letters appear in the fascicles and sets, and many of the poems in the fascicles and sets probably started off life as scraps, which were destroyed or discarded after fair copies were made.

    Starting in the 1990s, scholarship has taken a great deal of interest in the materials that Dickinson used and the epistolary network that she constructed. Noting that her manuscripts contained drawings, cut-outs, collages, and other non-literary media, Susan Howe, herself a poet and visual artist, has claimed that Dickinson’s use of non-standard writing surfaces was not simple thrift on her part — employing whatever was to hand — but an entirely conscious and incipiently modernist aesthetic strategy to align her chosen forms with her chosen materials — especially in the case of the fifty-two “envelope poems,” where Dickinson inscribes on the exposed exterior surface what is intended for the concealed interior surface. 

    Virginia Jackson has gone so far as to suggest that the popular conception of Dickinson as a writer of lyric is misleading at best, an artifact of a number of contingencies: her posthumous appearance in print, the many editorial hands involved in putting out the first edition of her poems, a culture primed to venerate solitariness in poets in general and sentimentality in women poets in particular, and later, paradigms of academic literary interpretation and pedagogy that operate on the assumption that poems are printed “verbal icons” and that ahistorically conflate the lyric with poetry as such. A better name for the genre she worked in, according to Jackson, would be the “letter poem”: not the formalized address by the poet to herself that is “overheard” in print by a future, far-away, and unknown group of readers, but particular messages addressed to specific audiences on specific occasions. 

    In fact, once one abandons the teleology of print, one can see that it is also technically incorrect to say that Dickinson did not publish. Rather, her network of correspondents was itself a small-scale publishing operation built by her for the express purpose of having her poems read as close to the way she intended as possible. It was a publishing operation centered on Amherst, though it extended as far as Worcester, Springfield, Boston, Brooklyn, and Geneva, New York. At first, Dickinson’s long-distance letters and letter poems were delivered by stagecoach; after her father helped secure the funding to build the first railroad connection in Amherst, an event commemorated in her poem “I like to see it lap the Miles,” they were delivered by iron horse. 

    And here we can return to Kafka. In carving her own postal network out of the existing system and integrating her literary and epistolary activities, Dickinson not only anticipated his practice, she arguably transcended it. True, like Browning, Dickinson was a more explicit allegorist than Kafka — there is no doubt, for example, about the identity of the country gentleman who offers a ride in his carriage to the speaker of her best-known poem — and thus seems easier to interpret, but through her denaturing of poetic form she arrived first at many of the effects we more commonly associate with his name. 

    The hymn (or ballad) stanza that forms the basic unit of most
    of her poems is a quatrain in alternating iambic tetrameter and trimeter with rhymes on the second and fourth lines. As the name suggests, it comes from the tradition of church (or folk) song, and is generally regarded as a genre of “minor literature,” compared to the statelier pentameter used in the elegiac stanzas of Wordsworth or Browning. Dickinson denatured this stanza in three ways. First, through the syntactical twists necessary to fit complex thoughts into the sprightly meter. Second, through her use of slant or feminine rhymes, which kink the poems’ sound and meaning, on the principle that one should “Tell all the truth but tell it slant—” since “Success in Circuit lies” and “The Truth’s superb surprise / …must dazzle gradually / Or every man be blind—.” (Dickinson’s justification recalls Moses’ rationale for taking over the reading of the commandments to the Israelites gathered at Mount Sinai.) Third, and most famously, through her idiosyncratic use of the dash, which has become as proprietary as the initial K is for Kafka, since no one can use the punctuation mark in a poem without recalling hers. 

    As the late Helen Vendler, her most astute critic, noted, the dash serves many functions in Dickinson’s hands. By slowing down the sprightly meters of the hymn stanza, it can “[enact] separation,” “indicate a break in continuity,” or “correct a narrative.” When a dash — rather than a full stop or a question mark — concludes a poem, it “becomes especially significant” since is can convey “a state of suspended being,” or its opposite, “a state of continuing action,” and sometimes both at the same time. To this I would add only the following biographical and historical context: Dickinson’s dashes already mark the prose letters that she wrote to Austin as an eleven-year-old girl in 1842, a few years before Samuel Morse sent the message “What hath God wrought” from his office in Baltimore to the Capitol Building in Washington, D.C. in his eponymous telegraph code.

    We know how baffling these stylistic innovations have always seemed thanks to the testimony of her first “professional” reader, Thomas Wentworth Higginson. Known today, if at all, as the man who helped introduce the world to Emily Dickinson, in the early 1860s Higginson was a writer on the make. The ex-minister, abolitionist, and women’s rights advocate had established a reputation as an up-and-coming public intellectual with a series of articles in The Atlantic, including one with advice to “young contributors” to the magazine in September 1861. The following April, he received a letter that was to change his life — and the course of American literature. The envelope contained four poems, an unsigned note requesting his verdict on them, and a smaller envelope containing a card with the name of the author written on it in pencil. (“A box within a box,” Dickinson’s biographer Alfred Habegger comments, “with herself deep inside.”) Higginson performed “surgery” on the enclosed poems, and recommended that his correspondent read Whitman and Harriet Prescott; in a follow up letter he judged her use of meter “spasmodic” and “uncontrolled” and recommended that she “delay to publish.” She assured him that publication was the farthest thing from her mind, and where the judgment of her contemporaries were concerned, “I have no Tribunal.” 

    Since all ended well — the two remained lifelong correspondents and Higginson went on, with Austin’s mistress Mabel Loomis Todd, to be the editor of the commercially successful first edition of her Poems — we can perhaps forgive him for seeing a novice still learning the rules of poetic composition where he should have seen a genius who had long since moved past them, and tampering in the editorial process with what he should have left alone. His incomprehension is revealing. Recalling his first impressions of her poetry in 1891 in an essay on her letters for The Atlantic, he wrote of the final stanza of “Your riches taught me poverty”

    Its far – far Treasure to surmise –

    And estimate the Pearl – 

    That slipped my simple fingers through –

    While just a Girl at school!

    that “the slightest change in the order of words — thus, ‘While yet at school, a girl’—would have given her a rhyme for the last line…” (I am quoting the version that he edited for publication, not the original letter poem addressed to Sue.) Where he wants a perfect rhyme (pearl/girl) to give the poem strong closure, she provides a slant rhyme (pearl/school) instead. What Higginson fails to see is that just like the inversion of expected syntax in the penultimate line, Dickinson preserves the echo of the expected rhyme in the line that subverts it. Read in the original context of the letter addressed to Sue, it is a poem about regret for missed opportunities, but read in the context of her reception by Higginson, her choice not only anticipates his objection — that what is regarded by experts as correct has slipped through her “simple” fingers — it provides an ironic commentary on the difficulties of grasping beautiful things and properly estimating their value.

    What neither of them were in a position to know was that in this poem — which contains a line, incidentally, about Buenos Aires and one about becoming a Jew — her subversion of sonic expectation produces an analogous effect to the subversions of narrative and referential expectation in the stories of Kafka. Unlike the modernist writers who more ostentatiously or programmatically departed from literary norms, it is Dickinson’s and Kafka’s precise calibrations of the distance between the conventional and the unconventional that makes their work seem so uncanny.

    Dickinson displayed her greatest range of effects, and is at her most Kafkaesque, in her poem “I felt a Funeral in my Brain.” 

    I felt a Funeral, in my Brain,

    And mourners to and fro

    Kept treading – treading – till it seemed

    That Sense was breaking through —

    And when they all were seated,

    A Service, like a Drum —

    Kept beating — beating – till I thought 

    My mind was going numb —

    And then I heard them lift a Box

    And creak across my Soul 

    With those same Boots of Lead, again,

    Then Space — began to toll —

    As all the Heavens were a Bell,

    And Being, but an Ear,

    And I, and Silence, some strange Race

    Wrecked, solitary, here — 

    And then a Plank in Reason, broke,

    And I dropped down, and down —

    And hit a World, at every plunge,

    And Finished knowing — then —

    Written at the outset of both the Civil War and an eruption of creative activity that saw her go from producing a few dozen to a few hundred poems per year, “Funeral” belongs to a suite exploring physical and psychological agony, including “After a great pain a formal feeling comes,” as well as the lesser-known “I like a look of Agony,” “It is easy to work when the soul is at play,” “That after Horror — that ‘twas us,” “A single Screw of Flesh,” and “A Weight with Needles on the pounds,” each of which — like “In the Penal Colony” — turns on an image of a body being tortured by one or more small, sharp, metal objects. It is usually read (by Vendler, Sharon Cameron, and others) as an account of a severe mental breakdown (or a particularly gruesome migraine), with the twenty un-full-stopped lines rhythmically propelling the speaker downward to darkness on the paratactic conjunction “and” as she encounters all the mortuary (Funeral, mourners, service, Box) and mental (Brain, Sense, mind, Soul, Reason) elements of the conceit announced in the first line. 

    The more one sits with “Funeral,” however, the stranger this simple conceit becomes. Most puzzling are the ways it twists interior and exterior space into a Möbius strip, and then concludes with a similarly ambiguating operation on the experience of time. The physical trajectory of the “I” in the poem is particularly difficult to follow. In line one, the speaker seems to be the space inside which the activities of the metaphorical funeral occur — the comparison, though unmentioned in the poem, might be to a church building —
    but when the “Plank in Reason” breaks in line seventeen, causing the speaker to “drop down, and down” in line eighteen, it strongly suggests that “Reason” is in turn a metonym for the wooden “Box” that is lifted by the mourners in line nine. In other words, that the speaker is the body in the coffin in the funeral that takes place inside her own brain. (Recall Habegger’s characterization of the envelope containing a card with the words “Emily Dickinson” that she would include in the envelope of her letter to Higginson a few months later as a “box within a box with herself deep inside.”)

    Complicating matters further is that in the intervening lines, the poem seems to turn its gaze outward from the interior of the brain to external “Space” in line twelve. What the capitalized noun means in unclear. Is it to be identified with the totality of the universe or “all the Heavens”? Does “Space” include the interior space of the speaker’s body? When the poet plunges, is she falling through herself or Space or both? Do the Heavens mean simply “sky,” in which case they are included in space, or do they mean “the afterlife,” in which case they transcend space? Do the “Worlds” she hits during her free fall also contain Heavens? Similar questions could be asked of “Being” in line fourteen which is shrunk, synecdochally, to an exterior part of the body, namely, a single “Ear” that is able to hear the “tolling” of Space, just as the bodies of the condemned in “In the Penal Colony” are said by the officer to shrink to an organ that is able to hear. Is “Being” here Being as such, or simply the poet’s “Being”? If the latter, is Being coextensive with the poet’s body, or only a part of it, and if the latter, does it exist inside or outside the Brain where the funeral in the poem is said to take place? Is the “Ear” of Being that hears the tolling of space the same organ that hears the “creak” across the “Soul” in line ten? 

    These questions are no less undecidable than the ones raised in Kafka’s stories. Whatever one’s answer to them happens to be, it is clear that what “Funeral” tracks, no less than “In the Penal Colony,” is the instability, reversibility, and even collapse of received notions about interiority and exteriority into a vertiginous mise en abyme. Like its speaker, the reader of “Funeral” arrives in the last line of the poem “Finished knowing” exactly what to make of it. But Dickinson is too clever and profound a poet to allow the consolations of a skeptical “suspension of judgment” to have the final word. Instead, with the last word “— then —” she elects to deepen the mystery, using her signature dashes in both the ways Vendler described — at once suspending being and continuing the action — to add a dimension of temporal paradox to the spatial paradoxes of the preceding nineteen lines.

    Yet the manuscript of “Funeral” suggests that Dickinson was also of two minds about where the poem should be located. She crossed out “Brain” and wrote “Soul” in the margins, only to cross out “Soul” again and retain her original conception. Perhaps the reason was nothing more than to preserve the appearance of “Soul” in line ten that introduces, via the rhyme with “toll” in line twelve, the metaphor of the bell in line thirteen. But the ultimate choice of the finite material substance of the brain rather than the infinite metaphysical substance of the soul as the envelope for “Funeral” strikes me as the right one, since it heightens both the poem’s pathos and its perplexities. In her poem Dickinson does not resolve — for herself or anyone else — the probably insoluble issue of the relationship between the mind and the body or the body and the soul, nor does she do so in the many other poems that return directly or obliquely to this theme. But her engagement with the corporeality and materiality of the self was unusual for the poetry of the time, something that is often obscured by the fact that the other person to do so was none other than Walt Whitman.
    The discourses of contemporary physics, mathematics, botany, and anatomy all appear in her poetry, especially between 1861 and 1865, her années miraculeuses

    Along with “Funeral” the word “brain” appears in four other poems, including “The Brain — is wider than the Sky —” in 1862, where she writes somewhat blasphemously that 

    The Brain is just the weight of God — 

    For — Heft them — Pound for Pound —

    And they will differ — if they do —

    As Syllable from Sound — 

    True, the word “Soul” appears more frequently in her Complete Poems, but along with “mind” it is usually presented through the vehicle of a corporeal metaphor, or shown as participating in some inextricable or inexplicable relation to the body. Of these, the one that caught my eye was a letter poem from November 1865 sent to Sue in Geneva, New York, to console her for the death of her two-year-old niece. “The Overtakelessness of Those / Who have accomplished Death — ,” Dickinson writes Sue, is “Majestic is to me beyond / The Majesties of Earth — ” But before it ascends to heaven, the soul “Inscribes” a message — “Not at Home” — “upon the Flesh.” 

    Dickinson would send a similar message to her own nieces — “Called Back” — before her own death on May 15, 1886. The words, taken from the title a popular novel that Dickinson enjoyed, were later carved on her tombstone, the original writing surface for poetry. On the day of her funeral, construction had just begun on the Statue of Liberty in New York Harbor. Emile Berliner was in Washington D.C., at work on his latest invention, the gramophone record. Franz Kafka, her true inheritor, was a two-year-old boy in Prague. One day he will write a story that concludes with a visit to a tombstone, whose inscription reads: “Here lies the former commandant. His followers, who at present must remain nameless, dug a grave for him and laid this stone. A prophecy says that after a certain number of years the commandant will arise, and from this house lead his followers to re-conquer the colony. Have faith and wait!”

    Tel Aviv, June 24, 2024

    Suddenly a cry flew 

    out of nowhere, like the lash of a whip,

    piercing and sharp,

    waking us from a troubled sleep —

    furious —

    “Tell me, have you all gone mad? 

    Giving up on all this?

    Just like that, despairing already,

    Without a real fight?”

    “Leave us alone,” we said.

    “Let us withdraw into our heads 

    to mourn our dead

    until this thing passes away 

    that no words can portray. 

    We are like mutes beneath the weight of its pain, 

    before the horrors of our hostages.

    So let us be, just be,

    without understanding, without thinking, 

    until our looted land, our trampled land,

    our raped land

    stops hurting.” 

    For a moment the lights flailed.

    For a moment the tunnels wailed.

    The world was black and white.

    The world was coal and ice.

    In the middle of the night we got up to flee,

    my wife, my son, and me

    I bore the cry on one shoulder 

    and the hope on the other, 

    numbed and put under. 

    “How much more can we go on like this”
    my wife whispered,

    so that the boy wouldn’t hear

    and be struck by fear.

    “Our high-tech filled the world with awe

    we were the start-up nation —

    but it turns out we had a flaw, 

    we were just the warm-up band 

    for the guy in the crowd with a gun in his hand 

    who said his bullets were blanks.”

    “Look,” whispered my wife.

    “This is how it happens. 

    This is what it looks like when it actually happens.”

    We saw – 

    We saw long, silent convoys

    streaming from the mountains into the valleys

    and swallowed by ships that were swallowed by seas. 

    “It’s as if in a single day of atrocities this land became 

    too demanding, too much for us to handle,” my wife said in astonishment.

    “No, no,” scoffed a young man

    passing by on a scooter,

    with a gun in his holster, a shooter.

    “No, no, it’s that it took only a single day of fear 

    to extinguish — to rob you of —

    maybe you never had it —

    the desire for a land of your own.” 

    “It’s not that we’re running away,” 

    I said to my wife.

    “We’re just changing the arrangement of our elements,

    We’re just relocating, inward — ”

    Suddenly the boy spoke:

    Yallah, rise from the ashes,

    from your school of fear and despair!” 

    So said our son, 

    as he grew stronger before our eyes.

    He showed us pictures from an album we did not recognize

    of a bloodied childhood, a war childhood, a ruined cradle, 

    images of a confiscated childhood. 

    “Because if we do not rise from the ashes now,”

    the boy said, “we will never rise again.” 

    “Or we will rise so different,” my wife said,

    “so strange and terrible, so hard and bitter —

    Foes —

    until finally we will no longer be those 

    they would be foolish to tangle with.”

    “This is the last minute,” the boy roared,

    “And even if it sounds trite, 

    for right now it is right,

    because right now rules are being written.

    Those who were left behind are leaving,

    those who were deserted are deserting. 

    Speak to me, father,

    give me breath.

    I’m nearly done for, father, I’m nearing death. 

    My soul is weary of the call-ups, weary,

    give me a hope, give me a reason —

    You are silent, father, 

    so I will speak for you:

    Men, women, now is the time to fight, 

    to go out into the streets at night.

    There is whom to fight for

    and there is what to fight for,

    because a gift like this, a gift from life, 

    we will never be given again. 

    No other state will sprout from this strife.

    It all now depends on you.

    This is the time to rise, to live,

    To be a nation or not to be —

    To be a human or not to be —

    There is for whom and there is for what —

    And everything is suspended

    over nothingness.” 

    Translated by Leon Wieseltier

    The Politics of Possession in America

    I saw The Exorcist not long ago, probably the bravest thing I have done in a while. The movie terrified me the first time around in 1973, and it did the same fifty years later. This time it got me thinking about possession. It made me wonder if milder forms of possession — no projectile green goo, no head twirling 360 degrees round — might be abroad in American culture now. Alas, there is reason to think so.

    The Exorcist, you will recall, is about the possession of a twelve-year old girl, Regan, by a demon named Pazuzu. By the time Pazuzu is through with her, Regan has become something of a demon herself. She spits venom in every direction, at her mother, at her doctors, and at her mother’s boyfriend, who in time she murders. Enter two priests — one old, one young — who perform an exorcism on Regan. The effort kills the older priest. The younger priest saves Regan by telling the demon to take possession of him and leave the girl alone. Which the demon does, but at a cost. Pazuzu sends the unlucky priest flying out the window to his death on a set of stairs still known in Georgetown, the site of the movie, as the Exorcist Steps. Good wins over evil, but the price is high.

    Regan’s possession turns her libidinous. When a group of clueless psychiatrists comes into her room, she elevates her skirt and roars “fuck me” three times in a deep, demonic voice. (Mercedes McCambridge does a wonderful job with the vocals.) Furious at her mother, she rises up from her bed and bellows, “Lick me! Lick me!” Wanting to get at the young priest, whose mother has just died, she screams, “Your mother sucks cocks in hell!” There’s more where that came from and plenty of it. 

    The last resort for Regan’s mother is to turn to exorcism after the medical doctors have let her down. The doctors are materialist, smug, and rather uncaring. One is not entirely displeased when Regan contrives to grab one of them by the nuts and administer a good squeeze. Suppose, though, that Regan’s mother had summoned not priests, not reductive psychiatrists, but a psychoanalyst — old school, from Vienna, let us say. What might such a figure say about Regan’s possession? 

    He would surely not be talking about demons. More likely he would be talking about some part of her own spirit or psyche that had taken Regan over. To the psychoanalyst, we are an uneasy combination of id, ego, and super-ego: desire, intelligence, and often irrational prohibition, to put it in compressed terms. (The picture has been bitterly criticized, but bear with it for now.) And those elements of the psyche generally exist in tension with one another. The id desires, but the ego and the super-ego do not always approve of its desire. Conflict occurs. In fact, conflict, sometimes mild, sometimes not, is the ongoing state of the human psyche. We want, but we also don’t want what we want, and there begin multiple riddles. For the Viennese visitor, character is conflict.

    Is it possible for one agency of the psyche to take over and suppress or even temporarily eliminate the other two? One can, I think, be taken over by the id. What we call the id is, in German, das es, which means “the it.” We are all inhabited by an identity-less creature, an it, that wants and wants. Generally, the it is mitigated by the ego and super-ego, but not always. 

    Can someone become identical with her it? This is what happens to Regan, or so our Viennese visitor would say. She becomes all libido (lick me!) and someone on the verge of puberty, as she is, is likely to have libido to spare. Dismiss all the special effects for a moment, and you see that Regan has become all brutal desire. She is identical with it. 

    The movie came out in 1973, and it is certainly a product of its time. What society, or some influential segment of it, was worried about then, and probably not without reason, was the potentially destructive effects unloosed by the voluptuary counterculture. The Washington section of the movie begins with a scene of a demonstration, being filmed for a movie starring Regan’s mother, featuring irrational, chanting students. (Chanting can itself be a form of mild and temporary possession.) The religion of sex, drugs, and rock and roll was scary. What if everyone tuned in, turned on, and dropped out? Or simply went into a Dionysian rage? More to the point, for this movie, what happens when the great sexual unbinding reaches the middle class and the very young? You may need old-time Catholicism to fight it: fire with fire and all that.

    I have seen two different endings for the movie. In the first, Regan, who has purportedly forgotten everything that has transpired, sees the white color on a priest and, not knowing quite what she is about, jumps forward and kisses him. (Chastely.) In the second ending, there is still the chaste kiss, but the priest goes off to lunch with a hapless detective who has gotten involved in Regan’s case. Church and state unite as a guard against future Dionysian explosions. 

    The Exorcist is of a piece with the renderings, and maybe the reality, of the Manson Family murders and the bloody concert at Altamont. Both events were cast as cautionary tales about the 1960s. (Joan Didion made a career out of this.) It’s not all peace and love, Woodstock magic, and Rousseau-inspired bliss. If you let it all hang out, smoke weed, drop LSD, bathe naked, and chant along with Crosby, Stills, Nash, and Young, “Rules and regulations who needs them? / O throw them out the door,” then Sade himself is around the corner. We unyoke the dark side, as well as the bright. Those who summon Dionysus must understand that he is not only a liberator but also a destroyer.

    We are a long way now from the ‘60s reign of Dionysus. Id possession may erupt here or there, but it cannot be a point of cultural definition the way it could be around 1973, the year of The Exorcist. Our mythical God, if we have one, is Apollo — Apollo in his most reductive, cold, materialist guise. Apollo can be a wonderful force — he brings music and mathematics, architecture, and science. But like Dionysus, he can be corrupted. He can become obsessive, judgmental, cruel, even sadistic. When the mortal Marsyas challenges Apollo to a musical competition and loses, Apollo skins Marsyas alive. Apollo’s spiritual province, our Viennese colleague will tell us, is the super-ego. When it is enflamed, the super-ego judges us bitterly for not living up to the highest standards, for not being excellent in every way, for not being perfect. Like Apollo, it is prone to deliver punishment. Though it does not quite flay us, as Apollo was inclined to do, it attacks the hapless ego and causes depression, anxiety, and self-hatred in myriad forms. We live now, I believe, in the age of corrupted Apollo, the god of computers, in the age of the super-ego.

    [inaignia]

    Is it possible to be possessed by the super-ego? 

    From every direction now one hears the voice of hollow righteousness. It is monotonal, jargon-ridden, condemnatory, fierce. It calls others out for their crimes. It hunts out racism, sexism, homophobia, transphobia, Islamophobia: you name it. It is relentless, unforgiving, and cruel. It raves madly in the idiom of reason. Those who deploy it would give anything to be judges, with hanging prerogatives. These people are often suffering possession by the super-ego, just as a true-life Regan, circa 1973, would be suffering possession by the id.

    I am not saying that every person who endorses the social justice agenda is mentally ill and needs a shrink or an exorcist. People who hold these ideas sanely — and there are plenty of them — tend to be humorous, self-aware, and well-disposed to irony. They are serious in their commitments, but they don’t take themselves too seriously. Nor do I wish to endorse the perpetual application of psychoanalytical terms to cultural matters. These terms are best used to help suffering individuals begin to name and to wrestle with their problems. But there are times when a society, or a significant portion of it, takes on pathological qualities. (And we must not be cowed about speaking about pathology.) In such times, we are lucky that we have analytical tools with which to diagnose what ails us.

    How, then, do you distinguish the sane members of the leftist tribe from the whackos? You start to recognize both versions — the sane and the pathological — by their voices. The sane speak with self-awareness and usually some self-doubt. Irony infuses what they say, as does a willingness to listen to counter-arguments. The whackos speak in one tone of voice, a tone that a dictator would be proud to deploy. (For what is a dictator but a non-stop, obsessive talker who is always right?) It says, I am authority, I am truth and power, listen to me. Their voices sound mechanical, robotic. They sound, in a word, possessed. And they are possessed, by their super-egos. 

    They create manifold problems. For a start, they are in pain themselves: they are being tyrannized by their own over-I’s. One way they find respite, a brief release from their torments, is by deploying their super-egos against others. Of course, then the super-ego redoubles its force, and the scolder scolds himself even harder than before. So back, then, to denouncing others! This exercise resembles what happens when a thirsty wayfarer on the sea takes a slug of seawater. Relief and release, but only for a minute. Then one is thirstier. Remedy? A few more gulps of sea water. Then: repeat cycle.

    The super-ego-possessed are dangerous also for another reason. They have the ability to inflame the super-egos of others. The reasonable college president or CEO hears the rant and finds herself guilty and confused. These people, who give in to the radicals and the inquisitors and the nut-jobs, have been called cowards. I am not so sure. I think they have failed to understand that ours is an age of ersatz authority, which can strike the irrational parts of ourselves as quite real authority. They do not understand that reason can be unreasonable. (Thus no straight and humane and non-lawyerly answer was given to the question of how the university presidents would respond to calls for a genocide of the Jews.) Super-ego possession is contagious.

    What is to be done? One thing is sure: you cannot talk to a super-ego-possessed person. It is tempting to try. They sound reasonable. (They often tend to be quite smart, an ace test-taker, high-I.Q., in a good-speller sort of way.) But don’t try it. You are wasting your time. As the senior priest tells his younger associate in The Exorcist, don’t converse with Regan. Don’t negotiate with a demon-possessed kid. So conversation is off the boards, when possession — id possession or super-ego possession — rules the roost. 

    [inaignia]

    Let me pause for a moment and look in the other direction, the direction of the American body politic that is in its way possessed by Donald Trump. This form of possession is different. We’re not talking about people trying to look, act, and sound like him. No, it’s something else entirely. 

    If the two priests from The Exorcist looked at the extremes of the Trump phenomenon with a cold eye, I suspect that what they would see would be a case of idolatry. “I am the Lord thy God,” the Book says, “thou shalt not have strange gods before me.” And this indeed is a strange God. But people do worship him, and in considerable numbers. A woman at a Trump rally in Iowa, awaiting the arrival of her master, said, “I’ll probably start crying. I’m going to start crying now. When he comes out on stage and I can see him face to face, it’s going to be the best day of my life. I love that man.” This is the language of worship, not of relatively sane political endorsement. 

    Is it entirely unfair to link the face-to-face business to Paul’s words in Corinthians: “For now we see in a mirror dimly, but then face to face.\Now I know in part; then I shall know fully, even as I have been fully known”? (It probably is unfair.) But we can perhaps agree that this is not the language of thoughtful political allegiance, but a far more religious idiom. And, no doubt, this woman is not the only one who feels religious awe in the presence of Trump. Listen to another devotee: “Trump works for God. And God is all about America. And God is the one that’s gonna save America. He’s using Trump as one of his tools. That’s why we support Trump, because he works for God.” Trump seems closer to Jesus than to God the Father in this rendition, but the point remains.

    Idolatry, at least in the Jewish and Christian tradition, means worshiping a substitute for the one true God. Usually, it is a substitute that you can see or touch or hear, that is available to the senses. It seems that loving and serving an invisible and immaterial god is a psychological burden. A golden calf is better: more readily seen, more readily petitioned and celebrated.

    What the priest calls idolatry, the psychoanalyst names differently. What he sees is an irrational fixation on a leader, but not just any leader. To Freud, writing in Group Psychology and the Analysis of the Ego, there is a certain sort of leader who draws obsessive, even pathological, allegiance. The leader must love only himself, and others only insofar as they further his ends. He must be, or be perceived as, completely self-reliant. He needs nothing from anyone else. He trusts his own thoughts to the exclusion of all others. And he holds himself with a regal seriousness. He does not ever criticize himself. He never admits that he was wrong or could be wrong. He is inevitably male.

    He acts as something of a hypnotist on his audience, lulling them into worshipful compliance. He casts a spell. Then the clincher: he puts himself in the place of his followers’ super-egos. He absolves them of the burden of thought. They no longer need to decide what is good and bad, right and wrong. He knows, and will tell them, and what he tells them is true. They have traded their mental autonomy, yes; but in return they have received confident assurance, peace of mind. The conflict that comes with having a tri-partite self lessens and sometimes disappears. One is made whole. One is resolved and happy. 

    I do not mean to say that everyone who endorses Donald Trump is psychologically ill and in need of treatment. Two lawyer friends of mine, among the more intelligent people I know, endorse Trump for clear and cogent reasons. They find him humanly objectionable, but they believe that on a number of issues his instincts are good. They like his views on trade; they like his views on borders; they like his reluctance to use military force. This election cycle, they will probably endorse his views on crime. I believe these people are mistaken for a dozen reasons. But you can talk to them. They give reasons, they take your thoughts in, they respond to them humanely, and usually with good humor. They are not possessed by Trump. They have not put him in the place of their super-egos. Psychological terms will not suffice to illuminate their thinking.

    Surely the assassination attempt is going to augment Trump’s standing as a deific figure among his worshippers. To give only one example of the many sacralizations of Trump after the shooting: shortly after the event, Ben Carson, who ran against Trump in 2016, speaking at the Republican convention said, “First they tried to ruin his reputation and he’s more popular now than ever. Then they tried to bankrupt him, and he’s got more money than he had before. Then they tried to put him in prison and he’s freer and has made other people free with him. Then last weekend they tried to kill him and there he is over there, alive and well.” That last “they” in “they tried to kill him” is remarkable. When Carson was talking, no evidence of any accomplice or plot had been found. The shooter was only a scrawny, confused, and lonely kid, but Carson’s “they” suggests that there are titanic forces ranged against the god-king. And he will defeat them, inevitably. Carson reached to the Book of Isaiah, addressing Trump directly, “No weapon formed against you shall prosper.” We are no longer talking about a common mortal.

    As a rule, I dislike using psychological terms to analyze culture. Those terms often work well enough to describe people who are ill. (They seem to me to have little or no purchase on the healthy and thriving.) But we have a situation now in which a significant segment of our population seems to be, well, possessed. Some so-called progressives are captive to a set of often extreme ideas; some right wingers are captive to the person of Donald Trump, a rogue patriarchal figure. Who would care, really? Except that for various reasons these people, who are probably more to be pitied than condemned, have gathered an extraordinary amount of cultural power in present-day America. 

    Is there anything to be done about that? Yes, but the response needs to be thoughtful and humane. We need to get used to hearing and deploying the term “super-ego possessed” to characterize those who have been taken over by progressive jargon. These people do not speak, rather they are spoken. They are possessed by a certain set of values and words. (Some of which, sanely deployed, are perfectly good values and perfectly good words.) Those in authority, the college presidents and the CEOs and all the rest, need to be able to identify this phenomenon and call it what it is. As in, “Yes indeed, he’s super-ego possessed. Can’t really talk to him and shouldn’t have to listen.” Others sympathetic to progressive ideas need to know how to spot the possessed in their midst and stop transferring power to them. They seem so fervent, so committed, so in love with justice. They can manifest themselves as an inspiration. And they tap so effectively into the super-egos of others. It will take time to see which progressives are reasonably sane and which have become possessed. But as Freud said, the voice of reason is a small still voice, but it will not rest until it gains a hearing. 

    The super-ego-possessed leftist is in thrall to a voice and to ideas that live internally. He has become possessed by the discourse and it speaks him. The Trump worshiper has put a man, Trump, in the place of his super-ego. Not only can he do no wrong, he can always tell them exactly what is right. I’m not sure there is much of any cure for this malady. But one can point it out and see what happens. It is embarrassing for a full-grown woman or man to internalize a paternal figure. It is an admission that one has not grown up. I sometimes tease a couple of my Trump friends by asking, “Is Trump still your daddy?” They are good-humored people and respond in kind. 

    What makes someone susceptible to possession by a super-ego figure? I am not entirely sure, but I can make a guess. Every year for some years, a hypnotist visited the University of Virginia, as part of student orientation. He brought people up on stage and began trying to cast his spell. He would start with a dozen or so, but soon he would send at least half back to their seats. The others, he saw, were susceptible to hypnotism. I knew a few of the kids who were kept on stage and invited to dance, sing, and fall in love with each other. They tended to be dreamy kids, emotionally alive, but not given to strong cerebration. They were the kinds of people who were, give or take, born stoned. They did not rely much on cognitive function, which our Viennese friend would associate with the ego. They were quite willing to accept the voice of the hypnotist as their inner voice of conscience — as their temporary super-ego. 

    I think that those who yearn for a super-ego figure, a Trump or some other authoritarian leader, often tend to have moderate intellectual faculties at best. They live in a world of baffling complexity and do so in a state of confusion. This state is constantly fed by the Internet, which is a confusion machine. So many voices, so many views; cacophony wherever you look. Plenty of lies, plenty of errors. It can be a pleasure to learn from the Internet, but you best have acquired your basic knowledge from something that is not the Internet. A book is a good idea; five hundred (good) books areis better. If you do not really read or cannot read well (a common condition, actually), your head is likely to be a blooming, buzzing confusion. Half the people in our country, by definition, have an IQ less than a hundred. They have been tossed into a life during a time where the world can be a joyous place indeed, but it can also be a rankly complicated one. And along comes a figure who clears up the confusion. He relives citizens of the burden of intellectual activity that every open society in some measure imposes upon them. He knows what is true, right, and just. He tells you the idealized past is better than the chaotic present, and he gives you the hope that we might return there. He is never wrong. Not for nothing did the ever-shrewd Trump say that he loves uneducated people. 

    There is no exorcism for this sort of possession, or at least none that I know of. Calm conversation, compassionate engagement: these things may help. But let’s not get our hopes up. When my students go off to their holiday dinners, where they will often meet up with a relative who is a fountain of Trumpism, I offer them some advice. Don’t weep, don’t wail, don’t leave the room. Respond with a simple question: What made you come to believe what you do? Then listen hard. Sometimes it works. (Sometimes.) When it works, the steam of fanaticism dissipates just a bit. But possession is the antithesis of persuasion. 

    It is no great wonder that people exist who are prone to possession; it has always been so. Our own case is unusual in part because we have powerful means for the possessed to amplify their thoughts. And for whatever reason, all too many of us are susceptible to their influence. As I say, I don’t much like flourishing the language of Vienna to describe American politics now. I would far rather speak the language of Washington and Philadelphia, of the Declaration and the Constitution. I hope that soon the psychoanalytical idiom will go back to its proper home in the learned journals and the consulting room. For now, though, what we see before us leaves little choice.

    We need to understand where we are. Let us defeat fanaticism with analysis, with ideas, if this can still be done. And also with a renewed compassion. For fanaticism is alive in the land, and on the move: its power to do harm is inexhaustible.

    The Rise of the Barbarian Right

    It’s strange, how life can sometimes mimic literature. Consider the story of Jonathan Keeperman, which in crucial ways recalls American Pastoral. Like Philip Roth’s novel, it is a story of how mad ideas can take hold when history unsettles familiar normative coordinates, and when children confront a more dimly lit world than the one faced by their fathers. Even some of the basic details are reminiscent of American Pastoral. Jonathan Keeperman’s father, Fred, came into the world in 1948 at Brooklyn’s Maimonides Hospital and spent his early years in Brownsville. The family-owned a candy store on Pitkin Avenue, and soon Fred was immersed in the “colorful cast of characters who inhabited the immigrant Jewish community into which he was born,” as his obituary put it. (He died two years ago.) 

    The family moved to the eminently Rothian town of Metuchen, New Jersey. Fred joined his high school’s varsity wrestling team, and this in turn won him an athletic scholarship to Knox College in Galesburg, Illinois. There he met his future wife, Rita, a Galesburg local and Catholic-school grad “who taught Fred how to bale hay and put a cow back in the barn,” per the obit. After graduating college, Fred became first a special-ed teacher and then a junior-high vice principal. 

    In 1976, ambition beckoned him to the Bay Area. He went into business with his uncle and finished an evening law degree. Eventually, the family made its forever home in a cul-de-sac in Moraga, a lush, quiet suburb of San Francisco. Fred ran his own small law office, where Rita would serve as the business manager. On the side, he coached sports and led the local education foundation and baseball association, among other civic groups. Fred and Rita Keeperman, in short, enjoyed a full measure of the stability and social capital which were the boomers’ historical inheritance but which would elude later cohorts.

    Jonathan Keeperman was born in Moraga, the third of Fred’s four children. He earned a master’s degree in creative writing at the University of California, Irvine, and would teach as a non-tenured lecturer at the same institution for more than a decade, from 2009 until 2023. During his time at Irvine, Keeperman honorably defended the free-speech right of the campus Republicans to invite an obnoxious speaker, and helped to lead efforts to organize his fellow itinerant instructors under the auspices of the American Federation of Teachers. Writing that instructors “make up the highest percentage [of adjuncts] among all the disciplines in the system,” Keeperman told California Teacher, an AFT publication, in 2016 that “we wanted to look at the labor practices from campus to campus.” He complained of the low pay, the arbitrary power wielded by administrators, and the insecurity that defined the careers of adjuncts. In doing so, Keeperman channeled the anxieties of the educated precariat, which were to propel millennial socialism and the movements associated with Bernie Sanders and, a little later, the Squad.

    Yet Keeperman’s radicalization in those febrile years ran in a different direction than might have been expected from someone of his background. Unlike Merry, Seymour “The Swede” Levov’s daughter in American Pastoral, who swerves to the radical left — all the way to the Weather Underground — in opposition to the Vietnam War, Fred Keeperman’s son has emerged as one of the stars of the “dissident right”: a loose constellation of pseudonymous intellectuals and social scenesters who promote a combination of IQ-based eugenics, the worship of strength, and lifestyle self-help. 

    Roth’s Merry directs her (literally) explosive rage against America’s postwar military-industrial establishment — a discrete and familiar bogey for boomer progressives. But Keeperman and his cohort, the dissident right, identify a more fundamental force as the oppressive enemy: democratic egalitarianism, with its supposed denial of human difference, its general tendency to cut down the high to succor the low. They blame it for pervasive censorship and the H.R.-department quality of modern social life; for the snuffing out of excellence and the “disequilibrium afflicting the contemporary social imaginary,” as Keeperman has written. 

    The dissident right would bury the mildly egalitarian brand of conservatism that in the last century made its peace with equal human dignity, even seeking to extend it to subjects derogated by progressive egalitarianism, such as the unborn child. That conservatism was anchored in the “Judeo-Christian” consensus of the postwar era — a consensus that is fast slipping away, along with the shared moral memory of the horrors of the first half of the twentieth century. The dissident right would replace all that with a more heroic landscape trod by the aristocratic spirit: the one who designates value for himself, hindered neither by the demands of the dysgenic many, nor by popes and priests, nor still by the oozing tyranny of the primordial feminine — the ultimate source of democratic egalitarianism. To hell with your sacred victims, bellows the master subject of history, the noble barbarian, the online Übermensch, as he smashes down the female-dominated egalitarian order. In more extreme versions — Keeperman himself stops well short of these — the noble barbarian might also proclaim: Total N——r Death! (one of their grotesque trademark chants.)

    Shocking stuff. Except, as we will see, for all its capacity to create rhetorical disturbances, the dissident right is merely affirming — in vulgar rhetoric and hateful imagery — the IQ-obsessed, biopolitical future that is already being organized under existing market societies. 

    At some point in the 2010s, Keeperman adopted “L0m3z” as his nom de plume et de guerre, an online persona who could say things that a non-tenured academic could never get away with. Such as: 

    “Lamppost” — meaning, hang or lynch — “the journos.”

    My enemies are dysgenic freaks.
    The sheer tonnage of human filth is overwhelming. An assault on the senses, on the basic right to decency and peace of mind. How does one walk through the cities and not be constantly and involuntarily muttering under his breath: “Billions…billions…”? 

    That last bit is a reference to a social-media meme featuring a frowning, bespectacled figure who, aggrieved or put-upon, declares that “billions must die!” Keeperman/L0m3z would weave much of his output from the memetic threads that normally cocoon online subcultures. (The process should be called memesis.) He softened the material, making it more accessible to a wider audience of conservative “normies,” even as he added a dash of literary flare, as might have been expected from a veteran of one of the nation’s most prestigious writing programs.

    The Travis Bickle-style sentiment expressed by L0m3z — his desire to see a filthy human mass washed away from the face of the earth — is undercut by the fact that the “billions must die” meme-guy is supposed to be taken for a sad-sack creature. Such half-facetiousness is a central feature of the dissident right, a humanizing fig leaf. The movement’s political claims are rarely expressed in earnest or systematic fashion. Rather, arguments are advanced precisely via the joke and the adroit compiling, rejiggering, and interpolating of an existing set of highly mobile memes and symbols.

    The memetic joke serves multiple purposes. For one thing, it shoos away those too dull or too moralistic to get it: that “we” don’t really mean it when we say that billions must die, and also sort of do mean it — wink. More mundanely, the joke supplies a built-in defense mechanism against would-be cancelers and doxers, that is, those who would reveal the real figures lurking behind the pseudonymous avatars. L0m3z himself was recently unmasked as Keeperman in a Media Matters-style exposé in The Guardian, which denounced him for, among other things, reissuing Ernst Jünger’s The Storm of Steel under Passage Publishing, the imprint that he founded
    in 2021.

    Given the emphasis on edgy jokes, the dissident right’s voice can blend with that of other groups of online shitposters, all contributing their share of noise to the social-media cacophony. But Keeperman & Co. are a distinct group, in the business of articulating a distinct worldview. That worldview cannot be understood as highbrow Trumpism. For the dissident right has little to do with the populist upsurge that has engulfed most developed democracies since the mid-2010s, even if the same structural forces have provoked both. For one thing, the social base of the dissident right lies not in Trump country — not in, say, the Rust Belt or Appalachia — but among a segment of the bicoastal professional class. The core group of ideologues is composed of higher-education exiles, “independent scholars,” and non-tenured academics. Arrayed around them are concentric circles of artists and fashionistas, tech and finance bros, podcasters, fuckupnik heirs, and the like, most clustered in Lower Manhattan, Miami, and the Bay Area. Some of these characters could be described as financially stressed, but others are perfectly affluent.

    Sociologically, the dissident right has more in common with the urban left than either camp does with Trumpian America. Indeed, the movement and its hangers-on include not a few former Democratic Socialists of America types. There is, for example, Anna Khachiyan and Dasha Nekrasova, cohosts of the Red Scare podcast. Once a bastion of irreverent vocal-fry Bernie-ism, Red Scare now worshipfully covers the likes of Steve Sailer, the amateur race scientist.

    Sailer, who popularized the term “human biodiversity,” is the author of the America’s Half-Blood Prince: Barack Obama’s “Story of Race and Inheritance,” which appeared in 2009. More recently, Keeperman’s imprint has published a collection of Sailer’s old columns and blog posts. The volume is studded with such hard and brilliant gems of racial pseudoscience as: “Barbados, despite an average IQ of 78, is one of the most pleasant countries in the 3rd World due to its commitment to maintaining a veddy, veddy English culture”; and “since there are so many unmarried Asian men and black women, they should find solace for their loneliness by marrying each other. Yet, when was the last time you saw an Asian man and a black woman together?” 

    Many of the dissident right’s leading personalities, moreover, are what I have called “off-white ethnics”: Jews, Armenians, Romanians, Arabs and North Africans, even some Indians. This frequently puts them at odds with more straightforward white nationalists and anti-Semites — such as the “Groyper” movement led by the flamboyant video-caster Nick Fuentes — who write off even the Poles as “barely white,” let alone a “Mischling” like Keeperman. For their part, the dissident rightists consider unalloyed racial nationalism déclassé, an affront to good taste and a refuge for “low-IQ” white people, whom they hold in almost as much contempt as they do blacks. If you are an intelligent Jewish-American urbanite who wants to play around with certain Nietzschean and eugenic themes, you aren’t going to join tiki-torch-bearing marchers chanting that “the Jews will not replace us.” No, you turn to the dissident right.

    These tensions are not just a matter of ethnic- and class-based rivalry among different groups of haters. They bespeak serious ideological differences. The Groypers are uncomplicated racial-fascist goons: The Jews have orchestrated mass migration, porn addiction, and foreign wars to break our organic unity and weaken our people, etc. The dissident right is a much more complex beast, capable of entertaining sophisticated visions of the political order that might replace the current one, the better to serve the creativity of “natural” or
    IQ aristocrats. 

    For a glimpse of these visions, consider After the War, an anthology of dissident-right flash fiction released this year by Keeperman’s Passage Publishing. In keeping with the dissident-right style, all but a couple of the forty-four contributors appear pseudonymously. This, combined with the short length of the stories, makes reading the book feel like scrolling down an especially freaky X feed. And that’s the point. As the writer “Zero HP Lovecraft” notes in the foreword, 

    a flash-fiction story is short enough that you can conceivably read one story, not only in a single sitting, but in a single interval of consciousness, with no momentary discontinuity. If you grit your teeth and muscle through it, you can read a whole two pages without even once switching contexts to check your social-media feeds.

    Zero HP Lovecraft — the pseudonym is a portmanteau of the name of the Rhode Island horror pioneer and the video-game notion of having zero “health points” — is one of the most virulently racist characters in the dissident-right sphere. “No shit, I’m racist,” he has confessed on the X app. “I have recorded entire podcasts about why blacks are dumber and more violent than whites. I have advocated for [the] mass deportation of anyone darker than cappuccino.” He complains of the “negroid warbling” — rap, hip-hop, and R&B — that supposedly permeates contemporary public spaces. He declares: “I don’t have DNA. I only have TND” (that is, Total N——r Death).

    He is also the author of a handful of strikingly inventive horror stories, written in the tradition of his literary namesake but updated for the age of surveillance capitalism. His gift lies in conveying the subjective experience of people in these milieus — cryptocurrency speculators, “fin-tech” specialists, unemployed online edgelords, and the like — in crackling prose: “We imagined ourselves as samurai-sword VR pirate pioneers, but it turns out we’re pointless argument vegetables growing in walled gardens, harvested for the benefit of robots that serve us ads.” In classic Lovecraftian fashion, he then pulls back the curtain to reveal the hidden, cosmic-scale monsters — technology and capital — that dwarf and menace his human subjects, draining their life essence until the human itself has been rendered superfluous. 

    One can’t but feel a certain awe for the absolute bleakness of his worldview — a bleakness that sometimes bears the awful ring of truth. “They say the bit of folk trivia about a goldfish having a memory of three seconds is just that,” he muses in the foreword to After the War. “But there’s still something so poignant about this image. Trapped in a glass bowl, watched on all sides, an attention span of three seconds: that’s me, that’s you.” Thus, Zero HP Lovecraft laments in substance tech-driven social phenomena that the anthology reflects in flash-fiction form.

    The authors featured in After the War aren’t nearly as perceptive about technology or as fantastically imaginative as Zero HP Lovecraft. Many deploy hackneyed tropes that are the mirror image of the didactic woke-ism that mars much mainstream fiction: if cartoonishly repressive white males supply the grist for the Big Five’s moralizing mill, here it’s the fat, blue-haired, rainbow-pin-wearing, they/them-pronouns-using police officer and similarly left-coded authority figures who serve as easy foils for the protagonists. Other stories are too caught up in insider symbology and jokes to rise to any degree of universal literary merit. A handful are unquestionably clever, however, combining grim humor with memorable conceits made all the more discomfiting by the authors’ foul politics.

    As its subtitle, Stories From the Next Regime, suggests, the anthology invites the reader to envision what it would mean to overthrow our current political, economic, and cultural arrangements in favor of an order more conducive to the adventure and excellence for which the dissident right yearns. The mood is one of anticipation and triumph, even if in many cases, it is really the nihilistic sense of triumph felt by the one who burns everything down. 

    Most of the stories are in the science-fiction genre, with the very best of them managing to pass off their uncanny and disturbing future scenarios as humdrum reality for their characters. Philip K. Dick was probably the grandmaster at generating this effect, and his influence is felt heavily throughout. Indeed, the anthology as a whole could be described as Dickian — that is, if Dick had cheered for the Nazis in his novel The Man in the High Castle, from 1962, now better known as the basis for Amazon’s television series. The stories generally fit inside a limited thematic matrix — a sign that they arise from within a coherent and fairly well-developed ideological movement, rather than a literary scene or sensibility. I notched the recurring themes in the back of my copy as I was reading it, and was struck by how infrequently I needed to come up with new categories for my counting system. Taken together, the handful of categories can double as a guide to dissident-right ideology.

    It should come as no surprise that a significant plurality of the stories involve race, eugenics, “natural” or genetic aristocracy, physiognomy, and the centrality of bloodlines. “The Frowners,” by a writer who goes by “Degree Studies,” is typical. The story dully restages Steve Sailer’s notion of “noticing”: that hereditary differences among large human groups — delineated by race — should be obvious to anyone prepared to take off the lenses of egalitarian piety. The protagonist is a scientist from Earth who, in the distant future, visits Jupiter’s moon Ganymede to present the results of his research into the local xeno-culture. It turns out that “there were distinct physical characteristics of those engaged in violence” among the Ganymedians: namely, “a distinct downward tilt to their face,” absent among the peaceful. But the Ganymedians are not prepared to hear this hard teaching. One of their own scholars pipes up that “there is no reason to believe” that frowners “are innately more violent. In fact, we believe the frown is caused by exposure to violence and injustice.” Substitute skin color for the Ganymedian frown and . . . get it? Noticing!

    The yearning to expose the immutable hierarchy of human types likewise animates “The Pasture,” by “Meta Prime.” It envisions a future in which humans are socially categorized as Sheep, Camels, Lions, and Children. The latter three designate the three-stage metamorphoses of the soul in Thus Spoke Zarathustra. In Meta Prime’s story, only those born as Camels, Lions, or Children are afforded respect, while the Sheep are placed at an early age in a space known as The Pasture, where they are digitally surveilled and cared for. Parents of Sheep hope desperately that their children might graduate from The Pasture into a higher status, thus “transcend[ing] their lineage” and “proving [their] soul to be something more than that of [their] ancestors.” But this rarely happens. Usually the Sheep leave The Pasture and enter society as Sheep. “Even the families who were able to get their offspring placed in ideal positions in The Pasture had little hope of making a difference… They had to live with the fact that souls of their caliber had not built society… and only now had the opportunity to enjoy it by the good graces of the Children, Lions, and Camels who had sacrificed before them.” 

    The message is unmistakable: our actually existing society is maintained only thanks to the efforts of a hereditary or natural aristocracy, whose superiority in talent and responsibility is simply inaccessible to the many. Meta Prime only wishes we had the courage to admit this bare reality and thus furnish the aristocrats with the rewards of rule and respect naturally due to them, while disabusing the small-souled many of their grubby hopes of social ascent or equality. 

    “Blood Ties,” by the writer “Mythpilot,” adds an interesting geopolitical twist to these hereditarian concerns. The story pictures James, the aristocratic ruler of a Western country, retiring to a rare moment of intimacy with his wife, Anna, after the couple has publicly announced the impending marriage of their daughter to a son of the Russian nobility. Delighted about this union, James and Anna reflect fondly on their own, a “grand romance,” a match made in heaven — or more precisely, by “biopolitics,” as Anna reminds her husband. Now, a similar fusion of noble lines promises to seal the peace between their country and the Russians — “the people who once tried to bomb us,” Anna notes. James waxes philosophical: “It’s a new world, darling, and it belongs to us. We used to trust in pieces of paper for peace. Now we trust blood… Peace has a price. Blood for blood.” Anna concurs: “Despite my mother-sadness, I think it’s a marvelous thing. Blood is human, blood is warm, blood is us. This can be the beginning of a new age, a human age.” The “old world” wasted itself for the sake of abstract principles, when all it took to establish harmony between nations was to restore the premodern politics of “personality” and intermarriage. The countless victims of Europe’s monarchic wars could not be reached for comment.

    If Mythpilot offers a farcically gentle vision of a world order built on blood and genes, a contributor who goes by “P.C.M. Christ” — the nom de plume’s theological significance will become clear in a moment — harbors no such illusions. In “Sins of the Fathers,” at once the collection’s most impressive and repellent story, P.C.M. Christ takes the dissident right’s eugenic politics to their terrifying, and ferociously anti-Christian, terminus. 

    In the future, all children born with intellectual or physical disabilities are banished beyond the pale of civilization, so that neither the family nor the state has to bear the burden of “a life that could never recover from its handicaps.” Tyler is one such child. While his older brother, Harrison, is a genetic marvel of intelligence and athleticism, Tyler is “half-formed and hideous” and “crippled by various mental disorders.” Agents of the state are headed over to collect him for removal, and his mother and father must make a choice. Parents are permitted to accompany their children into exile, on the condition that they never return to civilization. Tyler’s mother, Mary, is already firmly resolved: she is staying with Harrison among the healthy. But the father, Chris, is torn. “I need some time, Mary,” he says, but he only has thirty minutes to decide. And then: “I’m going with Tyler.” It is a touching moment, all the more notable for the elliptical brevity with which the author describes it. Yet it soon becomes clear that, for P.C.M. Christ, the father is a contemptible character. 

    Transported with Tyler to a camp far outside the city, Chris finds himself forced to permanently undress and to submit to the egalitarian diktats of a figure called simply “Mother”: a cross between matriarch, corporate diversity manager, and female correctional officer. “Our way of living is communal,” Mother instructs Chris. “Care, sex, love; all are given freely and broadly…We do not allow anyone to be above another.” Again: “There is one sin in this community, and it is unforgivable — that you would place yourself above the group.” Since the community is founded upon a constitutive “weakness,” Mother adds, those who dare inject strength or excellence or hierarchy face “unrelenting punishment,” lest they jeopardize communal “safety.”

    Skeptical at first, Chris learns to accept this state of affairs. So much so that after six months, his male breasts begin to emit milk — “a sign, he was told, of his growing empathy.” Chris’s transformation from husband to milkmaid reaches its apotheosis when a family with a Down syndrome daughter approaches him, and he happily breastfeeds the lot of them, as their bodies and feelings melt into each other. Tyler, meanwhile, shows no improvement, “only growing more and more demanding” over time. But Chris counts himself blessed, feeling “righteousness through pain” and — P.C.M. Christ adds with all the Nietzschean venom he can muster — “gratitude for the hobbling weight of a cross to bear.”

    In addition to its eugenic dimension, P.C.M. Christ’s story introduces a second major theme of the anthology and, by extension, of the dissident right: namely, a fear of female power as the biological engine of social egalitarianism. The idea goes back to the movement’s leading thinker, Costin Alamariu, the Romanian-born political scientist behind the pseudonym-cum-online persona Bronze Age Pervert or BAP (complete with a broken faux-primitive syntax). Alamariu’s Yale dissertation from 2015 — published independently last year as Selective Breeding and The Birth of Philosophy — contends that philosophy at its classical origins was foremost concerned with “the problem of breeding.” (The book briefly topped the Amazon charts.) Eugenics, good breeding, “the standard of nature” as conveyed by Plato’s and Aristotle’s endless talk of excellences among horses and other animals — all this for Alamariu represents the rebellion of the noble barbarian against the female-led egalitarianism that is society’s default form. Alamariu, and Keeperman and dozens of lesser dissident-right figures after him, use the metaphor of the “longhouse” — the communal living space supposedly associated with sedentary agricultural civilization — to represent this matriarchal despotism. 

    In “Sins of the Fathers,” the camp for disabled children ruled by a female disciplinarian — promoting equality even as she flexes her own power — is an obvious longhouse. The same complex of ideas appears throughout the anthology, with the authors variously equating femininity with collective organization, campus diversity nostrums, anti-racism and the removal of Confederate monuments, and the general sapping of vigor and vitality. Yet the contributors to After the War are seemingly divided over the prospects for resistance, with some staging misogynistic orgies of male triumph, while others merely assert a male right to pleasure as the condition of perpetual female domination.

    The opening story by “V.N. Ebert” — one of the more drearily on-the-nose pieces — is narrated by a pilot of spacecraft in a future moon colony. “The Moon had been a libertarian thing” at first, he tells us. But the colony is now menaced by the same creeping bureaucratization that long ago suffocated adventure and heroism on Earth. A visiting female activist embodies this threat. “She didn’t acclimate well,” the pilot observes. “Wanted to organize, whatever that meant.” Her recruitment fliers “had a rainbow flag I remembered from down there.” The girl rails against “exploitation and violence,” but our pilot pays her no heed, and she finally abandons her mission and returns to Earth. Noble barbarian 1, female busybody 0. 

    “Genesis Revelation,” by “Mencius Moldbugman,” transmutes the male triumph over female power into an insane Burroughs-esque mythscape. Here, the horrors of the longhouse are laid bare. It is a “dark prison” whose walls are smeared with “the blood of weak men, the blood of men cowering wild-eyed.” Overseeing this prison are the “doe-eyed but dangerous women” whose power has depended upon the “suffocation” of men down the ages. Our protagonist, identified as “the warrior,” enters the longhouse bent on ending this female despotism. “With purpose, the warrior strode toward the nearest woman and grabbed her.” Defenseless before such boldness, “the woman leaned back and opened her legs, offering herself to her new master. Her sex thanked him for his strength and moistened with relief that her reign had finally come to an end. The other women took heed and did the same.” The dissident-right warrior screws his way to freedom, winning over all womankind except for “one old crone too bitter and barren to bear the blessings of his fruit.” But the harridan, too, eventually succumbs to the warrior’s determination, her everlasting NO to life drowned out by his everlasting YES. Finally, “the warrior stepped out of the longhouse, loyal mothers to his future sons in tow.”

    “A Big Man on Campus,” by “Noble Red,” is equally heavy-handed. On her way to Drag Queen Story Hour at Ruth Bader Ginsburg College in upstate New York, the freshman Margaret spots the only boy on campus: tall, handsome, “the most beautiful young man she’d ever laid eyes on.” A friend informs her that his nickname is Shakespeare — “because everyone gets to shake his spear.” The friend adds without elaborating that “he’s the college rapist, of course.” An official at the registrar’s office unlocks the meaning of this mysterious statement for Margaret. “The number one reason why young women go to college is to get raped,” she explains. Parents don’t intervene, “because they realized that their daughters claiming to have been raped was a marker of high status. But more importantly, it was and is a marker of political affiliation. It means you’re one of the right people.” 

    Problem is, “demand very much exceeds supply.” Enter “Shakespeare”: “We employ a low-status male to rape all the students. He doesn’t really rape them, of course. They just go to his room for thirty minutes and then allege that he did. We log the complaint, inform the authorities, file all the paperwork…The boy never faces legal jeopardy, because the girls all profess to be “too traumatized” to go through with the ordeal of a police investigation and trial. As she enters his room for her own turn, Margaret thinks she can “save” the boy-prisoner of female desire and female politics. Noble Red’s story, then, ends on a note of pessimism when it comes to overcoming the longhouse, at least among the college-attending classes. Yet for the author, the miserable fate of “Shakespeare” isn’t lacking for erotic possibility: the girls hold the power to ruin him, even as they also hold his “spear.” 

    In his own story, “Vampire Island,” Alamariu/BAP proposes a similar pattern for resetting relations between the sexes: not the total defeat of the longhouse in the manner of Mencius Moldbugman’s warrior, but a renegotiation of the terms of female domination. In the wake of nuclear war, the five hundred or so remnants of the LC (“Lewis and Clark”) battalion have taken refuge on Guam, enjoying plentiful food and abandoned fuel and living as equals in generous leisure. A bust of the “Blond Beast” is enshrined at the center of their camp.

    This tropical idyll is interrupted when a few of the men vanish while on excursions. Someone or something is kidnapping the soldiers. More specifically, “the most conspicuously handsome and fit were being picked off.” A party reconnoitering the nearby jungle in search of the missing learns the truth: their comrades have been taken prisoner — for the purpose of breeding — by a band of semi-civilized amazons. “Pumping relentlessly into the frenzied gripping pussies of the ecstatic amazons,” the captive prisoners are now cheered and now flogged by the “violent vampiric cum huntresses.” The horror, the horror! But there are too many amazons, and the men of the battalion resolve to sue for peace. Under the settlement, the amazons are to perform “no more than three extractions per day, and this only a week at a time, with days of rest in between, fed shellfish, pineapple, and cured wild boar by the amazons’ dwarf-like servant class.” For BAP, at least when he is scribbling trashy post-apocalyptic erotica, female domination, rightly ordered, entails male pleasure. 

    Reconciling the contradictions in dissident-right sexuality might appear impossible. One mode of fantasy vents revulsion at female sexuality: as a mysterious and “natural” power in itself and as the source of egalitarian politics that must be vanquished by the noble barbarian. The other eroticizes the status quo of female domination: hence the recurrent figure of the lone and helpless male overpowered by groups of sexually dominant women. The through-line seems to be an inability to view sex through any lens but that of power and counterpower: it’s the brawny male brute or the dominatrix all the way down.

    Interestingly, BAP himself straddles the two modes of fantasy. As a thinker, he promotes the notion that matriarchy is the stifling default state of society, and can only be resisted through heroic male exertion. As a storyteller, however, he imagines amazons lashing his Aryan heroes — like Keeperman, BAP is part-Jewish — and forcibly extracting their lifegiving seed. Then again, BAP’s idol Nietzsche advised: “Going to a woman? Do not forget the whip!” — yet he was also photographed (with an English philosopher) harnessed animal-like to a cart carrying his beloved Lou-Andreas Salomé, a whip in her hand and looking knowingly at the camera.

    The largest plurality of stories in this sickening volume are devoted to imagining the processes of social breakdown, civil conflict, frontier settlement, and population transfer that open up new horizons of freedom for the noble barbarian — or at least, that bring the current order to a close. 

    “Mog the Urbanite” contributes a tightly composed — if politically chilling — tale about a pair of boys examining their grandfather’s collection of strange trophies. The ancient objects carry labels such as: “The Skin of Senator Molembek,” “Warhead From a Minuteman Missile,” and so on. But the one that most absorbs them has a faded label, and the boys can’t figure out what it is. It is a guitar, but the boys wonder if it is a weapon before giving up. Later, as they approach the “throne room” of “the Warlord,” they wonder why such an object would have a sticker on it that reads: “This Machine Kills Fascists.”

    In “A Whole World,” the writer “Golgi Apparatus” recounts the “informal invasion” and “second colonization of Africa” by enterprising privateers. The story, written in a breathless tone and cadence, describes the “CEO monarchs and eccentric pioneers” and “tech-bro caesars” who in the late-twenty-first-century manage to subdue the continent, once more, to the West’s undying Promethean impulse. The United States is in decay, but “in Conakry, a solitary genius known only as the Master rules through a network of undying mechanical servants — kept alive, some whisper, through a twisted Kabbalistic occultism optimized in a laboratory…Ethiopia is under Mormon control. Mercantile hubs and “neo-Singapores” blossom across the Gold Coast. 

    “Reconquista,” L0m3z’s own contribution, treads similar ground. More competently written than the others, it offers a faux-historical narrative about the future takeover of California from Mexican cartels by a militia amid the apparent breakup of the United States. After generations during which they could only dream of possessing “the birthplace of their grandfathers,” the militiamen could possess any mansion they wished. But for now they are celebrating victory with a barbeque on the beach. I couldn’t help but recall that this fantasy of a Californian Reconquista — the recolonization of the former Golden State by its “indigenous” population — is the work of Jonathan Keeperman, son of Fred Keeperman of Pitkin Avenue, Brooklyn.

    Other writers foresee the present egalitarian regime staying in place, and its opponents either escaping to ungoverned spaces or else mounting special operations and a low-intensity counterinsurgency. “Demeter,” a tightly crafted horror tale by “Detective Wolfman,” features a trucker who smuggles statues of forbidden historical figures such as Thomas Jefferson and Robert E. Lee to more tolerant places in South America and Eastern Europe. Interdicted by the villainous FBI, the smuggler reveals himself to be a vampire: the undying and undead Southern spirit, avenging the Lost Cause from beyond the grave.

    A related complex of themes — After the War is a rich document of contemporary political anthropology — has to do with male self-help and self-improvement. In “Under the Willow,” a writer who goes by “William Wheelwright” pompously describes a member of a future caste of warriors, also named William, as being of “epistolary persuasion.” The William of the story, we learn, “was maniacally focused on the perfection of himself. In the gymnasium, of his body. In the library, of his mind. And in the sacristy, of his soul.” Judging by his racist musings on the X app, however, William Wheelwright has a long way to go to basic human decency, let alone the spiritual perfection that he ascribes to his character: last year he responded to a poll asking, “Fellas you have to pick a [girlfriend], which one of these is least objectionable: former escort; sex with dogs; a black.” “Dogs are the least evil,” he wrote.

    Time and again, we encounter young men who have prepared themselves for the coming war-apocalypse-new world through strenuous exercise and the consumption of healthful food: “Raw milk. Berries. A few slices of 100 percent grass-fed organic steaks from cows that were kept in red-light vaults five hours a day,” as one story has it. You know, not like the bugs and zogslop on which the dysgenic masses gorge themselves. The prevailing emotion in these writings is disgust, and they provoke the same feeling in their readers. 

    Between the civil-war reveries, the fantasies of exacting violent revenge against the woke disciplinarians, and the vitalist commitment to the cultivation of minds and bodies fit for armed conflict, it is tempting to view the dissident right as a serious threat of radicalization. Online, no doubt, on various dark and darker webs, they have such an effect. But I think something else is afoot here: namely, the romanticization of social developments that are already unfolding in the United States and other advanced market systems. Put another way, dissident-right culture merely lends a heroic sheen to our actually existing realities and the ideological structures used to legitimate them.

    In actually existing advanced market societies, there is no need to set up camps on the outskirts of cities to house children with Down syndrome, because many fetuses diagnosed with the condition and others of the kind are terminated before birth, and close to a hundred percent in Iceland and Denmark. Other burdensome citizens — not just those facing terminal diseases, but also increasingly the elderly and even young people with mental illness — are increasingly goaded into medically assisted suicide in Canada and the Benelux states, with advocates fighting tirelessly to expand such MAID regimes elsewhere in the West. Categorizing people from birth as Sheep or Lions or what have you, so as to ensure that everyone knows his place, is equally superfluous in today’s market societies. On both sides of the Atlantic, social mobility has largely ground to a halt. In the United States, it takes an average of six generations for the advantages associated with inherited family wealth to disappear, according to research from the Brookings Institution. Among the libertarian right as well as some progressives, this reneging on the egalitarian promise of “meritocracy” is justified on the basis of the hereditary genius and virtue of the rich, Charles Murray-style. The “Pasture” that they recommend is universal basic incomes or forms of so-called negative taxation: handouts aimed at mollifying economically useless individuals, so as to obviate reforms aimed at altering the lopsided distribution of power generated by markets. Confronted with the growing stubbornness of hierarchies in contemporary capitalist society, the dissident right wildly and without any sense of irony expresses its disappointment by inventing a new hierarchical thinking draped in a mystical vernacular, a different sort of inherited hierarchy that is uglier and even less mutable than the one that they deplore. 

    Likewise, the fusion of “noble” blood and genes is a feature of advanced market societies, as Murray pointed out more than a decade ago in Coming Apart. Indeed, sociology has been aware of it as a problem for meritocracy ever since Michael Young coined the term in his dystopian novel-essay hybrid The Rise of the Meritocracy, first published in 1958. Our jet-setting meritocrats are already apt to unite their blood lines, with little to show for it by way of greater social progress or harmony between the nations. The dissident right’s fantasies of intermarried neo-aristocrats merely attach the prestige of the ancien régime to the love lives of tech bros and the consultant class, much as an earlier generation of eugenic ideologues did the same in the late nineteenth and early twentieth centuries. It was repulsive then and it is repulsive now. 

    As for sexual relations defined solely on the basis of power, that, too, is characteristic of the world we inhabit. A sexual imaginary that can only alternate between the male brute who screws his way to freedom or else the vampiric, dominant female — that’s about as radical as the ethics of contemporary pornography. As the critic Geoff Shullenberger has pointed out, moreover, there is nothing particularly novel about the notion that matriarchy is society’s default setting: the radical feminists of the 1970s got there first. Once again the creepy right inverts, and parodies, the radical left: the dissident rightists merely reverse the normative valuation of this state of sexual affairs — even as they also eroticize the matriarch who holds the whip.

    Even the new-frontier fantasies of the dissident right aren’t that far-fetched. China, Russia, and the North Atlantic powers are already mounting a second colonization of Africa. Only, the process isn’t taking place as Silicon Valley-adjacent writers besotted with the Californian Ideology might imagine. It is a project not of bitcoin-flush privateers, but of massive state-directed or state-backed enterprises that are marvels of social organization. On a less dramatic scale, numerous movements on the existing right encourage and organize the resettlement of likeminded families away from disordered urban cores of blue cities and states and toward safer and more socially cohesive red areas. It is their right to do so; but buying real estate is not a radical act.

    Much the same could be said for the dissident right’s obsession with bodybuilding. In their fiction and “nonfiction” (such as it is), the dissident rightists insist that to overcome the schemes of woke despots and h.r. henpeckers, you must train body and mind and lift yourself above the despicable masses. Social antagonism is not to be collectively resolved, but transcended by the exertions of the heroic individual perfecting himself, as the classical sculptor chiseled elegant form out of lowly matter. Yet such self-help programs, such aspirations to a salvific superiority, are nothing new. As the left historian Charles Sellers observed, going back to the nineteenth century, striving middle classes often corralled surging social discontent into a frenzy for self-discipline and what today’s online right would call “clean eating.” As if perfection was ever the solution to a social problem. 

    This is the dark transubstantiation of market society, as it is practiced by the barbarian right: how it, and they, can turn even its most humdrum offerings — going to the gym and counting your calories — into a means for defeating the woke matriarchy or any other real or imagined enemy. Much as, in American Pastoral, Merry’s fury, having erupted in violence, finally exhausts itself in the harmless pseudo-mysticism and lifestyle experimentation that were the endpoint of the New Left, so the dissident right, for all its countercultural energy and its self-congratulatory sense of its own radicalism, ratifies the deeper logic of the very society against which its adherents purport to rebel. It is only an ugly and fevered diversion from what really ails us. 

    True radicalism and true dissent in contemporary America require a critical examination of the meritocratic ideal and the power relations it has served to disguise since the early twentieth century. It should mean rejecting the routine throwing away of weak and vulnerable lives. It would mean reaffirming moral and political universalism, whether rooted in the Bible or in social democracy, especially in our time, when these ideals are under assault from seemingly every quarter. Lending the present state of affairs a new legitimacy on the basis of IQ or racism is not radical. It is simply evil. 

    A Poetry of Place

    When we first met, you said you hoped to write

    a place as yet unwritten, maybe here,

    the last of the café’s lunch crowd clearing out

    with a soft ceramic clink and spray of light 

    through glass to glaze your dark cascade of hair.

    It’s not Manhattan, after all: it’s not

    a place for public life, yet here we sit

    with much between us still unspoken,

    each unfamiliar blossom yet to bloom.

    One Saturday I lingered in the park

    not far from your apartment, the faint perfume

    of evening primrose floating through the dark 

    with petals cool as rain against the skin,

    the season still unchronicled, but you

    had packed your bags and flitted back to Brooklyn,

    from what, and to what end, I never knew.

    Somewhere Else

    The last time we ever spoke

    Missouri suburbs filled with snow

    and snowfall blotted out the oak

    beyond your buried patio.

    You’d never see another spring.

    Falls . . . confusion . . . vertigo . . . 

    Familiar landmarks vanishing,

    you stood up from your wheelchair.

    Where did you think you were going?

    Across the Firth of Forth to Fife,

    to a croft in Pittenweem—why there,

    a place that you would never see?

    Though stranded, at the end of life,

    you still had somewhere else to be.

    Memory Care

    Memory care makes final introductions

    to residents whose names have slipped away

    at the slightest pressure, evanescent

    syllables for those who will not be here long,

    mere bubbles in a froth of foam,

    more transient than resident, some

    transitioning, not to another gender,

    but to another state of being altogether.

    Are you in the bathtub? my father greeted me,

    perhaps replacing past with present, room

    with tub, the week since I last visited

    with a momentary absence. He glimpsed

    the cascade of errors, winced, then shrugged

    and raised his palms in mock surrender.

    Nights are worst, when Mr. D don’t act right,

    but crawls around unplugging clocks and lamps

    and grunting like an animal, until the staff

    step in to strap him down with soft restraints.

    Dreams subject him to a long examination,

    turning over his childhood to find the source of sadness,

    testing each failure, each scene of humiliation,

    as he turns over the exam to find a list of questions

    in an ancient, unfamiliar script

    with blanks that spread like spilt milk.

    Tomorrow — tomorrow he won’t remember

    where he lives, much less these ghostly neural flickers;

    but tonight the dreams remember him.

    What Comes After

    Reconstituted voices,

    scraps of cloud caught in branches,

    the morning campfire of Pu-erh tea

    or mown hay of white peony,

    an old man’s blaser hanging on its peg,

    the human funk of toasted cumin seeds,

    oak burnt to ashes, cinerulent fox fur,

    crapy grape leaves in late November,

    a shirred old pumpkin,

    the soap and pepper of walnut hulls,

    the must of summer clothes left out through fall,

    the shadow of a straw hat 

    hooked on a chair back,

    Parcheesi’s orphaned pawn,

    the clatter of thick French china,

    Bordeaux of old furniture,

    the frazzle of a bee,

    A above middle C,

    oaks in spring bright as lettuce,

    butter and apricot of chanterelles,

    brine on stone after a storm,

    the smell of lake water,

    and all night, the tentative knock 

    of a hull against the dock.

     

    Why College, or What Have We Done?

    Every fall I teach a first-year seminar called “Why College? Historical and Contemporary Perspectives.” On the first day of class I present a list of possible purposes for college and ask students to rank them. “Finding your passion” and “changing the world” are always the top vote-getters, because that is the story we tell about college. Welcoming the new students at convocation, the president declares that they can do whatever they want with their lives, so they should do something they love. And they are also reminded to live for others, not just for themselves. At the University of Pennsylvania, where I teach, that inevitably means trotting out the school’s favorite quote from its famous founder. “The noblest question in the world is, ‘What good may I do in it?’” Benjamin Franklin asked.

    I wish that was the real point of college, and so do the students. But we both know better. The point is to get ahead, and to win the game. That means giving the teacher (in this case, me) the answer that he wants to hear. And outside of class, it means competing for every trophy in sight. Indeed, the competition is what produces the value. A few years ago, a student told my seminar that she had “tried out” for the Alzheimer’s Buddies Club — which sends people to visit patients in nearby hospitals — but that she didn’t “get in.” When she applied, she said, she had to submit an essay explaining why she wanted to participate; then she had to undergo an interview with an officer in the organization, who quizzed her about her “motivations” and “qualifications” for the role. Her story saddened me. I told the class that I didn’t think Penn should sponsor a group that winnowed people so selectively for a volunteer opportunity. 

    It’s a free country, I said, and if students wanted to test and interview each other, that was their own business. But if they wanted Penn’s imprimatur — and its money — that was a different story. Everyone who wished to volunteer should be able to do so; and if more people applied than the hospital could accommodate, they should draw lots to decide who went. Students looked down at their notebooks, avoiding my gaze, and the room got quiet. Finally someone broke the silence. “If they did that,” a brave student explained, “nobody would apply.” Never mind the poignant essay about Grandma and her descent into dementia, or the resumé (the Alzheimer’s club required those, too) showing that you visited nursing homes in high school. The point, again, is to win. And if the game is too easy, there is no point at all.

    As you proceed through college, the stakes get higher. The next shiny object is the post-graduation job, ideally in finance, tech, or management consulting. At last count, sixty percent of Penn’s students entered one of these three fields. We tell them to find their passions and to change the world. But somehow, after four years here, over half of them choose the same thing. Many of them — probably most of them — are not passionate about it. And does anyone seriously believe that sending more people from Penn to Wall Street will make the world a better place? It’s not about that. We have socialized these young people for a Hobbesian war of all against all, where everyone battles for a scarce good. And we rarely — if ever — challenge them to reflect on whether it really is good, and for whom.

    That’s on us. The campus protests last spring over Israel and Palestine — and the related incidents of antisemitism — have occasioned another bout of handwringing about the moral state of our students. This is as old as America itself; from the start, adults have worried about whether the kids are alright. But today’s anxieties have the wrong focus. The big problem at college is not political correctness, or wokeness, or racism, or antisemitism. The big problem is cynicism, spawned by an institution that tells young people one thing and does the opposite. If we truly believed our rhetoric about individual exploration and collective uplift, we would structure college in a very different fashion. But we don’t believe it and the students know it. They have found us out.

    The war of all against all starts well before these people get to college, of course. The first big prize is getting in. Despite all their blather about diversity, the elite universities still draw most of their students from the upper rungs of the economic ladder. And that presents a puzzle for ambitious high schoolers and their worried parents. In a situation where everyone is pretty much the same, how do you stand out? The answer is to cultivate a self with a distinct passion— that word again! —and to compile a set of corresponding experiences, all designed to show that you deserve admission more than the next person. When, in my seminar, we address the history of selective colleges, I show the students John F. Kennedy’s application to Harvard in 1935. Asked why he wants to go there, Kennedy replies that he would like to attend the same school as his father did. He also writes that he has always wanted to be a “Harvard man,” which doesn’t explain why he went to Princeton first. (He dropped out, for health reasons.) His application file also includes a remarkable letter from Joseph P. Kennedy, who admits that his son was “careless” in high school; hence the mediocre grades at Choate, which are on full display in the file. (Shout-out to the Kennedy Library in Boston, which deserves its own Profile in Courage award for posting all this material on the Web.) My students get indignant about Kennedy’s application, pointing out — correctly — that he was a rich kid who got in solely because of his name. True enough, I reply, and Kennedy would have agreed. He never imagined that he had earned his way into Harvard. And in the tradition of noblesse oblige, this meant that he also had a duty to serve others.

    Not so for today’s meritocrats. We seduce them into believing that they are special — more special, of course, than the many thousands of kids we reject. Forget about the battalions of tutors, counselors, and consultants that assisted them along the way. Not to mention the myriad other privileges that they received, simply from the circumstances of their birth. To drive home the point, I tell my students — tongue firmly in cheek — that I was a very good fetus. I didn’t just lay there in the amniotic fluid, sucking on the teat of the maternal nanny state. I pulled myself up by my umbilical cord! I made sure that I was born in a rich country, where I had plenty to eat. And I also chose educated and curious parents, who took me around the world and exposed me to its infinite complexity. The students all smile, sheepishly, because they understand what I’m saying. But we’ve got our story, and we’re sticking to it. Everyone here earned their way in, which makes them better than the people who didn’t.

    That sets them up for misery down the road, when they get turned down by a club or a fraternity. Or, later, by an employer. Or a graduate school. If your entire fate rests in your own hands, you bear all the responsibility for your failures. You will interpret them as existential judgments about your very being. And when you succeed, conversely, you will be less charitable to the people who fail. I won this race on my own merit! If you lost, you must not have enough merit. You should try harder, or maybe just accept your fate. But don’t expect anything from me, because — remember — I deserve to be here. I don’t owe you a thing. Jack Kennedy didn’t believe that for a second, which is why he devoted his life to public service: from whom much was given, much was expected. But many of my students — perhaps most of them — do believe it. This is what we have taught them. Their job is not to serve others; it is to stand out from them.

    But standing out becomes harder when you arrive at a college full of standouts. And it is still harder when almost all of them get A’s. Last year, seventy-nine percent of the grades given to Harvard undergraduates were in the A range (A+, A, or A-), compared to sixty percent a decade earlier. The same fraction of grades at Yale were A’s, up from sixty-seven percent in 2010-2011. “When we act as though virtually everything that gets turned in is some kind of A — where A is supposedly meaning ‘excellent work’ — we are simply being dishonest to our students,” the Yale philosophy professor Shelly Kagan observed. He’s right, and — again — the students know it. In a refreshingly candid essay published last spring, a Harvard undergraduate named Aden Barton admitted that students could succeed in classes without breaking a sweat — sometimes even without showing up. 

    For the past forty years, I have endured a recurring nightmare in which I arrive in a classroom — as a student, not a teacher — and realize that I am a month late. At Harvard, Barton reports, one of his friends didn’t attend any classes at all for the first month of the term. The mere thought of doing that — even for one class — still wakes me up in a panic. But it was no problem for Barton’s buddy, who blew off his entire course schedule. He still had to submit the assigned work. But there is not much of it, Barton writes, and almost anything you turn in will get you an A. The safest move — especially if, like Barton, you have not done the reading — is to mimic the professor’s opinions, which invariably fall somewhere on the left. Most students aren’t like the keffiyeh-clad protesters we saw on TV this spring. They are more like Aden Barton, who echoes a few political platitudes on his way to an easy A.

    That leaves lots of time for the realm where you can stand out: extra-curricular activities. Quoting a dean at NYU, Barton notes that students “feel the need to distinguish themselves outside the classroom” because they are “essentially indistinguishable” inside of it. The dense network of student organizations provides just the ticket, because so many of them are selective. (See: Alzheimer’s Buddies Club.) “At Harvard, one cannot simply ‘join a club,’” the Unofficial Guide to Harvard notes. “Instead, you must prove your worth.” Enter “comping,” which is Harvard-speak for competitive tryouts. “My roommate was wait-listed to volunteer at a homeless shelter,” the Unofficial Guide continues. “Some girl on my floor got cut from a Zumba class. It’s brutal.” At the University of Virginia, one parent recalled, their son was rejected from a pep group that supports the basketball team. He started a rival organization, but almost nobody signed up for it. Then he got wise to the system and required people to submit applications, which immediately quadrupled the number of students who wanted to join. Anything which is good must be competitive. And if it’s not competitive, it can’t be much good. 

    To its credit, Penn has prohibited student groups from collecting resumes from first-year students or from requiring “specific attire” during early-round interviews. But even these good-faith efforts to tone down the war of all against all demonstrate its enduring power. If you cannot demand that first-round candidates wear nice clothes, that means it’s fine to make them dress up for the call-backs. And whereas resumes cannot be solicited from freshmen, Penn decreed, “a list of activities may be requested on a written application.” That sounds a lot like a resume to me, and I doubt that the students see much difference between the two. It’s still about building a personal brand, and — most of all — about besting your peers. “Not all of you will make it in,” the Unofficial Guide to Harvard warns, in its chapter about comping. “Let the Hunger Games begin!”

    Unsurprisingly, the most competitive organizations are often the ones devoted to the prize jobs that students will seek after graduation. A search for “consulting” on the Penn Clubs website yields thirty-one different hits, including Global Research and Consulting, Penn International Impact Consulting, and Consult for America. Type in “finance” and you will get twenty-five results, including Business Brilliance, Penn Impact Investing, and M & A at Penn. (You read that right: we have a student group devoted to mergers and acquisitions.) You can find similar business-themed organizations on campuses around the country. They generate their own revenue on top of what they get from the universities, which have also found creative ways to monetize them via Corporate Partnership Programs (CPPs). In exchange for a fee from a company, the university provides it with member lists of relevant student clubs. That saves money for companies in recruiting new talent, of course, and it also allows them to target hard-to-reach populations. Noting that corporations often have a difficult time “balancing their diversity ratios,” one University of California campus promised them the email addresses of students in the Society of Women Engineers and the Society of Black Engineers.

    The CPPs also link companies to university career services departments, which have become headhunting agencies for the big finance and consulting firms. As the sociologist Amy Binder has observed, we used to think of career services as an office that helps students discover what they want to do. Now it delivers students to prospective employers, especially those who can afford to pay for prime real estate at the career-services center. Harvard even renamed its own career center after a prominent investment banker, who — not incidentally — gave the university a hefty donation. As always, the students get the message. 

    At the Harvard office, one student told Binder, there is an “entire section” devoted to finance and another to consulting. “And then they have the not-for-profits as a general clump [laughs] and then they have ‘other’ [laughs harder],” the student added. Students at Stanford likewise mocked the “gold, silver, platinum” arrangement at the university’s career center: obviously, the companies that put down the most cash got the most prominent billing. As their nervous jibes illustrated, students are uncomfortable with the nakedly transactional nature of this arrangement. But they also acknowledge that it works, in its own grim way. Asked to define a “good job,” one Harvard senior pointed to the firms that dominated the career-services center as well as on-campus recruiting events, where companies shell out big bucks to host information sessions and cocktail parties. “I guess a good job means consulting or finance, because, well, look, that’s what the Office of Career Services has,” the student said.

    Throw in a healthy dose of peer pressure, and it becomes next-to-impossible to resist the siren calls of finance, tech, and consulting. “There was, like, this stampede to start applying, and it wasn’t my conscious decision,” a Harvard graduate recalled. “It was more, I guess, I mean, I hate to use the term ‘fear of missing out.’” I hate it, too, but it is real. Nobody—well, almost nobody — goes to college thinking they want to become a management consultant. Then you watch your friends get dressed up, get interviewed, and land a high-salaried job. And all of a sudden you want the same thing. “There’s definitely a herd mentality,” a Harvard student told the New York Times this spring. “If you’re not doing finance or tech, it can feel like you’re doing something wrong.” It’s here, it’s available, and — most of all — people you admire are competing for it. Why not throw your own hat in the ring, and see where it takes you?

     Because the jobs stink, that’s why. You won’t hear that at the career-services office, which is paid to propagandize for the big firms. If you listen to the students, however, it comes through loud and clear. They know — in their bones — they will not find their passion or create a better planet at Bain Capital or Boston Consulting Group. “Everyone has this ‘change-the-world’ mentality when they come to Stanford,” one student said. “You come wanting to change the world and then you leave wanting to work at McKinsey.” And much of that work falls into the category of “bullshit jobs,” a term coined by the late anthropologist David Graeber. A bullshit job is one that you do not believe in, but that you do anyway; it is “a form of paid employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence,” wrote Graeber. 

    You don’t have to buy Graeber’s neo-Marxist explanation for this phenomenon — that a population kept busy with make-work will not revolt against capitalism — to see that he was onto something. We have created vast bureaucratic armies of managers, analysts, and assistants who despise what they do, and whom nobody would miss if they disappeared tomorrow. “Could there be anything more demoralizing,” Graeber asked, “than having to wake up in the morning five out of seven days in one’s adult life to perform a task that one secretly believed did not need to be performed — that was simply a waste of time or resources, or that even made the world worse?” The question answers itself. 

    Every month brings another report about burgeoning rates of anxiety, depression, and loneliness among teenagers and young adults. In a recent book, Jonathan Haidt attributes the youth mental-health crisis to the ubiquity of smartphones and especially of social media. The argument makes intuitive sense, especially to oldsters who see the kids scrolling all day — and well into the night — on their phones. If you are bombarded 24/7 with curated content from your peers, you will inevitably conclude that they are hotter and happier than you are. What could be more demoralizing — or more depressing — than that? But a monocausal explanation for a complicated phenomenon always fails, and social science has failed to document a clear causal relationship between social media use and mental illness. (In fact, one prominent study found that joining Facebook can enhance well-being.) Nobody seriously questions whether the mental health of young people has plummeted; the big question is why. “I keep asking for alternatives,” Haidt remarked recently, responding to his scholarly critics. “You don’t think it’s the smartphones and social media — what is it?” 

    Perhaps one of the causes is the scourge of bullshit jobs, and the sad sense of inevitability that surrounds them. I do believe that these jobs represent “a scar across our collective soul,” as Graeber memorably declared. But I don’t blame my students for taking them. Students often tell me it would be stupid or foolhardy not to take a bullshit job, even when they know they will hate it. When I graduated from college, I didn’t consider the possibility that I might be unemployable and a burden on my parents. And if you had suggested that I move in with them — something millions of college graduates do today — I would have laughed you out of the room. I grew up in a completely different world, where the continued prosperity and security of the American middle class was taken for granted. My students do not have the luxury of that presumption. They also come from a wider swath of the socio-economic spectrum: while rich kids are still the largest cohort, they are not the only one. And we all know that families are incurring enormous debt to pay for college, which surely increases students’ incentive to find high-paying jobs. Significantly, though, a recent survey at Harvard showed that first-generation and working-class students were no more (or less) likely than their well-heeled peers to take positions in finance or consulting. This isn’t just about the money, or the broader economic anxieties in our society. It is about the culture we have created inside the university itself.

    What can we do to change it?

    One common answer focuses on reviving the humanities, which require students to deliberate on the meaning of life — and, we might hope, make them reconsider their own. In several pieces in the New York Times, my colleague Ezekiel Emanuel has warned, like other commentators, against reducing college to a job training program that downplays — or ignores — the ethical dimensions of education. His first concern was the students who praised Hamas after its October 7 attack on Israel, which demonstrated their “moral obliviousness”: if they had received stronger instruction in the humanities, he argued, they wouldn’t have fallen for that kind of claptrap. (Whether their professors are reinforcing the claptrap is a different issue.) More recently, Emanuel has worried that vocationalism is crowding out the liberal arts tradition and its attendant values of inquiry, exploration, and civic understanding. “Ambitious students eager to land a prestigious consulting, finance or tech job . . . find it too easy to brush aside courses in the arts, humanities and social and natural sciences,” Emanuel and his co-author warned. We need “more Socrates and Plato,” they declared, and less data sciences and accounting.

    The trends they describe are unmistakable. Humanities departments have withered over the past two decades: smaller classes, fewer majors, and a shrinking faculty. A debacle. In part, it is owed to students voting with their feet: they think the jobs are elsewhere, so they gravitate to majors in business and other “practical” fields. But it also reflects conscious decision-making by the universities, which behave like businesses in their own right. In a bracing article earlier this year, the University of Chicago classicist Clifford Ando showed how “revenue-centered management” — one of several practices borrowed from the corporate realm — spurred the university’s business and public policy schools to offer undergraduate majors, so that they could siphon off more tuition dollars. (Willie Sutton said he robbed banks because that’s where the money is; at universities, it’s in the undergrads.) 

    But that means less revenue for the Humanities Division, which must continue to staff the much-vaunted core curriculum classes. It has been forced to hire adjunct instructors, who are themselves so strapped financially that they cannot devote sufficient attention to their classes. Nor do most of them possess the job protections that allow tenure-track faculty to write and to teach without fear or constraint. This says all you need to know about the real priorities at the University of Chicago, which continues to tout its liberal arts bona fides even as it strips the liberal arts of resources. “The only fields that matter are ones that are essentially isomorphic with particular occupations,” Ando darkly concludes.

    A possible solution to the problem — and an increasingly popular one among university administrators — is to show how humanistic disciplines can prepare students for jobs, too. There is plenty of evidence that employers prefer someone with an English degree to a candidate who studied business, because the English major is more likely to have the skills you need in the workplace: written and oral communication, critical thinking, teamwork, and so on. Shakespeare as the road to McKinsey. In the past several decades, moreover, humanities graduates have kept pace with other fields in average annual earnings. But most of our students don’t know that. “The humanities have a marketing problem,” noted a dean at Arizona State University, using another metaphor from the business world.

    Surveys of students at Arizona State University have shown that they associated “humanities” with careers in teaching and (bizarrely) human resources, but nothing beyond that. So the school brought in notable humanities alumni from different professions to spread the message: philosophy, history, and literature can help you succeed in the world. It also created internships that allow students in these fields to explore a variety of occupational paths. “Students want lucrative careers, but they don’t realize that the skills they need for those careers … are all skills that they could be acquiring in the humanities disciplines,” one English professor told the National Humanities Alliance, for its recent report on “best practices” in recruiting students. We need to “catch students early,” she added, and to “challenge the narrative they’ve been fed about the humanities.”

    But we also need to challenge the narrative about bullshit jobs and to expose our complicity — to borrow the activist term du jour — in promoting them. I want young people to study history so they can think critically about what Aristotle called “the good life,” the one that is worth living. But if we continue to sell our students’ futures to the finance and consulting industries, we answer the question for them. What is the good life? The one with lots of money and status, of course. The administrators are right: the humanities can and should prepare students for different types of work — many of which we cannot imagine yet — but we won’t do that well, or honestly, if we are simultaneously channeling them into a tiny band of jobs. That constricts their imaginations, of their own lives and others. And it makes a mockery of the liberal arts, which are supposed to liberate us from our dogmas and our preconceptions. As I tell my students, there is nothing wrong with choosing to become a management consultant, financial analyst, or tech start-up assistant, but there is something enormously wrong with an institution that advertises limitless opportunities and then funnels more than half of its population into consulting, finance, and tech.

    On the last day of my first-year seminar, we discuss William Deresiewicz’ Excellent Sheep. The book is ten years old, but it remains the best single critique of elite American universities that I have ever read. With just a semester under their belts, the students can already recognize their institution — and themselves — in Deresiewicz’s pages: the stressful admission process, the check-the-box classes, the grim “comps” for extracurricular activities, and so on. The real purpose of a place such as Penn, Deresiewicz insists, is to enrich Penn: we produce rich alumni who give us money, which then produces more rich alumni. To get an actual education — the kind that opens your mind rather than just lines your pockets — you must resist the institution. It wants to make you into the sort of person who becomes a business consultant. If that’s not what you want, Deresiewicz warns, you will have to put up a fight.

    Which is where my students start to push back. Yes, they acknowledge, there are students who stand apart from the dominant pressures of Penn. But they are often just as miserable as the stressed-out social climbers, albeit for a different reason: they feel like they do not belong. Most of all, the students say, it’s not fair — or realistic — to expect individual eighteen-year-olds to stand against the crowd. My students can already see the tensions between what we say — find your passion, change the world — and what we do. And they can tell how it produces cynics, who mouth the cliches of personal and social transformation while they trundle off to Bain and McKinsey. Yet it is equally cynical to suggest that there is nothing we can do, as an institution, to alter this reality. It is a collective problem, my students argue, so it also requires collective solutions.

    They are right. And if enough people spoke up — students, faculty, even administrators — we could tweak and alter things for the better. We could institute a weighted lottery for admissions, whereby Penn set a bar and randomized among all the applicants who landed above it: we would still be highly selective (our bar would be high), but we would dispense with the fiction that everyone who is accepted is better than everyone who is rejected. We could require student groups to take all comers, with obvious carve-outs for varsity athletic teams and the like; you would be free to establish your own Hunger Games, of course, but not on our dime. We could put an end to on-campus recruiting, the endless parade of students in fancy clothes lining up for fancy jobs; anyone should be able to apply to the big firms, of course, but there is no good reason that we should subsidize them. We could bar the firms from buying pride of place at the career services office. And we could stop selling them the names of our students — and with that, their souls — which might be the biggest scandal of all.

    What about classwork? Aden Barton, the Harvard undergraduate, reports that it is an “afterthought” for most students. But we could change that, too. We could require students to attend class (imagine!); we could establish minimum reading and writing expectations; we could institute grading curves. The call to revive humanistic study — more Plato, please! — assumes that students would actually read Plato if Plato was assigned. And we know that many of them will not, unless we take specific measures to make sure that they do. None of this would be easy; real change rarely is. But we shouldn’t allow our fraught present-day moment — including the great technological war on reading and more generally on attention — to block us from imagining different futures, which is the whole point of studying the liberal arts in the first place.

    Why college? I still believe in the answers that we hear every year at convocation: to find your place in the world, and to leave it a better place than it was when you entered it. I recognize that we have not made good on that high and right promise. The cynics will tell you that it’s impossible, and sometimes the cynics are right. But if you think otherwise, come join me in the humanistic underground. Come to class on time. Readings are mandatory. Visitors are welcome.

    The Problem of “Popular” Sovereignty

    “In America, the people govern, the people rule, and the people are sovereign.” So said President Donald Trump in his inaugural speech to the United Nations in September 2017. “In foreign affairs, we are renewing this founding principle of sovereignty. Our government’s first duty is to its people, to our citizens… As President of the United States, I will always put America first.” Trump used the terms “sovereign” and “sovereignty” some twenty-one times in his U.N. address. As this brief quote suggests, the meaning of these terms shifted throughout his remarks: first Trump said that the people govern, then he said that those who govern must protect the people, and finally he said that the nation would act in its self-interest. 

    Sovereignty is a concept as much used as little understood in contemporary political discourse. In purely secular terms, sovereignty is the right to rule and to make the rules. Even if we can offer secular accounts of the concept of sovereignty today, the concept’s origins, at least in Western thought, are hardly secular. The sovereign of all sovereigns, of course, is a monotheistic God, at least within the Abrahamic religions of Judaism, Christianity, and Islam. And the paradigmatic assertion of (and submission to) sovereignty is God’s inexplicable command to Abraham to slay his son Isaac. Kierkegaard called this the “teleological suspension of the ethical,” under which justice and reason must give way to the duty to obey the absolute sovereignty of God.

    Earthly sovereignty is either a pale copy of Divine sovereignty or takes its authority from Divine delegation or approbation. It is no accident that Romans 13:1 was a favorite prooftext for magistrates and would-be sovereigns: “Let every person be subject to the governing authorities, for there is no authority except from God, and those authorities that exist have been instituted by God.” Relying on such theological arguments, European monarchs insisted on the Divine Right of Kings: on their right to rule and to make the rules, and indeed on the absoluteness, indivisibility, and non-accountability of their power, at least while on earth. After all, God himself ordained their status.

    This example suggests that one of the most frequent and important features of sovereignty is its ideological function. Claims of sovereignty mystify claims of authority. They disguise what is really going on in a political situation. Assertions of sovereignty are often designed to give the impression that one has the right to rule and to make the rules, and that others cannot and should not interfere with that right. As an ideological concept, sovereignty’s purpose is to generate a false belief in justified subordination to those claiming sovereign authority. The confusions and mystifications of sovereignty emerge from its totalizing rhetoric. 

    In practice, however, sovereignty is always partial and incomplete, hemmed in and limited by other forms of power. No one and no thing — with the exception of the Almighty, of course — is fully and truly sovereign. (And as Milton vividly portrayed in Paradise Lost, even God can face rebellion from his disgruntled angels.) Everyone, to the extent that it even makes sense to call them sovereign, is sovereign only to a certain degree, whether conceptual or empirical. Claims of sovereignty are always bumping up against competing claims by others, a bit like the three mental patients in Milton Rokeach’s The Three Christs of Ypsilanti, each of whom claimed to be Jesus.

    Countries within the international system often complain about unjustified intrusions on their sovereignty, insisting that they be left alone or that they can take justified retaliation against those intrusions. Needless to say, those countries in conflict with them make similar claims. Whether these claims make theoretical sense, they are central to contemporary international relations. Both American conservatives and American progressives sometimes argue that it is important to protect our national sovereignty from foreign encroachments and from international law, though, of course, they may offer very different examples. At the same time, people assert the importance of “state sovereignty” within the United States to limit the very same federal government that is supposed to be sovereign with respect to international institutions. It’s all very confusing.

    Scholars have often noticed the ideological functions of sovereignty and criticized them. In his book on popular sovereignty, Edmund Morgan pronounced the idea to be nothing more than a fiction, used by power-grabbing politicians seeking to legitimate their claims to power. Many years ago Steven Krasner pleaded with political scientists to drop the term because it had no empirical validity. More recently, Don Herzog argued that we should stop talking about “sovereignty” altogether. It is not a meaningful concept, he explains, because almost no one in the contemporary world still believes in the normative desirability of an absolute, indivisible, and unaccountable power attached to any earthly authority. The concept was invented and theoretically elaborated in the sixteenth and seventeenth centuries as a device for bringing the savage European wars of religion to an end. As part of that arrangement, sovereign princes henceforth would get to determine the one true religion that would be accepted within their domains. Cuius regio, eius religio, as the saying went. Yet even as European monarchs were proclaiming their Divine right to rule, thinkers such as Thomas Hobbes were undermining religious claims to sovereignty and attacking them as dangerous. This led to the rise of a new conception of sovereignty located not in individual monarchs but in an imagined collectivity called the people.

    After the collapse of the Divine Right of Kings, a new ideological formation arose to take its place. This idea is popular sovereignty. The early medieval phrase Vox populi, vox Dei — the voice of the people is the voice of God — might suggest that, like the sovereignty of kings, popular sovereignty can similarly be justified on theological grounds. In fact, the phrase was originally used to ridicule the pretensions of popular sovereignty: before it came to be used in an anti-monarchical spirit, it was deployed to suggest that it is both blasphemous and ridiculous to analogize the collective will of ordinary human beings to the voice of God. Today the idea of popular sovereignty is usually articulated in purely secular terms. In his Leviathan, Hobbes — who ridiculed theological explanations — argued that the true sovereign authority is that of “the people” themselves. Desperate to escape an awful state of nature, the people, using their own reason, band together to establish and authorize an all-powerful government — at least as long as it provides security, which, for Hobbes, is the greatest of all political goods.

    As Richard Tuck emphasizes, with the rise of popular sovereignty there emerged a crucial distinction between “sovereignty” and “government.” An absolute monarch citing Romans 13:1 in early modern Europe claimed to be the sovereign and the head of government. There were obvious exceptions — for example, during a regency when the monarch was under age — but for the most part an absolute monarch was both the governor and the sovereign. Louis XIV bluntly proclaimed “L’etat, est moi.” Charles I in England had similar pretensions. But he, of course, lost his head, as did Louis’s successor a century later in France. Both succumbed, as it were, to popular sovereignty.

    Systems of popular sovereignty made the twin concepts of sovereignty and government come apart, especially as the size and the complexity of the state grew. In a system of popular sovereignty, the people, who are the sovereign, create a government, which can take many forms. But depending on how the framework of government is organized, the people may or may not directly participate in governance. In the Constitution of the United States, for example, there is no provision for direct democracy, although there is in many of the individual States. At the national level within the United States, the distinction between sovereignty and governance means that although the people in theory retain the right to rule and make the rules, they may not actually exercise that power in practice. Rather, that power is exercised on their behalf by those elected to represent them. 

    For James Madison, the lack of direct popular governance in the U.S. Constitution was a positive feature. In The Federalist No. 63, he explained that the difference between American government and the governments of ancient democracies “lies IN THE TOTAL EXCLUSION OF THE PEOPLE, IN THEIR COLLECTIVE CAPACITY, from any share” in actual decision-making. The capitalization is Madison’s own, in case we are tempted to overlook the importance of his statement. Madison does not mention Hobbes in The Federalist, no doubt for political reasons, but one can be confident that Madison fully embraced Hobbes’s notion that once a people establish and authorize their government, they should promptly become what Hobbes called the ”sleeping sovereign.” Since the sovereign is asleep, it plays no role in everyday government.

    Much political and legal discourse blurs the distinction between the sovereign and government, even though the distinction is actually central to popular sovereignty. In 1819, for example, in McCulloch v. Maryland, Chief Justice John Marshall referred to Maryland as a “sovereign state.” Of course, Marshall’s point in McCulloch was that even if Maryland called itself “sovereign,” it was hemmed in by any number of features of the national Constitution, including restrictions on its power to tax. Perhaps when Marshall called Maryland a “sovereign state,” he was being ironic. (Perhaps one should read this passage today with a hint of sarcasm.) But in 1819, many people thought otherwise, as some advocates of states’ rights do even to this day.

    The separation of sovereignty and government has two aspects. The first is that the government is not the sovereign, but merely its agent or servant. The second is that the sovereign, at least while sleeping, is not the government. The first point — that the government is not the sovereign — was articulated in 1793 in one of the earliest important cases of the United States Supreme Court, Chisholm v. Georgia. Chisholm involved a suit against the State of Georgia brought by the executor of the estate of a South Carolinian who had contracted with Georgia during the Revolutionary War. Georgia, claiming that it was a sovereign state, argued that it was immune from suit in the federal courts. Justice James Wilson, one of the most important figures at the Philadelphia Convention, emphasized that Georgia was “NOT a sovereign state” (capitalization in the original), while John Jay, the nation’s first Chief Justice (and a co- author of The Federalist), emphasized that in the United States only “the people” are sovereign. Indeed, Jay continued, not only could the states be dragged before federal courts, but the national government itself, being subordinate to “the people,” might also lack the “sovereign immunity” enjoyed by the British monarch. If the federal government enjoyed any immunity, Jay suggested, it was for wholly prudential reasons: if the government refused to pay damages there was no one to make it comply. This implied that, by contrast, the national government could (and would) enforce judgments against a recalcitrant state.

    The second aspect of the distinction between sovereignty and government — that the sovereign is not the government — emerges from the rise of popular governments that displace older monarchical forms. Yet in the very act of displacing monarchies, the division of sovereignty from government created repeated theoretical problems that have not been solved to this day. The gap between sovereignty and government creates the possibility of a similar gap between popular sovereignty and popular governance — in other words, it creates the possibility that those who actually govern can simultaneously claim the authority of “the people” while effectively shutting the people out of governance. If democracies involve sovereignty (that is, rule) by the people, but the sovereign is perpetually asleep, in what sense is popular sovereignty at work, and in what sense is it actually democratic? In other words, popular sovereignty creates the possibility of governmental arrangements that undermine democracy in the name of democracy, and that mystify the actual exercise of power by governing elites. As with the Divine Right of Kings, the concept of popular sovereignty might easily become an ideological weapon in the hands of ruling elites who wish to solidify their claims to power.

    To be sure, there may be no gap between sovereignty and governance, and therefore no democratic deficit, if the people are properly represented. If elections produce a perfect representation of the people — what John Adams idealistically called “mimetic” representation — and if the people’s elected representatives speak truly and faithfully for the people’s genuine interests, the problem of the gap between sovereignty and governance does not arise so urgently. Indeed, assume proper representation of the people and the problem of liberty is also solved. Under the older republican theory, people are slaves when they are subject to the arbitrary command of others, and they are free when they make laws for themselves. It follows that in a well-functioning system of representative government, the people are both sovereign and free. The laws they live under are also the laws that they — or, at least, their genuine representatives — make for themselves. 

    There are, of course, many theoretical problems with the theory of representative government, as thinkers from Rousseau to the present have explored. What constitutes proper, much less perfect, representation is often a matter of serious theoretical dispute. Other problems are practical and empirical. Representation is never close to being mimetic. To adopt Bill Clinton’s famous phrase, we do not have a government that “looks like America,” and people may disagree about which groups in the American mosaic would have to be added, and in which institutions, in order to achieve the necessary resemblance. Historically speaking, many people have not been represented at all in government, and defenders of the status quo have sometimes relied on implausible theories of virtual representation to keep it that way. 

    Even when voting rights have been extended to include more and more of the population, modern representative systems retain deep flaws. The wealthy and the powerful will inevitably corrupt the system, and self-dealing is a perpetual problem. Even if the problem of the corrupting influence of money on politics were fully and finally solved, all systems of representing popular values and preferences involve difficult tradeoffs. In an America with a population of over three hundred thirty million inhabitants, wouldn’t “mimesis” require a House of Representatives with thousands of representatives? That is far too many to allow genuine deliberation and debate. In sum, there is little reason to believe that the political systems we have today adequately address the gap between democratic sovereignty and democratic governance, even if some are assuredly better than others.

    There is yet another problem, which becomes ever more urgent as time goes on. One of the ironies of the modern period is that the theory of popular sovereignty caught on just as states were getting larger and more populous, and their systems of governance more elaborate. One could no longer assume the kind of small homogenous polities of which Athens was the model. In the twenty-first century, for example, Texas has seven times the population of the entire United States in 1790; the largest state, California, is ten times larger. Moreover, the population (and the citizens who participate in governance) are far more diverse than could possibly have been imagined in 1787. That is one reason why Montesquieu’s The Spirit of the Laws was so important to the American Founders, cited more often than any other single work in The Federalist. But the citations were almost always critical. Montesquieu had argued against the possibility of a successful extended (and heterogeneous) republic, and it was crucial for thinkers like Madison to prove him wrong, and explain why popular sovereignty made sense in an extended republic.

    Precisely because popular sovereignty rests on a distinction between sovereignty and governance, it creates a series of practical and theoretical problems, each of which troubles democracies to this day.

    1. How do “the people” change the form of government? Once a government exists, how, if at all, does the popular sovereign alter it? Hobbes, like Locke, imagined that the people would come together in the state of nature to form a government, but human beings rarely create new governing institutions in the total state of anarchy imagined in Leviathan. Rather, in most cases human beings act in medias res, with already constituted authorities and powers in place. The powers that be may resist any potential threats to their own authority. So they will usually oppose awakening the popular sovereign if it means that they will lose power — and they will insist that calls for constitutional reform do not reflect the people’s will, because the sovereign is still snoring — or at best sleepwalking.

    This is the problem of recognizing and organizing “constituent power,” or, to use the fancier French, pouvoir constituant. As a leading contemporary constitutional theorist, Martin Loughlin, has explained, “constituent power articulates the power of the multitude: constituent power is the juristic expression of the democratic impetus.” The problem of constituent power emerges from the separation of governance and sovereignty. To speak as sovereigns, the people must be organized in some way, but the most obvious way for them to be organized — and thus to speak — is through legal forms created by the existing government. This creates the possibility that the people’s voice will be muffled or distorted by these forms, or, possibly, that the voice of the people will be that of a ventriloquist’s dummy. More generally, the problem of constituent power creates the puzzle that the voice that underwrites the legitimacy of government must itself be shaped and constituted by the existing government. (The only exceptions are the rare instances in which catastrophic defeats in war or political revolutions wipe out any existing form of government.) One way around this is to create special conventions that exist solely for the purpose of enacting basic law, but these conventions, too, cannot arise spontaneously in a large country but must be organized by law.

    1. Who are “the people”? Constituent power is a power of the people as sovereign. But who are “the people” who are sovereign? How are they defined, and who has the power to define them? Here again the existing authorities are likely to shape how and which people are heard. Do “the people” consist of every human being living within the geographical confines of a state, or are only some of these human beings part of “the people?” Are all members of the community equal in their peoplehood, or are there gradations of the same? Thomas Jefferson notably spoke in the name of one “people” at the very beginning of the Declaration of Independence. But surely that did not refer to each and every individual person inhabiting the territory of the British colonies (including, for example, visiting sojourners and, more importantly, enslaved persons and members of Indigenous Nations). A similar problem arises for the “We the People” that opens the preamble to the U.S. Constitution and “ordains” what follows. To hear the people speak, one must decide who is allowed to speak in the first place.

    It is an uncomfortable truth that the major twentieth century theorist of pouvoir constituant was the odious Carl Schmitt, who, in his masterpiece Constitutional Theory posited that ein Volk stood behind all constitutions at all times. They (or it) retained the Hobbesian capacity to awaken from its/their slumbers and transform any existing constitutional order as it/they wished, without limits on its/their power. But to theorize in this way already poses the danger that some segment of the population will suppress or ignore other parts in order to assert its primacy. This presents, once again, the ideological mystification of the concept of sovereignty: a dominant group presents itself as the singular Volk that purports to speak authoritatively on behalf of all the various communities and individuals that live within the borders of the state, and thereby suppresses those who do not conform homogeneously to its imagined model.

    1. The populist temptation. The gap between sovereignty and governance naturally anticipates the rise of populism. Populism rests on the claim that “the people” are being unjustly shut out of government by elites, who, according to populist ideology, are opposed to the interests of the people and whose power is therefore illegitimate. One recurrent pathology of populism is its tendency to turn politics into a clash between the virtuous true members of “the people” and the “others,” consisting of a hated ruling class (and other internal enemies of the people). In older days, that might have referred to the capitalist owners of the means of production, though today it is more likely to apply to “meritocratic” elites, intellectuals in general, or members of minority groups whose exclusion helps define “the people.” (One of the ironies of populist politics is that populists who come to power and run important aspects of the government may still imagine that they are not members of the elite.). That is to say, populism can simply be a claim about the (unjustified) gap between sovereignty and governance, and a claim that the people’s agents should pay more attention to the needs and wants of the public – or, more darkly, in the name of that gap, it can be an excuse to engage in status competition and the social subordination of particular groups within a nation.

    Populism, we might add, is not necessarily the same thing as popular constitutionalism. Popular constitutionalism is a claim that the right to interpret the Constitution rests with ordinary people as much as with legal and juridical elites. It asserts that all individuals may practice constitutional interpretation — by analogy to the Protestant idea of a priesthood of all believers directly interpreting Scripture themselves. To be sure, such constitutional protestantism may be combined with an attack on elite culture and institutions. Yet claims of popular constitutionalism can and have been made by elites and by leaders of social movements as well as ordinary citizens, and by members of the political branches as well as members of the general public. 

    1. Ensuring faithful representation. The distinction between sovereignty and governance leads to still another problem. If the government is the agent of the people, what ensures faithful agency by the people’s servants? This question lies at the heart of constitutional design. The point of regular elections, fixed terms of office, separation of powers, and checks and balances is to ensure that the “servants” do not entrench themselves and oppress their erstwhile masters. 

    But the very need for these structures presupposes that representatives may not actually speak for the people they represent. That is why they need to be “checked and balanced.” So the question remains: Who, whether inside or outside government, speaks for “the people,” and how do we know that they speak on their behalf? Moreover, what if the people do not speak in a single voice? Who is it, precisely, that can legitimately claim to speak for “the people,” as distinguished from the multiple “peoples” that comprise a contemporary diverse and pluralistic society? Politicians regularly speak of what the American people want. But if there is not a single “people,” then any particular claim to speak on its behalf is fictional. At best one must stipulate that the vector sum of all of the cacophonous voices represented in political institutions somehow magically reflects the actual will of the people. But this is a stipulation rather than a plausible conclusion. 

    Perhaps, if one is thinking of the people of ancient Athens, one can imagine all of them — or, more to the point, all of the free enrolled male citizens — coming together in collective deliberation leading to a decision by majority vote in lieu of unanimous consent. The idea of a Quaker meeting, where everyone strives to reach consensus, may be an attractive proposition. but it is hard to imagine a sizable organization, let alone a modern nation-state, run on such principles. In any case, the Athenian model quickly gave way to much larger entities, the Roman Empire being the most obvious successor. This created yet another version of Montesquieu’s challenge: the larger the population, the greater the problem of making popular sovereignty a reality. In 1790, the total population of the United States was roughly four million, which is less than that of contemporary Oklahoma, currently the twenty-eighth largest state in the Union. A follower of Montesquieu might well doubt whether the concept of popular sovereignty makes any sense in such a world.

    The question becomes especially important when one thinks either of constitutional formation or significant constitutional change. In both cases, almost by necessity, very small groups claim a mandate to speak for “the people.” They might be bourgeois revolutionaries like the American Founders, or Leninists who style themselves as the vanguard of the working class. How does one justify treating a part as the whole — one political faction as instantiating “the people” even while denouncing others as illegitimate radicals (or reactionaries)? As critics of “populism” such as Jan-Werner Muller note, all such groups posit not only a potentially dangerous unitary notion of peoplehood, but also arrogantly assert that they have the unique ability to “represent” this far larger “people” and to discern the common good, even in the absence of formal elections or any other plausible modes of designation.

    One might be tempted to solve the problem by turning to plebiscitary notions of government or other forms of direct democracy. But plebiscites, a favorite of the egregious Schmitt, or even issue-based referenda, almost always involve up-or-down votes on proposals made by others, who may manipulate the wording of the proposals to ensure their desired outcome. Even if one is generally sympathetic to leavening representative democracy with direct forms of popular participation, the record of initiatives and referenda in the contemporary United States is decidedly mixed. At the end of the exercise, someone in our society is going to feel manipulated and unrepresented. The voice of the people is never simply given; it must be constructed, and the question is always by whom.

    This fact brings us to one of the unintended but important consequences of the rise of popular sovereignty. Since popular sovereignty must be always expressed in some representational form, the age of popular sovereignty also turns out to be the age of constitutions and constitutionalism. Modern democracies, to the extent that they are able to remain democracies, depend on well-working constitutions. Conversely, poorly functioning constitutional systems tend to create cumulative problems that, over time, threaten the stability of democracies and lead to their decay and demise. To the extent that American democracy has been successful, it is because its constitutional system has proved surprisingly adaptable in spite of its many flaws and its many seriously undemocratic features. (We note that the system did break down during the Civil War and had to be reconstructed through new amendments that made it somewhat more democratic.)

    The many flaws and undemocratic features of the American constitutional system have started to catch up with us. That is one reason why our political system seems so dysfunctional and the future of American democracy seems so fraught. Behind the popular anger and frustration that we see all around us is a veritable chasm between the ideology of popular sovereignty and the reality of unresponsive governance. We suffer from a deeply deficient system of popular representation, of which the malapportioned Senate and the broken system of campaign finance are only the most egregious examples. 

    If American democracy is to survive, Americans will once again have to engage in serious constitutional reforms that repair their broken system of representation. The good news is that there is a long history of such reforms in the United States, both at the state and federal levels, and both through formal constitutional amendment and through ordinary legislation (such as the Voting Rights Act of 1965). In fact, nothing is more characteristic of the long American experiment with popular sovereignty. During a seven-year period from 1913 to 1920, the United States adopted four constitutional amendments, and in 1933 alone the country adopted two more (one of which repealed the earlier Prohibition Amendment). This says nothing of the explosion of constitutional creativity at the state level during the Progressive Era. 

    The United States once had many able politicians — Woodrow Wilson and Theodore Roosevelt among them — who
    pushed for constitutional reform. Although serious discussions of changing our representational system have been off the table for many years, especially on the left, we think that there is now increasing interest. (For example, Steven Levitsky and Daniel Ziblatt, whose earlier book How Democracies Die largely ignored the role of constitutional design in undermining democracy, more recently published Tyranny of the Minority: Why American Democracy Reached the Breaking Point, which places the question of reform at its center.) 

    It is not too late for the United States to repair its creaky and increasingly antiquated system of popular representation. For this to happen, however, the American popular sovereign will have to find a way once again to awake from its slumbers and, in the words of the Declaration of Independence, “alter or … abolish” the old “destructive” forms and “institute new government.” As at the Founding and during the Progressive Era, that will require a genuine political conversation between an aroused public and a new generation of enlightened political leaders willing to think audaciously.

    We have noted that the shift from the sovereignty of monarchs to the sovereignty of the people creates a series of problems that reappear in various guises throughout the history of democracies. Since they continuously reappear, we doubt that there are final answers to these problems, rather than a series of evolutionary accommodations that will change over time.

    Don Herzog has argued that the theoretical problems with sovereignty are so great that we should stop using the word. Yet he draws back from suggesting that we must stop talking about popular sovereignty as well, which might be understood as a rejection of democracy itself. One might doubt the real possibility of junking a concept so central to American self-identity and the American constitutional project. Yet the issue cannot be resolved by mere nomenclature. Rather, we believe that the problems of popular sovereignty, which have been present from the beginning of the modern era, mean that modern democrats must decide what degree of misalignment between sovereignty and government they are willing to live with, and how they can best adapt their constitutional systems to the changing needs of changing populations. 

    At the same time, we think that modern states face increasing problems of justifying themselves in terms of popular sovereignty, as tensions within the paradigm become more pronounced. We do not believe that a naïve version of popular sovereignty can adequately explain the governance of a large transcontinental state of three hundred and thirty million people like the United States. We need a new way of imagining popular sovereignty if the concept is to remain workable. 

    We might make an analogy to Thomas Kuhn’s model of scientific revolutions, which explains how widely accepted scientific theories — such as Newtonian mechanics — generate anomalies that eventually lead to new conceptions, like the theory of relativity or quantum mechanics. For many years, popular sovereignty has been the standard model of how to justify and legitimate political power. Most political and legal theorists work within the “normal science” — to use Kuhn’s terminology — of liberal democratic theory. But various problems, or in Kuhn’s terms, anomalies, have always existed within the model. We either ignore them or we construct the equivalent of Ptolemaic epicycles to explain them. As states get larger and more elaborate, and as technological change proceeds, the problems of popular sovereignty that have existed since the beginning start to look even more problematic. The difficulties now seem more urgent. At some point the anomalies in the theory of popular sovereignty might call for a deep rethinking of the concept and produce a new system of political authority with a new justifying ideology.

    The sovereignty of monarchs eventually gave way to popular sovereignty. Perhaps, as society changes, popular sovereignty will someday give way to a new political conception of the right to rule, and humanity will find itself in a post-popular sovereignty world. Following the fall of the Berlin Wall, Francis Fukayama famously argued that liberal democracy had won out over its rivals and that it would dominate political thinking for the rest of history. The decades that followed his bold claim produced many historical rejoinders to it, in the form of democratic backsliding and the rise of new forms of authoritarianism. But there may be still another objection to Fukayama’s argument. It is possible that theories of popular sovereignty only make sense under certain conditions, and once those conditions disappear, new theories of sovereignty will displace them in whole or in part, just as democracies and authoritarian states displaced monarchies.

    What would a successor to popular sovereignty look like? We don’t know yet. Not only has the Owl of Minerva not spread her wings, she still seems to be soundly asleep on her perch. If we look around us, the most obvious candidate for a new political system would be technocracy, in which the right to rule passes to unelected experts. As the state becomes increasingly complex, and technology companies assume more and more practical governing power, perhaps the fiction of popular sovereignty will give way to a new form of technocratic aristocracy. Yet technocracy has serious problems, which should already seem obvious. The very features of technological advancement that bring technocracy to the fore also tend to undermine its authority. It is true, for example, that the digital age has given companies such as Facebook and Open AI increasing powers of influence and governance. What it has not done is bestow the sort of philosophical and sociological legitimacy that should accompany this increased power. Quite the contrary; so far the digital age seems to have created stiff headwinds to the political authority of large technology companies.

    The rise of the printing press undermined the authority of the Church and paved the way for the creation of the public sphere, which, as Jurgen Habermas contended, created a political space for the rise of representative democracies. Similarly, as Benedict Anderson argued, the rise of newspapers helped make possible the belief in nations of large populations spread over vast territories. One cannot tell quite the same story about digital technologies. The Internet makes experts ever more important to successful governance, but it also facilitates populist upheavals. The ability of everyone to be a broadcaster has greatly advanced the growth and spread of knowledge, but also the growth and spread of falsehoods, propaganda, and shared hallucinations. It has also helped generate increasing distrust of technocratic expertise, in much the same way that the mass distribution of religious texts and multiple versions of the Bible encouraged distrust in the expertise of the Catholic clergy and a desire for lay individuals to “do their own research” into what God wanted. Judging by the early years of the digital revolution, new communications technologies seem to have undermined the authority of all experts and made technocrats a despised group among much of the public. Perhaps the tide will turn as the age of artificial intelligence proceeds. Perhaps a new political formation will emerge out of technocracy, encompassing rule by experts assisted by artificial intelligence and algorithmic decision making. Perhaps the form of politics that the digital age facilitates is authoritarian. Or, more hopefully, perhaps the ideal of popular sovereignty still has life in it, suitably transformed in a digital economy.

    We can see the future only dimly at present. But we can draw one important lesson from the past. Democracy is a moving target, and it tends to evolve together with the dominant communications technologies of the time. The democracy of newspapers and books that characterized the political culture of the eighteenth and nineteenth centuries gave way to a democracy shaped by radio, television, and other mass media in the twentieth century. Twenty-first century democracy, if it is to survive, will have to find a way to adapt to digital technologies, pervasive data collection, and artificial intelligence. 

    Lawyers, in their role as the servants of political power, have always had a hand in theorizing and legitimating political change within an overarching status quo. Currently, however, they are mired in ancient discourses of sovereignty — like those promulgated by the U.S. Supreme Court — that ignore its many problems. Realizing popular sovereignty in the twenty-first century will take more imagination. To respond to the political challenges of the future, lawyers will have to put aside their lawbooks and broaden their vision. 

    And those who profess to be political leaders must learn to take a page from our Founders, who were not afraid to experiment with a broken system — the Articles of Confederation. Alexander Hamilton’s initial essay in The Federalist begins with the question whether nations “are forever destined to depend for their political constitutions on accident and force,” or “are really capable… of establishing good government from reflection and choice.” Hamilton argued for the latter, and his answer should be our answer today.

    Respect, or The Missing Relation

    I contemplate a bird. In fact it is a photo of a bird, many times larger than life, hanging on the wall of a café. I’ve never had a chance to scrutinize a bird so carefully before. After I finish admiring its beauty, I turn my attention to its claws, which are pointed and hard, its beak, which is open in a cry, its eyes, which are empty of pity or warmth. I think: this creature is intensely alien to me. It is not a cute little bird, a sweet little bird, look at the pretty little bird. It is not a bird in a children’s book. And it comes to me that I have never understood an animal this way before. That whether in a zoo, on a farm, in my yard — still more with a photo or video clip of the kind that are forever being passed around online — my response to animals has always been to anthropomorphize them, to project my subjectivity onto them, to slobber over them with my emotions, with my needs. To place them in relation to myself. And it comes to me as well that to refrain from doing so, to let the bird, the goat, the possum be exactly what it is, in itself and for itself, without reference to me, to accept it in its otherness, would be to treat it with profound respect.

    I am talking with a former professor of mine. She is telling me that she believes that part of our job as teachers of undergraduates is to help our students, as she puts it, “instrumentalize” the things they learn from us — instrumentalize them, she means, for the sake of social change. I’m skeptical. What do academics know about instrumentalizing anything? More to the point, what business do we have telling students what they ought to do with what we teach them? “Fine,” I say at last (this is some years ago), “as long as you would be okay with one of your students instrumentalizing what they learn from you to try to overturn Roe v. Wade.” She is stunned. The possibility has clearly never crossed her mind — the possibility, that is, that students might have goals that conflict with hers. That they possess an otherness that we as educators must respect.

    A few years later, I come across an essay by this same professor in The Chronicle of Higher Education, the principal organ of news and opinion about the academy. Titled “In Praise of the Academic Cliché,” it champions buzzwords such as
    “performativity,” “intersectionality,” and “heteronormativity” as agents of transformative social potential, especially once “they quietly wriggle through discourse, swimming from theory to classrooms” and thence, beyond the college walls, to essays, podcasts, Twitter, “mainstream journalism and popular entertainment.” The student’s function in the process is to carry them, the way that a deer might carry a tick. “Not all of our students will be original thinkers,” she writes, “nor should they all be. A world of original thinkers, all thinking wholly inimitable thoughts, could never get anything done. For that, we need unoriginal thinkers, hordes of them, cloning ideas by the score and broadcasting them to every corner of our virtual world. What better device for idea-cloning than the cliché?” Instead of teaching undergraduates to avoid clichés, as generations of instructors have done, “we should instead strive to send our students forth — and ourselves too — armed with clichés for political change.” 

    My professor had progressed from wanting to teach her students to instrumentalize ideas to wanting to instrumentalize her students: to recruit, enlist, train, mobilize, and deploy them — “armed,” in “hordes” — for the purpose (her purpose) of “getting things done.” Or rather, she had shown me that the second impulse was implicit in the first. Forget teaching people to think; forget uniqueness, individuality, the soul. The ideas are cloned, and so are the students. Nor is she alone in her desire, as anyone familiar with contemporary academia will know. Quite the opposite, in fact. Some years ago, to take one data point, I spent a couple of weeks at a moderately selective Catholic university: not an elite institution, not one you would think of as a redoubt of progressive ideology. Across the board, across the disciplines, the dean informed me, younger faculty believe their job to be indoctrination. Which means they think their mission is to serve the cause, not the students. The students are tools.

    This is the antithesis of what I am calling respect, be it for a creature or a student: the recognition of another as other and the willingness to let them be such. Call this antithesis projection, intolerance, the will to power, it is surely a persistent part of being human, and it is also surely getting worse. Academics are not the only professionals who have decided that their mission is to save the world and that their clients must be proselytized and propagandized, their personhood be damned, in order to do so. So have teachers, librarians, and social workers. So, perhaps worst of all, have counselors and psychotherapists, who practice the one line of work in which it is even more important than in education to treat one’s charges as individuals, people with their own particular histories, qualities, needs. All around us we are witnessing the loss of this thing that I’m calling respect. 

    The problem is bipartisan. The left speaks constantly of “difference,” but it cannot abide it. This is, again, an old phenomenon freshly intensified. “Deviation” from the party line was a cardinal sin in twentieth-century communism. Leftist groups, accordingly, were notoriously fissiparous, splintering into ever-smaller factions of doctrinally pristine believers. As class politics became self-expressive lifestyleism, the purism seamlessly transferred. A friend from redneck small-town southwest Georgia lived for many years on a commune in central Virginia largely populated by Northeast liberals. They were, she told me, the most intolerant people she’d ever met: accepting and open to all, as long as you were exactly like them. (Those acquainted with other progressive bastions — Berkeley, Cambridge, Brooklyn, et al. — will know what she was talking about.) “Do your own thing,” went the countercultural slogan, with the tacit addendum, “as long as I approve of it.” You could wear beads or berets or dashikis (or later, mohawks or dreadlocks or flannel), but never khakis or a suit, let alone a cross. And now that politics and self-expression have become coterminous (the personal is political, the political is personal), it is all Stalinism all the time. To be my “ally” means that you agree with me, not on a specific issue, as it once would have done, but on all of them, unquestioningly. There’s no more ordering à la carte; you have to swallow the entire menu.

    But the right is no better these days, having likewise largely extirpated its liberal commitments in the name of an epochal moral crusade. Epistemic modesty, à la Edmund Burke, is gone, as is libertarian toleration. Red America, as David French and others have reported, is as heavily policed as Blue. MAGA admits no dissent; its idol is a jealous god; Never-Trumpers are “human scum.” Progressive social power is answered by state censorship, Kendi and Butler by DeSantis and Abbott. Christian nationalism, including in its juridical manifestation as “common good constitutionalism,” promises to make us do what’s best for us, whether we like it or not. In his Dobbs concurrence, Clarence Thomas started to prepare the ground for the repeal of rights to contraception, gay marriage, and gay sex, and thus to their legislative suppression. The Libertarian Party, as Cato’s Andy Craig remarked not long ago, having “experienced a hostile takeover by far-right culture warriors” has embraced “a program of openly bigoted authoritarianism.” In this it is aligned with conservatism’s Orbanists and Putinists, their retreat from democratic pluralism to a fantasy of church and volk.

    This is politics, but beneath it, it is narcissism. Or rather, politics has become a mode of narcissism, which can be defined as the need to make the whole world over in one’s image, to fill it with the self. For its hypertrophy, which has gone beyond the darkest dreams of Christopher Lasch, there are many things to thank, but above all is the internet. We now have the ability not only to create our own reality, but to live uninterruptedly within it. The phrase “my truth” originated as an assertion of the validity of subjective experience, of feelings as real and important. Now it signifies the triumph of subjectivity, its abolition of the objective, the external, the empirical. “The Bible was written by Africans,” I overheard a fellow author confide at a booksellers convention. “I know some people disagree,” she added, having clocked the nearby Semite, “but that’s my truth.” Her right-wing counterparts include the individuals who, dying of Covid, continued to insist that the pandemic was a hoax.

    Narcissism governs the contemporary stance toward art as well. Instead of going to it in the fearful hope that art will trouble us out of ourselves — confront us with genuine difference, and therefore make us different — we insist that it affirm us. Women will aver that they prefer to read books by female authors; sometimes that that’s the only kind they read. Pedagogical authorities concur that children should be given stories about people who “look like them,” that anything else is an injury. When a work remains refractory to our desire for validation — often because it belongs to the past, that foreign country — we rewrite it. Shakespeare is “queered,” Austen is revealed to be a radical feminist, and so forth. Once again, this is an oldish story — it dates to the rights revolutions and the reading practices they spawned — that has in our century become immensely worse. For with its two-way social traffic, the internet has given rise to the phenomenon of fandom, with its enormous powers of insistence. Not just fans — “fandom,” like “kingdom.” Now the audience is able not only to project its desires onto its idols (devotees of Elvis or the Beatles could do that as well), but to make those figures answerable to its projections. Now artists and audience mirror each other, the ego duplicated in an infinite regression.

    I have struggled with these things myself (and not just with regard to birds): with intolerance, with projection, with the impulse to convert. When I started teaching in my late twenties, still in my militant-atheist phase, I had a student, fresh from Catholic school, named John Luke. I really gave that kid a time — not quite explicitly, which was maybe worse, because I never said anything that he could argue with directly. It was all insinuation, a subtle sort of intellectual bullying. I remember bringing in some Nietzsche once (this was freshman composition; other graduate instructors — we were a bunch of smug little bastards — were sneaking in swatches of Marx or Foucault). This will give my students something challenging to chew on, I thought, and if I can win one away from the pale Galilean, then so much the better. I feel a kind of psychic nausea when I think of this today. One definition of evil, I later discovered, understands it as the effort to impose one’s will on others. 

    Slowly, however, I managed to learn. Many years later, I had another avowedly Catholic student. One day, in office hours, she mentioned that she belonged to a campus organization called CLAY, or Choose Life at Yale. I inwardly recoiled. Holy crap, I thought, she’s one of them. (I also thought, good name.) But I managed to keep my mouth shut. She’s got a right to her belief, I reflected, and what’s more, I respect her for standing up for it under what are surely challenging conditions. (It was she whom I was thinking of, in fact, when I reminded my professor that there might be people in her classes who want to overturn Roe v. Wade.) I had gotten to know a lot of students over the years, and I cherished those connections, but there was something special about this one — something cleaner or purer, and precisely because of our differences. It is pleasant to have disciples, but it can also be corrupting. The moment I accepted her for who she was, she got a little realer — became more of an actual person, not an idea of myself echoed back to me — and so, I think, did I. I was over here; she was over there. I didn’t like her for being like me, and she didn’t like me for being like her. We eventually grew to be friends, and some twenty years later we still are.

    Friends. I have an old one, someone who has perpetually disappointed the expectations that people have had of her. We met in a Zionist youth movement, but she later stepped away from any form of Jewish practice or affiliation. She went to a leading professional school, but she abjured the prestigious career paths that her classmates pursued. Raised in an affluent suburban environment, she went off to live in a working-class rural community. “The worst thing you can do to your friends,” she once remarked, “is not be the person they want you to be.” I thought of that when I was having dinner with another friend, another former student, a young man who was taking his time about getting his life on track, in a way I was getting impatient about. He had just broken up with his girlfriend, he told me. Again? I thought. “I’m sorry to hear that,” I said, though he didn’t seem sorry at all. And then I caught myself. “That was a stupid thing to say,” I said. “Why should I care if you have a girlfriend?” Why indeed? Who was I to be “impatient”? I was too identified with him. I needed to let him be who he was going to be, whatever he was going to be.

    If this is difficult to do with friends, it is virtually impossible to do with children. Virtually, but not completely. A parent was telling me that she couldn’t wait for her teenage daughter to go off to college so that she, the parent, could finally get some distance from her. I thought, for years I’ve been advising young adults that they need to separate from their parents. It hadn’t even crossed my mind that parents ought to try to separate from their children, because I hadn’t imagined that such a thing was possible. Later, I read this in Louise Glück, a writer who knew about separateness:

    I’ll never understand
    the claim of a mother
    on a child’s soul.

    So many times
    I made that mistake
    in love, taking
    some wild sound to be
    the soul exposing itself…

    The soul is silent.
    If it speaks at all
    it speaks in dreams. 

    A child’s being is their own. Mothers and fathers, it’s not about you.

    Respect, as I am calling it, shows up in the political realm as tolerance. I used to hate that concept, back in my days as an angry young Jew. Who are you, I thought, to merely tolerate me? Am I supposed to be grateful for that? But our politics of mutual negation has made me wiser. Tolerance, compared to what we have, would be tremendous, would be a terrific achievement. Tolerance, in a democracy, signifies the recognition that the people whom you hate the most — Nazis, let us say, to put it in the starkest terms — have a right to share the political community with you: to speak, vote, advocate, educate, organize, assemble, just like you do. That they are your equals as citizens — I would add, as human beings. Being a Nazi is a civil right; being a Nazi is a human right. To grasp that is to understand the stony way of tolerance.

    My own instruction in this virtue came courtesy of Dave Chapelle. It was one of his Jew jokes, about the Jews controlling Hollywood or some such. My first thought was, fuck this guy. My second was, I’m never going to watch a thing he does again. My third was, idiot, this is what tolerance means, spiritually if not literally: not having to approve of everything another person does, and not disengaging from them even when they anger you, even when they offend you. Being okay with not being okay. What you are tolerating, ultimately, is your own discomfort.

    It’s hard. It’s definitely hard. And it goes the other way as well. I am white, middle-aged, middle-class. When I encounter someone from the other side of one of those divides, my self-consciousness kicks in. It isn’t guilt; it’s a feeling of inauthenticity, like I can suddenly see through my act (I’m so white, so stiff, so deeply uncool) because I imagine that they can. My instinct is to pander, to assimilate myself to them: to fall in with their way of speaking, standing, holding themselves, with their point of view. (Anyone who’s watched a grown-up try to talk to a bunch of teenagers will understand what I mean.) But in time I’ve learned to check that impulse. When faced with difference now, I don’t reject it and I don’t surrender to it. (Keep your back straight, I sometimes literally tell myself.) I’ve also learned that people will respect you more (in the familiar sense) if you just be yourself. And it’s the only way, of course, to build a genuine relationship.

    So what am I saying here? What exactly is this “respect” that I’ve been mulling over ever since I saw that picture of a bird? To help me consider the matter a little more rigorously, I turned to Martin Buber, with his famous I-Thou and I-It. Where does respect fall in relation to that distinction? I-It is instrumental: you use the thing, the creature, the person, for your own ends. I-Thou is relational, being to being. “I contemplate a tree,” Buber writes. “I become bound up in relation to it. The tree is now no longer It. I have been seized by the power of exclusiveness.” In the moment of I-Thou, in other words, the Thou is all there is. I apprehend it in its wholeness, its unity, its being. “The tree is no impression,” no bundle of separate sensations, nor is my relation with some kind of indwelling spirit. “I encounter no soul or dryad of the tree, but the tree itself.”

    There is much to admire in I and Thou, but also much, I find, to question. Buber tells us that to meet the Thou — which finally means the divine — to enter into what he calls a genuine relation, one needs a “full acceptance of the present,” of reality (“The Word of revelation is I am that I am.…That which is is, and nothing more”). One needs to practice not a “seeking” but a “waiting.” Very good. What Buber is describing is a mystical experience, the suspension of time and space and ego, such as we learn about also in other traditions. It is rare; it cannot be achieved by will alone; it is a gift of grace. But it doesn’t, for that very reason, help us much in ordinary life. And the only alternative, he says, is I-It. All is relation (“There is no I taken in itself”), and there are only two relations.

    This will not do. Conversations with friends, acts of love and care, the connection of teacher to student — all these are instrumental? No. There must be something in between his two extremes. We do not need to “Thou” the other in order to refrain from instrumentalizing them. The essence of respect, in fact, is non-identification. It is a refusal of projection. For that, I think, is what the I-Thou ultimately is. It is a strangely non-relational relation. He’s vibing with the tree, as the kids would say, but is the tree vibing with him? It’s just a tree, after all. He may feel a reciprocity (the tree “has to do with me”), but a feeling does not tell you anything except that you are having a feeling. 

    Buber gives the game away when he turns from trees to human beings. “Even if the man to whom I say Thou is not aware of it,” he writes, “yet relation may exist.” But to call this a relation is to strain the term beyond its breaking point. It seems, instead, a private experience, however exalted, one in which the other person functions as a kind of spiritual trampoline. As for creatures, wild animals, Buber has the hardest time with them of all, perhaps because, unlike a tree, they visibly respond to us. Contemplate an animal, be it a backyard bird or a deer in a forest, meet its eyes, and what you are likely to register is not “relation” but a sense of threat, as in, what is this ape and why is it staring at me?

    For Buber, I-Thou is the ground of morality. Its essence is love, the “responsibility of an I for a Thou.” But if we need to love the other in order to treat them correctly, then we’re all in a great deal of trouble. We should not have to empathize or sympathize or understand or “leap the chasm of otherness” or “be in relation.” We only have — but this is not an easy thing — to see the other in itself and for itself. I think of an acquaintance who lives in western Massachusetts and has spent some time in California. He much prefers the Yankees, he has told me, as dour and unfriendly as they often are, because when push comes to shove, you can count on them. They don’t have to like you to help you. The Californians appall him, precisely because their morality is based on feeling, on spurts of universal love. No love today? No help, no recognition, no concern — go soak your head.

    No doubt this is partly a matter of temperament — Buber is terribly moist — but I am for a dry morality. I am for detachment, even alienation, as a hedge against over-identification. I am for letting the other be other (including the universe, which some ventriloquize as the divine). You can love the other, but you can equally leave them alone. Buber speaks of community, the form that “relation” assumes in collective life, but this is not to be confused with actual communities. The latter means that everyone is in each other’s grill, whereas my ideal is the city, where people mind their business unless otherwise requested. Many years ago, I spent some months on a kibbutz. “People here,” a resident told me, “will let you into their living room, but they won’t let you into their bedroom.” He wasn’t talking about polyamory; he was talking about the fact that close quarters can militate against intimacy, because they force you to defend your boundaries. But cities, with their ethic of noninterference, can make not only for strong individuals, but also, in my experience, for strong attachments. No distance, no crossing. No separation, no connection. If I am over here and you are over there, then at least we can say that we know where we are.

    I had been trying to come up with an alternative to Thou and It, a third term for a third dyad: I-That? I-Them? Then I realized the real problem is that pesky “I,” with its knack of getting in the way. We can’t be rid of it, and so we must constrain it. I think of the concept of tzimtzum — the act, in Kabbalah, whereby an infinite God creates the world by contracting himself to make room for it. The ego also tends to fill immensity. Self-contraction is a decent rule of conduct, and a useful prayer would be, Lord help me to make myself small.