The Doctrine of Hate

    Julius Margolin was born in 1900 in Pinsk. After studying philosophy in Germany in the 1920s he moved to Poland with his family, where he became active in Revisionist Zionism  and published a Yiddish book on poetry. From there he and his family moved to Palestine. For economic reasons, Margolin returned to Poland in 1936, where he was trapped by the Nazi invasion, and was eventually imprisoned in Soviet labor camps. In July, 1945 he was released and made his way back to Tel Aviv, where he wrote a pioneering memoir of the Gulag and died in 1971. The full text of Journey into the Land of the Zeks and Back was not published in his lifetime.

    After my release from Maxik’s hospital, having had an opportunity to rest, and armed with certification as an invalid, I returned to the camp regime. In Kruglitsa, a certified invalid with a higher education has a wealth of possibilities. You can choose: assist the work supervisor in compiling the lists of personnel in the brigades; work in the Cultural-Educational Sector (KVCh); or be an orderly in the barrack. Until a prisoner is taken off the official work register, he will not be sent to such unproductive work. The place for a healthy, able person is in the forest or field, where hands and shoulders are needed. The work boss will not allow an able-bodied worker to have an office or service job. An invalid is another matter. Whatever he is able and willing to do without being obliged to do so is a pure gain for the state.

    At first, I was amused at the accessibility of work from which I had been barred as a third-category worker. When they found out that Margolin had been deactivated, people immediately invited me to work in various places, and I succumbed to temptation. An invalid is allotted the first level food ration and 400 grams of bread. By working, I received the second level and 500 grams.

    For an entire month, I tried various places. After a ten-week stay in the hospital, it was pleasant to be occupied and to be listed in a job. After a month, however, I came to feel that I had been deactivated for a reason. I lacked strength. The job with the work supervisor dragged on until late at night. Work at the KVCh entailed being in motion all day, making the rounds of the barracks, rising before reveille. As a worker in the Cultural-Educational Sector, I had to get up an hour before everyone else: by the time the brigades went out to work, I had to list on the huge board at the gate the percentage of the norm that each brigade had fulfilled the previous day.

    A worker calculated these norms in the headquarters at night and, before going to sleep, he left the list for me in a desk drawer in the office. The camp was still sleeping, the dawn reddened behind the barracks, and the guards were dozing on the corner watchtowers, when I would climb with difficulty onto a stool that I had placed in front of the giant chart and begin writing in chalk on the blackened board the figures for the twenty brigades.

    This work bored me. The thought that as an invalid I was not obliged to endure this misery gave me no rest. I had been an invalid for an entire month and had not yet utilized the blessed right to do nothing; I had not taken advantage of my marvelous, unbelievable freedom. In the middle of the summer in 1943, I declared a grand vacation. At the same time, it represented a great fast: 400 grams of bread and a watery soup. It was June. Blue and yellow flowers bloomed in the flowerbeds in front of the headquarters; under the windows of the infirmary, the medical workers had planted potatoes and tobacco. In the morning, the patients crawled out to the sun and lay on the grass in their underwear or sunned themselves in the area around the barracks. When I went by, barefoot, in my mousy gray jacket without a belt, fastened by one wooden button near the collar, they shouted to me: “Margolin, you’re still alive? We thought you were gone already!”

    Without stopping, I went on to the farthest corner of the camp territory. I had a blanket, a little pencil, and paper. There was lots of paper: in the past month, I had hoarded a respect-able amount. I even had a little bottle of ink from my work in the KVCh. I would take a rest from people, the camp, work, and eternal fear. I lay on my back, watching the clouds float above Kruglitsa. A year earlier, I had worked in the bathhouse and ran into the forest for raspberries. Amazingly, then I was able to carry three hundred buckets of water a day. That year depleted me. Now there were no raspberries, but neither did I have to drag water buckets. I was satisfied; it was a profound rest.

    In the summer of 1943, a storm raged over Kursk, and Soviet communiqués spoke of gigantic battles, as if all the blood receded from this great country and flowed to the single effort in that one spot. One hardly saw healthy males in Kruglitsa. Women guarded the prisoners and conducted the brigades to work. Gavrilyuk, who the past summer had been a Stakhanovite wagoner, now, like me, had been retired from work, and women prisoners worked as wagon drivers in camp. Women, like reservists, went to the first line of work. We knew from the newspapers that, throughout the country, women were working as tractor drivers, in factories, and in the fields. The free men held the battle front while the male prisoners in the camp melted like snow in the spring sun and descended under the ground. I knew that in another year I would be weaker than I was at present. If the war dragged on, I would die and not even know how it ends. Out of pure curiosity, I wanted to make it to the end of the war.

    That summer, my first grand interlude as an invalid, I wrote “The Doctrine of Hate.” That summer I was preoccu-pied with thoughts about hate. Lying in the grass behind the last infirmary, I returned to the topic from day to day and turned out chapter after chapter. I experienced a profound and pure enjoyment from the very process of thought and from the awareness that this thinking was outside-the-camp, normal, free thought, despite my current conditions and despite the barbed wire fence and guards. This was “pure art.” There was no one to whom I could show it or who could read what I was writing, and I felt pleasure from the very activity of formulating my thoughts, and as the work advanced I also felt carry three hundred buckets of water a day. That year depleted me. Now there were no raspberries, but neither did I have to drag water buckets. I was satisfied; it was a profound rest.

    In the summer of 1943, a storm raged over Kursk, and Soviet communiqués spoke of gigantic battles, as if all the blood receded from this great country and flowed to the single effort in that one spot. One hardly saw healthy males in Kruglitsa. Women guarded the prisoners and conducted the brigades to work. Gavrilyuk, who the past summer had been a Stakhanovite wagoner, now, like me, had been retired from work, and women prisoners worked as wagon drivers in camp. Women, like reservists, went to the first line of work. We knew from the newspapers that, throughout the country, women were working as tractor drivers, in factories, and in the fields. The free men held the battle front while the male prisoners in the camp melted like snow in the spring sun and descended under the ground. I knew that in another year I would be weaker than I was at present. If the war dragged on, I would die and not even know how it ends. Out of pure curiosity, I wanted to make it to the end of the war.

    That summer, my first grand interlude as an invalid, I wrote “The Doctrine of Hate.” That summer I was preoccupied with thoughts about hate. Lying in the grass behind the last infirmary, I returned to the topic from day to day and turned out chapter after chapter. I experienced a profound and pure enjoyment from the very process of thought and from the awareness that this thinking was outside-the-camp, normal, free thought, despite my current conditions and despite the barbed wire fence and guards. This was “pure art.” There was no one to whom I could show it or who could read what I was writing, and I felt pleasure from the very activity of formulating my thoughts, and as the work advanced I also felt proud that to a certain degree I was prevailing over hatred, was able to grasp it, and to subject it to the court of Reason.

    This subject was dictated by my life. What I had endured and seen around me was a true revelation of hate. In my previous life, I only heard or read about it, but I never encountered it person-ally. Neither racial nor party hatred had crossed the threshold of my peaceful home. In camp, for the first time, I heard the word “kike” directed at me, felt that someone wanted me to perish, saw victims of hate around me, and witnessed its organized apparatus. In camp, I, too, for the first time learned to hate.

    Now it was time for me to elaborate all this material theoretically. How simple it would be to go away from the haters to that bright kingdom of warmth and humanity in which I, unawares, lived before the Holocaust. It is natural for a person to live among those who love and are loved by him, not among enemies and haters. But this was not my fate. Nor was I able to resist hatred actively. The only thing that remained free in me was thought; only by thought could I respond. There was nothing else I could do but try to understand the force that wanted to destroy me.

    I was less interested in the psychology of individual hatred than in its social function, its spiritual and historical meaning. I saw hatred as a weapon or as a fact of contemporary culture.

    The most important thing, with which I began, was the dialectic of hate. Hatred is what unites people while dividing them. The link via hate is one of the strongest in history. Souls come together in hate like the bodies of wrestlers — they seek each other like wrestlers in a fight. You cannot understand hate as pure negation, because if we merely do not love or do not want something, we simply walk away from it and try to eliminate the unnecessary and unpleasant from our life. There was something in my hatred of the camp system that forced me to think about it, and I knew that my hatred would not let me forget it even when I got out of here. Hate arises in conditions when we cannot escape. Hate is a matter of proximity. Personal, class, or national hatred — it is always between cohabitants, between neighbors, between Montague and Capulet, over borderline and frontier.

    The paradox of hate is that it leaves us in spiritual proximity to that which we hate until, ultimately, there arises rapprochement and similarity. Sometimes, the hate itself turns out to be merely a concealed fear of what attracts us, as in Catullus’ poem Odi et amo, as in Hamsun’s “duel of the sexes,” as in a lackey’s hatred for the lord, and finally, in antisemitism of the maniacal type, when people cannot do without the Jews. Here is an acute example. Adolf Nowaczyński, a talented Polish writer, was a malicious hater of everything Jewish. When he approached old age, he took off for Palestine to see things with his own eyes, and it turned out that he felt quite good in Tel Aviv. This man’s life would have been empty without Jews. If they had not existed, he would have had to invent them, and ultimately that is what he did all his life. There is hatred toward fascism and even hatred of communism that derives from a certain moral closeness and, in any case, leads toward it over time. We cannot hate what is absolutely incomprehensible and alien. The incomprehensible arouses fear. Hatred, however, needs an intimate knowledge and multiplies it, and it endlessly forces us to take an interest in what we detest.

    This was the paradox of hatred that I examined from all sides while lying in the sun in the corner of the camp yard. Hatred was not only before me — it was also inside me. In me, however, it was different from the hatred against which my entire being rebelled. It thus was necessary to differentiate the various forms of hatred, in order to distinguish between  the hatred that was inside me and what to me was an odious and evil hatred.

    I began by identifying some bogus and altered forms, the pseudo-hatred that only obscures the essence of the matter. I saw that an inapt item or something with an external resemblance paraded under the label of hatred. Away with counterfeits!

    First: juvenile hatred, odium infantile. Children are capable of the most fierce, frantic hatred, but that is only “ersatz,” not serious. Juvenile hatred is a momentary reaction, an acting out. It boils up in an instant and passes without leaving a trace; it rises and bursts like a soap bubble. In essence, it is an outburst, a fit of emotional distress. This is precisely the reason why, in its mass manifestation, by virtue of its qualities of easy arousal, easy manageability, and evanescence, it is particularly suitable for the purposes of cold-blooded producers of this hatred and inciters, who always mobilize it in the masses when it is necessary to stimulate them to an extraordinary effort, to struggle in the name of changing goals. Hatred goes to the masses, flows along the channels of calculated propaganda, but it is all on the surface; it has neither depth nor stability. Left to itself, it dies out or unexpectedly changes direction, as in 1917, when the masses, filled by Tsarist governments with pogromist and front-line hatred, turned against the govern-ment itself. The savage hatred of the incited mass, like fuel in a car, turns the wheels of the military machine, but the ones at the steering wheel are calm and cool.

    Ripe, mature hatred does not have the nature of a momentary reaction; it is a person’s automatic, internally determined and stable position. It does not exhaust itself in one ferocious outburst but gnaws at a person’s entire life and lurks behind all his manifestations and deeds. Psychologically it is manifested in a thousand ways. From open hostility to blind nonrecognition, all shades of dislike, malice, vengeful-ness, cunning and envy, mockery, lies, and slander form the vestments of hatred, but it is not linked exclusively with any one of them. There is no specific feeling of hatred; in its extreme form, it ceases to need any kind of “expression.” 

    A child’s hatred is expressed in screaming, foot stamping, and biting. The hatred of a savage, which is the same as a child’s hatred, elementary, bestial fury, is expressed in a pogrom, in broken skulls and bloodletting. There is, however, mature hatred that is expressed only in a polite smile and courteous bow. Perfect hatred is Ribbentrop in Moscow, kissing the hands of commissars’ wives, or Molotov, smiling at the press conference. We adults have learned to suppress and regulate manifestations of our hatred like a radio receiver, turning it off and on like a light switch. Our hatred is a potential force; therefore, it can be polite and calm, without external manifestations, but woe to the one who shakes an enemy’s extended hand and walks along with him.

    The second form of pseudo-hatred is odium intellectuale: the hatred of scientists, philosophers, and humanists — it is the hatred of those incapable of hating, the academic hatred of intellectuals, which was introduced as an antidote and placed as a lightning rod against barbarism. This vegetarian, literary hatred would have us hate abstract concepts — not an evil person but the evil in man, not the sinner, but sin. This hatred unceasingly exposes vices and fallacies, mistakes and deviations against which we are ordered to fight. This theoretical hatred completely fences itself off from the practical. Unfortunately, the street does not understand these fine distinctions: mass hatred recognizes only that enemy whose head one can break.

    Humanism in its essence cannot oppose hatred. We know of two attempts in the history of culture to eliminate hatred from human relations: “nonresistance to evil” and the view that the end does not justify immoral means. Passive resistance to evil, however, invariably switches to active resistance against the bearers of evil, and the question of “ends and means,” with its artificial division of the indivisible, remains intractable so long as we do not know what specific means are being used for precisely what goals. Historically, butchers and murderers invariably used abstract, theoretical hatred for their own purposes, expertly contriving to turn every intellectual product into a weapon of mass murder and unlimited slaughter.

    Christ drove the money lenders out of the Temple. His successors excommunicated the heretics from the church and lit the bonfires of the Inquisition, up to Torquemada and that papal legate who, upon suppressing the Albigensian heresy, said, “Kill all of them; God will recognize his own.” The Encyclopédistes and Rousseau hated vice and believed in the triumph of virtue. The French Revolution introduced the guillotine. Marx started with the liquidation of classes and of exploitation in human relations. His followers turned Marxism into a formula of mass terror, when a “class” is destroyed not as an economic category but as millions of living, innocent people. “Kill them all; history itself will revive what it needs.” The process contains a tragically inevitable progression, and, unavoidably, the warrior-humanist becomes a captive of an alien element, as in the case of Maxim Gorky in the role of a Kremlin dignitary. The teachers either capitulate in the face of the conclusions that the pupils derive from their lessons or perish in prison or on the scaffold.

    Odium intellectuale, the theoretical hatred of scholars, thus either fails to achieve its goal or leads to results that are diametrically opposite to the original intention. Luther throws an inkpot at the devil. The devil turns the philosopher’s ink into blood and a sea of tears.

    The third form of hate that I isolated in my analysis is odium nationale, the well-meaning hatred of those who take up arms in order to halt the force of evil. Evidently, there was never a dark force that did not try to pass itself off as just and worthy. Evidently, we have no other means of distinguishing between good and evil than by Reason and Experience, which teach us to recognize the essence of phenomena from their manifestations and consequences. There is, thus, a hatred that is rational and transparent in all its manifestations. It is clear to us why and when it arises. Its logical basis is at the same time the reason for its conditional nature, as it disappears along with the causes that evoked it. This hatred is so secondary and reactive that we can easily designate it as counter-hatred. We do not need it intrinsically, but when an enemy imposes it upon us, we do not fear to take up the challenge, and we know that there are things in the world that are worth fighting against — the passion and force of survival which do not yield to the enemy’s force and passion but have nothing in common with them in their inner essence.

    Having thus carefully differentiated the historically present forms of pseudo-hatred — mass-juvenile and intellectual-ab-stract, and the rational counter-hatred of the warrior — I approached the eyeless monster that at the time of my imprisonment had spread over all of Europe.

    Unlike the superficially emotional, infantile hatred of the crowd, the theorizing hatred of the intellectual, and the sober, clear conviction of the defenders of humankind, there is a force of primal and pure hatred, active despite its blindness, and blind despite its initiative, and the more active the less causally provoked. It fears only the light of day. Reason is its natural enemy. 

    Haters of the world are united in their negation of freedom of the intellect. The mark of Cain by which one can recognize genuine hate is scorn of free thought, rejection of the intellect. For Hitlerism, free thought is “a Jewish invention”; for the Inquisition, it is a mortal sin; for the ideologues of communism, it is counterrevolution and bourgeois prejudice. Every basis for such hate is imaginary and pseudo-rational. It is therefore natural that the people who established forced-labor camps in Russia simultaneously eradicated freedom of discussion and the right of independent investigation there. 

    In a pure, undiluted form, hatred is self-affirmation via another’s suffering. People become haters not because their surrounding reality forces them to that. There is no sufficient basis for hatred in the external world. There is nothing in the world that could justify the annihilation of flourishing life and proud freedom undertaken by Hitler, the fires of the Inquisition, or the prisons and pogroms and the camp hell of the Gestapo and the NKVD.

    There is a pyramid of hate, higher than the Palace [of the Soviets] that is being constructed in Moscow at the cost of hundreds of millions while people are dying of starvation in the camps. At the base of this pyramid are people similar to children, wild savages, like the one who hit me with a board on the road to Onufrievka, or the SS man who shot my elderly mother on the day the Pinsk ghetto was liquidated. These people rape, destroy, and murder, but tomorrow they themselves will be the most mild and obedient and will serve the new masters and believe the opposite of what they believed yesterday, and others — just like them — will come to their homes to murder and rape. Above these people stand others who teach them and entrust them to do what they do.  Above them are still others, who engage in ideology and theoretical generalizations, and those embellishers, who service the hatred, deck it out, put it to music, and dress it in beautiful words. 

    Ultimately, however, at the very top of the pyramid stands a person who needs all this: the incarnation of hatred. This is the organizer, the mastermind, the engineer and the chief mechanic. He has assembled all the threads in his hands, all the subterranean streams and scattered drops of hatred; he gave it direction, a historic impetus and scope. At his signal, armies cross borders, party congresses adopt resolutions, entire peoples are exterminated, and thousands of camps are erected. And he may be kind and sweet: he may have six children as Goebbels did or a “golden heart” like Dzerzhinsky’s, an artistic nature like Nero’s or Hitler’s, and the Gorkys and Barbusses will not stop slobbering over him. He, however, decreed that somewhere people must suffer. He executed them in his mind when no one yet knew about his existence. Even then he needed this.

    This brings up a central question in the doctrine of hate: What is the makeup of a person, a society, an epoch if naked hatred has become such a necessity for them, if the senseless torment-ing of their victims becomes a necessary condition of their own existence? It is not at all easy to answer this question if one does not adduce the familiar so-called arguments that the German people “were defending themselves against the Jews,” that the Inquisition was “saving souls,” or that Stalin is re-educating and reforming “backward and criminal elements” with the help of the camps. This is obvious nonsense. Of course, I in no way harmed the Germans or needed a Stalinist re-education, but even if that had been the case, it would not justify the gas chambers or turning millions of people into slaves. Germany did not need the gas chambers; the Russian people did not need the camps. But they are truly necessary for the big and little Hitlers and Himmlers, Lenins and Stalins, of the world. What, indeed, is going on?

    One must clearly recognize that the people holding the keys of power are fully aware of and admire the extent of the avalanche of human and inhuman suffering that seems like an elemental misfortune to us little people. Those people are responsible for its existence every minute and second. They have started it and control it, and it exists not because of their ignorance or impotence but precisely because they know well what they are doing, and they are doing precisely what meets their needs. Only a dull, wooden German lacking imagination, such as Himmler, needed to visit Auschwitz in person in order to look through a little window of the gas chamber to see how hundreds of young Jewish girls choked to death, girls who had been specially dispatched to execution that day for that purpose. People of the Kremlin do not need to observe personally; they have statistics about the camp death toll.

    There is no answer to why this is necessary other than to analyze the known pathological peculiarities of human nature. There is no rational, “economic,” or other explanation of hatred. The logic of hatred is the logic of madness.

    That man [Stalin] hates: He cannot do without this attitude to people; without it, he suffocates. Hate is the oxygen that he breathes. Taking hatred away from him would leave him destitute.

    That man hates, which means that some kind of inner weakness develops into hate, the result of some organic problem. Some kind of lack, defect, or unhappiness may remain within the bounds of his sense of self, but it may also spread to his social milieu and be transmitted to other people. There are wounded people, vulnerable classes, ready to turn into breeding grounds of collective hate. There are situations when people, groups, or societies are unable or unwilling to look truth in the face.

    In Vienna, young Hitler discovered that the Jews are responsible for depriving him and the German people of their deserved place in the sun. This is preposterous but, indisputably, this man started with some feeling of pain; he was deeply hurt. Had he wanted the truth, he would have found a real cause, but the truth was too much for him to bear. He therefore began to search for external guilty parties. Here the mechanism of hate begins to operate. The real pain turns into an imagined insult. An enemy and offender must be found. 

    The need for an enemy is radically different from the need for a struggle that is characteristic of every strong person. Strong people seek an arena, an outlet for strength. The hater seeks offenders to accuse. On the one hand, the need for a struggle engenders courage and initiative. On the other, the need to deal with a cunning enemy engenders aggressiveness and malice. The offender is always nearby. If he is not visible, that means he is in disguise and must be unmasked.

    All haters are great unmaskers. Instead of a mask, however, they tear off live skin, the true nature, and they replace reality with a creation of their inflamed fantasy. Hatred starts with an imaginary unmasking and ends with real flaying, not in theory but in practice.

    The analysis of our epoch given by Marx and developed by Lenin crossed all bounds of a reasonable interpretation of reality. Pseudo-rational theory turned into a Procrustean bed that did not accommodate real life. It is sufficient to compare the tirades of Mein Kampf with Lenin’s passionate polemics  and his thunderous charges against capitalism to sense their psychological affinity. It is the language of hate, not of  objective research. We can learn as much about reality from Marxist-Leninist scholastics as we can from the Protocols of the  Elders of Zion.

    Every hatred that reworks pain into insult carries out “transference,” in the language of modern psychoanalysis. The source of the pain is internal but we transfer it to the outside. Others are to blame when things go wrong for us, when our plans do not succeed and our hopes are crushed. We thus find an outlet, a relief, but only an illusory one. Hate acquires an address — a false one. Revenge, dictated by hate, misses the mark, like a letter sent to an incorrect address. Hatred engenders a constantly hungry vengefulness. 

    An imagined or real hatred becomes a pretext for hateful acts if a person has a need and desire to hate. Sooner or later, this need will be expressed in aggression. Even if there is a real objective murderous force, which derives from a hopeless attempt to build one’s own cursed existence on the misfortune and death of those around.

    In order to find support in the external world, this deadly force needs to falsify it. The world is not suitable as is. It is literally true that Streicher and Goebbels could not hate Jews because they did not know them at all. If they had known this people with true, live knowledge, this hatred could not have developed. Their hatred related to that distorted, deformed notion of the Jewish people that they themselves had created and that was dictated by their need to hate. In the institutions of the National Socialist Party, in the Erfurt Institute, there were enormous piles of material about the Jewish people, but the thousands of pieces served them only to create a monstrous mosaic of slander. 

    In the same way, the people who sent me to this camp did not know me. Their hatred consisted precisely of their not wanting to know me and not having hesitated to turn my life and face into a screen onto which to project an NKVD film: “A threat to society, a lawbreaker. Henceforth this person will be not what he thought he was but what we want him to be and what we shall make of him.” In order to erase my existence as they did, one had to harbor a great, formidable hatred  of humanity.

    Until we uproot this hatred, it will not stop slandering people and their real impulses, will not cease circling around us, seeking out our every weakness, mistake, and sin, which are numerous, not in order to understand us and to help us but in order to blame us for its own thirst for cruelty and blood.

    Pathological hate reflects the primal instinct of the rapacious beast who knows that he can appease his agonizing hunger by the warm blood of another. Millennia of cultural development infinitely distanced and complicated this instinct by pseudo-rational sophistry and self-deception. Human rapaciousness exceeded that of the beasts, differing from it in that it manifested itself under senseless pretexts in the name of imaginary goals. The struggle against hatred is thus not limited by humankind’s biological nature but encompasses all the specifically inhuman, the perversions, and the lies that comprise the anomaly of highly developed culture and cannot be destroyed until its existence becomes common knowledge. 

    Free and perspicacious people someday will destroy  hatred and create a world where no one will need to hate or oppose hatred. The human striving for freedom is incompatible with hate. Without going into complex definitions of freedom, one can agree that as it develops, freedom will steadfastly expel lies and hatred not only from the human heart but also from human relationships and the social order. Opposition to lies and hatred is thus already the first manifestation of human freedom.

    Having finished my investigation of hatred with this proud phrase, I turned over onto my back and looked around: I was lying in a meadow, on green grass at the end of the camp. The forbidden zone started five steps away and a tall palisade with barbed wire spread around. Several prisoners swarmed in the forbidden zone; they were cutting the grass and digging up earth. Under the windows of the hospital kitchen formed a line of medics with buckets for soup and kasha.

    In the most minuscule hand, I erased all dangerous hints.  I read it with the eyes of the security operative: it was an “antifascist” document written by a stranger, but it was not blatantly counterrevolutionary. Understandably, there was not a word about Soviet reality in this manuscript. I had to keep in mind that it could be taken away at any moment in a search. 

    But I had pity on my manuscript. There was no chance of hiding a work of that size for a long time in the camp. Suddenly, I had a fantastic idea. I got up and went to the KVCh, where two girls were sitting at two tables. 

    “What do you want?”

    “This is what I want,” I said slowly. “I have a manuscript of about a hundred pages. … I am an academic and wrote something in my specialty. In the barrack, you know, it’s dangerous. They’ll tear it up to use for rolling cigarettes. I want to give it to the KVCh for safekeeping. When I leave here, you’ll return it to me.”

    The girl was taken aback. She and her friend looked at me in dull astonishment, suspiciously, as at someone abnormal. In the end, she went to the phone and asked the guardhouse to connect her to the security supervisor.

    “Comrade supervisor, someone came here, brought a manuscript, and asks that we take it for safekeeping. He says that he is a scientific worker.”

    She repeated this several times over the telephone, then she turned to me:

    “Your name?” 

    I gave it.

    The girl conveyed my name, listened to the answer, and hung up the receiver. “The supervisor said,” she turned to me, hardly keeping back laughter, “’let him throw his manuscript into the outhouse’.”

    The Wonder of Terrence Malick

    The best American film of 2019, A Hidden Life, was little seen, and nominated for nothing. Why be surprised? Or assume that our pictures deserve awards any more than the clouds and the trees? Try to understand how movies may aspire to a culture that regards Oscars, eager audiences, and fame as relics of our childhood. The ponderous gravity of The Irishman and its reiterated gangster fantasy, the smug evasiveness of Once Upon a Time … in Hollywood, were signs that old movie habits were defunct. Parasite was no better or worse than cute opportunism. It was a wow without an echo. Whereas A Hidden Life was like a desert, known about in theory but ignored or avoided. I use that term advisedly, for Malick is a student who knows deserts are not dull or empty. They are places that can grow the tree of life as well as any forest. Simply in asking, what is hidden here?, Terrence Malick was leading us to ponder, What should a movie be?

    He had never volunteered for conventional schemes of ranking. His creative personality can seem masked or obscure, but his reticence is portentous too, and it belongs to no one else. Had he really taught philosophy at M.I.T. while doing a draft for Dirty Harry? Please say yes: we so want our auteurs to be outlaws.  His self-effacement, his famous “elusiveness,” was often seen as genius. Yet some early admirers felt he had “gone away” in the twenty-first century, or migrated beyond common reach. People regarded his private and idiosyncratic work as egotism, no matter how beautiful it might be. Some were disinclined even to try A Hidden Life after the inert monuments that had preceded it. But it was — I say it again — the best American film of 2019, a masterpiece, and it invited us to try and place Malick, and to ponder if our “map” was part of the problem. To put it mildly, A Hidden Life does not seem American (or even Austrian, where it was set and filmed). It is occurring in cultural memory as a sign of what we might have been.

    There was never a pressing reason to make up our minds about Malick. He was casual, yet lofty; he might be an artist instead of a regular American moviemaker in an age when it was reckoned that tough pros (like Hawks and Hitchcock) made the best pictures. Thus he began with two unwaveringly idiosyncratic films — Badlands in 1973 and Days of Heaven in 1978. He took in their awed reception and then stopped dead for twenty years, and let his reputation become an enigma. Did he really prefer not to appear with his movies, or give helpful interviews, so that he could be free to pursue ornithology and insect life? Was he unpersuaded by careerist plans, or cleaning up in the manner of Spielberg or Lucas? In never winning an Oscar, he has made that statuette seem a stooge.

    It has always been hard to work out his intentions. Going on the titles, Badlands could be a perilous vacation, while Days of Heaven might promise transcendence. In the first, across the empty spaces of the Dakotas and Montana, Kit Carruthers found his daft halcyon moment of aimlessness while being taken for James Dean, while in the latter, in its gathering of rueful magic hours, we encountered a broken family where a guy was shot dead, his girl was thinking of being a hooker to survive, and the kid sister was left alone with her mannered poetry (like Emily Dickinson voiced by Bonnie Parker). In its locusts and fire, and a screwdriver thrust in the farmer’s fragile chest, Days of Heaven spoke to the ordeal of frontier people in 1916 going mad, skimming stones at entropy, or posing for the pictures in Wisconsin Death Trip (published by Michael Lesy in the year Badlands opened). The two films together said that America was an inadvertently gorgeous place full of terrors. 

    Those early films were filled with love and zest for drab characters buried in the hinterland yet nursing some elemental wonder. But decades later, in 2012, To the Wonder felt like a crushing title for a film that had lost touch with ordinary poetry. Its women were models fleeing from Vogue. Whereas Sissy Spacek as Holly in Badlands (twenty-four yet doing fourteen without strain or condescension) was somehow routine as well as brilliant. Her unwitting insights hovered at the brink of pretension, but any doubt we had was lost in captivation for this orphan who packed vivid party dresses for her violent spree into emptiness. This was after Kit had shot her father dead, not just because dad didn’t approve of a garbage collector outlaw going with his Holly, but because he hadn’t the patience to listen to the rambling solo that was so folksy and menacing — “Oh, I’ve got some things to say,” Kit promised. “Guess I’m lucky that way.”

    And Holly did feel wonder for this vagrant actor. It was there in the flat adoration that Spacek offered him. She slapped his face for killing Dad, but then went along with him, too matter of fact to pause over spelling out love, but utterly transported by this signal for young getaway. Badlands was akin to Bonnie and Clyde, but you felt that Kit and Holly were in a marriage they did not know how to express. And they were sustained by Malick’s amused affection. He was close to patronizing his couple, maybe, making them babes in the woods in a surreal frame, but he felt their romance as much as he was moved by sunsets and the childish tree houses that they built. They were savage yet humdrum, and Kit’s killings were as arbitrary or impulsive as his funny chat. Yes, he was psychotic and headed for the electric chair, but the sweet interior desolation of their America understood them and treated them kindly. When Kit was captured at last, the horde of cops, sheriffs, and soldiers recognized that he was a cockeyed hero, the escapee they had dreamed of.

    One can love Bonnie and Clyde, but that love story is self-conscious about its lust for fame; it really was the heartfelt cry of Beatty and Dunaway and a generation yearning to be 

    known. Badlands, by contrast, is so casual or inconsequential, and so appreciative of a wider span of history, the one we call oblivion. It has a notion that vagrancy and lyricism were responses to the heart of it all, the vast stretch of land where badness is as implicit as beauty. Bonnie and Clyde do not notice where they are, but Kit and Holly are specks in an emptiness as infinite as breathing. It’s only now, in retrospect, that the film seems so intoxicated with talk and its futile liberty, when Malick was headed towards a sadder future in which his stunned characters said less and less, and sometimes became so reduced to half-stifled sighs you wished they’d shut up. That early Malick loved loose talk. Compared with the directors of the early 1970s he was like a muttering Bartleby alone in a crew of insistent press-agented Ahabs.

    This leaves you wondering how few authentic outsiders American film has permitted.

    Malick was thirty-five when Days of Heaven opened, the son of a geologist, born in a small town in Illinois, of Assyrian and Lebanese descent. He graduated from Harvard, and then went on to Oxford as a Rhodes scholar, without getting any degree there. The general estimate is that he was brilliant, as witness his published translation of Heidegger’s The Essence of Reasons. But who has read that book, or is in a position to judge the translation? So it’s part of the uncertain myth that includes our wondering over whether Malick has had private money. Or some lack of ordinary need for it. How has he seemed so unprofessional?

    He is credited with the script for Pocket Money (1972), a Stuart Rosenberg film with Paul Newman and Lee Marvin that is more odd than striking. But it led to Badlands, for which he had some money from the young producer Edward Pressman, from the computer pioneer Max Palevsky, and from a few people he knew. All of which meant it wasn’t a regular production like other films of 1973 — The Exorcist, Mean Streets, The Sting,  American Graffiti, The Way We Were. Caught between two parts of The Godfather, it didn’t seem to hear them or know them. Badlands may have cost $300,000. Warner Brothers bought the picture and released it: it closed the New York Film Festival in 1973, and if it perplexed audiences, there was a sense 

    that something rare and insolent had passed by. Badlands didn’t care what we felt: suspense and attention were mere horizons in its desert, not luxury hotels. It was an American product, but it had a more hushed European ambition. You could label it a Western if you were ready to agree that Hollywood, born and sited in the West, never knew or cared where it was.

    Some tried to see Badlands as a slice of picaresque life. We knew it was derived from a real case, a minor outrage on the remote prairie. In 1957-1958, in Nebraska mostly, the nineteen-year-old Charles Starkweather had killed ten people with fourteen-year-old Caril Ann Fugate as his companion. Fugate actually served seventeen years in prison, no matter that in the movie she says she married her lawyer. (That was a prettification or a kind of irony.) And there was more real grounding in the steady assertion that Martin Sheen’s Kit was a lookalike for James Dean and therefore rooted in popular culture. Kit and Holly dance to Mickey and Sylvia singing “Love is Strange,” from 1956.

    Strange was only half of it. In 1973, the feeling that sex was at hand on the screen was still pressing. As Kit took off with Holly, it was natural to think they would soon be fucking. Malick allowed an offhand obligation to that craze — $300,000 carried some box office responsibility — but he was too unimpressed or philosophical to get excited about it. Married three times by now, he doesn’t do much sex on screen. “Did it go the way it’s supposed to?” Holly asks Kit about their unseen coupling. “Is that all there is to it? Well, I‘m glad it’s over.” All said without prejudice to their affinity or their being together. 

    The absent-minded talk meant more than the way Sissy Spacek secured the top button on her dress “afterwards.” After all, her character was fifteen and he was twenty-four. Yet they were both children in Malick’s art. And then, like kids, they lost interest in their adventure, even in sex, the sacrament in so many pictures of the 1970s. The novelty of Badlands was its instinct that life was boring or insignificant. And that was asserted amid a culture where movies had to be exciting, urgent, and “important.”

    Malick knew that “importance” was bogus. Or he had his eye on a different order of significance. And other truths and differences were inescapable in his film: that no runaway kids had the temerity or the rhythm for talking the way these two did; that stranger than “Love is Strange” was the way Carl Orff and Erik Satie played in their summer as warnings against “realism.” The people in the film were not just movie charac-ters, they were shapes in a mythology. A similar thing happened in Days of Heaven with its triangle of attractions, where Richard Gere, Brooke Adams, and Sam Shepard seemed unduly pretty for the Texas Panhandle. Malick had narrative problems on that picture which he solved or settled by summoning the voice of Linda Manz’s kid sister — a laconic, unsentimental, yet dreamy observer of all the melodrama. (The voice was sister to Holly, too.) She was part of the family, but her voiceover let us feel the narrative was already buried in the past, and nothing to fret over. Life itself was being placed as an old movie.

    Days of Heaven was extreme in its visualization: it included a plague of locusts, which was an epic of cinematography and weird special effects, involving showers of peanut shells and characters walking backwards. But the quandary of the Brooke Adams character, in love with two men, both unlikely in the long term, was the closest Malick had come to novelistic drama. I still feel for Shepard’s farmer, a rich man at a loss with feelings, though Malick had the sense to save the reticent Shepard from “acting.” Instead he was simply photographed, as gaunt and theoretical as his great house planted on the endless prairie. Just as he was shy of sex, so Malick the director was hesitant over what the world called story.

    No great American director has carried himself with such indifference as to whether he was being seen, let alone understood.  To see Malick’s work has always been a way of recognizing that the obvious means of doing cinema — appealing stories with likeable actors that move us and make money — was not settled in his mind. I think that is one reason why he stopped for twenty years — just to remain his own man, and not to yield to the habit of eccentric beauty in case it became studied, precious, or crushingly important.

    Thus, in 1998, The Thin Red Line seemed to believe in a new kind of authenticity and directness. Wasn’t it “a war movie”? Didn’t it make more money than Malick has ever known? Wasn’t it about killing the enemy, that blessed certainty that films provide for fearful guys? It offered Guadalcanal in 1942, and it came from a James Jones novel, the writer of From Here to Eternity, which for someone of Malick’s age really was the Pacific War, despite being short on combat and going no farther west than Hawaii. The Thin Red Line is the infantry, landing on an island, and reckoning to take its peaks and destroy the enemy. It is a man’s world that male audiences might relax with. There are only fragmentary glimpses of women left at home — a rapturous shot of an occupied dress on a summer swing, something that would become an emblem of happiness in Malick’s work.

    But nothing competes with the ferocity of Colonel Tall, played by Nick Nolte in the most intense performance in a Malick picture, as a commander whose orders were abandoned and denied. That is not how war films are supposed to work: no one ever challenged John Wayne at Iwo Jima or Lee Marvin in The Big Red One. But Malick’s thin red line is less conventional or reliable. It finds its example in the wistful instinct to desert on the part of a common soldier, Private Witt (Jim Caviezel). For Jones, Witt was an extension of the brave victim Prewitt 

    whom Montgomery Clift played in From Here to Eternity, but for Malick the lonely private is another version of Bartleby, who gives himself up finally not just in heroism but in almost yielding to hesitation.

    Maybe this was once a regular combat picture, to be set beside the work of Sam Fuller or Anthony Mann. But not for long: inscape pushes battle aside for a contemplation of tropical grasses tossing in the wind, insect life, parrots and snakes, intruded on for a moment by war but not really altered by it. Malick has an Emersonian gift for regarding human affairs from the standpoint of nature. It is in the perpetuity of nature that Malick perceives the strangeness, and the humbling, in Earth’s helpless duration. This war prepares us for the bizarre little dinosaurs in The Tree of Life, and the unnerving perspective in which we observe or suffer the earnestness of Sean Penn in that film.

    That touches on a fascinating atmosphere attached to Malick and his blithe treatment of stars. In his long absence from the screen, the glowing characters in those first two films seemed to attract actors, as if to say it might be them, too. He seemed as desirable for them as Woody Allen — and sometimes with a similar diminution of felt human reality. He must have been flattered that so many stars wanted to work for him; he may have forgotten how far he had excelled with newcomers or unknowns. Still, I found it disconcerting when John Travolta or George Clooney suddenly turned up in spiffy, tailored glory in The Thin Red Line, and one had the feeling with The Tree of Life that Sean Penn was vexed, as if to say, “Doesn’t Terry know I’m Sean Penn, so that I deserve motivation, pay-off, and some scenes, instead of just wandering around?” Led to believe he was central to The Thin Red Line, Adrien Brody was dismayed to find he had only a few minutes in the finished film.

    Was this just an experimenter discovering that his film could remain in eternal post-production? Or was it also a creeping indifference to ordinary human story? Was it an approach that really required novices or new faces? How could big American entertainments function in this way? How was Malick able to command other people’s money on projects that sometimes seemed accidental or random, on productions that had several different cuts and running times? He seemed increasingly indecisive and fond of that uncertainty, as if it were a proof of integrity. Was he making particular films, or had the process of filming and its inquiry become his preoccupation? How improvisational a moviemaker is he? And what were we to make of its end products — or was “the end” a sentimental destination mocked by the unshakable calm of 

    duration? How could anyone get away with The Thin Red Line costing $90 million and earning back only a touch more? I could make a case for The Thin Red Line as Malick’s best film and the most intellectually probing of them all. But “best” misses so many points. To shoot it, Malick had gone to the jungles of northern Queensland and even the Solomon Islands. The weapons and the uniforms seemed correct, but the hallowed genre of war movie was perched on the lip of aestheticism and absurdity and surrealism.

    As a world traveler and a naturalist — his nature films are certainly among his most marvelous achievements — Malick was especially sensitive to terrain. For The New World, in 2005, he went to the real sites, the swampy locations, of early settlement in Virginia. He researched or concocted a language such as the natives might have spoken. His tale of John Smith, Pocohontas, and John Rolfe has many enthusiasts for its attempt to recreate a time so new then and so ancient now. This was also a historical prelude to the wildernesses in Badlands and Days of Heaven. It might even begin to amount to a history of America.

    I had worries about the film, and I have never lost them. Its Pocohontas was exquisite and iconic, even if the picture tried to revive her Powhartan language. But the actress, Q’orianka Kilcher, was also part German, part Peruvian, raised in Hawaii, a singer, a dancer, a stunt performer, a princess of modernity, with evident benefit of cosmetics and a gymnasium. Whereas Sissy Spacek in Badlands had a dusty, closed face credible for the look of a kid from Fort Dupree in South Dakota in the 1950s, uneducated, indefatigably unradiant, born bored, more ready for junk food than primetime fiction. That background was what made Holly so absorbing, and it was Kilcher’s emphatic beauty that shifted The New World away from urgency or naturalism. It was as if Angelina Jolie or Joan Crawford were pretending to be the Indian maiden.

    In a way, Pocohontas was the first adult female in Malick’s work, but was that a warning sign that maybe he didn’t fathom grown up women once they had got past the wry baby talk that makes the first two films so endearing? The New World did not really have much caring for Native Americans, for women, or for the challenge of Europeans determined to take charge of any viable Virginia. It was a film that opted for the picturesque over history, whereas Badlands and Days of Heaven lived on a wish to inhabit and understand America in the unruly first half of the twentieth century as a wilderness succumbing to sentimentality. But the picturesque has always been a drug in cinema, and it had been lurking there in the magic hours in Days of Heaven.

    There was a gap of six years before the pivotal The Tree of Life, perhaps Malick’s most controversial film. Here was a genuinely astonishing picture, ambitious enough to range from intimacy to infinity. In so many ways, it was an eclipsing of most current ideas of what a movie might be. At one level, it was entirely mundane, the portrait of two parents and their three sons in a small town in Texas in the 1950s. For Brad Pitt (a co-producer on the project), the father was a landmark role in which he allowed his iconic status to open up as a blunt, stubborn, unenlightened man of the 50s. Jessica Chastain was the mother, and she was placid but eternal — she was doing her pale-faced best, but surely her part deserved more substance to match not just Pitt but the wondrous vitality of the boys (Hunter McCracken, Finnegan Williams, Michael Koeth, and Tye Sheridan).

    All his working life, Malick has excelled with the topic of children at play, and as emerging forces who jostle family order. Don’t forget how in his first two pictures adult actors were asked to play child-like characters. The family scenes in The Tree of Life are captivating and affirming with a power that is all the more remarkable because the subject of the film is the family’s grief at the death of one of these children. The Tree of Life insists that the death of a child is a cosmic event. Not long after the young man’s death is announced, and before the story of the family is told in flashback, there is an unforgettable yet pretentious passage shot with almost terrifying vividness from nature — the bottom of the sea, the fires of a volcano, the reaches of space — accompanied by religious music. With an epigraph from Job, the real subject may be sublimity itself.

    No one had ever seen a film quite like it. Reactions were very mixed. The picture won the Palme d’Or at Cannes; it had many rave reviews; it did reasonable business. There were those who felt its perilous edging into pretension and a sweeping universality in which the movie vitality of the family succumbed to the melancholy of grazing dinosaurs who had never been moviegoers. But there were more viewers who recognized an exciting challenge to their assumptions. The Tree of Life prompted a lot of people in the arts and letters to revise their ideas about what a movie might be. Pass over its narrative situation, this was a film to be measured with Mahler’s ruminations on the universe or with the transcendent effects of a room full of Rothkos.

    And then Malick seemed to get lost again. He veered away from the moving austerity of Days of Heaven to a toniness more suited to fashion magazines.  There was widespread disquiet 

    about his direction, owing to the modish affectation in To the Wonder (2012), Knight of Cups (2015) and Song to Song (2017). From a great director, these seemed confoundingly hollow films that almost left one nostalgic for the time when Malick worked less.

    Ironically, To the Wonder is the one film for which he has owned up to an autobiographical impulse. It grew out of hesitation over his third and fourth wives, presented in the movie as Olga Kurylenko and Rachel McAdams, two unquestioned beauties. McAdams delivers as good a performance as Brooke Adams in Days of Heaven, but there are moments where her character’s frustrations could be interpreted as the actress’ distress over poorly written material. Malick was now running scared of his ear for artful, quirky talk. But the women in To the Wonder are betrayed by the worst example of Malick’s uninterested stars. Ben Affleck is the guy here, allegedly an “environmental inspector.” That gestural job allows some moody depictions of wasteland and some enervated ecstasy over the tides around Mont-Saint-Michel in France. Yet the situation feels the more posed and hollow because of Affleck’s urge to do as little as possible. His hero is without emotional energy; he deserves his two women as little as male models earn their expensive threads in fashion spreads. The film’s clothes are credited to the excellent Jacqueline West, but they adorn a fatuous adoration of affluence.

    West was part of Malick’s modern team: the film’s producer was Sarah Green; the engraved photography was by the extraordinary Emmanuel Lubezki; the production design was from Jack Fisk still, who had held that role since Badlands, where he met and then married Sissy Spacek; the aching music was by Hanan Townshend in a glib pastiche of symphonic movie music — it was so much less playful or spirited than the score for Badlands. The only notable crew absentee was Billy Weber, who has been the editor on many Malick pictures. To the Wonder is said to have earned $2.8 million at the box office, and it’s hard to believe it cost less than $20 million. If that sounds like a misbegotten venture, wait till you struggle through it and then wonder what let Malick make another film in the same clouded spirit, Knight of Cups. And then another: Song to Song, the ultimate gallery of beautiful stars, supposedly about the music world of Austin, which came off semi-abstract no matter that Malick had lived there for years.

    Any sense of experience and vitality seemed to be ebbing away. Was he experimenting, or improvising, or what? The several loyalists involved, as well as those players who were filmed but then abandoned, might say it was a privilege to be associated with Terry. I long to hear some deflating rejoinders to that from Kit Carruthers. There was a wit once in Malick that had now gone missing. I say this because a great director deserves to be tested by his own standards, which in Malick’s case are uncommonly high. Even with the more adventurous Christian Bale as its forlorn male lead — a jaded movie screenwriter — Knight of Cups is yet more stultifyingly beautiful and Tarot-esque, with a placid harem of women (from Cate Blanchett to Isabel Lucas, from Imogen Poots to Natalie Portman), all so immediately desirable that they do not bother to be awake. Richard Brody said it was “an instant classic,” which only showed how far “instant” and “classic” had become invalid concepts. The film earned a touch over $1 million, and it had disdain for any audience. It was a monument to a preposterous cinephilia and to a talent that seemed in danger of losing itself.

    Those are harsh words, but I choose them carefully, after repeated viewings, and in the confidence that Badlands, Days of Heaven and The Thin Red Line are true wonders. The Terrence Malick of early 2019, passing seventy-five, was not a sure thing. And then he retired all doubt about his direction and released his fourth great film; and surely four is enough for any pantheon.

    Malick had been contemplating A Hidden Life and the historical incident upon which it is based for a few years. In 1943, Franz Jagerstatter was executed in Berlin for refusing to take an oath of loyalty to Adolf Hitler. He was a humble farmer high in the mountains of northern Austria, where he lived with his wife, his three daughters, his sister-in-law, and his mother. They were valued members of a small community and worked endlessly hard to sustain their meager living. They were devout Catholics, and Franz had done his military service without thinking too much about it. His farm and his village are surrounded by breathtaking natural beauty, and Malick lingers long over the fields and the peaks and the clouds in a way that teaches us that even Nazism is ephemeral.

    The film has few long speeches in which Jagerstatter spells out his reluctance to honor the Nazi code. He is more instinctive than articulate. He knows the fate he is tempting; he understands the burden that will put upon his wife and children; he appreciates that he could take the oath quietly and then do non-combatant service. It is not that he understands the war fully or the extent of Nazi crimes. He is not a deliberate or reasoned objector. But just as he feels the practical truths in his steep fields and in the lives of his animals, and just as he is utterly loyal to his wife, so he believes that the oath of allegiance will go against his grain. He does not show a moral philosophy so much as a moral sense. He cannot make the compromise with an evasive form of words.

    There is no heavy hint in A Hidden Life of addressing how Americans in our era might withhold their own allegiance to a leader. But the film rests on a feeling that such cues are not needed for an alert audience living in the large world. We are living in a time that will have its own Jagerstatters. That is part of the narrative confidence that has not existed in Malick since Days of Heaven. It amounts to an unsettling detachment: he shares the righteousness of Jagerstatter, but he does not make a fuss about his heroism. In the long term of those steep Alps and their infinite grasslands, how much does it matter? Do the cattle on the farm know less, or are they as close to natural destiny as the farmer’s children?

    That may sound heretical for so high-minded a picture. And there is no escaping — the final passages are shattering — how Jagerstatter is brutalized and then hung by the Nazi torturers and executioners. The Catholic church would make a saint of him one day, and Malick has taken three hours to tell what happened, but the film has no inkling of saintliness or a cause that could protect it.  The farmer’s wife, rendered by Valerie Pachner as sharp and uningratiating, does not need to agree with her man, or even to understand him. People are alike but not the same, even under intense pressure. No one could doubt Malick’s respect for Jagerstatter, and August Diehl is Teutonically tall, blond, and good-looking in the part. But he is not especially thoughtful; his doubts over the oath are more like a limp than a self-consciously upright attitude. Certainly the old Hollywood scheme of a right thing waiting and needing to be done leaves Malick unmoved; he would prefer to be a patient onlooker, a diligent chronicler, attentive and touched, but more rapt than ardent, and still consumed by wonder.

    Malick has admitted how often he had got into the habit of working without a script (or a pressing situation), so that he often filmed whatever came into his head. But he seems to have learned how far that liberty had led him astray. So A Hidden Life has as cogent a situation as those in Badlands and Days of Heaven. That does not mean those three films are tidy or complacent about their pieces clicking together. They are all as open to spontaneity and chance as The Thin Red Line. But just as it is trite and misleading to say that The Thin Red Line was a film about war, so A Hidden Life feels what its title claims: the existence of an inwardness that need not be vulgarized by captions or “big scenes.” The film concludes with the famous last paragraph of Middlemarch, about the profound significance of “hidden lives” and “unvisited tombs.” Yes, this is what a movie, a heartbreaking work, might be for today. As for its relative neglect, just recall the wistful look on the dinosaur faces in The Tree of Life.

    We can do our best, we can make beauty and find wisdom, without any prospect of being saved from oblivion.

    Owed To The Tardigrade

    Some of these microscopic invertebrates shrug off temperatures
    of minus 272 Celsius, one degree warmer than absolute zero.
    Other species can endure powerful radiation and the vacuum of space.
    In 2007, the European Space Agency sent 3,000 animals
    into low Earth orbit, where the tardigrades survived
    for 12 days on the outside of the capsule.

    The Washington Post, “These Animals can survive until the end
    of the Earth, astrophysicists say”

    O, littlest un-killable one. Expert
    death-delayer, master abstracter

    of imperceptible flesh. We praise
    your commitment to breath.

    Your well-known penchant
    for flexing on microbiologists,

    confounding those who seek
    to test your limits using ever more

    abominable methods: ejection
    into the vacuum of space, casting

    your smooth, half-millimeter frame
    into an active volcano, desiccation

    on a Sunday afternoon, when the game
    is on, & so many of us are likewise made

    sluggish in our gait, bound to the couch
    by simpler joys. Slow-stepper, you were

    called, by men who caught first
    glimpse of your eight paws walking

    through baubles of rain. Water bear.
    Moss piglet. All more or less worthy

    mantles, but I watch you slink
    through the boundless clarity

    of a single droplet & think
    your mettle ineffable, cannot

    shake my adoration
    for the way you hold fast

    to that which is so swiftly
    torn from all else living,

    what you abide in order
    to stay here among the flailing

    & misery-stricken, the glimpse
    you grant into limitless

    persistence, tenacity
    under unthinkable odds,

    endlessness enfleshed
    & given indissoluble form.

    A Democratic Jewish State, How and Why

    The question of whether Israel can be a democratic Jewish state, a liberal Jewish state, is the most important question with which the country must wrestle, and it can have no answer until we arrive at an understanding of what a Jewish state is. A great deal of pessimism is in the air. Many people attach to the adjective “Jewish” ultra-nationalistic and theocratic meanings, and then make the argument that a Jewish democratic state is a contradiction in terms, an impossibility. On the left and on the right, among the elites and the masses, people are giving up on the idea that both elements, the particular and the universal, may co-exist equally and prominently in the identity of the state. This way of thinking is partly responsible for the recent convulsions in Israeli politics, for the zealotry and the despair that run through it. Yet it is an erroneous and unfruitful way of thinking. It rigs the outcome of this life-and-death discussion with a tendentious and dogmatic conception of Judaism and Jewishness.  

    There is another way, a better way, to arrive at an answer to this urgent and wrenching question. Let us begin by asking a different one, a hypothetical one. Let us imagine the problem in a place that is not Israel or Palestine. Could a Catalan state, if it were to secede from Spain, be a democratic Catalan state, a liberal Catalan state? Catalan nationalism is a powerful force, and many Catalans wish to establish an independent state of their own with Barcelona as its capital, based on their claim that they constitute a distinct ethnocultural group that deserves the right to self-determination. Though recent developments in Spain have shown that the establishment of an independent Catalan state is far from becoming a reality in the near future, let us nonetheless consider what it might look like. In this future state — as in other European nation-states, such as Denmark, Finland, Norway, Germany, the Czech Republic, and others that have a language and state symbols that express an affinity to the dominant national culture — the Catalan language would be the official language, the state symbols would be linked to the Catalan majority, the official calendar would be shaped in relation to Christianity and to events in Catalan history, and the public education of Catalans would insure the vitality and the continuity of Catalan culture, transmitting it to the next generation. Revenues from taxation would be distributed solely among Catalan citizens and not across Spain, and the foreign policy of the Catalan state would reflect the interests of the ethnocultural majority of the state. It is very probable that Catalunya’s immigration policy, like that of all contemporary European and Scandinavian states, would attempt to safeguard the Catalan majority in its sovereign territory. 

    It is important to note that these aspects of a Catalan state would not reflect anything unusual in the modern political history of the West. The Norwegians, for example, demanded all these characteristics of statehood in 1907, when they seceded from Sweden (under threat of war) since they saw themselves as a separate national group. In the matter of identity, Catalunya, like Norway, would not be a neutral state in any meaningful fashion, and there is no reason that it should be a neutral state. Members of the Catalan group deserve a right to self-determination, which includes a sovereign territory inhabited by a Catalan majority in which a Catalan cultural public space is created and the culture of the majority is expressed. 

    But this is not all we would need to know about a Catalan nation-state that purports to be a democracy. The test of the question of whether Catalunya, or any other state, is democratic is not dependent upon whether it is neutral with respect to identity. Its moral and political quality, its decency, its liberalness, will be judged instead by two other criteria. The first is whether its character as a nation-state results in discriminatory policies towards the political, economic, and cultural rights of the non-Catalan minorities that reside within it. The second is whether Catalunya would support granting the same right of self-determination to other national communities, such as the Basques. Adhering to these two principles is what distinguishes democratic nation-states from fascist ones. 

    Ultra-nationalist states are sovereign entities in which the national character serves as a justification for depriving minorities of political, economic and cultural rights. In the shift to ultra-nationalism that we are witnessing around the world today, such states also attack and undermine the institutions that aim at protecting minorities — the independent judiciary, the free press, and NGO’s dedicated to human and minority rights. In addition, ultra-nationalists states do not support granting rights of self-determination to nations that reside within them or next to them. They generally claim that no such nations exist, or that the ethnic groups that call themselves a nation do not deserve the right to self-determination.

    The legitimacy of Israel as a nation-state should be judged just as we would judge any other nation-state, according to these two principles. If, in the name of the Jewish character of the state, the Arab minority in Israel is deprived of its rights, the very legitimacy of the State of Israel as a Jewish nation-state will be damaged. Discrimination in the distribution of state resources in infrastructure, education, and land, and the refusal to recognize new Arab cities and villages in the State of Israel, threatens to transform it from a democratic nation-state into an ultra- nationalist state. Such a threat to the democratic character of the state is posed also by recent legislative attempts (which fortunately have failed) to demand a loyalty oath solely from Israel’s Arab citizens. The threat is heightened by a political plan put forth by elements of the Israeli radical right, which, in a future agreement with the Palestinians, would deny Israeli citizenship to Israeli Arabs, by virtue of a territorial exchange that would include their villages in the territory of a future Palestinian state. This is to act as if the Israeli citizenship of the Arabs of Israel is not a basic right, but a conditional gift granted to them by the Jewish nation-state — a gift that can be rescinded to suit the interests of Jewish nationalism. The Nation-State law that was passed by the Israeli parliament in 2018, which formulates the national identity of the country in exclusively Jewish terms, is an occasion for profound alarm, in particular in its glaring omission of an explicit commitment to the equality of all Israeli citizens, Jews and Arabs alike. Such a commitment to the equality of all citizens was enshrined in Israel’s Declaration of Independence, the founding document that to this day contains the noblest expression of the vision of Israel as Jewish and democratic. The commitment to the equality of all citizens might be legally and judicially ensured in relation to other basic laws in Israel’s legal system, yet its striking absence from this latest official articulation of the character of the state is yet another marker of the drift to ultra-nationalism. 

    The structural discrimination manifested in these examples constitutes an unjustified bias against the Arab citizens of Israel. It also serves to undermine the very legitimacy of the Jewish state. A Jewish nation-state can and must grant full equality to its Arab citizens in all the realms in which it has failed to do so until now. It must recognize them as a national cultural minority, with Arabic as a second official language of the state and the Islamic calendar as an officially recognized calendar. The public educational system must be devoted, among other goals, to the continuity of the Arab cultural traditions of Israel’s citizens. 

    In the recent elections held in Israel, three within a single year, the participation of the Arab citizens of Israel in the vote increased by 50%, reaching very close to the percentage of the vote among Jewish citizens. This is a wonderful and encouraging sign of the greater integration of the Arab population in larger Israeli politics. As a result the Joint List, the Israeli Arab party, which encompasses different ideological and political streams in the Arab community of Israel, increased its seats in Israel’s Knesset from ten to fifteen — an extraordinary achievement. But its positive impact was undone by the disgraceful failure of the left and center to form a government with the Joint List on the grounds that a government that rests on the Arab vote is unacceptable. Thus was lost an historic opportunity to integrate the Arab minority as an equal partner in sharing governmental power.  

    As is true of all other legitimate democratic nation-states, the second condition that Israel must maintain is the recognition of the right of the Palestinian nation to self-determination in Gaza and the West Bank — the same right that Jews have rightly demanded for themselves. The denial of such a right, and the settlement policy that aims at creating conditions in which the realization of such a right becomes impossible, similarly damage the legitimacy of Israel as a Jewish nation-state. The Trump plan for peace includes, among its other problematic aspects, the annexation of the Jordan Valley to the state of Israel, which would constitute yet another significant impediment to the possibility of a two-state solution. If any Israeli government includes such an annexation in its plans, it will also create de facto conditions that will undermine the possibility of a Jewish democratic state in the future. 

    It is important to stress that the fulfillment of the first condition — equal rights to minorities — is completely within Israel’s power. Discrimination against citizens of your own country is always a self-inflicted wound. The second condition, by contrast, the recognition of the Palestinian right to self-determination, is not exclusively in the hands of Israel. The conditions of its realization are much more complicated. It depends to a significant degree upon Palestinians’ willing ness to live side by side with the State of Israel in peace and security. The situation with regard to the possibility of such co-existence is difficult and murky and discouraging on the Palestinian side — and yet Israel must nevertheless make clear its recognition of the Palestinian right to self-determination, not least for the simple reason that achieving it will lend legitimacy to Israel’s own claim to the same right.  

    If democracy and decency do not require cultural neutrality from a nation-state, then how should the identity of the majority be recognized in such a state without vitiating its liberal principles? There are four ways, I believe, that the Jewish nature of the State of Israel should be expressed. The first is to recognize the State of Israel as the realization of the Jewish national right to self-determination. In this era, when the meaning of Zionism is mangled and distorted in so many quarters, it is important to recognize what Zionism incontrovertibly is: a national liberation movement aimed at extracting a people from the historic humiliation of dependence on others in defining their fate. That remains its central meaning. Zionism gave one of the world’s oldest peoples, the Jewish people, the political, military, and economic ability to define themselves and defend themselves.  

    The most fundamental feature of Israel as a Jewish state resides, therefore, in its responsibility for the fate of the Jewish people as a whole. If the responsibility of the State of Israel were confined only to its citizens, it would have been only an Israeli state. In light of this responsibility to a people, it has the right and the duty to use the state’s powers to defend Jews who are victimized because they are Jews.

    The second feature that defines Israel as a Jewish state is the Law of Return. This law, which was established in 1950, and is intimately connected to the first feature of national self-determination, proclaims that all Jews, wherever they are, have a right to citizenship in the State of Israel, and can make the State of Israel their home if they so desire. The State of Israel was created to prevent situations — plentiful in Jewish history — in which Jews seeking refuge knock on the doors of countries that have no interest in receiving them. For the same reason, Palestinian refugees in the Arab states ought to have immediate access to citizenship in the state of Palestine when it is established.

    Yet the justification of the Law of Return does not rest exclusively on conditions of duress. If national groups have a right to self-determination — the right to establish a sovereign realm where they constitute the majority of the population, and where their culture develops and thrives — it would be odd not to allow Jews or Palestinians a right of citizenship in their national territory. It is also important to emphasize that the Law of Return is legitimate only if accompanied by other tracks of naturalization. If the Law of Return were the only way of acquiring Israeli citizenship, its exclusively national character would harm the rights of minorities and immigrants who are not members of the ethnic majority. Safeguarding the ethnocultural majority in any state is always severely constrained by the rights of minorities. Thus the transfer of populations, or the stripping of citizenship by the transfer of territory to another state, are illegitimate means of preserving a majority. It is crucial, therefore, that other forms of naturalization exist as a matter of state policy, including granting citizen-ship to foreign workers whose children were born and grew up in Israel, and to men and women who married Israeli citizens.

    The third expression of the Jewishness of the State of Israel relates to various aspects of its public sphere, such as its state symbols, its official language, and its calendar. These symbolic institutions are derived from Jewish cultural and historical symbols, including the menorah and the Star of David; Hebrew is the official language; Israel’s public calendar is shaped according to the Jewish calendar; and the Sabbath and Jewish holidays are official days of rest. Yet a democratic state demands more. The public expression of the majority culture must go along with granting official status to the minority cultures of the state, including Arabic as the second official language of the state of Israel, and recognizing the Islamic calendar in relation to the Arab minority. Again, official symbols and practices that have an affinity to the majority culture exist in many Western states: in Sweden, Finland, Norway, Britain, Switzerland and Greece, the cross is displayed on the national flag. In all those cases, the presence of state symbols that are connected to the religion and culture of the majority does not undermine the state’s democratic and liberal nature. In many of those states, however, there are powerful political forces that wish to limit democracy to the dominant ethnicity. The historical challenge in these multiethnic and volatile societies — and Israel also faces this challenge — is to prevent the self-expression of the majority from constraining or destroying the self-expression of the minority. 

    The fourth essential feature of a democratic nation-state, and the most important one, relates to public education. In the State of Israel, as a Jewish state, the public system of education is committed to the continuity and reproduction of Jewish cultures. I emphasize Jewish cultures in the plural, since Jews embrace very different conceptions of the nature of Jewish life and the meaning of Jewish education. In its commitment to Jewish cultures, the State of Israel is not different from many modern states whose public education transmits a unique cultural identity. In France, Descartes, Voltaire, and Rousseau are taught, and in Germany they teach Goethe, Schiller, and Heine. The history, the literature, the language, and sometimes the religion of different communities are preserved and reproduced by the system of public education, which includes students of many ethnic origins. Jews who happen to be German, American, or French citizens and wish to transmit their tradition to their children must resort to private means to provide them with a Jewish education. In Israel, as in other modern states (though not in the United States), such an education should be supported by state funds. This commitment does not contradict — rather, it requires — public funding for education that, alongside the public education system, insures the continuity of the other traditions represented in the population of the state, the Islamic and Christian cultures of the Arab minority in Israel. The culture of a minority has as much right to recognition by the state as the culture of the majority. 

    There are voices that maintain that the only way to secure Israel’s democratic nature is to eliminate its Jewish national character and turn it into a state of all its citizens, or a bi-national state. This sounds perfectly democratic, but it would defeat one of the central purposes of both national communities. In this territory there are two groups that possess a strong national consciousness — Jews and Palestinians; and there is no reason not to grant each of them the right of self-determination that they deserve. Moreover, a state of all its citizens in the area between the Jordan River and the Mediterranean Sea would, in fact, be an Arab nation-state with a large Jewish minority. It would become a place of exile for the Jewish minority. Historical experience in this region, where national rights and civil liberties are regularly trampled, suggests that Greater Palestine would be one of the harshest of all Jewish exiles.

    Honoring the status of the Arab citizens of Israel and espousing the establishment of a Palestinian state ought not to focus on — and does not require — the impossible and unjust annulment of the Jewish character of the State of Israel. It should focus instead on the effort to create full and rich equality for the Arab minority in Israel, and on the possibility of establishing a Palestinian nation-state alongside the state of Israel.

    In a Jewish state, the adjective “Jewish” carries within it another crucial challenge to liberal democracy, which is not tied to its national content but to its religious implications. This Jewish character, or the religious meaning of the adjective “Jewish,” might harm the freedom of religion in the state. Indeed, some currents in Israeli Judaism — and some religiously inspired ideological and political trends in the Jewish population of Israel — constitute a powerful and complex challenge to Israeli liberalism. Some voices assert that the Jewish identity of the state justifies granting the weight of civil law to Jewish law, and the use of the coercive machinery of the state for the religious purposes of the dominant community. 

    But a Jewish state conceived in this way could not be democratic in any recognizable manner, for two reasons: it would harm both the religious freedom of its citizens and the religious pluralism of the communities that constitute it. The attempt to “Judaize” the state through religious legislation, above and beyond the four features mentioned above, would undermine Israel’s commitment to liberalism and destroy some of its most fundamental founding principles. It would take back the pluralism that was explicitly and stirringly guaranteed in Israel’s Declaration of Independence. 

    Since the nineteenth century, Jews have been deeply divided about the meaning of Jewish identity and their loyalty to Jewish law. Jews celebrate the Sabbath in a variety of ways. They disagree ferociously about basic religious questions, including the nature of marriage and divorce. Any attempt to use the power of the state to adjudicate these deep divisions would do inestimable damage to freedom of religion and freedom from religion. In this case it would be the freedoms of Jews that would be violated. 

    The role of the state is not to compel a person to keep the Sabbath or to compel her to desecrate it. The state must, instead, guarantee that every person has the right to behave on the Sabbath as she sees fit, as long as she grants the same right to individuals and communities who live alongside her. All attempts at Judaizing the state through religious legislation — such as the law prohibiting the selling of bread in public during Passover, or the law prohibiting the raising of pigs — are deeply in error, since it is the obligation of a liberal democratic state to allow its citizens to decide these matters autonomously, as they see fit.

    The Sabbath, like other Jewish holidays, ought to be part of the official calendar of Israel as a Jewish state. A shared calendar, with Islamic and Christian holidays on it too, is an essential feature of the life of a state, and it enables a kind of division of cultural and spiritual labor, a pluralist form of cooperation among its citizens. If state institutions do not function during the Sabbath, it is not only because we would like religious citizens to be able to take equal part in the running of those institutions, but also because Israel ought to respect the Jewish calendar. The same applies as well to factories and businesses that must be shuttered during days of rest. 

    Such a policy, moreover, should be supported not for religious reasons, but owing to secular concerns about fairness. First, it allows equal opportunity to workers and owners who wish to observe the Sabbath. Historically, in the various Jewish exiles, the observance of the Sabbath sometimes caused Jews a great deal of economic hardship owing to the advantage that it conferred upon competitors who did not observe the same day of rest. In a Jewish state, Jews who observe the Sabbath ought to be free from such an economic sacrifice. The second reason for closing businesses and factories on the Sabbath concerns the rights of workers. The institution of the Sabbath is more widespread than most Jews know, and it is consistent with universal ethical considerations. Constraining the tyranny of the market over individual and family life by guaranteeing a weekly day of rest for workers and owners is common in European states which, in accordance with the Christian calendar, enforce the closing of businesses on Sunday. In a similar spirit, factories, malls, stores, and businesses ought to be closed during the Sabbath in a Jewish state — but art centers, theaters, museums, and restaurants should continue to function, so that Israeli Jews may choose their own way of enjoying the day of lovely respite.

    The abolition of the coercive power of the state in matters of religion should be applied as well to the primary domain of religious legislation in Israel: divorce and marriage. The monopoly granted to rabbinical courts in issues of divorce and marriage must finally be terminated. It is an outrageous violation of the democratic and liberal ethos of the state. Alongside religious marriage, Israel must recognize civil marriage. Such a reform would allow a couple that cannot marry according to Jewish law to exercise their basic right to form a family. It would also recognize the legitimate beliefs of many men and women who do not wish to submit to the rabbinical court, which is often patriarchal in its rulings and financially discriminates against women in divorce agreements.

     The claim of some religious representatives that establishing civil marriage would cause a rift among Jews, since members of the Orthodox Jewish community would not be able to marry Jews who did not divorce according to rabbinical procedure, is not persuasive. Many Jews all over the world marry and divorce outside the Orthodox community, and this is de facto the case in Israel as well, since many Israelis obtain civil marriages outside Israel, or live together without marrying under the jurisdiction of the rabbinate. The establishment of two tracks of marriage and divorce, religious and secular-civil, would not create division, which already exists in any case, but it would remove the legal wrong caused to Israelis who cannot practice their right to marry within Jewish law, and it would liberate those who aspire to gender equality from the grip of the rabbinical courts.

    I should confess that my analysis of the place of religion in Israel does not rest exclusively upon my liberal commitments. It is grounded also in my concern for the quality of Jewish life in Israel. Religious legislation has had a devastating impact on Jewish culture and creativity in Israel. The great temptation to settle the debate over modern Jewish identity through the coercive mechanism of the state justifiably alienates major segments of the Israeli public from Jewish tradition, which comes to be perceived by many Israelis as threatening their way of life. The deepening of alienation from the tradition, and its slow transformation into hostility, suggests that the more Jewish the laws of Israel become, the less Jewish the citizens of Israel become. 

    The Israeli parliament is not the place to decide the nature of the Sabbath, or which Jewish denomination is the authentic representation of Judaism, or who is a legitimate rabbi. Such controversies have corrupted the legislature, creating cynical political calculations in which religious matters have served as political payoffs to maintain government coalitions. The unavoidable debate on Jewish culture and religion must move from parliament to civil society. The nature of Jewish life in Israel must be determined by individuals and communities who will themselves decide how to lead their lives without interference from the state. For instance, there is no law in Israel prohibiting private transportation during the sacred day of Yom Kippur, yet the sanctity of the day is generally observed without any coercion. Wresting Judaism from the control of the politicians will unleash creative forces for Jewish renewal and allow for new ways of refreshing the tradition and extending its appeal. 

    Among the precious and time-honored institutions of Judaism which have been corrupted by the state is the rabbinate. The methods used for nominating and choosing the chief rabbis, and the rabbis of cities and neighborhoods, demonstrates that the rabbinate has turned into a vulgar patronage system, used by politicians to distribute jobs to their supporters. In many places, there is no affinity between the state-appointed rabbis and their residents. It is urgently in the interest of both Judaism and Israel that the state rabbinate be abolished.

    I do not support the total separation of religion and state as practiced in the United States. It seems to me that the model of some European countries is better suited to Israel. The establishment of synagogues and the nomination of rabbis ought to be at least partially supported by public funds, in the same way that museums, community centers, and other cultural activities are supported by the state. But this funding should be distributed in accordance with the communities’ needs and preferences, without allowing for a monopoly of any particular religious denomination over budgets and positions. Each community should choose its own rabbi according to its own religious orientation, as was the practice of Jewish communities for generations. And these same protections of freedom of religion must be granted to Muslim and Christian communities of Israel.

    Israel can and should be defined as a Jewish state, where the Jewish people exercises its incontrovertible right to self-de-termination; where every Jew, wherever he or she lives, has a homeland; where the public space, the language, and the calendar have a Jewish character; and where public education allows for the continuity and flourishing of Jewish cultures. These features do not at all undermine the democratic nature of the state, so long as Israel’s cultural and religious minorities are also granted equal and official recognition and protection, including state funding of Muslim and Christian public education systems, and the recognition of Arabic as a second official language of the state and the Muslim and Christian calendar as state calendars. In this sense, there is nothing contradictory or paradoxical about the idea of a Jewish democratic state. 

    The pessimism is premature. These essential principles can be reconciled and realized. Yet there are significant limits in such an experiment that must be vigilantly respected. Any attempt to “Judaize” the state of Israel beyond those limits would transform it into an undemocratic nation-state, and compromise its liberal nature, and undo its founders’ magnificent vision, and damage the creative Jewish renewal that may emerge from the great debate about modern Jewish identity. The tough question is not whether a Jewish state can be both democratic and liberal, but rather what kind of Jewish state do we wish to have. [END]

    Dark Genies, Dark Horizons: The Riddle of Addiction

    In 2014, Anthony Bourdain’s CNN show, Parts Unknown, travelled to Massachusetts. He visited his old haunts from 1972, when he had spent a high school summer working in a Provincetown restaurant, the now-shuttered Flagship on the tip of Cape Cod. “This is where I started washing dishes …where I started having pretensions of culinary grandeur,” Bourdain said in a wistful voiceover. For the swarthy, rail-thin dishwash-er-turned-cook, Provincetown was a “wonderland” bursting with sexual freedom, drugs, music, and “a joy that only came from an absolute certainty that you were invincible.” Forty years later, he was visiting the old Lobster Pot restaurant, cameras in tow, to share Portuguese kale soup with the man who still ran the place.

    Bourdain enjoyed a lot of drugs in the summer of 1972. He had already acquired a “taste for chemicals,” as he put it. The menu included marijuana, Quaaludes, cocaine, LSD, psilocybin mushrooms, Seconal, Tuinal, speed, and codeine. When he moved to the Lower East Side of New York to cook profession-ally in 1980, the young chef, then 24, bought his first bag of heroin on the corner of Bowery and Rivington. Seven years later he managed to quit the drug cold turkey, but he spent several more years chasing crack cocaine. “I should have died in my twenties,” Bourdain told a journalist for Biography.

    By the time of his visit to Provincetown in 2014, a wave of painkillers had already washed over parts of Massachusetts and a new tide of heroin was rolling in. Bourdain wanted to see it for himself and traveled northwest to Greenfield, a gutted mill town that was a hub of opioid addiction. In a barebones meeting room, he joined a weekly recovery support group. Everyone sat in a circle sharing war stories, and when Bourdain’s turn came he searched for words to describe his attraction to heroin. “It’s like something was missing in me,” he said, “whether it was a self-image situation, whether it was a character flaw. There was some dark genie inside me that I very much hesitate to call a disease that led me to dope.”

    A dark genie: I liked the metaphor. I am a physician, yet I, too, am hesitant to call addiction a disease. While I am not the only skeptic in my field, I am certainly outnumbered by doctors, addiction professionals, treatment advocates, and researchers who do consider addiction a disease. Some go an extra step, calling addiction a brain disease. In my view, that is a step too far, confining addiction to the biological realm when we know how sprawling a phenomenon it truly is. I was reminded of the shortcomings of medicalizing addiction soon after I arrived in Ironton, Ohio where, as the only psychiatrist in town, I was asked whether I thought addiction was “really a disease.

    In September 2018, I set out for Rust Belt Appalachia from Washington, D.C., where I am a scholar at a think tank and was, at the time, a part-time psychiatrist at a local methadone clinic. My plan was to spend a year as a doctor-within-borders in Ironton, Ohio, a town of almost eleven thousand people in an area hit hard by the opioid crisis. Ironton sits at the southernmost tip of the state, where the Ohio River forks to create a tri-state hub that includes Ashland, Kentucky and Huntington, West Virginia. Huntington drew national attention in August 2016, when twenty-eight people overdosed on opioids within four hours, two of them fatally.

    I landed in Ironton, the seat of Lawrence County, by luck. For some time I had hoped to work in a medically underserved area in Appalachia. Although I felt I had a grasp on urban opioid addiction from my many years of work in methadone clinics in Washington DC, I was less informed about the rural areas. So I asked a colleague with extensive Ohio connections to present my offer of clinical assistance to local leaders. The first taker was the director of the Ironton-Lawrence County Community Action Organization, or CAO, an agency whose roots extend to President Johnson’s War on Poverty. The CAO operated several health clinics.

    Ironton has a glorious past. Every grandparent in town remembers hearing first-person accounts of a period, stretching from before the Civil War to the early turn of the century, when Ironton was one of the nation’s largest producers of pig iron. “For more than a century, the sun over Ironton warred for its place in the sky with ashy charcoal smoke,” according to the Ironton Tribune. “In its heyday in the mid-nineteenth century there were forty-five [iron] furnaces belching out heat, filth, and prosperity for Lawrence County.” After World War II, Ironton was a thriving producer of iron castings, molds used mainly by automakers. Other plants pumped out aluminum, chemicals, and fertilizer. The river front was a forest of smokestacks. High school graduates were assured good paying if labor-intensive jobs, and most mothers stayed home with the kids. The middle class was vibrant.

    But then the economy began to realign. Two major Ironton employers, Allied Signal and Alpha Portland Cement, closed facilities in the late 1960s, beginning a wave of lay-offs and plant closings. The 1970s were a time of oil shocks emanating from turmoil in the Middle East. Inflation was high and Japanese and German car makers waged fierce competition with American manufacturers. As more Ironton companies downsized and then disappeared, the pool of living wage jobs contracted, and skilled workers moved out to seek work elsewhere. At the same time, the social fabric began to unravel. Domestic order broke down, welfare and disability rolls grew, substance use escalated. Most high school kids with a shot at a future pursued it elsewhere, and the place was left with a population dominated by older folks and younger addicts.

    Ironton continues to struggle. Drug use, now virtually normalized, is in its third, sometimes fourth, generation. Almost everyone is at least one degree of separation away from someone who has overdosed. Although precise rates of drug involvement are hard to come by, one quarter to one third is by far the most common answer I hear when I ask sources for their best estimate of people dealing with a “drug problem of any kind.” Alluding to the paucity of hope and opportunity, one of my patients told me that “you have to eradicate the want — why people want to use — or you will always have drug problems.”

    When Pam Monceaux, an employment coordinator in town, asked me whether I thought addiction was “really a disease,” she was thinking about her own daughter. Christal Monceaux grew up in New Orleans with her middle-class parents and a younger sister, and started using heroin and cocaine when she was nineteen. Pam blamed the boyfriend. “Brad sucked her in. Finally, she dumped him, went to rehab and did well, but a few months later took him back and the cycle began all over again.” Eventually Christal’s younger sister, who had moved to Nashville with her husband, persuaded her to leave New Orleans and join them. Pam, a serene woman who had over a decade’s time to put her daughter’s ordeal into perspective, said that relocating — or the “geographic cure,” as it is sometimes called — worked for Christal. A new setting and new friends allowed her to relinquish drugs. She got married, had children, and lived in a $400,000 house. The happy ending was cut short by Christal’s death at the age of forty-two of a heart attack. “If she could kick it for good when she was away from Brad and then when she moved to Nashville, how is that a disease?” Pam asked in her soft Louisiana drawl. “If I had breast cancer, I’d have it in New Orleans and in Nashville.”

    Unlike Christal, Ann Anderson’s daughter had not left drugs behind for good. So, at age 66, Ann and her husband were raising their granddaughter, Jenna. Ann, who worked for my landlord, was bubbly, energetic, and, curiously, sounded as if she were raised in the deep South. The welcome basket she put together for me when I arrived, full of dish towels, potholders, and candies, foretold the generosity that she would show me all year. Ann makes it to every one of Jenna’s basketball games. Jenna’s mom lives in Missouri and has been on and off heroin for years. “I love my daughter, but every time she relapsed, she made a decision to do it,” said Ann, matter-of-factly, but not without sympathy. “And each time she got clean she decided that too.”

    Another colleague, Lisa Wilhelm, formed her opinions about addiction based on her experience with patients. Lisa was a seen-it-all nurse with whom I had worked at the Family Medical Center located across highway 52 from the Country Hearth, a drug den that passed itself off as a motel. She did not ask for my opinion about addiction; she told me hers. “I think it is a choice. And I’ll devote myself to anyone who made that choice and now wants to make better ones,” Lisa said, “But it’s not a disease, I don’t think.”

    Then there was Sharon Daniels, the director of Head Start. Sharon managed programs for drug-using mothers of newborns and toddlers. “I see opportunities our women have to make a different choice,” she said. She is not pushing a naive “just say no” agenda, nor is she looking for an excuse to purge addicted moms from the rolls. This trim grandmother with bright blue eyes and year-round Christmas lights in a corner of her office is wholly devoted to helping her clients and their babies. But she thinks that the term disease “ignores too much about the real world of addiction. If we call it a disease, then it takes away from their need to learn from it.”

    Before coming to Ironton, I had never been asked what I thought about addiction by the counselors at the methadone clinic at which I worked in Washington. I am not sure why. Perhaps abstractions are not relevant when you are busy helping patients make step-wise improvements. Maybe the staff already knew what I would say. On those rare occasions when a student or a non-medical colleague asked me, generally sotto voce, if addiction were really a disease my response was this: “Well, what are my choices?” If the alternatives to the disease label were “criminal act,” “sin,” or “moral deprivation,” then I had little choice but to say that addiction was a disease. So, if a crusty old sheriff looking to justify his punitive lock-‘em-up ways asked me if addiction were a disease, I would say, “Why yes, sir, it is.”

    But Pam, Beckey, Lisa, and Sharon had no concealed motives. They were genuinely interested in the question of addiction. And they were fed up with the false choice routinely thrust upon them in state-sponsored addiction workshops and trainings: either endorse addicts as sick people in need of care or as bad actors deserving of punishment. With such ground rules, no one can have a good faith conversation about addiction. Between the poles of diseased and depraved is an expansive middle ground of experience and wisdom that can help explain why millions use opioids to excess and why their problem can be so difficult to treat. The opioid epidemic’s dark gift may be that it compels us to become more perceptive about why there is an epidemic. The first step is understanding addiction.

    Most people know addiction when they see it. Those in its grip pursue drugs despite the damage done to their wellbeing and often to the lives of others. Users claim, with all sincerity, that they are unable to stop. This is true enough. Yet these accounts tell us little about what drives addiction, about its animating causal core — and the answer to those questions has been contested for over a century. In the mid-1980s the Harvard psychologist Howard J. Shaffer proclaimed that the field of addiction has been in a century-long state of “conceptual chaos.” And not much has changed. For behaviorists, addiction is a “disorder of choice” wherein users weigh benefits against risks and eventually quit when the ratio shifts toward the side of risk. For some philosophers, it is a “disorder of appetite.” Psychologists of a certain theoretical stripe regard it as a “developmental” problem reflecting failures of maturity, including poor self-control, an inability to delay gratification, and an absence of a stable sense of self.  Sociologists emphasize the influence of peers, the draw of marginal groups and identification with them, and responses to poverty or alienation. Psycho-therapists stress the user’s attempt at “self-medication” to allay the pain of traumatic memories, depression, rage, and so on. The American Society of Addiction Medicine calls addiction  “a primary, chronic disease of brain reward, motivation, memory and related circuitry.” For the formerly addicted neuroscientist Marc Lewis, author of Memoirs of an Addicted Brain, addiction is a “disorder of learning,” a powerful habit governed by anticipation, focused attention, and behavior, “much like falling in love.”

    None of these explanations best captures addiction, but together they enforce a very important truth. Addiction is powered by multiple intersecting causes — biological, psycho-logical, social, and cultural. Depending upon the individual, the influence of one or more of these dimensions may be more or less potent. Why, then, look for a single cause for a complicated problem, or prefer one cause above all the others? At every one of those levels, we can find causal elements that contribute to excessive and repeated drug use, as well as to strategies that can help bring the behavior under control. Yet today the “brain disease” model is the dominant interpretation of addiction.

    I happened to have been present at a key moment in the branding of addiction as a brain disease. The venue was the second annual “Constituent Conference” convened in the fall of 1995 by the National Institute on Drug Abuse, or NIDA, which is part of the National Institutes of Health. More than one hundred substance-abuse experts and federal grant recipients had gathered in Chantilly, Virginia for updates and discussions on drug research and treatment. A big item on the agenda set by the NIDA’s director, Alan Leshner, was whether the assembled group thought the agency should declare drug addiction a disease of the brain. Most people in the room — all of whom, incidentally, relied heavily on NIDA-funding for their professional survival — said yes. Two years later Leshner officially introduced the concept in the journal : “That addiction is tied to changes in brain structure and function is what makes it, fundamentally, a brain disease.”

    Since then, NIDA’s concept of addiction as a brain disease has penetrated the far reaches of the addiction universe. The model is a staple of medical school education and drug counselor training and even figures in the anti-drug lectures given to high-school students. Rehab patients learn that they have a chronic brain disease. Drug czars under Presidents Bill Clinton, George W. Bush, and Barack Obama have all endorsed the brain-disease framework at one time or another. From being featured in a major documentary on HBO, on talk shows and Law and Order, and on the covers of Time and Newsweek, the brain-disease model has become dogma — and like all articles of faith, it is typically believed without question.

    Writing in the New England Journal of Medicine in 2016, a trio of NIH- and NIDA-funded scientists speculated that the “brain disease model continues to be questioned” because the science is still incomplete — or, as they put it, because “the aberrant, impulsive, and compulsive behaviors that are characteristic of addiction have not been clearly tied to neurobiology.” Alas, no. Unclear linkages between actions and neurobiology have nothing to do with it. Tightening those linkages will certainly be welcome scientific progress — but it will not make addiction a brain disease. After all, if explaining how addiction operates at the level of neurons and brain circuits is enough to make addiction a brain disease, then it is arguably many other things, too: a personality disease, a motivational disease, a social disease, and so on. The brain is bathed in culture and circum-stance. And so I ask again: why promote one level of analysis above all of the others?

    Of course, those brain changes are real. How could they not be? Brain changes accompany any experience. The simple act of reading this sentence has already induced changes in your brain. Heroin, cocaine, alcohol, and other substances alter neural circuits, particularly those that mediate pleasure, motivation, memory, inhibition, and planning. But the crucial question regarding addiction is not whether brain changes take place. It is whether those brain changes obliterate the capacity to make decisions. The answer to that question is no. People who are addicted can respond to carrots and sticks, incentives and sanctions. They have the capacity to make different decisions when the stakes change. There is a great deal of evidence to substantiate faith in the agency of addicts. Acknowledging it is not tantamount to blaming the victim; it is, much more positively, a recognition of their potential.

    The brain-disease model diverts attention from these truths. It implies that neurobiology is necessarily the most important and useful level of analysis for understanding and treating addiction.  Drugs “hijack” the reward system in the brain, and the patient is the hostage. According to the psychiatrist and neuroscientist Nora Volkow, who is currently the head of NIDA, “a person’s brain is no longer able to produce something needed for our functioning and that healthy people take for granted, free will.” Addiction disrupts the function of the frontal cortex, which functions as “the brakes,” she told a radio audience, so that “even if I choose to stop, I am not going to be able to.” Volkow deploys Technicolor brain scans to bolster claims of hijacked and brakeless brains.

    Rhetorically, the scans make her point. Scientifically, they do not. Instead they generate a sense of “neuro-realism” — a term coined by Eric Racine, a bioethicist at the Montreal Clinical Research Institute, to describe the powerful intuition that brain-based information is somehow more genuine or valid than is non-brain-based information. In truth, however, there are limits to what we can infer from scans. They do not allow us, for example, to distinguish irresistible impulses from those that were not resisted, at least not at this stage of the technology. Indeed, if neurobiology is so fateful, how does any addict ever quit? Is it helpful to tell a struggling person that she has no hope of putting on the brakes? It may indeed seem hopeless to the person caught in the vortex of use, but then our job as clinicians is to make quitting and sustained recovery seem both desirable and achievable to them.

    We start doing this in small ways, by taking advantage of the fact that even the subjective experience of addiction is malleable. As Jon Elster points out in Strong Feelings: Emotions, Addiction, and Human Behavior, the craving for a drug can be triggered by the mere belief that it is available. An urge becomes overpowering when a person believes it is irrepressible. Accordingly, cognitive behavioral therapy is designed precisely to help people understand how to manipulate their environment and their beliefs to serve their interests. They may learn to insulate themselves from people, places, and circumstances associated with drug use; to identify emotional states associated with longing for drugs and to divert attention from the craving when it occurs. These are exercises in stabilization. Sometimes they are fortified with anti-addiction medications. Only when stabilized can patients embark on the ambitious journey of rebuilding themselves, their relation-ships, and their futures.

    I have criticized the brain disease model in practically every lecture I have given on this wrenching subject. I have been relentless, I admit. I tell fellow addiction professionals and trainees that medicalization encourages unwarranted optimism regarding pharmaceutical cures and oversells the need for professional help. I explain that we err in calling addiction a “chronic” condition when it typically remits in early adulthood. I emphasize to colleagues who spend their professional lives working with lab rats and caged monkeys that the brain-disease story gives short shrift to the reality that substances serve a purpose in the lives of humans. And I proselytize that the brain changes induced by alcohol and drugs, no matter how meticulously scientists have mapped their starry neurons and sweeping fibers, need not spell destiny for the user.

    Yet despite my strong aversion to characterizing addiction as a problem caused primarily by brain dysfunction, I genuinely appreciate the good ends that the proponents of the brain model have sought to reach. They hoped that “brain disease,” with its intimation of medical gravitas and neuroscientific determinism, would defuse accusations of flawed character or weak will. By moving addiction into the medical realm, they can get it out of the punitive realm. And if addicts are understood to suffer from a brain disease, their plight will more likely garner government and public sympathy than if they were seen as people simply behaving badly. But would they? Research consistently shows that depictions of behavioral problems as biological, genetic, or “brain” problems actually elicit greater desire for social distance from afflicted individuals and stoke pessimism about the effective-ness of treatment among the public and addicted individuals themselves.

    Evidence suggests that addicted individuals are less likely to recover if they believe that they suffer from a chronic disease, rather than from an unhealthy habit. More radically, there is a grounded argument to be made for feelings of shame, despite its bad reputation in therapeutic circles. “Shame is highly motivating,” observes the philosopher Owen Flanagan, who once struggled mightily with alcohol and cocaine, “it expresses the verdict that one is living in a way that fails one’s own survey as well as that of the community upon whose judgment self-respect is legitimately based.” But under what conditions do feelings of shame end up prodding people into correcting their course, as opposed to making matters worse by fueling continued consumption to mute the pain of shameful feelings? The psychologists Colin Leach and Atilla Cidam uncovered a plausible answer. They conducted a massive review of studies on shame (not linked to addiction per se) and approaches to failure, and found that when people perceive that damage is manageable and even reversible shame can act as a spur to amend self-inflicted damage. They underscored what clinicians have long-known: only when patients are helped to feel competent — “self-efficacious” is the technical term — can they begin to create new worlds for themselves.

    Thinking critically about the disease idea is important for conceptual clarity. But a clinician must be pragmatic, and if a patient wants to think of addiction as a disease I do not try to persuade them otherwise. Yet I do ask one thing of them: to be realistic about the kind of disease it is. Despite popular rhetoric, addiction is not a “disease like any other.” It differs in at least two important ways. First, individuals suffering from addiction respond to foreseeable consequences while individuals with conventional diseases cannot. Second, this “disease” is driven by a powerful emotional logic.

    In 1988, Michael Botticelli, who would go on to become President Obama’s second drug czar over two decades later, was charged with drunk driving on the Massachusetts Turnpike. A judge gave him the choice of going to jail or participating in a treatment program. Botticelli made a decision: he went to a church basement for help, joined Alcoholics Anonymous, 

    and quit drinking. Yet on CBS’ 60 Minutes he contradicted his own story when he drew an analogy between having cancer and being addicted. “We don’t expect people with cancer to stop having cancer,” he said. But the analogy is flawed. No amount of reward or punishment, technically called “contingency,” can alter the course of cancer. Imagine threatening to impose a penalty on a brain cancer victim if her vision or speech continued to worsen, or to offer a million dollars if she could stay well. It would have no impact and it would be cruel. Or consider Alzheimer’s, which is a true brain disease.

    (True insofar as the pathology originates in derangements of brain structure and physiology.) If one held a gun to the head of a person addicted to alcohol and threatened to shoot her if she consumed another drink, or offered her a million dollars if she desisted, she could comply with this demand — and the odds are high that she would comply. In contrast, threatening to shoot an Alzheimer’s victim if her memory further deteriorated (or promising a reward if it improved) would  be pointless.

    The classic example of the power of contingency is the experience of American soldiers in Vietnam. In the early 1970s, military physicians in Vietnam estimated that between 10 percent and 25 percent of enlisted Army men were addicted to the high-grade heroin and opium of Southeast Asia. Deaths from overdosing soared. Spurred by fears that newly discharged veterans would ignite an outbreak of heroin use in American cities, President Richard Nixon commanded the military to begin drug testing. In June 1971, the White House announced that no soldier would be allowed to board a plane home unless he passed a urine test. Those who failed could go to an Army-sponsored detoxification program before they were re-tested.

    The plan worked. Most GIs stopped using narcotics as word of the new directive spread, and most of the minority who were initially prevented from going home produced clean samples when given a second chance. Only 12 percent of the soldiers who were dependent on opiate narcotics in Vietnam became re-addicted to heroin at some point in the three years after their return to the United States. Whereas heroin helped soldiers endure wartime’s alternating bouts of boredom and terror, most were safe once they were stateside. At home, they had different obligations and available rewards, such as their families, jobs, friends, sports, and hobbies. Many GIs needed heroin to cool the hot anger they felt at being sent to fight for the losing side by commanders they did not respect. Once home, their rage subsided to some extent. Also, heroin use was no longer normalized as it was overseas. At home, heroin possession was a crime and the drug was harder and more dangerous to obtain. As civilian life took precedence, the allure of heroin faded.

    We know the value of “contingencies.” Hundreds of studies attest to the power of carrots and sticks in shaping the behavior of addicted individuals. Carl Hart, a neuroscientist at Columbia University, has shown that when people are given a good enough reason to refuse drugs, such as cash, they respond. He ran the following experiment: he recruited addicted individuals who had no particular interest in quitting, but who were willing to stay in a hospital research ward for two weeks for testing. Each day Hart offered them a sample dose of either crack cocaine or methamphetamine, depending upon the drug they use regularly. Later in the day, the subjects were given a choice between the same amount of drugs, a voucher for $5 of store merchandise, or $5 cash. They collected their reward upon discharge two weeks  later. The majority of subjects choose the $5 voucher or cash when offered small doses of the drug, but they chose the drug when they were offered a higher dose. Then Hart increased the value of the reward to $20, and his subjects chose the money every time.

    One of my patients, I will call her Samantha, had been using OxyContin since 2011 when she was working in the kitchen at Little Caesar’s in downtown Ironton. The 20 mg pills belonged to her grandmother, whose breast cancer had spread to her spine. Samantha visited her grandma after work, watched TV with her, and went through the mail. She would also remove three or four pills per day from the massive bottle kept by the fancy hospital bed that Samantha’s brother moved into the living room. When Samantha’s grandmother died in 2016, so did the pill supply. “I just couldn’t bring myself to do heroin, and, anyway, I had no money for drugs,” Samantha said.

    When the pills were almost gone, Samantha drove to an old friend’s house, hoping that the friend would give her a few Oxy’s in exchange for walking Snappy, her arthritic chihuahua. “My friend wasn’t home, but her creepy boyfriend Dave answered the door and told me he’d give me some Oxy’s if I gave him a blow job.” Samantha was feeling the warning signs of withdrawal — jitteriness, crampy stomach, sweaty underarms. Desperate to avoid full blown withdrawal, she gave a minute’s thought to the proposition. “Then I felt revolted and I said no way and drove straight here because I knew I could start buprenorphine the same day,” she said.

    What of Samantha’s “hijacked” brain? When she stood before Dave, her brain was on fire. Her neurons were screaming for oxycodone. Yet in the midst of this neurochemical storm, at peak obsession with drugs, Samantha’s revulsion broke through, leading her to apply the “brakes” and come to our program. None of this means that giving up drugs is easy. But it does mean that an “addicted brain” is capable of making a decision to quit and of acting on it.

    On Tuesday nights, I co-ran group therapy with a wise social worker named John Hurley. In one group session, spurred by a patient sharing that he decided to come to treatment after spending some time in jail, the patients went around the room reciting what brought them to the clinic. Without exception, they said that they felt pressured by forces inside or outside themselves.

    “I couldn’t stand myself.”

    “My wife was going to leave me.”

    “My kids were taken away.”

    “My boss is giving me one more chance.”

    “I can’t bear to keep letting my kids down.”

    “I got Hep C.”

    “I didn’t want to violate my probation.”

    Ultimatums of these kinds were often the best things to happen to our patients. For other addicts, the looming consequences proved so powerful that they were able to quit without any professional help at all.

    The psychologist Gene Heyman at Boston College found that most people addicted to illegal drugs stopped using by about age thirty. John F. Kelly’s team at Massachusetts General Hospital found that forty-six percent of people grappling with drugs and alcohol had resolved their drug problems on their own. Carlos Blanco and his colleagues at Columbia University used a major national database to examine trends in prescription drug problems. Almost all individuals who abused or were addicted to prescription opioids also, at some point in their lives, had a mental disorder, an alcohol or drug problem, or both. Yet roughly half of them were in remission five years later. Given low rates of drug treatment, it is safe to say that the majority of remissions took place without professional help.

    These findings may seem surprising to, of all people, medical professionals. Yet it is well-known to medical sociologists that physicians tend to succumb to the “clinicians’ illusion,” a habit of generalizing from the sickest subset of patients to the overall population of people with a diagnosable condition. This caveat applies across the medical spectrum. Not all people with diabetes, for example, have brittle blood sugars — but they will represent a disproportionate share of the endocrinologist’s case load. A clinician might wrongly, if rationally, assume that most addicts behave like the recalcitrant ones who keep stumbling through the emergency room doors. Most do not. Granted, not everyone can stop an addiction on their own, but the very fact it can be done underscores the reality of improvement powered by will alone: a pathway to recovery rarely available to those with conventional illness.

    The second major difference between addiction and garden- variety disease is that addiction is driven by powerful feelings. Ask an alcoholic why she drinks or an addict why he uses drugs and you might hear about the pacifying effect of whisky and heroin on daunting hardship, unremitting self-persecution, yawning emptiness, or harrowing memories. Ask a patient with Parkinson’s disease, a classic brain disease, why he developed the neurological disorder and you will get a blank stare. Parkinson’s is a condition that strikes, unbidden, at the central nervous system; the patient does not consciously collude in bringing it about. Excessive use of a drug, by contrast, serves some kind of need, an inner pain to be soothed, a rage to be suppressed. It is a response to some sort of suffering.

    Memoirs offer portals into the drama of addiction. One of my favorites is Straight Life, by the master alto saxophonist Art Pepper. Self-taught on the instrument by the age of thirteen, Pepper endured a childhood of psychological brutality at the hands of a sadistic alcoholic father, an icicle of a grandmother, and an alcoholic mother who was fourteen years old when he was born and who did not hide her numerous attempts to abort him. “To no avail,” he writes. “I was born. She lost.” What preoccupied him as a child was “wanting to be loved and trying to figure out why other people were loved and I wasn’t.” Pepper’s self-loathing bubbled like acid in his veins. “I’d talk to myself and say how rotten I was,” he wrote. “Why do people hate you? Why are you alone?” At 23, after years of alcohol and pot, he sniffed his first line of heroin through a rolled up dollar-bill and the dark genie dissolved. He saw himself in the mirror. “I looked like an angel,” he marveled. “It was like looking into a whole universe of joy and happiness and contentment.”

    From that moment on, Pepper said, he would “trade misery for total happiness… I would be a junkie…I will die a junkie.” Indeed, he became a “lifelong dope addict of truly Satanic fuck-it-all grandeur,” in the words of his passionate admirer, the critic and scholar Terry Castle. He was in and out of prison for possession charges. Pepper lived without heroin for a number of years after attending Synanon, a drug-rehabilitation center in California, from 1969 to 1972 and was treated with methadone for a period in the mid-1970s. Eventually, though, he returned to drugs, mainly consuming massive amphetamine, and died from a stroke in 1982. He was 56.

    Addicts can appear to have everything: a good education, job prospects, people who love them, a nice home. They can be people who “are believed to have known no poverty except that of their own life-force,” to borrow the words of Joan Didion, and yet suffer greatly. The malaise is internal. Or they can be in dire circumstances, immiserated by their lives, moving through a dense miasma. “There was nothing for me here,” said one patient whose child was killed in a car accident, whose husband cheated on her, and who was trapped in her job as a maid in a rundown motel with an abusive boss. OxyContin made her “not care.” She reminded me of Lou Reed’s song “Heroin”:

    Wow, that heroin is in my blood
    And the blood is in my head
    Yeah, thank God that I’m good as dead
    Oooh, thank your God that I’m not aware
    And thank God that I just don’t care

    Pharmacologists have long classified opioid drugs as euphoriants, inducers of pleasure, described often as a feeling of a melting maternal embrace, but they could just as easily be called obliviants. According to the late Harvard psychiatrist Norman Zinberg, oblivion seekers yearned “to escape from lives that seem unbearable and hopeless.” Thomas De Quincey, 

    in Confessions of an English Opium Eater, which appeared in 1821, praised opium for keeping him “aloof from the uproar of life.” Many centuries before him Homer had likely referred 

    to it in the Odyssey when he wrote that “no one who drank it deeply…could let a tear roll down his cheeks that day, not even if his mother should die, his father die, not even if right before his eyes some enemy brought down a brother or darling son with a sharp bronze blade,” When the Hollywood screen-writer Jerry Stahl surveyed his life in 1995 in his memoir Permanent Midnight, he concluded that “everything, bad or good, boils back to the decade on the needle, and the years before that imbibing everything from cocaine to Romilar, pot to percs, LSD to liquid meth and a pharmacy in between: a lifetime spent altering the single niggling fact that to be alive means being conscious.” Drugs helped him to attain “the soothing hiss of oblivion.”

    According to ancient myth, Morpheus, the god of dreams, slept in a cave strewn with poppy seeds. Through the cave flowed the river Lethe, known as the river of forget-fulness, also called the river of oblivion. The dead imbibed those waters to forget their mortal days. Unencumbered by memory, they floated free from the aching sadness and discomforts of life.  The mythological dead share a kinship with opioid addicts, oblivion-seekers, and all their reality-manipulating cousins. The difference, mercifully, is that actual people can “un-drink” the numbing waters. Aletheia, truth, is a negation of lethe, the Greek word for forgetting. Recovery from addiction is a kind of unforgetting, an attempt to live in greater awareness and purpose, a disavowal of oblivion.

    Addiction is a cruel paradox. What starts out making life more tolerable can eventually make it ruinous. “A man may take to drink because he feels himself a failure,” said Orwell, “but then  fail  all the more completely because he  drinks.” The balm is a poison. Drugs that ease the pain also end up prolonging it, bringing new excruciations — guilt and grief over damage to one’s self, one’s family, one’s future — and thus fresh reason to continue. The cycle of use keeps turning. Ambivalence is thus a hallmark of late-stage addiction. The philosopher Harry Frankfurt speaks of the “unwilling addict” who finds himself “hating” his addiction and “struggling desperately…against its thrust.” This desperate struggle is what Samuel Taylor Coleridge, himself an opium addict, called “a species of madness” in which the user is torn between his current, anguished self who seeks instant solace and a future self who longs for emancipation from drugs. This explains why the odds of treatment drop out are high — over half after six months, on average. The syringe of Damocles, as Jerry Stahl described the vulnerability to relapse, dangles always above their heads. Many do not even take advantage of treatment when it is offered, reluctant to give up their short-term salvation. They fear facing life “unmedicated” or cannot seem to find a reason for doing so. My friend Zach Rhoads, now a teacher in Burlington, Vermont, used heroin for five years beginning in his early twenties and struggled fiercely to quit. “I had to convince myself that such effort was worth the trouble,” he said.

    Thomas De Quincey consumed prodigious amounts of opium dissolved in alcohol and pronounced the drug a “panacea for all human woes.” For Anthony Bourdain, heroin and cocaine were panaceas, defenses against the dark genie that eventually rose up and strangled him to death in 2018. But not all addicts have a dark genie lurking inside them. Some seek a panacea for problems that crush them from the outside, tribulations of financial woes and family strain, crises of faith and purpose. In the modern opioid ordeal, these are Americans “dying of a broken heart,” in Bill Clinton’s fine words. “They’re the people that were raised to believe the American Dream would be theirs if they worked hard and their children will have a chance to do better — and their dreams were dashed disproportionally to the population as the whole.” He was gesturing toward whites between the ages of 45 and 54 who lack college degrees — a cohort whose life-expectancy at birth had been falling since 1999. They succumbed to “deaths of despair,” a term coined by the economists Anne Case and Angus Deaton in 2015, brought on by suicide, alcoholism (specifically, liver disease), and drug overdoses. Overdoses account for the lion’s share. The white working class has been undermined by falling wages and the loss of good jobs which have “devastated the white working class,” the economists write, and “weakened the basic institutions of working-class life, including marriage, churchgoing, and community.”

    Looking far into the future, what so many of these low income, under-educated whites see are dark horizons. When communal conditions are dire and drugs are easy to get, epidemics can blossom. I call this dark horizon addiction. Just as dark genie addiction is a symptom of an embattled soul, dark horizon addiction reflects communities or other concentrations of people whose prospects are dim and whose members feel doomed. In Ironton, clouds started to gather on the horizon in the late 1960s. Cracks appeared in the town’s economic foundation, setting off its slow but steady collapse.

    Epidemics of dark horizon addiction have appeared under all earthly skies at one time or another. The London gin “craze” of the first half of the eighteenth century, for example, was linked to poverty, social unrest, and over-crowding. According to the historian Jessica Warner, the average adult in 1700 drank slightly more than a third of a gallon of cheap spirits over the course of a year; by 1729 it was slightly more than 1.3 gallons per capita, and hit 2.2 gallons in 1743. A century later, consumption had declined, yet gin was still “a great vice in England,” according to Charles Dickens. “Until you improve the homes of the poor, or persuade a half-famished wretch not to seek relief in the temporary oblivion of his own misery,” he wrote in the 1830s, “gin-shops will increase in number and splendor.”

    During and after the American Civil War, thousands of men needed morphine and opium to bear the agony of physical wounds. In his Medical Essays, the physician Oliver Wendell Holmes, Sr., a harsh critic of medication, excepted opium as the one medicine “which the Creator himself seems to prescribe.” The applications of opium extended to medicating grief. “Anguished and hopeless wives and mothers, made so by the slaughter of those who were dearest to them, have found, many of them, temporary relief from their sufferings in opium,” Horace B. Day, an opium addict himself, recorded in The Opium Habit in 1868. In the South, the spiritual dislocation was especially profound, no doubt explaining, to a significant degree, why whites in the postbellum South had higher rates of opiate addiction than did those in the North — and also, notably, one reason why southern blacks had a lower rate of opiate addiction, according to the historian David T. Courtwright. “Confederate defeat was for most of them an occasion of rejoicing rather than profound depression.”

    A similar dynamic was seen when Russia’s long-standing problem with vodka exploded during the political instability and economic uncertainty of the post-Communist era. The majority of men drank up to five bottles a week in the early 1990s. Back home, heroin was a symptom of ghetto life for millions of impoverished and hopeless Hispanics and blacks in the 1960s and 70s, followed by crack among blacks in the mid-80s. The rapid decline of manufacturing jobs for inner city men, writes the historian David Farber in his recent book Crack, “helps explain the large market of poor people, disproportionately African Americans, who would find crack a balm for their troubled, insecure, and often desperate lives.”

    Children raised by dark horizon parents often bear a double burden. Not only do they suffer from growing up with defeated people in defeated places where opportunities are stunted and boredom is crushing. Often they are casualties of their parents’ and their grandparents’ addictions. One of my patients, Jennifer, described herself as a “third generation junky.” Patches of acne clung to her cheeks, making her look younger than thirty. Her maternal grandmother managed well enough with an ornery husband who drank too much on weekends until he lost his job at a local casting plant in the 1970s and became a full-fledged alcoholic, bitter, aimless, and abusive to his wife. The grandmother worked cleaning motel rooms and began staying out late, using pills and weed. Jennifer’s mother, Ann, was the youngest in a household that had devolved into havoc.

    When Ann was sixteen, Jennifer was born. Not one reliable adult was around. “No one really cared if I went to school,” Jennifer recalls. No one urged her to succeed or expressed confidence in her. “I learned that when something bothered you, you got high.” Her mother, Ann, was aloof, Jennifer said, except for the stretch they were both in jail at the same time: she was 19, her mother was 42. “My mother was assigned to be the chaperone for my group of inmates,” Jennifer recalled. “She did my laundry and saved me extra food in jail. It was the only time she acted like a mom towards me.” Children raised in such homes are greatly disadvantaged. The absence of a steady protector in their lives often derails their developing capacity for tolerating frustration and disappointment, controlling impulses, and delaying gratification. They have difficulty trusting others, forming rewarding connections with others and they often see themselves as damaged and worthless. When adults around them do not want to work regularly, children cannot imbibe the habits of routine, reliability, and dependability. At worst, the cycle repeats itself, inflicting wounds across generations and communities as their collective disenchantment with the future mounts. Sociologists call this “downward social drift.”

    The germ theory of addiction: that is my term for one of the popular if misbegotten narratives of how the opioid crisis started. It holds that the epidemic has been driven almost entirely by supply — a surfeit not of bacteria or viruses, but of pills. “Ask your doctor how prescription pills can lead to heroin abuse,” blared massive billboards from the Partnership for a Drug-Free New Jersey that I saw a few years ago. Around that time, senators proposed a bill that would have limited physician prescribing. “Opioid addiction and abuse is commonly happening to those being treated for acute pain, such as a broken bone or wisdom tooth extraction,” is how they justified the legislation.

    Not so. The majority of prescription pill casualties were never patients in pain who had been prescribed medication by their physicians. Instead, they were mostly individuals who were already involved with drugs or alcohol. Yes, some actual patients did develop pill problems, but generally they had a history of drug or alcohol abuse or were suffering from concurrent psychiatric problems or emotional distress. It is also true, of course, that drug marketers were too aggressive at times and that too many physicians overprescribed, sometimes out of inexperience, other times out of convenience, and in some cases out of greed.

    As extra pills began accumulating in rivulets, merging with pills obtained from pharmacy robberies, doctor shopping, and prescription forgeries, a river of analgesia ran through various communities. But even with an ample supply, you cannot “catch” addiction. There must be demand — not for addiction, per se, but for its vehicle. My year in Ironton showed me that the deep story of drug epidemics goes well beyond public health and medicine. Those disciplines, while essential to management, will not help us to understand why particular people and places succumb. It is the life stories of individuals and, in the case of epidemics, the life story of places, that reveal the origins. Addiction is a variety of human experience, and it must be studied with all the many methods and approaches with we which we study human experience.

    Dark genies can be exorcised and dark horizons can be brightened. It is arduous work, but unless we recognize all the reasons for its difficulty, unless we reckon with the ambiguity and the elusiveness and the multiplicity of addiction’s causes, unless we come to understand why addicts go to such lengths to continue maiming themselves with drugs — compelled by dark genies, dark horizons, or both — their odds of lasting recovery are slim, as are the odds of preventing and reversing drug crises. The complexity of addiction is nothing other than the complexity of life.

    America in the World: Sheltering in Place

    I

    On the third week of America’s quarantine against the pandemic, a new think tank in Washington had a message for the Pentagon. “The national security state, created to keep us safe and guard our freedoms, has failed,” Andrew Bacevich, the president of the Quincy Institute for Responsible Statecraft, told viewers on a Skype video from home, interspersed with the sounds of sirens and images of emergency rooms. While microbes from China were mutating and coming to kill us, he preached, we were wasting our time hunting terrorists and projecting military power abroad. It was a sequitur in search of a point — as if America ever faces only one danger at a time. When the black plague struck Europe and Asia in the fourteenth century, it did not mean that Mongol hordes would no longer threaten their cities. Nor does the coronavirus mean that jihadists are not plotting terror or that Russia is not threatening its neighbors or that China is not devouring Hong Kong.

    His casuistry aside, Bacevich was playing to the resentments of Americans who sincerely believe that American foreign policy is driven by an addiction to war. For the first two decades of post-cold war politics, this argument was relegated to the hallucinations of the fringe. But no more. A new national consensus had started to form before the plague of 2020: that there are almost no legitimate uses for American military power abroad, that our wars have been “endless wars,” and that our “endless wars” must promptly be ended. On the subject of American interventionism, there is no polarization in this notoriously polarized country. There is a broad consensus, and it is that we should stay out and far away.

    The concept of “endless wars” has its roots in the middle of the twentieth century. In 1984, most famously, George Orwell depicted a totalitarian state that invents its own history to justify perpetual war between the superpowers to keep its citizens in a state of nationalist fervor. In American political discourse, the concept of a war without end was baked into the influential notion of “the manufacture of consent,” a notion manufactured by Noam Chomsky according to which the media teaches the American people to support or acquiesce in the nefarious activities of the military-industrial complex. But the “endless wars” that so many Americans wish to end today are not like the ones that Orwell imagined. Today Americans seek to end the war on terror, which in practice means beating back insurgencies and killing terrorist leaders in large swaths of the Islamic world. Orwell’s wars were endless because none of the world’s states possessed the power to win them. The war on terror, by contrast, endures because of a persistent threat to Western security and because weaker states would collapse if American forces left. The war on terror pits the American Gulliver against fanatical bands of Lilliputians. But the asymmetry of military power does not change the magnitude — or the reality — of the carnage that “stateless actors” can wreak.

    To get a feel for the new consensus on American quietism, consider some of the pre-pandemic politics surrounding the war in Afghanistan. In a debate during the presidential primaries, Elizabeth Warren insisted that “the problems in Afghanistan are not problems that can be solved by a military.” Her Democratic rivals on the stage agreed, including Joe Biden. This is also Donald Trump’s position. As Warren was proclaiming the futility of fighting for Afghanistan’s elected government, the Trump administration was negotiating that government’s betrayal with the Taliban. (And the Taliban was ramping up its violence while we were negotiating with it.) Before the coronavirus crisis, the Trump administration was spending a lot of its political capital on trying to convince skeptical Republican hawks that the planned American withdrawal would not turn Afghanistan into a haven for terrorists again, which of course is nonsense.

    The emerging unanimity about an escape from Afghanistan reflects a wider strategic and historical exhaustion. Despite the many profound differences between Trump and Obama, both presidents have tried to pivot away from the Middle East to focus on competition with China. (Obama never quite made the pivot.) Both presidents have also mused publicly about how NATO allies are “free riders” on America’s strength. And both presidents have shown no patience with the use of American military force. In 2012, even as the world was once again becoming a ferociously Hobbesian place, the Obama administration’s national defense strategy dropped the longstanding post-cold war goal of being able to win two wars in different geographical regions at once. (The Obama Pentagon seemed to think that land wars are a thing of the past and that we can henceforth make do with drones and SEALs.) Trump’s first defense strategy in 2018 affirmed the Obama formulation.

    Moreover, a majority of Americans agreed with their political leaders. A Pew Research poll in 2019 found that around sixty percent of all Americans did not believe it was worth fighting in Iraq, Syria, or Afghanistan. That percentage is even higher among military veterans. Indeed, Pew research polling since 2013 has found that more Americans than not believe that their country should stay out of world affairs. Hal Brands and Charles Edel, in their fine book The Lessons of Tragedy, point out that majorities of Americans still agreed in the late 2010s that America should possess the world’s most powerful military, and supported alliances, and favored free trade, but they conclude that many Americans are now resistant to the “sacrifices and trade-offs necessary to preserve the country’s post-war achievements.”

    All of that was before covid19 forced most of the country to “shelter in place.” In truth, sheltering-in-place has been the goal of our foreign and national security policy for most of a decade. And it will be much harder to justify a continued American presence in the Middle East, west Asia, Africa and even the Pacific after Congress borrowed trillions of necessary dollars for paycheck protection and emergency small business loans. In addition to all of the older muddled arguments for retreat, there will now be a strong economic case that the republic can no longer afford its overseas commitments, as if foreign policy and national security are ultimately about money. In other words, there are strong indications that the republic is undergoing a profound revision of its role in leading and anchoring the international order that it erected after World War II. The days of value-driven foreign policy, of military intervention on humanitarian grounds, and even of grand strategy, may be over. Should every terror haven, every failed state, every aggression against weak states, and every genocide be America’s responsibility to prevent? Of course not. But should none of them be? America increasingly seems to think so. We are witnessing the birth of American unexceptionalism, otherwise known as “responsible statecraft.”

     

    II

    At the end of the cold war, the spread of liberal democracy seemed inevitable. The Soviet Union had collapsed, and with it the communist governments of the Eastern European countries it dominated. China had momentously made room for a market in its communist system, a strange state-sponsored capitalism that brought hundreds of millions of people out of subsistence poverty. In the West, juntas and strongmen toppled and elected governments replaced them. In every region except for the Middle East and much of Africa, the open society was on the march.

    One of the first attempts to describe the thrilling new moment was a famous, and now infamous, essay by Francis Fukuyama. In 1989, in “The End of History?,” he surveyed a generation that saw the collapse of pro-American strongmen from Spain to Chile along with the convulsions behind the Iron Curtain and concluded that the triumph of liberalism was inevitable. (He has since revised his view, which is just as well.) His ideas provided the intellectual motifs for a new era of American hegemony. “The triumph of the West, and the Western idea, is evident first of all in the total exhaustion of viable systematic alternatives to western liberalism,” Fukuyama wrote. What he meant, in his arch Hegelian way, was that the age of ideological conflict between states was over. History was teleological and it had attained its telos. Fukuyama envisioned a new era in which great power wars would be obsolete. He did not predict the end to all war, but he did predict that big wars over competing ideologies would be replaced by a more mundane and halcyon kind of competition. The principled struggles of history, he taught, “will be replaced by economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands.”

    Fukuyama’s predictions were exhilarating in 1989 because the consensus among most intellectuals during the Cold War had been that the Soviet Union was here to stay. Early theorists of totalitarianism such as Hannah Arendt and Carl Friedrich had portrayed the Soviet state as an unprecedented and impermeable juggernaut that was terrifyingly strong and durable. The hero of Orwell’s dystopia, the dissident Emmanuel Goldstein, resisted Big Brother but was never a real threat to the state. In the Brezhnev era, analysts of the Soviet Union began to notice that the juggernaut was crumbling from within and had lost the ideological allegiance of its citizens, even as its military and diplomatic adventures beyond its borders continued. Building on this increasingly realistic understanding of the failures of the communist state, Fukuyama observed that totalitarian systems were overstretched and brittle. The West could exhale.

    Not everyone agreed. Samuel Huntington argued that conflict between great powers would remain because identity, not ideology, is what drives states to make war. While it was true that communism was weakening after the collapse of the Soviet Union, other illiberal forces such as religious fundamentalism and nationalism remained a threat to the American-led liberal world order. The hope that China or Iran could be persuaded to open their societies by appealing to prosperity and peace ignored that most nations were motivated not by ideals, but by a shared sense of history and culture. Leon Wieseltier similarly objected that the end of the Soviet Union and its empire would release ethnic and religious and tribal savageries, old animosities that were falsely regarded as antiquated. He also observed that the concept of an “end of history” was borrowed from the very sort of totalitarian mentality whose days Fukuyama believed were over. The worst fiends of the twentieth century justified their atrocities through appeals to history’s final phase; the zeal required for their enormous barbarities relied in part on a faith that these crimes are advancing the inevitable march of history. For Wieseltier, there is no final phase and no inevitable march, and the liberal struggle is endless. “To believe in the end of history,” he wrote, “you must believe in the end of human nature, or at least of its gift for evil.”

    As international relations theories go, “The End of History” was like a medical study that found that ice cream reduced the risk of cancer. Fukayama’s optimistic historicism instructed that the easiest choice for Western leaders was also the wisest. Why devise a strategy to contain or confront Russia if it was on a glide path to democratic reform? Why resist American industrial flight to China if that investment would ultimately tame the communist regime and tempt it to embrace liberalism?

    Every president until Trump believed that it was possible to lure China and Russia into the liberal international order and attempted to do so. Instead of preparing for a great power rivalry, American foreign policy sought to integrate China and Russia into global institutions that would restrain them. Bill Clinton and George W. Bush expanded NATO, but they also invited Russia into the Group of 7 industrialized nations. Clinton, Bush, and Obama — the latter liked to invoke “the rules of the road” — encouraged Chinese-American economic interdependence. Until Obama’s second term, the United States did next to nothing to stop China’s massive theft of intellectual property. Until June 2020, Chinese corporations could trade freely on U.S. stock exchanges without submitting to the basic accounting rules required of American companies. The assumption behind these Panglossian views of China and Russia was that democratic capitalism was irresistible and the end of communism marked the beginning of a new era of good feelings. (Communism never ended in China, of course.) And it was certainly true that trade with China benefitted both economies: Chinese and American corporations prospered and American consumers enjoyed cheaper consumer goods.

    This is not to say that there were no bouts of dissent. In his presidential campaign in 1992, Bill Clinton attacked George H. W. Bush for his capitulation to China after the uprising at Tiananmen Square. And even though Clinton did not alter the elder Bush’s approach to China during his presidency, there was a lively debate about China’s human rights abuses in the 1990s. Clinton expanded NATO, something the elder Bush opposed, but he and later George W. Bush and Barack Obama did little to push back against Russia’s own regional adventures and aggressive behavior. Consider that no serious U.S. war plan for Europe was developed between the end of the Cold War and 2014, the same year that Russia invaded Ukraine and eventually annexed Crimea, and five years after Russia invaded and occupied the Georgian provinces of South Ossetia and Abkhazia. We preferred to look away from Russia’s forward movements — with his cravenness about Syria, Obama actually opened the vacuum that Russia was happy to fill — just as we preferred to look away from the growing evidence of China’s strategic ambitions and human-rights outrages. We were reluctant to lose those good feelings so soon after we acquired them.

    None of this meant that American presidents would not use force or wage war after the collapse of the Soviet Union. They did. But they did not engage in great power wars. The first Bush saved Kuwait from Saddam Hussein and saved Panama from the lesser threat of Manuel Noriega. Clinton intervened in the Balkans to stop a genocide and launched limited air strikes in the Middle East and Afghanistan. In the aftermath of September 11, George W. Bush waged a war on terror and toppled the tyrannies that held Iraq and Afghanistan captive. Obama intervened reluctantly and modestly and ineffectively in Libya; he withdrew troops from Iraq only to send some of them back; and he presided over a “surge” in Afghanistan, even though its announcement was accompanied by a timetable for withdrawal. Trump has launched no new wars, but he has killed Iran’s most important general and the architect of its campaign for regional hegemony, and he has launched strikes on Syrian regime targets in response to its use of chemical weapons, though his strikes have not added up to a consistent policy. But even as optimism about world order has become less easy to maintain, even as the world grows more perilous in old and new ways, the American mood of retirement, the inclination to withdrawal, has persisted. Fukuyama, who acknowledged that the threat of terrorism would have to be met with force, has remarked that our task is not “to answer exhaustively the challenges to liberalism promoted by every crackpot messiah around the world.” But what about the genocides perpetrated by a crackpot messiah (or a rational autocrat)? And what about answering great power rivals? At the time, to be sure, we had no great power rivals. We were living in the fool’s paradise of a “unipolar” world.

    *

    Bill Clinton came to the presidency from Little Rock without a clear disposition on the use of military force. He was at times wary of it. He pulled American forces out of Somalia after a militia downed two American helicopters. In his first term he dithered on the Balkan wars and their atrocities, favoring a negotiation with Serbia’s strongman Slobodan Milosevic. He did nothing to stop Rwanda’s Hutu majority from slaughtering nearly a million Tutsis for three months in the spring and summer of 1994. He was more focused than any of his predecessors or successors on brokering a peace between Israelis and Palestinians. Over time, of course, he evolved, but how the world suffers for the learning curve of American presidents! Clinton punished Saddam Hussein’s defiance of U.N. weapons inspectors. He bombed suspected al Qaeda targets in Sudan and Afghanistan after the bombings of American embassies in Africa in 1998. He prevented Milosevic from cleansing Kosovo of Albanians and helped push back Serb forces from Bosnia.

    Clinton was a reluctant sheriff, to borrow Richard Haass’ phrase. In his first term he was unsure about using American force abroad. By the end of his second term, he had come to terms with the responsibilities of American power. “The question we must ask is, what are the consequences to our security of letting conflicts fester and spread?,” Clinton asked in a speech in 1999. “We cannot, indeed, we should not, do everything or be everywhere. But where our values and our interests are at stake, and where we can make a difference, we must be prepared to do so.” He was talking about transnational threats and rogue states. In his second term, Clinton took a keen interest in biological weapons and pandemics. This meant using military power to prevent the proliferation of weapons of mass destruction and deter terrorists. As Madeleine Albright, Clinton’s second secretary of state, memorably put it, America was the world’s “indispensable nation.”

    Yet Clinton’s activism did not extend to Russia or China. He helped to expand the NATO alliance, but also secured debt forgiveness for the Russian federation and used his personal relationship with Russian president Boris Yeltsin to reassure him that NATO’s expansion was no threat to Moscow. Clinton also reversed his campaign promise on China and granted it most favored nation status as a trading partner, paving the way for the economic interdependence that Trump may be in the process of unraveling today. At the time, Clinton explained that “this decision offers us the best opportunity to lay the basis for long-term sustainable progress on human rights and for the advancement of our other interests with China.” This reflected the optimism of 1989-1991. What other model did China have to emulate, but our own? Allow it to prosper and over time it will reform.

    When Clinton left office, the consensus among his party’s elites was that his foreign policy mistakes were errors of inaction and restraint. Clinton did nothing to prevent the genocide in Rwanda. He waited too long to intervene in the Balkans. It seemed that Americans had gotten over their inordinate fear of interventions. Why had it taken Clinton so long? There was an activist mood in Washington before the attacks of September 11. And after hijacked commercial planes were turned into precision missiles and the towers fell, the sense that America needed to do more with its power intensified.

    *

    In the Bush years, American foreign policy fell first into the hands of neoconservatives. For their critics, they were a cabal of swaggering interlopers who twisted intelligence products and deceived a dim president into launching a disastrous war. In fact they were a group of liberals who migrated to the right and brought with them an intellectual framework and appreciation for social science that was absent from the modern conservative movement. In foreign policy they dreaded signs of American weakness or retreat, and in 1972 supported Scoop Jackson against George McGovern in the Democratic primaries. As that decade progressed, the wary and disenchanted liberals migrated to the former Democrat Ronald Reagan. In Reagan, they found a president who despised Soviet communism as much as they did.

    In the 1990s, a new generation of neocons wanted to seize the opportunity of American primacy in the world after the Soviet Union’s collapse. As Irving Kristol observed, “With power come responsibilities, whether sought or not, whether welcome or not. And it is a fact that if you have the kind of power we now have, either you will find opportunities to use it, or the world will discover them for you.” In that spirit, the neoconservatives of the 1990s advocated an activist foreign policy. They argued that the United States should help to destabilize tyrannies and support democratic opposition movements. They were not content with letting history take its course; they wanted to push it along in the direction of freedom. Their enthusiasm for an American policy of democratization was based on both moral arguments and strategic arguments.

    The focus in this period was Iraq. Neoconservatives had rallied around legislation known as the Iraq Liberation Act that would commit the American government to train and to equip a coalition of Iraqi opposition groups represented in the United States by Ahmad Chalabi, a wealthy Iraqi political figure who was trained as a mathematician in the United States. For the first half of the 1990s, the CIA funded Chalabi’s Iraqi National Congress, but he had a falling out with the agency. The Iraq Liberation Act was a way to save the opposition group by replacing a once covert intelligence program with one debated openly in Congress. It should be noted that Chalabi’s initial plan was not to convince America to invade Iraq, but to secure American training and equipment to build a rebel army comprised of Iraqis to topple Saddam Hussein. Clinton allowed the legislation to pass in 1997, but his government never fully implemented it.

    George W. Bush ironically ran his campaign in 2000 with the promise of a humble foreign policy. Condoleezza Rice memorably declared at the Republican convention that America cannot be the world’s 911. Not long afterward, 9/11 was the event that forced Bush to renege on his promise. Three days after that attack, Congress voted to authorize what we know today as the war on terror: the “endless wars” had begun. Over the last nineteen years, that authorization has justified a global war against a wide range of targets. Bush used it as the legal basis for strikes on terrorists in south Asia. Obama used it to justify his military campaign against the Islamic State, when it was a battlefield enemy of al Qaeda’s Syrian branch. And while every few years some members of Congress have proposed changes to the authorization, these efforts have yet to succeed. Today many progressives believe the war on terror deformed America into an evil empire, patrolling the skies of the Muslim world with deadly drones, blowing up wedding parties in Afghanistan, torturing suspected terrorists and aligning with brutal thugs. Even Obama has not escaped this judgment. Some of these are fair criticisms. The war on terror was indeed a war. Innocent people died. At the same time, the other side of the ledger must be counted. Since 9/11, there have been no mass-casualty attacks by foreign terrorists inside our borders. On its own terms, from the rather significant standpoint of American security, this “endless war” has produced results.

    In the first years of the war on terror, the pacifist left had little influence over the national debate. A better barometer of the country’s mood was a column, published a month before the Iraq War, by Charles Krauthammer. He denounced what he said was Clinton’s “vacation from history,” and asked whether “the civilized part of humanity [will] disarm the barbarians who would use the ultimate knowledge for the ultimate destruction.” Those words, and many others like them, helped to frame the rationale for the American invasion of Iraq. Note that Krauthammer did not write that Clinton’s vacation from history was his failure to prepare for China’s rise and Russia’s decline. It was his failure to prevent the arming of smaller rogue states and terrorist groups. Krauthammer was still living in Fukuyama’s world. And so was Bush. In his first term, Bush not only failed to challenge Russia or China, he sought to make them partners in his new global war. Bush famously remarked that he had looked into the eyes of Vladimir Putin and found a man he could trust. (“I was able to get a sense of his soul.”) Bush’s government would also designate a Uighur separatist organization as a terrorist group, giving cover to the persecution of that minority. The world learned in 2018 that China had erected a new Gulag in western China that now imprisons at least a million Uighurs.

    China and Russia did not support Bush’s Iraq war. Many Democrats did. In 2002, a slim majority of Democrats in the House opposed a resolution to authorize it, but in the Senate, 29 out of 50 Democrats voted for it. Most significant, every Democrat with presidential aspirations — from Hillary Clinton to Joe Biden — voted for the war, a vote for which they would later apologize. At the time of that vote, the ambitious Democrats who supported it did not know that opposition to that war would define their party for years to come. Neither did the establishment Democrats who opposed it. Al Gore, speaking at the Commonwealth Club of San Francisco, explained his opposition to the war: “If we go in there and dismantle them — and they deserve to be dismantled — but then we wash our hands of it and walk away and leave it in a situation of chaos, and say, ‘That’s for y’all to decide how to put things back together now,’ that hurts us.” Gore was not concerned that America may break Iraq, he was acknowledging that it was already broken. Nor was he worried about an “exit strategy.” He worried that if America went to war in Iraq under a Republican president, the war may not be endless enough. America may leave too soon.

    The Iraq war was also opposed by a group of international relations theorists who advocated for what is known as foreign policy realism. Unlike Fukuyama, the realists do not think it matters how a state chooses to organize itself. All states, according to the realists, pursue their own survival, or their national interest. Thirty-three prominent realists purchased an advertisement in the New York Times in 2002 urging Bush not to invade Iraq. They argued that the coming war would distract America from the campaign against al Qaeda and leave it in charge of a failed state with no good options to leave. It is worth noting that neither the pacifist left nor the foreign policy realists argued before the war that Saddam Hussein had no weapons of mass destruction, the liquidation of which was Bush’s justification for the war. Both camps warned instead that an American invasion of Iraq could prompt the tyrant to use the chemical and biological weapons that everyone agreed he was concealing. As the professors wrote in their open letter, “The United States would win a war against Iraq, but Iraq has military options — chemical and biological weapons, urban combat — that might impose significant costs on the invading forces and neighboring states.” The argument was that removing Saddam Hussein would further destabilize the Middle East.

    Over the course of 2003, it became clear that the casus belli for Operation Iraqi Freedom — Saddam’s refusal to come clean on his regime’s weapons of mass destruction — was wrong. The teams of American weapons inspectors sent into the country could not find the stockpiles of chemical weapons or the mobile bio-weapons labs. The Bush administration sought to portray this error as an intelligence failure, which was largely correct. And so the war’s unanticipated consequences, some of them the result of American error, eclipsed the fact that Iraqis had drafted a constitution and were voting for their leaders. In America, a great popular anger began to form, not only against the Iraq war but more generally against American interventionism. The Democrats became increasingly eager to take political advantage of it. Talk of American hubris proliferated. Progressives were growing wary of the institutions of national security, particularly the intelligence agencies.

    Republicans under Bush were also divided between an embrace of the president’s own idealism to make Iraq a democracy and the unsentimental realism of his vice president, who darkly warned after 9/11 that the war against terror would have to be fought in the shadows. Bush’s own policies were inconsistent. Sometimes he pressured dictator allies to make democratic reforms, but he also empowered those same dictators to wage war against jihadists with no mercy. In Israel, Bush supported legislative elections that resulted in empowering Hamas in Gaza. (That was in 2006, the last time Palestinians voted for their leaders.) By the end of Bush’s second term, however, great power competition had re-emerged. While America was preoccupied with the Muslim world, Russia invaded the former Soviet Republic of Georgia. Bush did what he could. He sent humanitarian supplies to Tbilisi packed on U.S. military aircraft. He tried to rally allies to support a partial ban on weapons sales to Moscow. But Russia had the good fortune of timing its aggression just as the world’s financial markets collapsed. It was also lucky that the next American president would be Barack Obama.

    *

    Barack Obama had been a state senator in Illinois during the run up to the Iraq War, when his primary rival, Hillary Clinton, was a U.S. senator. She voted for the war. He gave a speech opposing it. At the time of the election, in a political party incensed by the Iraq war, Obama’s speech in Chicago in 2002 functioned as a shield: he may have lacked Clinton’s experience, but at least he did not support Bush’s war. Back in 2002, though, Obama’s speech was barely noticed. The Chicago Tribune news story led with Jesse Jackson’s speech and made no mention of the ambitious state senator. When Obama was at the lectern, he had two distinct themes. First, he wanted the protestors to know that he, too, understood the evil of neoconservatism. “What I am opposed to is the cynical attempt by Richard Perle and Paul Wolfowitz and other armchair, weekend warriors in this administration to shove their own ideological agendas down our throats,” he said. At the same time, Obama rejected the apologies for tyrants common on the hard left. Of Saddam, he said, “He is a brutal man. A ruthless man. A man who butchers his own people to secure his own power.” But the young Obama did not think that Saddam threatened American interests. Echoing Fukuyama’s optimism, he declared that “in concert with the international community he can be contained until, in the way of all petty dictators, he falls away into the dustbin of history.”

    Obama’s patience with history, with its dustbins and its arcs, turned out to be, well, endless. His Chicago speech should have been a warning for the left wing of the Democratic Party that over time it would be disappointed by his presidency. As Obama said, he was not against war. (The tough-minded Niebuhrian speech that he delivered in Oslo when he accepted his ridiculous Nobel Prize underscored his awareness of evil in the world.) He was merely against dumb wars — or as he later put it, “stupid shit.” He had come into office when the world was growing more dangerous, and he chose to respond to these dangers with careful and scholarly vacillations. He wanted the American people to know that he was thoughtful. The most salient characteristics of his foreign policy were timidity and incoherence, and a preference for language over action.

    Thus, Obama withdrew American forces from Iraq in 2011, only to send special operators back to Iraq in 2014, after the Islamic State captured the country’s second largest city. He “surged” forces in Afghanistan in his first term, but fired the general he chose to lead them, and spent most of his administration trying, and failing, to withdraw them. He spoke eloquently about the disgrace of Guantanamo, but never closed it. He declassified a series of Justice Department memos that made specious legal arguments to allow the CIA to torture detainees, but his Justice Department never prosecuted the officials responsible, as many in his base wanted. He sided with peaceful protestors in Egypt in 2011 at the dawn of the Arab Spring and urged Hosni Mubarak to step down, but after Egypt elected an Islamist president, the military toppled him in a coup thirteen months later and Obama declined to impose sanctions. He did manage to reach a narrow deal with Iran to diminish, but not demolish, its nuclear weapons program. By this time Iran was on a rampage in the Middle East, and the windfall that its economy received from the nuclear bargain would be reinvested in its own proxy wars in Syria, Iraq and Yemen. The deal alienated America’s traditional allies in the Middle East and brought Israel closer to its Arab rivals.

    The most spectacular failure of Obama’s foreign policy, of course, was Syria. After the Arab Spring, Syrians demanded the same democratic freedoms that they saw blooming in Tunisia and briefly in Egypt. Obama supported them, at first. But the tyrant was watching: Bashar al-Assad had learned from what he considered the mistakes of Mubarak and Ben Ali. Assad was also fortunate that his patrons were Russia and Iran, who also lived in fear of popular uprisings. So began the Syrian civil war that to this day rages on. That war has flooded Europe and Turkey with refugees, with dire political consequences, and threatened for a few years in the middle of the 2010s to erase the borders established after World War I for the Middle East.

    It is not the case that Obama did absolutely nothing to support the Syrian opposition. In 2012, he approved a covert program known as Timber Sycamore, in which the CIA endeavored to build up an army of “moderate rebels” against Assad. The plan was always flawed. Obama did not want American forces to fight inside Syria and risk an open clash with Iranian and Russian forces who were on the side of the Assad regime. (Obama was reluctant to offend the Russians and he was actively seeking d.tente with the Iranians.) America clung to its passivity as Syria’s civil war and Iraq’s embrace of Shiite majoritarian rule created the conditions for the emergence of the Islamic State. A few years later, Obama authorized a Pentagon program to arm and support a largely Kurdish army fighting the Islamic State. With the help of American air power, the Kurds and U.S. special forces eventually smashed the “caliphate” during Trump’s first term in office.

    Artlessly and in accord with his principles, Obama painted himself into a corner. He called on Assad to leave, but he never used American power to assist with that mission. Obama also warned of consequences if Assad used chemical weapons, which he called a “red line.” In 2013, when Assad crossed this line, Obama threatened air strikes against Assad’s regime. The moment of truth— about Syria, about American interventionism — had arrived. Obama punted. He gave a bizarre speech in which he asserted that he had the constitutional prerogative to strike Syria without a resolution from Congress but was asking Congress to authorize the attack anyway. In his swooning memoir of the Obama White House, Ben Rhodes recalls that the president told him, “The thing is, if we lose this vote it will drive a stake through the heart of neoconservatism — everyone will see they have no votes.” Never mind the heart of Bashar al Assad! Rhodes continues: “I realized then that he was comfortable with either outcome. If we won authorization, he’d be in a strong position to act in Syria. If we didn’t, then we would potentially end the cycle of American wars of regime change in the Middle East.”

    The episode broaches the early roots of the bipartisan consensus against “endless war.” When the resolution came up for a vote, it barely got out of the Senate Foreign Relations Committee. As the Senate debated, Republican hardliners began to wobble. “Military action, taken simply to save face, is not a wise use of force,” said Senator Rubio. “My advice is to either lay out a comprehensive plan using all of the tools at our disposable that stands a reasonable chance of allowing the moderate opposition to remove Assad and replace him with a stable secular government. Or, at this point, simply focus our resources on helping our allies in the region protect themselves from the threat they and we will increasingly face from an unstable Syria.” In other words, Rubio would not support a modest air strike to impose some costs on a breach of an important international norm because it did not go far enough. The result of this twisted reasoning, and of the failure of the resolution, was the emboldening of Assad. Finally, at the last minute, Obama was saved by Assad’s most important patron. Russian foreign minister Sergei Lavrov and Secretary of State John Kerry quickly patched together a plan whereby Syria, for the first time, would declare its chemical weapons stockpiles and allow international inspectors to get them out of the country. Over time, the deal proved worthless. Assad would gas his people again and again, eroding what was once a powerful prohibition on the use of chemical weapons in the twenty-first century. But if the deal did nothing to end the misery of Syria, it did a lot to end the misery of Obama. In 2013, Obama portrayed the bargain as a triumph of diplomacy, which it was — for Putin.

    One of the first foreign policy priorities for Obama after his election was to mend relations with Moscow. This was called the “reset.” Obama was most exercised by transnational threats: climate change, arms control, fighting terrorism, Ebola. He wanted Russia to be a partner. And Russia wanted recognition that it was still a great power.

    After Obama folded on his “red line” in Syria, Putin made his move. Russian forces invaded Ukraine in 2014 to stop a democratic revolution and eventually annexed Crimea. Obama imposed a series of economic sanctions on Russian industries and senior officials, but he declined to arm Ukraine’s government or consider any kind of military response. (He worried more about escalation than injustice.) His administration’s advice to Kiev was to avoid escalation. The following year Obama did not challenge Russia when it established airbases inside Syria. He still needed the Russians for the Iran nuclear deal. By 2016, when the U.S. intelligence community was gathering evidence that Russians were hacking the Democratic National Committee and Hillary Clinton’s campaign, Obama’s White House waited until after the election to punish Moscow. Three weeks before the next president would take the oath of office, Obama announced the expulsion of thirty-five spies and modest sanctions on Russia’s intelligence services. It was a fine example of “responsible statecraft.”

    *

    The thoughtful incoherence of Barack Obama was succeeded by the guttural anarchy of Donald Trump. It was nearly impossible to discern from Trump’s campaign what his actual foreign policy would be if he won. His ignorance of international affairs was near total. He simultaneously pledged to pull America out of the Middle East and to bomb ISIS indiscriminately. He could sound like Michael Moore one minute, thundering that George W. Bush lied America into the Iraq War, and in the next minute like a Stephen Colbert imitation of a right-wing neanderthal, claiming that Mexico was deliberately sending its rapists into our country. And yet there was a theme in Trump’s hectoring confusion. He hearkened back to a very old strain of American politics. One could see it in his slogan “America First,” a throwback to the isolationism of Charles Lindbergh in the 1930s. When Trump asked mockingly what America was getting from its interventions in the Middle East or the protection its troops provided Europe through the NATO alliance, he was unknowingly channeling Senator Robert Taft and his opposition to the Marshall Plan. Past presidents, Republicans and Democrats, understood that the small upfront cost of stationing troops overseas in places such as Korea or Bahrain paid much greater dividends by deterring rivals and maintaining stability. Military and economic aid was a small price to pay for trade routes and open markets. But Trump rejected all of this.

    As president, Trump’s foreign policy has not been altogether catastrophic. (That is faint praise, I know.) He has used force in constructive flashes, such as the drone strike that killed Qassem Suleimani or the air strikes against Syrian landing strips after the regime gassed civilians. He never pulled America out of NATO as he said he would, though he declined to say publicly that America would honor the mutual defense commitments in the treaty’s charter. He pulled out of Obama’s nuclear deal with Iran, a deal whose merits were always a matter of controversy. He began to reverse the spending caps imposed during Obama’s presidency on the Pentagon’s budget. On China, the Trump administration has begun aggressively to target Beijing’s thievery and espionage and takeover of international institutions.

    Most consistently, Trump’s foreign policy has been marked by an amoral transactionalism. Modern presidents of both parties have made bargains with tyrants, but they did so sheepishly, and often they appended talk of human rights to their strategic accommodations. Trump was different. He went out of his way to pay rhetorical tribute to despots and authoritarians who flattered him — Kim Jong Un, Vladimir Putin, Xi Jinping, Viktor Orban, Jair Bolsonaro. When Trump’s presidency began, senior advisers such as General James Mattis and General H.R. McMaster tried to soften, and at times to undermine, his appetite to renounce American leadership in the world. McMaster made the president sit through a power-point presentation about life in Afghanistan before the Taliban to persuade him of the need for a small military surge there. After Trump abruptly announced the withdrawal of the small number of American forces in Syria, his advisers persuaded him that some should stay in order to protect the oil fields. And so it went until most of the first cabinet was pushed out in 2018 and 2019. The new team was more malleable to Trump’s instincts. Trump’s new secretary of state, Mike Pompeo, empowered an envoy to negotiate an American withdrawal from Afghanistan with the Taliban, without including the Afghan government, our ally, in the talks. Instead of undermining Trump’s push to leave the Iran nuclear deal, as James Mattis and Rex Tillerson had done, the president’s new team kept escalating sanctions.

    Trump was erratic. Never has foreign policy been so confusing to anyone outside (and to some inside) the White House. Trump would impetuously agree with heads of state to major policy changes before the rest of his government could advise him of his options. Since Trump shares his internal monologue with the world on twitter, these lunges became policies, until he would later reverse them just as fitfully. To take one example: the sequence of tweets that announced Trump’s deal in 2019 with Turkey to pull American support for its Kurdish allies in Syria had real consequences, even though Trump would later reverse himself. As the Turkish military prepared to enter an autonomous Kurdish region of Turkey, the Kurdish fighters who had bled to defeat ISIS were forced to seek protection from Russia, Iran, and Bashar al Assad.

    During that crisis, Trump tweeted about one of his favorite themes: “The endless wars must end.” For the first fifteen years of the post-9/11 era, that kind of talk would have been heresy for Republicans. Despite a few outliers inside the party like Ron Paul and Rand Paul, the party of Bush and Reagan supported what it called a “long war,” a multi-generational campaign to build up allies so they could defeat terrorists without American support. Until very recently, Republicans understood that as frustrating as training local police in Afghanistan and counter-terrorism commandos in Iraq often can be, the alternative was far worse, both strategically and morally. The same was true of American deployments during the Cold War. To this day there are American troops in South Korea and Germany, in part because their very presence deterred adversaries from acting on their own aggressive or mischievous impulses. But Trump disagreed. And he echoed a growing consensus. “No more endless wars” is the new conventional wisdom.

     

    III

    The Quincy Institute for Responsible Statecraft was founded in 2019 as a convergence of opposites, with money from George Soros’ Open Society Foundation and the Koch brothers. There was one thing about which the opposites agree, and that is the end of American primacy, and consequent activism, in the world. The new think tank hopes to mold the wide but inchoate opposition to “endless wars” into a coherent national strategy.

    On the surface, the Quincy Institute presents itself in fairly platitudinous terms. “The United States should respect established international laws and norms, discourage irresponsible and destabilizing actions by others, and seek to coexist with competitors,” its website says. “The United States need not seek military supremacy in all places, at all costs, for all time.” That boilerplate sounds like the kind of thing one would hear in the 2000s from what were then known as the netroots: wars of choice are bad, international law is good. But there is an important distinction. The progressives who obsessed over the neoconservatives in the Bush years argued the ship of state had been hijacked. The Quincy Institute is arguing that the institutions it once sought to protect from those ideological interlopers were themselves in on the heist. The problem is not the distortion of our foreign policy by foreign interests. The problem is the system that created our foreign policy in the first place.

    Consider this passage by Daniel Bessner on Quincy’s website: “While there are national security think tanks that lean right and lean left, almost all of them share a bipartisan commitment to U.S. ‘primacy’ — the notion that world peace (or at least the fulfillment of the “national interest”) depends on the United States asserting preponderant military, political, economic, and cultural power. Think tanks, in other words, have historically served as the handmaidens of empire.” Bessner is echoing an idea from Stephen Walt, the Harvard professor who is also a fellow at the institute. At the end of The Hell of Good Intentions, which appeared in 2018, Walt called for a “fairer fight within the system,” and recommended establishing a broader political movement and the creation of new institutions — a think tank? — to challenge what he perceives as the consensus among foreign policy elites to favor a strategy of liberal hegemony. American primacy in the world he deemed to be bad for America and bad for the world.

    The Quincy Institute hired the perfect president for such a program. A retired Army colonel and military historian who lost his son in the Iraq War, Andrew Bacevich has emerged as a more literate and less sinister version of Smedley Butler. That name is largely forgotten today, but Butler was a prominent figure in the 1930s: a retired Major General who, after his service to the country, declared that “war is a racket” and that his career as a Marine amounted to being a “gangster for capitalism.” Butler later admitted that he was approached by a cabal to lead a military coup against President Roosevelt, but he remains to this day a hero of the anti-war movement. In 2013, in Breach of Trust, Bacevich presented Butler as a kind of dissident: “He commits a kind of treason in the second degree, not by betraying his country but calling into question officially sanctioned truths.” In this respect, Butler is the model for other retired military officers who dare to challenge official lies. Not surprisingly, Breach of Trust reads like the military history that Howard Zinn never wrote. It is a chronicle of atrocities, corruption, and government lies. Like Bacevich’s other writings, it is a masterpiece of tendentiousness.

    More recently, Bacevich has sought to recast the history of the movement to prevent Roosevelt from entering World War II, known as America First. He has acknowledged that America was correct to go to war against the Nazis, but still he believes that the America Firsters have gotten a bad rap. Until Donald Trump, the America First movement was seen as a cautionary tale and a third rail. When Pat Buchanan tried to revive the term in the 1980s and 1990s, there was bipartisan outrage. After all, America First was led by Charles Lindbergh, an anti-Semite and an admirer of the Third Reich. Bacevich acknowledges this ugly provenance. And yet he chafes at Roosevelt’s judgment that Lindbergh’s movement was promoting fascism. “Roosevelt painted anti-interventionism as anti-American, and the smear stuck,” Bacevich wrote in 2017 in an essay in Foreign Affairs charmingly called “Saving America First.”

    Bacevich imparts a grain of truth. The America First movement was largely a response to the unprecedented horrors of World War I, in which armies stupidly slaughtered each other and chemical weapons were used on a mass scale. And the war was sparked by miscalculations and secret alliances between empires and smaller states in Europe: it lacked the moral and strategic purpose of defeating the Nazis and the Japanese fascists. It is quite understandable that two decades after World War I ended, many Americans would be reluctant to fight its sequel. But Bacevich goes a bit further. In his Foreign Affairs essay, he instructed that “the America First Movement did not oppose Jews; it opposed wars that its members deemed needless, costly, and counterproductive. That was its purpose, which was an honorable one.” But was it honorable? While it is true that in the 1930s major newspapers did a terrible job in covering the Third Reich’s campaign against Jews and other minorities, those persecutions were hardly a secret. Nazi propaganda in the United States was openly anti-Semitic. The war weariness of post-World War I America does not confer nobility on America First’s cause. In a recent interview Bacevich became testy when asked about that remark. “Come on now,” he said. “I think that the anti-interventionist case was understandable given the outcome of the First World War. They had reason to oppose U.S. intervention. And, again, let me emphasize, their calculation was wrong. It’s good that they lost their argument. I do not wish to be put into a position where I’m going to make myself some kind of a defender for the people who didn’t want to intervene against Nazi Germany.” Good for him.

    That exchange tells us a lot about the Quincy Institute. The think tank’s foreign policy agenda and arguments echo the anti-interventionism of the 1930s. Most of its scholars are more worried about the exaggeration of threats posed by America’s adversaries than the actual regimes doing the actual threatening. In May, for example, Rachel Esplin Odell, a Quincy fellow, complained that Senator Romney was overstating the threat of China’s military expansion and unfairly blaming the state for the outbreak of the coronavirus: “The great irony of China’s military modernization is that it was in large part a response to America’s own grand strategy of military domination after the Cold War.” In this, of course, it resembled most everything else.

    The institute has hired staff that come out of the anti-neoconservative movement of the 2000s. Here we come to a delicate matter. The anti-neoconservatives of that era flirted with and at times embraced an IR sort of anti-Semitism: the obsession with Israel and its influence on American statecraft. Like the America Firsters, the anti-neoconservatives worry about the power of a special interest — the Jewish one — dragging the country into another war. A few examples will suffice. In 2018, Eli Clifton, the director of Quincy’s “democratizing foreign policy” program, wrote a post for the blog of Jim Lobe, the editor of the institute’s journal Responsible Statecraft, that three Jewish billionaires — Sheldon Adelson, Bernard Marcus, and Paul Singer — “paved the way” for Trump’s decision to withdraw from Obama’s Iran nuclear deal through their generous political donations. It is certainly fair to report on the influence of money in politics, but given Trump’s well-known contempt for the Iran deal, Clifton’s formulation had an odor of something darker.

    Then there is Trita Parsi, the institute’s Swedish-Iranian vice president, who is best known as the founder of the National Iranian American Council, a group that purports to be a non-partisan advocacy group for Iranian-Americans but has largely focused on softening American policy towards Iran. In 2015, as the Obama administration was rushing to finish the nuclear deal with Iran, his organization took out an ad in the New York Times that asked, “Will Congress side with our president or a foreign leader?” a reference to an upcoming speech before Congress by the Israeli prime minister Benjamin Netanyahu. The National Iranian American Council’s foray into the dual loyalty canard is ironic considering that Parsi himself has been a go-between for journalists and members of Congress who seek access to Mohammad Javad Zarif, Iran’s foreign minister.

    This obsession with Israeli influence in American foreign policy is a long-standing concern for a segment of foreign policy realists, who believe that states get into trouble when the national interest is distorted by domestic politics — an affliction that is particularly acute in democratic societies which respect the rights of citizens to make their arguments to the public and to petition the government and to form lobbies. The most controversial of the realists’ scapegoating of the domestic determinants of foreign policy was an essay by Stephen Walt and John J. Mearsheimer (both Quincy fellows) that appeared in the London Review of Books in 2005. It argued that American foreign policy in the Middle East has been essentially captured by groups that seek to advance Israel’s national interest at the expense of America’s. “The thrust of US policy in the region derives almost entirely from domestic politics, and especially the activities of the ‘Israel Lobby,’” they wrote. “Other special-interest groups have managed to skew foreign policy, but no lobby has managed to divert it as far from what the national interest would suggest, while simultaneously convincing Americans that US interests and those of the other country — in this case, Israel — are essentially identical.”

    Walt and Mearsheimer backed away from the most toxic elements of their essay in a subsequent book. The essay sought to explain the Iraq War as an outgrowth of the Israel lobby’s distortion of American foreign policy. The book made a more modest claim about the role it plays in increasing the annual military subsidy to Israel and stoking American bellicosity to Israel’s rivals like Iran. They also took pains to denounce anti-Semitism and acknowledge how Jewish Americans are particularly sensitive to arguments that present their organized political activity as undermining the national interest. Good for them. But the really important point is that events have discredited their claims. The all-powerful “Israel Lobby” was unable to wield its political influence to win the fight against Obama’s Iran deal. It was not able to stop Obama’s public pressuring of Israel to accept a settlement freeze. Decades earlier, it had not been able to thwart Reagan’s sale of AWACs to the Saudis. Anyone who believes in an omnipotent AIPAC is looking for conspiracies.

    *

    Walt himself, and the Quincy Institute, now has a much more ambitious target: the entire foreign policy establishment. This is the central thesis of The Hell of Good Intentions — that the machinery of American foreign policy is rigged. It will always favor a more activist foreign policy, a more dominant military and liberal hegemony. All the pundits, generals, diplomats and think tank scholars in Washington are just too chummy with one another. A kind of groupthink sets in. (This never happens at the Quincy Institute.) The terms of foreign policy debate are narrowed. And analysts who seek an American retrenchment from the world are shunted aside.

    To prove this point, Walt spends several pages observing how former government officials land jobs at prestigious think tanks and get invited to speak at fancy dinners. The result is that no one is ever held to account for their mistakes, while the courageous truth-tellers are ignored and isolated. (At times the book reads like a very long letter by a spurned friend asking why he never got an invitation to last month’s retreat at Aspen.)

    To illustrate this desperate problem, Walt turns to the annual conference for the World Affairs Councils of America. He ticks off speakers from past years—Susan Glasser, Vali Nasr, Paula Dobriansky — and observes, “These (and other) speakers are all dedicated internationalists, which is why they were invited.” So whom does Walt want the World Affairs Councils of America to invite? “Experts with a more critical view of U.S. foreign policy, such as Andrew Bacevich, Peter Van Buren, Medea Benjamin, Glenn Greenwald, Jeremy Scahill, Patrick Buchanan, John Mueller, Jesselyn Radack, or anyone remotely like them.”

    There is so much to be said about all of these figures. Patrick Buchanan’s ugly isolationist record is well known. But consider, at the other end of the ideological spectrum, Medea Benjamin. She is the founder of an organization called Code Pink, known mostly for disrupting public meetings, which last year briefly took control of the Venezuelan embassy in Georgetown to prevent representatives of the country’s internationally recognized interim anti-Maduro government from taking over. A group of American anti-imperialists were defending the prerogatives of a dictator who had sold off his country’s resources to China and Russia while his people starved. People like Benjamin are not dissidents. They are stooges.

    In this way the hard-nosed centrist post-Iraq realists converge with the radicals of the left even as they converge with the radicals of the right. This is realism in the style not of Henry Kissinger but of Noam Chomsky. As in Chomsky, the aggression of America’s adversaries is explained away as responses to American power. And as in Chomsky, the explanation often veers into apologies for monsters. Consider “Why the Ukraine Crisis is the West’s Fault,” an essay by Mearsheimer in Foreign Affairs in 2014. There he argues that the expansion of NATO and the European Union, along with American democracy-promotion, created the conditions for which the Kremlin correctly assessed that its strategic interests were threatened in Ukraine. And after street demonstrations in Kiev resulted in the flight of the Ukrainian president, Viktor Yanukovych, to Russia, Putin had little choice but to snatch Crimea from his neighbor. “For Putin,” the realist writes, “the illegal overthrow of Ukraine’s democratically elected and pro-Russian president — which he rightly labeled a ‘coup’ — was the final straw.” Of course the heroic agitation of the Maidan was about as much of a coup as the Paris commune of 1871. But like Putin, Mearsheimer argues that this “coup” in Ukraine was supported by Washington. His evidence here is that the late Senator John McCain and former assistant secretary of state Victoria Nuland “participated in antigovernment demonstrations,” and that an intercepted phone call broadcast by Russia’s propaganda network RT found that Nuland supported Arseniy Yatsenyuk for prime minister and was positive about regime change. “No Russian leader would tolerate a military alliance that was Moscow’s mortal enemy until recently moving into Ukraine,” Mearsheimer writes. “Nor would any Russian leader stand idly by while the West helped install a government there that was determined to integrate Ukraine into the West.”

    What Mearsheimer leaves out of his essay is that Yanukovych campaigned for the presidency of Ukraine on a promise to integrate his country into the European Union, an entirely worthy goal. But he violated his pledge with no warning, and under Russian pressure; and his citizens became enraged. Nor does Mearsheimer tell his readers about the profound corruption discovered after Yanukovych fled. Ukrainians did not rise up because of the imperialist adventures of Victoria Nuland or the National Endowment for Democracy. They rose up because their elected president tried to bamboozle them by promising to join Europe only to join Russia. Mearsheimer also makes no mention of the Budapest memorandum of 1994, in which Russia, America, and the United Kingdom gave security assurances to Ukraine to protect its territorial integrity in exchange for relinquishing its Soviet-era nuclear weapons. The fact that Putin would so casually violate Russia’s prior commitments should give fair-minded observers reason to fear what else he has planned. But Mearsheimer is not bothered by Putin’s predations. Putin, Mearsheimer writes, knows that “trying to subdue Ukraine would be like swallowing a porcupine. His response to events there has been defensive, not offensive.”

    Mearsheimer’s excuses for Putin and his failure to grasp the meaning of Ukraine’s democratic uprising in 2014 illuminate a weakness in his broader theory of international relations. In Mearsheimer’s telling, the only meaningful distinction between states is the amount of power they wield. States, he writes in his book The Great Delusion, “are like balls on a billiard table, though of varying size.” He goes on to say that “realists maintain that international politics is a dangerous business and that states compete for power because the more power a state has, the more likely it is to survive. Sometimes that competition becomes so intense that war breaks out. The driving force behind this aggression is the structure of the international system, which gives states little choice but to pursue power at each other’s expense.” This is not a novel idea. Thucydides relates what the Athenians told the Melians: “the strong do what they can and the weak suffer what they must.” For Mearsheimer, it does not matter that twenty years before its invasions of Crimea and Ukraine Russia had pledged to respect and protect Ukraine’s territorial integrity. Russia was strong and Ukraine was weak. Russia’s perception of the threat of an enlarged European Union mattered, whereas the democratic choice of Ukrainians did not. Realists are not moved by democratic aspirations, which are usually domestic annoyances to high strategy. Nor are they bothered by the amorality of their analysis of history.

    As for American behavior around the world, the Thucydidean framework describes it, but — unlike Russian behavior — does not extenuate it. For the Quincy intellectuals, there is no significant difference between America and other empires. America is not exceptional. It is only a larger billiard ball. It stands, and has stood, for nothing more than its own interests. But this equivalence is nonsense. Important distinctions must be made. When France booted NATO’s headquarters out of Paris in the middle of the Cold War, Lyndon Johnson did not order an army division to march on Paris. Trump’s occasional outbursts aside, America does not ask countries that host military bases to pay tribute. After toppling Saddam Hussein, America did not seize Iraq’s oil. Compare this to the Soviet Union’s response to a dockworkers’ strike in Poland, or for that matter to the Dutch East India Company. These realists do not acknowledge the value of preserving the system of alliances and world institutions that comprise the American-led world order, or the fact that they have often enriched and secured Americaʼs allies, and at times even its adversaries. In this respect they are not only anti-interventionist, they are also isolationists, in that they believe that the United States, like all other states, naturally and in its own best interest stands alone.

    All of this is emphatically not to say that the American superpower has always acted with prudence, morality, and benevolence. There have been crimes, mistakes, and failures. There have also been national reckonings with those crimes, mistakes, and failures. No nation state has ever not abused its power. But behind these reckonings lies a larger historical question. Has America largely used its power for good? A great deal depends on the answer to that question. And the answers must be given not only by Americans but also by peoples around the world with whom we have (or have not) engaged. The valiant people on the streets of Tehran in 2009 who risked their lives to protest theocratic fascist rule shouted Obama’s name — were they wrong? About Obama they were certainly wrong: while they were imploring him for help he was brooding about American guilt toward Mossadegh. But were they wrong about America? And the Ukrainians in the Maidan, and the Egyptians in Tahrir Square, and the Kurds, and the women of Afghanistan, and the masses in Hong Kong, and the Guaido movement in Venezuela, and the Uighurs in their lagers — why have they all sought American assistance and intervention? Perhaps it is because they know that the American republic was founded on a sincere belief that the freedom enjoyed by its citizens is owed to all men and women. Perhaps it is because they have heard that the United States created, and stood at the helm, of a world order that has brought prosperity to its allies and its rivals, and even sometimes came to the rescue of the oppressed and the helpless. The case can certainly be made that America in its interventions damaged the world — the anti-interventionists make it all the time — but the contrary case is the stronger one. And contrary to the anti-interventionists, there are many ways to use American power wisely and decisively: the choice is not between quietism and shock and awe. No, the people around the world who look to us are not deluded about our history. They are deluded only about our present.

    American exceptionalism was not hubris. It was a statement of values and a willingness to take on historical responsibility. Nor was it in contradiction to our interests, though there have been circumstances when we acted out of moral considerations alone. It goes against the mood of the day to say so, but we must recover the grand tradition of our modern foreign policy. It is not remotely obsolete. Reflecting on the pandemic last spring, Ben Rhodes declared in The Atlantic, very much in the spirit of his boss, that the crisis created an opportunity to reorient America’s grand strategy: “This is not simply a matter of winding down the remaining 9/11 wars — we need a transformation of what has been our whole way of looking at the world since 9/11.” Rhodes said that he still wants America to remain a superpower. He proposed new national projects to fight right-wing nationalism, climate change, and future pandemics — all excellent objectives. He also questioned why America’s military budget is several times larger than its budget for pandemic preparedness or international aid. But what if the world has not entirely changed, pandemic and all? What if the world that awaits us will be characterized by great power rivalry and persistent atrocities? What if corona does not retire Westphalia?

    If you seek to know what the world would look like in the absence of American primacy, look at the world now. Hal Brands and Charles Edel make this point well in The Lessons of Tragedy: “It is alluring to think that progress can be self-sustaining, and that liberal principles can triumph even if liberal actors are no longer preeminent. To do so, however, is to fall prey to the same ahistorical mindset that so predictably precedes the fall.” And so the first task of those seeking to counter American unexceptionalism is to resist the urge to believe that the past is entirely over, and to reject wholesale the old ends and the old means, and therefore to scale back America’s commitments to allies and to decrease the military budget. Even when we are isolationist we are not isolated. There are threats and there are evils, and whatever should be done about them it cannot be that we should do little or nothing about them. We need to become strategically serious.

    It was as recently as 2014 that Obama dismissed ISIS as a junior varsity team, and even he was forced to reconsider his narrative that the killing of Osama bin Laden was the epitaph for the 9/11 wars, when a more virulent strain of Islamic fascism emerged in the Levant. In the summer of 2014, he sent special operation forces back to Iraq and began the air power campaign against ISIS that continued through 2019. Would ISIS have come into being if America had kept a small force inside of Iraq after 2011 and continued to work quietly with Iraq’s government to temper its sectarian instincts against the Sunni minority? It is impossible to know. What is known, though, is that in 2011 American officers and diplomats on the ground who had worked with Iraq’s security forces warned that without some American presence in the country, there was a risk that the army would collapse; and it did. This same cautionary lesson also applies to Afghanistan. No serious person should trust the Taliban’s promise that it will fight against al Qaeda if it were to take back power. And while it is true that the Afghan government is corrupt and often hapless, foreign policy consists in weighing bad and worse options. The worse option for Afghanistan is a withdrawal that leaves al Qaeda’s longstanding ally a fighting chance to consolidate power and turn the country again into a safe haven of international terrorism and again oppress its people. This is not idle speculation.

    The continuing battle against terrorism, which is a continuing threat, must not blind us, as it did George W. Bush, to the new era of great power rivalry. Americans must surrender the pleasant delusion that China and Russia will mature into responsible global stakeholders, or that outreach to Iran will temper its regional ambitions. In this respect Fukuyama was wrong and Huntington and Wieseltier were right. The pandemic has shown how China hollows out the institutions of the world order that so many had hoped would constrain and tame them. After prior pandemics, the United States invested more in its partnership with China and the World Health Organization, reasoning that as China industri alized it needed assistance to track new diseases before they were unleashed on the rest of the world. That system failed in late 2019 and 2020 not because China lacked the public health infrastructure to surveil the coronavirus. It failed because China is a corrupt authoritarian state that lied about the threat and punished the journalists, doctors, and nurses who tried to warn the world about it. This suppression of the truth cost the rest of the world precious time to prepare for what was coming. It turns out that states are not just billiard balls of varying sizes. If China were an open society, it would not have been able to conceal the early warnings. The nature of its regime is an important reason why covid19 was able to mutate into a global pandemic.

    As former Soviet dissidents or Serbian student activists can attest, tyrannies appear invincible right up to the moment they topple. This does not mean that America should always use its power to speed this process along. Nor does this mean that America should lead more regime change wars like Iraq. The best outcome for countries such as Iran, China, and Russia is for its own citizens to reclaim their historical agency and take back their societies and their governments from their oppressors. But when moments arise that reveal fissures and weaknesses in the tyrant’s regime, when there are indigenous democratic forces that are gaining ground, America must intensify and assist them. This is a matter of both strategy — the friendship of peoples is always better than the friendship of regimes — and morality. When opportunities for democratic change emerge in the world, the wiser strategy is to support the transition and not save the dictator. Again, this is not a license to invade countries or foment military coups. It is rather a recognition that any arrangements America makes with despots will at best be temporary. America’s true friends are the states that share its values. But the triumph of the open society is not at all preordained. It requires historical action, a rejection of narcissistic passivity, in an enduring struggle. This historical action can take many forms, and it is not imperialism. It is the core of the republic’s historical identity. It is responsible statecraft.

    Ancient Family Lexicon, or Words and Loneliness

    “Whoever knows the nature of the name… knows the nature of the thing itself, ” Plato observed in his Cratylus. To know is a complex verb, difficult but rich. According to the dictionary, it means “to have news of a thing,” “to know that it exists or what it is.” In classical languages, the concept of knowing was linked with being born. Thus by coming into the world others have “news” about us: their recognition of us is part of our birth.

    Knowing the roots of the words at the basis of human relationships permits us to revive a world in which individuals existed as men and women or boys and girls with no middle ground. I will explain what that means. The ancestors of these appellations (woman, girl, man, boy) denoted a particular way of being that subsequent cultures have lost. As the meaning of the words changed, the beings themselves changed. Back then, before these semantic developments, it was understood that the condition of boyhood was synonymous with immaturity, and the divide between childhood and adulthood had to be put to the test of life. Moreover, youth and old age were not personal categories but attitudes of soul and mind. What follows is a sort of Indo-European family lexicon, and a portrait of a lost world.

    Mother
    The word comes from the Indo-European mater, formed by the characteristically childish elementary root ma– and the suffix of kinship –ter. In Greek it is mētēr, in Latin mater, in Sanskrit mātar, in Armenian mayr, in Russian mat, in German Mutter, in English mother, in French mère, in Italian, Spanish and Portuguese madre, in Irish máthair, in Bosnian majika.

    Father
    The word comes from the Indo-European pater, formed by the elementary root pa- and the suffix of kinship –ter. In Greek it is patèr, in Latin pater, in Sanskrit pitar, in ancient Persian pita, in Spanish, Italian and Portuguese padre, in French père, in German Vater, in English father.

    These terms are so ancient, so primordial that they have survived the history of languages   and the geography of peoples. Since they were first uttered, these words have consistently been among the first spoken by human beings. They are solid words, like a brick house, like a mountain. It is our fathers and our mothers who teach us first to name things. It is natural that a child should first articulate ma– or pa-. There is no child who does not seek to be loved and held, who is not in need of care and protection from a mother and father. And we never forget these words; we hold them inside ourselves all the way to the end. Studies on Alzheimer’s and senile dementia patients who have spoken a second language throughout their lives, a language different from that of their country of origin, show that they refer to dear ones using their original language. Native language. Mother-tongue.

    Human
    The classical etymology of the word man — meaning a human being — comes from the Latin homo, which dates back to the Indo-European root of humus, “earth,” a result of a primor-dial juxtaposition, perhaps even opposition, between mortal creatures and the gods of heaven. In the Bible, the Creator infuses earth with soul, creating the human compound. In French the term became homme, in Spanish hombre, a root that disappears in the Germanic languages, where we have man in English and Mann in German. The usage may now seem archaic, but it contains a universal idea.

    The Greek ànthrōpos has a disputed etymology. According to some, it is linked to the words anō, “up,” athréo, “look,” and òps, “eye,” a very fine combination of roots that indicates the puniness of men faced with the immensity of the divine and bound to raise their eyes to heaven from the ground. According to others, it is a descendent of the term anèr, “male,” “husband,” corresponding to the Latin vir. In both cases, the condition of “adult man” is colored by the concepts of strength, energy, ardor — of overcoming childhood through tests of courage, which reverberate in the Latin and Greek words vis and andreìa.

    Thus we have the universal concept of a human being who is small, humble, tied to the earth on which she has her feet firmly planted until the day of her death but not entirely material, puny but bent towards heaven – and also strong, therefore heroic, because she has succeeded in enlarging herself. In order to transition from girlhood to womanhood and from boyhood to manhood, one must pass a test. Through this test — or tests: the trials of a human life — girls and boys prove the measures of their strength, tenacity, and courage and in so doing become adults. Once the test is past, their nature itself is forever altered as their name is changed — no middle ground from girl to woman, from boy to man.

    Son, Daughter
    “Son” is connected with the Latin filius, “suckling,” linked to the root fe-, “sucking,” an affective and infantile term typical of the Indo-European –dhe, “to suckle,” which is found today in some Germanic languages as in the English word daughter or in the Bosnian one dijete, “boy.”

    The further we move away from the linguistic essence, from the primeval universality of the Indo-European roots, the more complicated things become, and the more the words grow apart and differ from Romance languages   to Germanic ones. The notion of “boy” or “girl” as adolescents still unprepared for adult life does not surface until the fourteenth century. This concept is a foreign loan that dates back to the late Middle Ages and derives from the Arabic raqqās, meaning “gallop,” or “courier,” or more specifically “boy who carries letters,” a term of Maghrebian origin probably spread from Sicily through port exchanges in the Mediterranean, which was so rich in Arabisms. (We may note that this etymology has been made irrelevant by the conditions of modern work, in which many adults are treated as boys who carry letters, that  is, are employed in infantilizing jobs that do not make full use of their adult skills.)

    Young, Old
    “Young” is a very pure and powerful word, and an imprecise one, not tied to a registry concept, in the same way that “old” is not. It clearly comes from the Indo-European root yeun-, from which the Sanskrit yuvā, the Avestan yavan-, the French jeune, the English young, the Latin iuvenis, the Spanish joven, the Portuguese jovem, the Romanian juve, the Russian junyj,  the Lithuanian jánuas, the German jung. “Young” is the calf or foal tenaciously striving to balance on thin and trembling legs, trying and trying again, falling ruinously to the ground until  it stands up, bleeding and covered with straw — but ready to go, to walk, to wander. Youth is strength, a drive, an arrow already fired.

    At the opposite extreme of the life cycle is the old, the elderly, which means worn out, weary, weak, too tired to move, to go further — like a car worn down by too many roads, a car that suddenly stops, the engine melted. Elderly is the worn sole of a shoe that has walked too far. It is the hands of the elderly, like cobwebs that have caught too much wind in life. This idea comes from the Latin vetulus, a diminutive of vetus, which means “used,” “worn out,” “old.” In French it is called vieil, in Spanish viejo, in Portuguese velho, in Romanian vechi. Old age is an attitude and not an age, it means stopping, even surrender. The string of the bow collapsed, the quiver empty. 

    Love
    Love is a pledge, as the etymology shows. The notion of betrothal, the ideas of bride and bridegroom, derive from the Latin sponsum and sponsam, from the past participle of the verb spondeo, which means “to promise,” corresponding to the Greek spèndō. In French it is called époux and épouse, in Spanish and Portuguese esposo, esposa. The original meaning of those words lay in the idea of the indissolubility of the promise of love. Once made, it cannot be revoked. The trust and the faith expressed in the promise were so sacred that they were celebrated by the couple with a libation to the gods.

    In the Romance languages, however,   the meaning of that promise has slipped into the future, to the rite that has yet to happen, in the word fiancé, which derives from fides in Latin, which means “faith.” It is this faith in the promise of love, in its futurity, that gives strength to lovers such as Renzo and Lucia, made immortal by Alessandro Manzoni in I promessi sposi, who did everything possible to fulfill that promise of love contained, primordially, in the definition of “betrothed.”

    Mom.

    As I mentioned, the word comes from the Indo-European root ma-, a universal utterance of affection, which has as its basis in the elementary sequence ma-ma. This childish word has identical counterparts in all Indo-European languages, a sound of affection that extends beyond borders in the welter of different languages around the world.

    Memory is often full of italicized passages, experiences that remain fresh despite the passage of time, but sometimes deletions overshadow the italics. For a long time 

    I had forgotten the sound of the word mom. I could not say it anymore because I had not said it out loud for over fifteen years. I had even stopped thinking it.

    Stabat mater, “the mother stood” next to the son, reads a thirteenth-century religious poem attributed to Jacopone da Todi, which later became universal in the Christian liturgy to indicate the presence of the sad mother next to the suffering son. Once, beside me, the daughter, there stood my mother. We celebrated our birthday on the same day, she and I: born premature, I was, as long as we both lived, her birthday present. When I was a child we always had a double party for the “women,” as my father called us. Since she died, every birthday of mine has been cut in half. And since then I have never been sure of exactly how old I am.

    Every January I get closer and closer to the age my mother was when she died. Meanwhile, like the turtle in the paradox of Zeno, I move further and further away from that lost, skinny, lonely girl who was between the third and the fourth year of high school when her mother died of a cancer as swift as a summer: she fell ill in June and passed in September, on the first day of school. For years I never told anyone of my early loss, it was one of my surgical choices. The silence gave me relief from the empty words of the others: poor girl, so young. I discovered a new space inside me, a sorrow that I did not know before and could now explore, unseen, unheard. I was an orphan.

    It seems impossible to admit it now, like all the admissions of the “imperfect present perfect” that we are, but there was a 

    long period in which I practically stopped talking. I am fine was the only sentence in my stunted girlish vocabulary. Not until I was seventeen did I begin to understand the value that the ancients attributed to words — and I began to respect them in silence with an uncompromising loyalty, learning to say little and to keep almost everything quiet.

    After high school I moved to Milan, enrolled at the university, and started a new life, which I call my second one. For years I never said anything to the people I met, to my friends, to my boyfriends, about my mother’s death. As a daughter I was mute. Anyway, almost nobody ever asked me. My silence was unchallenged. And then, with the publication of my first book, in which I shared my passion for ancient Greek, my third life began — my linguistic life, the era of saying — the advent of the words that I use to make everything real, especially death.

    I remember the exact moment that my verbal mission, my reckoning with mortality through language, started. I was presenting my book to the students in a high school in Ostuni when, at question time, a sixteen-year-old boy asked me, with the frankness of those who believe that I must know the most intimate things in the world because I wrote a book on Greek grammar, “Why in Greek is a human being also called brotòs, or destined to die?” “Because death is part of life,” I said, almost without thinking about it. I was disconcerted by the rapidity of my response: I already knew the answer, even if I had not read it in any book or treatise. I reminded myself that I had no need of a book to know this. She had died; I had lived it. And so on that day I reclaimed the first word that I uttered in my life, like so many of the women and men who have come and will come into the world and have gone and will go out of it. They 

    gave it back to me, those high school boys. I started to say mom again.    

    My mother, mine, who went away a long time ago and whom I resemble so much, the one who taught me my first words.

    The ancients believed that there was a perfect alignment between the signifier and the signified, between word and meaning, between name and reality, owing to the power of naming, to the descriptive force of a word to denote a thing.

    The Greek adjective etymos means “true,” “real,” from which the word “etymology” was later derived. It was coined by the Stoic philosophers to define the practice of knowing the world through the origin of the words that we use — the words that makes us what we are. I fell in love with the strange study of etymology in high school, and never gave up trying to understand the world according to it, to squeeze what surrounds me out of the language that surrounds me — notwithstanding my friends’ teasing that I cannot say anything without a reference to Greek or Latin.

    Many centuries later, taking up a thought of Justinian, Dante remarked in the Vita Nuova that nomina sunt consequentia rerum, “names are consequences of things” — that is, words follow things, they are upon them, they adhere to them, they reveal reality. Reality’s debt to language is very great. Words are the gates to what is. And to what is not: the opposite is also true, that if something has no name, or is not articulated in thought or speech, then it is not there. Silence about a thing does not mean that it is not real, but without a name and without words it is unrecognized and so, in a sense, not here, not present, now and now and now again, among us.

    Much that cannot now be said was once certainly said, about things that were once here but are gone, about a reality that has been lost. Dust.

    Two years ago I read an article in The New York Times that left me with such uneasiness that I was prompted to look more deeply inside myself and the people around me. The journalist declared that these first years of the new millennium are the “era of anxiety.” “The United States of Xanax,” he called the present era in his country’s history, after the most famous pill among the anxiolytics, whose percentage of diffusion in the population, including children, is in the double digits, and whose cost at the local pharmacy is slightly higher than the price of an ice cream and slightly less than a lunch at McDonald’s. Depression — that disease of the soul that until the twenties of the last century was considered as incurable, as inconsolable, as its name, melancholia — is today no longer fashionable, said the Times. It has been usurped. The years of bewilderment in the face of the abyss sung about by Nirvana — and which led to the suicide of Kurt Cobain — are over. Instead we suffer from a different kind of disease, an anxiety that makes us disperse ourselves busily, and scatter ourselves in the name of efficiency, so as not to waste time but instead  to manage it frantically. And as we strive not to lose time, we lose ourselves.

    The author of the article cited the case of Sarah, a 37-year-old woman from Brooklyn working as a social media consultant who, after having informed a friend in Oregon that she was going to visit her over the weekend, was seized by worry and fear when her friend did not reply immediately to her email. A common experience, perhaps: how many times do we fear that we have hurt a loved one without knowing exactly how? Is such worry a sincere concern about the other, or is it a  narcissistic, self-focused guilt? How often are we out of breath as if we were running when in fact we are standing still?

    But Sarah took her worry to an uncommon extreme. Waiting for the answer that was slow to arrive and that presaged her worst fear, she turned to Twitter and her 16,000 followers, tweeting, “I don’t hear from my friend for a day — my thought, they don’t want to be my friend anymore,” adding the hashtag “#ThisIsWhatAnxietyFeelsLike.” Within a few hours, thousands of people all over the world followed her example, tweeting what it meant for them to live in a state of perpetual anxiety, prisoners of a magma of indistinct, inarticulate emotions. At the end of the day, Sarah received a response from her friend: she had simply been away from her house and had not read the email. She would be more than happy to meet her, she had been hoping to see her for so long. A few days later Sarah remarked without embarrassment to journalists who were intrigued by the viral phenomenon: “If you are a human being who lives in 2017 and you are not anxious, there is something wrong with you.”

    Is that really so? Must we surrender to this plague of anxiety? Are we supposed to forget what we know — that friendship is measured in presence and memory, and not in the rate of digital response or the speed of reply? Are we required to infect our most significant relationships with the spirit of highly efficient customer service? Is it a personal affront if a loved one or a friend allows herself half a day to live her life before attending to us? Have we so lost the art of patience that we must be constantly reassured that we have not been abandoned? Are we living out of time, out of our time, if we do not agree to be prisoners of anxiety? Must we conform and surrender and live incompletely, making others around us similarly incomplete?

    I think not. It is perverse to regard anxiety as an integral and indispensable part of our life and our contemporaneity. It is difficult to admit, especially when we are unhappy, but we come into the world to try to be happy. And to try to make others happy. 

    Sarah may have suffered from an anxiety disorder, a serious illness that required appropriate treatment, or perhaps, as she later admitted, she simply felt guilty because, too busy with her work, she had not communicated with her friend for months and was now embarrassed about her absence, about suddenly making herself heard. When we abdicate the faculty of speech, we can only reconstruct the thoughts and feelings of others by means of clues. Often we interpret them incorrectly. Silence confuses us.

    I was once like that. There was a time when anyone could read the words senza parole — “speechlessness” — on my wrist. It was the expression that I got tattooed on my skin when I lost my mother : I can’t say a word, I don’t want to speak. It was my first tattoo, an indelible warning whenever someone held out his hand to help me. I pushed away from everyone after my mother died, especially from myself. I even dyed my hair black so as not to see in the mirror a reflection which resembled the mother I no longer had.

    But “speechlessness” is now the word I hate most, because I understood later, much later, that the words you need to say are always available to you, and you have to make the effort to find them. Just as Plato said, words have the power to create, to form reality — real words, which have equally real effects on our present. As Sarah’s sad story reveals, the absence of words is the absence of reality. Without words there is no life, only anxiety, only malaise.

    I covered up that tattoo in Sarajevo, a few days before my first book was published, because I had finally found my words. When people smile at the black ink stain that wraps my right wrist like a bracelet, I smile too, because only I know what is underneath, the error that was stamped on my flesh that I have now stamped out. How much life was born after the muzzle was destroyed!

    Whatever production of ourselves we stage, there will always be a little detail — a precarious gesture, a forced laugh, an uncertainty, an imbalance — that exposes the inconsistency between what we are doing and what we really want to do.

    We are not films, there is no post-production in life, and special effects lose their luster quickly. We are perpetually a first version, opera prima, drafts and sketches of the tragedy or comedy of ourselves, as in that moment at sunset in Syracuse or Taormina when the actors entered the scene to begin the show.

    Today we all live entangled in a bizarre situation. We have the most immense repository of media in human history and we no longer know what or how or with whom to communicate. I am convinced that we have never before felt so alone. The reason is not that we are silent. Quite the contrary. We talk and talk and talk, until talking exhausts us. But the perpetual cacophony allows us to ignore that we communicate little of substance. We tend to say the bare minimum, to speak quickly and efficiently, to abbreviate, to signal, to hide, to be always easy and never complex. We seem, simultaneously, afraid of being misunderstood and afraid of being understood. The human act of saying has become synthetic, a constant pitch, a transactional practice borrowed from business in which we must persuade our interlocutors in just a few minutes to commit everything they have. Our speech is an advertisement, a performance. Joy is a performance, pain is a performance — and a speedy one. If we do not translate our sentiments into slogans and cliches, graphics and “visualizations,” if we do not express ourselves in the equivalents of summaries, slides, and abstracts, if our presentation of our feelings or our ideas exceed a commonly accepted time limit (reading time: three minutes), then we fear that nobody will have the patience to listen to us.

    We have swapped the infinity of our thoughts for the stupid finitude of 280 characters. We send notices of our ideas and notifications of our feelings, rather like smoke signals. Is there anything more like a smoke signal than Instagram stories, which are similarly designed to disappear? 

    Brevity is now the very condition of our communication. We behave like vulgar epigrammatists, electronically deforming the ancient art of Callimachus and Catullus. We condense what we have to say into each of the many chats on which we try desperately to make ourselves heard by emoticons and phrases and acronyms shot like rubber bullets that bounce here and there as in an amusement park. We refuse subordinate clauses, the complicated verbal arrangement — appropriate for the complexity of actual ideas and feelings — known as hypotaxis, fleeing from going hypò, or “below” the surface, and preferring instead to remain parà, or “next,” on the edge of the parataxis, the list of the things and people we love.

    We refuse to know each other and in the meantime we all talk like oracles.

    It is a fragile paradox, which should be acknowledged without irony (that hollow armor) and which demands love rather than bitter laughter: the less we say about ourselves, the more we reveal about ourselves. Only we do it in a skewed, precarious way. And we do it deceptively, even treasonously.

    Our brevity is only a postponement of what sooner or later will be expressed, but in a twisted way. Surely others have observed the tiny breakdowns, the personal explosions that plague any person forced to live in a perpetual state of incompleteness. Have you never seen someone who, finding herself without words, ends up screaming and madly gesticulating? Everywhere we end up sabotaging the image of perfection that we impose on ourselves with small, miserable, inhuman actions. An unjustified fit of anger on a train: a wrong seat, a suitcase that doesn’t fit, a crying baby, a dog, an insult at the traffic light, and suddenly we are hurling unrepeatable shrieks out the window before running away like thieves. Or perhaps you have observed another symptom of this unhealthy condition: anxious indecision — an unnerving slowness to order at the restaurant, you choose, I don’t know, I’m not sure, maybe yes, of course not, in front of a bewildered waiter, while we collapse as if the course of our whole life depended on the choice of a pizza. 

    Once upon a time, revolutions were unleashed to obtain freedom from a master. Today the word “revolution” is thrown around in political discourse, but in our inner lives it makes us so afraid that we prefer to oppress ourselves, to renounce the treasures of language and the strengths they confer. And so silence has become our master, imprisoning us in loneli-ness. A noisy silence, a busy loneliness. The result is a generalized anxiety that, when it explodes, because it always explodes sooner or later, makes us ashamed of ourselves. 

    When we give our worst to innocent strangers, we would like immediately to vanish, to erase the honest image of ourselves unfiltered. We tell ourselves that is only what we did there — on the subway at rush hour when an old lady cluttered us with her shopping bags, or in the line at the post office, annoyed because we lost our place while we were fiddling with the phone or with a post on Facebook in which we commented on something about which we do not care and about which we have nothing to say because there is nothing to say about it. That is not who we really are. It was a mistake. It was not representative — or so we tell ourselves.

    If we are ashamed, if we want to disappear after these common eruptions, it is for all that we have not done, for all that we have not said, to these strangers and to others we have encountered before. By remaining silent, or by speaking only efficiently, before the spectacle of life, without calling anything or anyone by name, without relishing descriptions, not only do we not know things, as Plato warned, but we do not even know ourselves.

    Who are we, thanks to our words?

    Futilitarianism or To the York Street Station

    Wednesday, April 8th…a date etched in black for socialists and progressives, marking the end of a beautiful fantasy. It was on that doleful day that Senator Bernie Sanders — acknowledging the inevitable, having depleted his pocketful of dreams — announced the suspension of his presidential campaign. It was the sagging anticlimax to an electoral saga that came in like a lion and went out with a wheeze. For months the pieces had been falling into place for Sanders to secure the Democratic nomination, only to fall apart in rapid slow motion on successive Super Tuesdays, a reversal of fortune that left political savants even more dumbstruck than usual. Taking to social media, some of Sanders’ most fervent and stalwart supporters in journalism, punditry, and podcasting responded to the news of his withdrawal with the stoical grace we’ve come to expect from these scarlet ninja. Shuja Haider, a high-profile leftist polemicist who’s appeared in the Guardian, The Believer, and the New York Times, tweeted: “Well the democratic party just officially lost the support and participation of an entire generation. Congratulations assholes.” (On Twitter, commas and capital letters are considered optional, even a trifle fussy.) Will Menaker, a fur-bearing alpha member of the ever popular Chapo Trap House podcast (the audio clubhouse of the self-proclaimed “dirtbag left”), declared that with Bernie out of the race, Joe Biden, “has his work cut out for him when it comes to winning the votes of a restive Left that distrusts and dislikes him. It’s not impossible if he starts now by sucking my dick.” Others were equally pithy.

    It fell upon Jacobin, the neo-Marxist quarterly and church of the one true faith, to lend a touch of class to the valedictory outpourings. Political admiration mingled with personal affection as it paid homage to the man who had taken them so far, but not far enough. On its website (the print edition is published quarterly) it uncorked a choral suite of tributes, elegies, and inspirational messages urging supporters to keep their chins up, their eyes on the horizon, their gunpowder dry, a song in their hearts: “Bernie Supporters, Don’t Give Up,” “We Lost the Battle, but We’ll Win the War,” “Bernie Lost. But His Legacy Will Only Grow.” In this spirit, the magazine’s editor and founder, Bhaksara Sunkara, author of The Socialist Manifesto: The Case for Radical Politics in an Era of Extreme Inequality, conducted a postmortem requiem on YouTube with his Jacobin comrades processing their grief and commiserating over their disappointment. Near the end of the ceremony, Sunkara declared that Bernie’s legacy would be as a moral hero akin to Martin Luther King, Mother Jones, and Eugene V. Debs. Which offered a measure of bittersweet consolation, but was not what Sunkara had originally, thirstily desired. “I wanted him to be fucking Lenin. I wanted him to take power and institute change.” But the Bernie train never reached the Finland Station, leaving the Jacobins cooling their heels on the platform and craning their necks in vain. 

    Politically and emotionally they had banked everything on him. “Socialism is the name of our desire,” Irving Howe and Lewis Coser had famously written, and for long fallow seasons that desire lay slumbrous on the lips until awakened by Bernie Sanders, the son of Jewish immigrants from Poland, the former mayor of Burlington, Vermont, the junior senator of that state, and lifelong champion of the underdog. Where so many longtime Washington figures had been led astray by sinecures, Aspen conferences, and unlimited canapes, Sanders had been fighting the good fight for decades without being co-opted by Georgetown insiders and neoliberal think tanks, like a protest singer who had never gone electric. He might not be a profound thinker or a sonorously eloquent orator (on a tired day he can sound like a hoarse seagull), and his legislative achievement may be a bit scanty, but his tireless ability to keep pounding the same nails appealed to youthful activists that had come to distrust or even detest the lofty cadences of Barack Obama now that he was gone from office and appeared to halo into Oprah-hood. Eight years of beguilement and what had it materially gotten them? grumbled millennials slumped under student debt and toiling in unpaid internships. What Bernie lacked in movie-poster charisma could be furnished by Jacobin, which emblazoned him as a lion in winter.

    So confident was Jacobin that the next great moment in history was within its grasp that in the winter of 2019 it devoted a special issue to the presidency of Bernie Sanders, whose cover, adorned with an oval portrait of Sanders gazing skyward, proclaimed: “I, President of the United States and How I Ended Poverty: A True Story of The Future.” Subheads emphasized that this was not just an issue of a magazine, a mere collation of ink and paper, it was the beginning of a crusade — a twenty-year plan to remake America. Avengers, assemble! At the public launch of the “I, President” issue, Sunkara rhetorically asked, “Is there a point in spending all day trying to explain, like, the Marxist theory of exploitation to some 18-year-old? Yes! Because that kid might be the next Bernie Sanders.” 

    Alas, Jacobin made the mistake of counting their red berets before they were hatched, and now the issue is fated to become a collector’s item, a poignant keepsake of what might have been. Had Sanders remained in the race and won the presidency, Jacobin would have been as credited, identified, and intimately associated with the country’s first socialist administration as William F. Buckley, Jr.’s National Review was with Ronald Reagan’s. Jacobin could have functioned as its ad hoc brain trust, or at least its nagging conscience. From that carousel of possibilities the magazine instead finds itself reckoning with the divorce of its socialist platform from its standard bearer, facing the prospect of being just another journal of opinion jousting for attention. No longer ramped up as a Bernie launch vehicle, Jacobin must tend to the churning ardor for grand-scale structural change and keep its large flock of followers from straying off into the bushes, which is not easy to do after any loss, no matter how noble. “In America, politics, like everything else, tends to be all or nothing,” Irving Howe observed in Socialism and America. And after working so hard on Bernie’s behalf, it’s hard to walk away with bupkis. 

    Jacobin possesses a strong set of jaws, however. It will not be letting go of its hold in the marketplace of ideas anytime soon. For better or ill, it will continue to set the tone and tempo on the left even in the absence of its sainted gran’pop. Since initiating publication in 2010, Jacobin has established itself as an entrepreneurial success, a publishing sensation, and an ideological mothership. It has built up its own storehouse of intellectual capital, an identifiable brand. Taking its name and sabre’d bravado from the group founded by Maximilien Robespierre that conducted the French Revolution’s Reign of Terror (an early issue featured an IKEA-like guillotine on the cover, presumably for those fancying to stage their own backyard beheadings — “assembly required,” the caption read), Jacobin located a large slumbering discontent in the post-Occupy Wall Street/Great Recession stagnancy among the educated underemployed and gave it a drumbeat rhythm and direction.

    From the outset the magazine exuded undefeatable confidence, the impression that history with a capital H was at its back. Its confidence in itself proved not misplaced. Where even before the coronavirus most print magazines were on IV drips, barely sustainable and in the throes of a personality crisis, Jacobin’s circulation has grown to 40,000 plus (more than three times that of Partisan Review in its imperious prime); it has sired and inspired a rebirth of socialist polemic (Why Women Have Better Sex Under Socialism, The ABCs of Socialism, Why You Should Be a Socialist, and the forthcoming In Defense of Looting), and helped recruit a young army of activists to bring throbbing life to Democratic Socialists of America, whose membership rolls as of late 2019 topped 56,000, with local chapters popping up like fever blisters. 

    The editorial innovation of Sunkara’s Jacobin was that it tapped into animal spirits to promote its indictments and remedies, animal spirits normally being the province of sports fans, day traders, and bachelorette parties but not of redistributionists, egalitarians, and social upheavers. Even its subscription form is cheeky: “The more years you select, the better we can construct our master plan to seize state power.” Although the ground game of socialism was traditionally understood as a conscientious slog — meetings upon meetings, caucusing until the cows come home, microscopic hair-splitting of doctrinal points — Jacobin lit up the scoreboard with rhetoric and visuals that evoked the heroic romanticism of revolution, history aflush with a red-rose ardor. The articles can be dense and hoarse with exhortations (“we must build…,” “we must insist…” we must, we must), the writing unspiced by wit, irony, and allusion (anything that smacks of mandarin refinement), and the infographics more finicky than instructive, but the 

    overall package has a jack-in-the-box boing!, a kinetic aesthetic that can be credited to its creative director, Remeike Forbes. Not since the radical Ramparts of the 1960s, designed by Dugald Stermer, has any leftist magazine captured lightning in a bottle with such flair. 

    Effervescence is what sets Jacobin apart from senior enterprises on the left such as The Nation, Dissent, New Left Review, and that perennial underdog Monthly Review, its closest cousin being Teen Vogue, Conde Nast’s revolutionary student council fan mag — the Tiger Beat of glossy wokeness. When not extolling celebrity styling (“Kylie Jenner’s New Rainbow Manicure Is Perfect for Spring”), Teen Vogue posts junior Jacobin tutorials on Rosa Luxemburg and Karl Marx, whose “writings have inspired social movements in Soviet Russia, China, Cuba, Argentina, Ghana, Burkina Faso, and more…” (most of those movements didn’t pan out so well, but they left no impact on Kylie’s manicure). 

    Jacobin recognized that hedonics are vital for the morale and engagement of the troops, who can’t be expected to keep chipping away forever at the fundament of the late-capitalist, post-industrial, Eye of Sauron hegemon. No longer would socialists be associated with aging lefties in leaky basements cranking the mimeograph machine and handing out leaflets on the Upper West Side — socialism now had a hip new home in Brooklyn where the hormones were hopping and bopping pre-corona. “‘Everybody looks fuckin’ sexy as hell,’” shouted [Bianca] Cunningham, NYC-DSA’s co-chair. ‘This is amazing to have everybody here looking beautiful in the same room, spreading the message of socialism.’” So recorded Simon van Zuylen-Wood in “Pinkos Have More Fun,” his urban safari into the dating-mating, party-hearty socialist scene for New York magazine.

    In the middle of the dance floor I ran into Nicole Carty, a DSA-curious professional organizer I also hadn’t seen since college, who made a name for herself doing tenant work after Occupy Wall Street. (DSA can feel like a never-ending Brown University reunion.) “Movements are, yeah, about causes and about progress and beliefs and feelings, but the strength of movements comes from social ties and peer pressure and relationships,” Carty said. “People are craving this. Your social world intersecting with your politics. A world of our own.”

    Jacobin’s closest companion and competitor in the romancing of the young and the restless is The Baffler, founded in 1988, at the height of the Reagan imperium, allowed to lapse in 2006, revived from cryogenic slumber in 2010, and going strong ever since. Both quarterlies publish extensive and densely granulated reporting and analytical pieces on corporate greed, treadmill education, factory farming, and America’s prison archipelago, though The Baffler slants more essayistic and art-conscious, a Weimar journal for our time. The chief difference, however, is one of temperament and morale. Where Jacobin, surveying the wreckage and pillage, holds out the promise that the cavalry is assembling, preparing to ride, The Baffler often affects a weary-sneery, everything-sucks, post-grad-school vape lounge cynicism, as if the battle for a better future is a futile quest — the game is rigged, the outcome preordained. “Forget it, Jake, it’s Chinatown.” 

    The Bafflerʼs bullpen of highly evolved futilitarians leans hard on the words “hell” and “shit” to register their scorn and disgust at the degradation of politics and culture in our benighted age by rapacious capital with the complicity of champagne-flute elitists and the good old dumb-ox American booboisie. It’s Menckenesque misanthropy (minus Mencken’s thunder rolls of genius) meets Bladerunner dystopia with a dab of Terry Southern nihilism, and it’s not entirely a warped perspective — the world is being gouged on all sides by kleptocratic plunder. But The Baffler offers mostly confirmation of the system’s machinations, the latest horrors executed in fine needlepoint, no exit from the miasma. Each issue arrives as an invitation to brittle despair. 

    Jacobin, by contrast, acts as more of an agent of transmutation, a mojo enhancer for the socialist mission. This is from “Are You Reading Propaganda Right Now?” by Liza Featherstone, which appeared in its winter 2020 issue:

    One of the legacies of the Cold War is that Americans assume propaganda is bad. While the term “propaganda” has often implied that creators were taking a manip-ulative or deceptive approach to their message — or glossing over something horrific, like World War I, the Third Reich, or Stalin’s purges — the word hasn’t always carried that baggage. Lenin viewed propaganda as critical to building the socialist movement. In his 1902 pamphlet What Is to Be Done?, it’s clear that his ideal propaganda is an informative, well-reasoned argument, drawing on expertise and information that the working-class might not already have. That’s what we try to do at Jacobin.

    It is worth asking how much these excitable Leninists actually know about their Bolshie role model. Did they notice Bernie’s response to Michael Bloomberg’s use of the word “communist” to describe him at one of the debates? He called it “a cheap shot.” Say what you will about Sanders, but he recoiled at the charge. He, at least, is familiar with Lenin’s work.

    Jacobin’s mistake was to think it could play kingmaker too. In It Didn’t Happen Here: Why Socialism Failed in the United States, Seymour Martin Lipset and Gary Marks delineated the unpatchable differences between “building a social movement and establishing a political party,” or, in this case, taking over an existing one. (As Irving Howe cautioned, “You cannot opt for the rhythms of a democratic politics and still expect it to yield the pathos and excitement of revolutionary movements.”) Political parties represent varied coalitions and competing interests, requiring expediency, horse trading, and tedious, exhausting staff work to achieve legislative ends. Lipset and Marks: “Social movements, by contrast, invoke moralistic passions that differentiate them sharply from other contenders. Emphasis on the intrinsic justice of a cause often leads to a rigid us-them, friend-foe orientation.” 

    The friend-foe antipathy becomes heightened and sharpened all the more in the Fight Club of social media, where the battle of ideas is waged with head butts and low blows. In print and online, Jacobin wasn’t just Sanders’ heraldic evangelist, message machine, and ringside announcer (“After Bernie’s Win in Iowa, the Democratic Party Is Shitting Its Pants” — actual headline), it doubled as the campaign’s primary enforcer, methodically maligning and elbowing aside any false messiah obstructing the road to the White House, ably assisted by the bully brigade of “Bernie Bros” and other nogoodniks who left their cleat marks all across Twitter. Excoriation was lavished upon pretenders who had entered the race out of relative obscurity and momentarily snagged the media’s besotted attention, such as Texas’ lean and toothy Beto O’Rourke, whose campaign peaked when he appeared as Vanity Fair’s cover boy and petered out from there (“Beto’s Fifteen Minutes Are Over. And Not a Moment Too Soon,” 

    wrote Jacobin’s Luke Savage, signing the campaign’s death certificate). 

    Pete Buttigieg received a more brutal hazing, ad hominemized from every angle. Jacobin despised him from the moment his Eddie Haskell head peeped over the parapet — that this Rhodes scholar, military veteran who served in Afghanistan, and current mayor of South Bend, Indiana had written a tribute to Bernie Sanders when he was in high school only made him seem more fishily Machiavellian in their minds. A sympathetic, personally informed profile by James T. Kloppenberg in the Catholic monthly Commonweal portrayed Buttigieg as a serious, driven omnivore of self-improvement, but in Jacobin he barely registered as a human being, derided as “an objectively creepy figure” by Connor Kilpatrick (“That he is so disliked by the American public while Sanders is so beloved…should hearten us all”), and roasted by Liza Feather-stone for being so conceited about his smarts, an inveterate showoff unlike you-know-who: “Bernie Sanders, instead of showing off his University of Chicago education, touts the power of the masses: ‘Not Me, Us.’ The cult of the Smart Dude leads us into just the opposite place, which is probably why some liberals like it so much.” 

    There was no accomplishment of Buttigieg’s that Jacobin couldn’t deride. Buttigieg’s learning Norwegian (he speaks eight languages) to read the novelist Erlend Loe would impress 

    most civilians, but to Jacobin it was more feather-preening, and un-self-aware besides: “Pete Buttigieg’s Favorite Author Despises People Like Him,” asserted Ellen Engelstad with serene assurance in one of the magazine’s few stabs at lit crit. Even Buttigieg’s father — the renowned Joseph Buttigieg, a professor of literature at Notre Dame who translated Antonio Gramsci and founded The International Gramsci Society — might have washed his hands of this upstart twerp, according to Jacobin. By embracing mainstream Democratic politics, “Pete Buttigieg Just Dealt a Blow to His Father’s Legacy,” Joshua Manson editorialized. The American people, Norwegian novelists, the other kids in the cafeteria, Hamlet’s ghost — the message was clear: nobody likes you, Pete! Take your salad fork and go home!

    Buttigieg may have betrayed his Gramscian legacy but it was small beans compared to the treachery of which another Sanders rival was capable. In “How the Cool Kids of the Left Turned on Elizabeth Warren,” Politico reporter Ruairi Arrieta-Kenna chronicled Jacobin’s spiky pivot against Elizabeth Warren, that conniving vixen. Arrieta-Kenna: “It wasn’t so 

    long ago that you could read an article in Jacobin that argued, ‘If Bernie Sanders weren’t running, an Elizabeth Warren presidency would probably be the best-case scenario.’ In April, 

    another Jacobin article conceded that Warren is ‘no socialist’ but added that ‘she’s a tough-minded liberal who makes the right kind of enemies,’ and her policy proposals ‘would make this country a better place.’” Her platform and Sanders’ shared many of the same planks, after all. 

    Planks, schmanks, the dame was becoming a problem to the Jacobin project, cutting into Bernie’s constituency and being annoyingly indefatigable, waving her arms around like a baton twirler. Warren needed to be sandbagged to open a clear lane for Bernie. Hence, “in the pages of Jacobin,” Arrieta-Kenna wrote, “Warren has gone from seeming like a close second to Sanders to being a member of the neoliberal opposition, perhaps made even worse by her desire to claim the mantle of the party’s left.” The J-squad proceeded to work her over with a battery of negative stories headlined “Elizabeth Warren’s Head Tax Is Indefensible,” “Elizabeth Warren’s Plan to Finance Medicare for All Is a Disaster,” and “Elizabeth Warren Is Jeopar-dizing Our Fight for Medicare for All,” and warned, quoting Arrieta-Kenna again, “that a vote for Warren would be ‘an unconditional surrender to class dealignment.’” When Warren claimed that Sanders had told her privately that a woman couldn’t defeat Donald Trump and declined to shake Bernie’s hand after the January 14 Democratic debate, she completed the arc from valorous ally to squishy opportunist to Hillary-ish villainess. Little green snake emojis slithered from every cranny of Twitter at the mention of Warren’s name, often accompanied by the hashtag #WarrenIsASnake, just in case the emojis were too subtle. Compounding her trespasses, Warren declined to endorse Sanders after she withdrew from the race, blowing her one shot at semi-redemption and a remission of sins. Near the end of Jacobin’s YouTube postmortem, Sunkara expressed sentiments that seemed to be universal in his cenacle: “Fuck Elizabeth Warren,” he explained, “and her whole crew.”

    Once Buttigieg and Warren dropped out of serious contention, the sole remaining obstacle was Joe Biden, whom Jacobin considered a paper-mache relic in a dark suit loaned out from the prop department and seemingly incapable of formulating a complete sentence, much less a coherent set of policies — an entirely plausible caricature, as caricatures go. Occasion-ally goofy and even surreal in his off-the-cuff remarks, Biden doesn’t suggest deep reserves of fortitude and gravitas. In February 2020, Verso published Yesterday’s Man: The Case Against Joe Biden by Jacobin staff writer Branko Marcetic, its cover photograph showing an ashen Biden looking downcast and abject, as if bowing his weary head to the chopping block of posterity. But on the first Super Tuesday, the Biden candidacy, buoyed by the endorsement by the formidable James Clyburn and the resultant victory in South Carolina, rose from the dusty hallows and knocked Sanders’ sideways. It was the revenge of the mummy, palpable proof that socialism may have been in vogue with the media and the millennials but rank and file Democrats, especially those of color, weren’t interested in lacing up their marching boots. For them, the overriding imperative was not Medicare for All or the Green New Deal but denying Donald Trump a second term and the opportunity to reap four more years of havoc and disfigurement. In lieu of Eliot Ness, Joe Biden was deemed the guy who had the best shot of taking down Trump and his carious crew. 

    For a publication so enthralled to the Will of the People and the workers in their hard-won wisdom, it’s remarkable how badly Jacobin misread the mood of Democratic voters and projected its own revolutionary ferment on to it — a misreading rooted in a basic lack of respect for the Democratic Party, its values, its history, its heroes (apart from FDR, since Sanders often cited him), its institutional culture, its coalitional permutations — all this intensified with an ingrained loathing for liberalism itself. From its inception, Jacobin, like so many of its brethren on the Left, has displayed far more contempt and loathing for liberals, liberalism, and the useless cogs it labels “centrists” than for the conservatives and reactionaries and neo-fascists intent on turning the country into a garrison state with ample parking. It has a softer spot for hucksters, too. It greeted libertarian blowhard podcaster Joe Rogan’s endorsement of Sanders as a positive augury — “It’s Good Joe Rogan Endorsed Bernie. Now We Organize” — and published a sympathetic profile of the odious Fox News host Tucker Carlson. This has been its modus operandi all along. In a plucky takedown of the magazine in 2017 called “Jacobin Is for Posers,” Christopher England noted, “It can claim two issues with titles like ‘Liberalism is Dead,’ and none, henceforth, that have shined such a harsh light on conservatism.” For Jacobin, liberalism may be dead or playing possum but it keeps having to be dug up and killed again, not only for the exercise but because, England writes, “conservatism, as its contributors consistently note, can only be defeated if liberalism is brought low.” Remove the flab and torpor of tired liberalism and let the taut sinews of the true change-maker spring into jaguar action. 

    Which might make for some jungle excitement, but certainly goes against historical precedent. “In the United States, socialist movements have usually thrived during times of liberal upswing,” Irving Howe wrote in Socialism and America, cautioning, “They have hastened their own destruction whenever they have pitted themselves head-on against liberalism.” Tell that to Jacobin, which either didn’t learn that lesson or considered it démodé, irrelevant in the current theater of conflict. With the Democratic Party so plodding and set in its ways, a rheumy dinosaur that wouldn’t do the dignified thing and flop dead, the next best thing was to occupy and replenish the host body with fresh recruits drawn from young voters, new voters, disaffected independents, blue-collar remnants, and pink-collar workers. Tap into this vast reservoir of idealism and frustration to unleash bottoms-up change and topple the status quo, writing fini to politics as usual. Based on 2016 and how strongly Sanders ran above expectations, this wasn’t a reefer dream.

    The slogan for this campaign was “Not Me. Us,” and it turned out there were a lot fewer “us” this time around. “Mr. Sanders failed to deliver the voters he promised,” wrote John Hudak, a deputy director and senior fellow at the Brookings Institution, analyzing the 2020 shortfall. “Namely, he argued that liberal voters, new voters, and young voters would dominate the political landscape and propel him and his ideas to the nomination. However, in nearly every primary through early March, those voters composed significantly smaller percentages of the Democratic electorate than they did in 2016.” It wasn’t simply a matter of Sanders competing in a more crowded field this time, Hudak reported. In the nine primaries after Warren’s withdrawal, when it became a two-person race, “Mr. Sanders underperformed his 2016 totals by an average of 16.0%, including losing three states that he won in 2016 (Idaho, Michigan, and Washington).” How did Jacobin miss the Incredible Sanders Shrinkage of 2020? 

    It became encoiled in its own feedback loop, hopped up on its own hype. “Twitter — a medium that structurally encourages moral grandstanding, savage infighting, and collective action — is where young socialism lives,” van Zuylen-Wood had observed in “Pinkos Have More Fun,” and Twitter, to state the obvious, is not the real world, but a freakhouse simulacrum abounding with trolls, bots, shut-ins, and soreheads. Jacobin and its allies so dominated online discourse that they didn’t comprehend the limits of that dominance until it hit them between the mule ears. They fell victim to what has come to be known as Cuomo’s Law, which takes its name from the New York gubernatorial contest in 2018 between Andrew Cuomo and challenger Cynthia Nixon, a former cast member of Sex and the City and avowed democratic socialist. On Twitter, Nixon had appeared the overwhelming popular favorite, Cuomo the saturnine droner that no one had the slightest passion for. But Cuomo handily defeated Nixon, demonstrating the disconnect between online swarming and actual turnout: ergo, Cuomo’s Law. 

    Confirming Cuomo’s Law, Joe Biden probably had less Twitter presence and support than any of the other major candidates, barely registering on the radar compared to Sanders, and yet he coasted to the top of the delegate count until the coronavirus hit the pause button on the primary season. Sanders’ endorsement of Biden in a joint livestream video on April 13th not only conceded the inevitable but delivered a genuine moment of reconciliation that caught many off-guard, steeped in the residual rancor of 2016. Whatever his personal disappointment, Sanders seems to have made peace with defeat and with accepting a useful supporting role in 2020; he refuses to dwell in acrimony. The same can’t be said about many of the 

    defiant dead-enders associated with Jacobin, who, when not rumor-mongering about Biden’s purported crumbling health, cognitive decline, incipient dementia, and basement mold, attempted to kite Tara Reade’s tenuous charges of sexual harassment and assault at the hands of Biden into a full-scale Harvey Weinstein horror show, hoping the resultant furor would dislodge Biden from the top of the ticket and rectify the wrong done by benighted primary voters. For so Jacobin had written and so it was said: “If Joe Biden Drops Out, Bernie Sanders Must Be the Democratic Nominee.” 

    Like Norman Thomas, the longtime leader of the Socialist Party in America, Bernie Sanders bestowed a paternal beneficence upon the left that has given it a semblance of unity and personal identity. He is the rare politician one might picture holding a shepherd’s crook. The problem is that identification with a singular leader is an unsteady thing for a movement to lean on. Long before Thomas died in 1968, having run for the presidency six times, the socialist movement had receded into gray twilight, upstaged by the revolutionary tumult on campuses and in cities. Jacobin is determined to make sure history doesn’t reprise itself once Sanders enters his On Golden Pond years. Preparing the post-Bernie stage of the socialist movement, a pair of Jacobin authors, Meagan Day and Micah Uetricht, collaborated on Bigger Than Bernie: How We Go from the Sanders Campaign to Democratic Socialism (Verso), a combination instruction manual and inspirational hymnal.

    The duo doesn’t lack for reasons to optimize the upside for the ardent young socialists looking to Alexandria Ocasio-Cortez as their new scoutmaster. The coronavirus crisis has laid bare rickety infrastructure, the lack of preparedness, near-sociopathic incompetence, and widespread financial insecurity that turned a manageable crisis into a marauding catastrophe, making massive expansion of health coverage, universal basic income, and debt relief far more feasible propositions. The roiling convulsions following the death of George Floyd once again exposed the brutal racism and paramilitarization of our police forces. A better, more humane future has never cried out more for the taking. But there is a catch: it can be seized only in partnership with liberal and moderate Democrats, no matter how clammy the clasping hands might be, no matter how mushy the joint resolutions, and this will be galling for Jacobin’s pride and vocation, making it harder for them to roll out the tumbrils with the same gusto henceforth. The magazine, after conducting introspective postmortems (“Why the Left Keeps Losing — and How We Can Win”) and intraparty etiquette lessons (“How to Argue with Your Comrades”), finds itself feeling its way forward, with the occasional fumble. When Bhaskar Sunkara announced on Twitter that he intends to cast his presidential vote for Green Party candidate Howie Hawkins (who he?), one of those showy public gestures that leaves no trace, he received pushback from fellow comrades in The Nation (“WTF Is Jacobin’s Editor Thinking in Voting Green?”) and elsewhere. Clarifying his position in The New York Times, where clarifications learn to stand up tall and straight, Sunkara assured the quivering jellies who read the opinion pages that “contrary to stereotypes, we are not pushing a third candidate or eager to see Mr. Trump’s re-election. Instead we are campaigning for core demands like Medicare for All, saving the U.S. Postal Service from bipartisan destruction, organizing essential workers to fight for better pay and conditions throughout the coronavirus crisis and backing down-ballot candidates, mostly running on the Democratic ballot line… Far from unhinged sectarianism, this is a pragmatic strategy.”

    Jacobin pragmatism? This is a historical novelty. By November we will know if they are able to make it to the altar without killing each other. It’s hard to settle once you’ve had a taste of Lenin.

    Night Thoughts

    Long ago I was born.
    There is no one alive anymore
    who remembers me as a baby.
    Was I a good baby? A
    bad? Except in my head
    that debate is now
    silenced forever.
    What constitutes
    a bad baby, I wondered. Colic,
    my mother said, which meant
    it cried a lot.
    What harm could there be
    in that? How hard it was
    to be alive, no wonder
    they all died. And how small
    I must have been, suspended
    in my mother, being patted by her
    approvingly.
    What a shame I became
    verbal, with no connection
    to that memory. My mother’s love!
    All too soon I emerged
    my true self,
    robust but sour,
    like an alarm clock.

    Mahler’s Heaven and Mahler’s Earth

    Gustav Mahler: the face of a man wearing glasses. The face attracts the attention of the viewer: there is something very expressive about it. It is a strong and open face, we are willing to trust it right away. Nothing theatrical about it, nothing presumptuous. This man wears no silks. He is not someone who tells us: I am a genius, be careful with me. There is something energetic, vivid, and “modern” about the man. He gives an impression of alacrity: he could enter the room any second. Many portraits from the same period display men, Germanic and not only Germanic men, politicians, professors, and writers, whose faces disappear stodgily into the thicket of a huge voluptuous beard, as if hiding in it, disallowing any close inspection. But the composer’s visage is naked, trans-parent, immediate. It is there to speak to us, to sing, to tell us something.

    I bought my first recording of Gustav Mahler many decades ago. At the time his name was almost unknown to me. I only had a vague idea of what it represented. The recording I settled on was produced by a Soviet company called Melodiya — a large state-owned (of course) company which sometimes produced great recordings. There was no trade in the Soviet Union and yet the trademark Melodya did exist. It was the Fifth Symphony, I think — I’ve lost the vinyl disc in my many voyages and moves — and the conductor was Yevgeny Svetlanov. For some reason the cover was displayed in the store window for a long time; it was a modest store in Gliwice, in Silesia. Why the display of Mahler’s name in this provincial city which generally cared little for music? 

    It took me several days before I decided to buy the record. And then, very soon, when I heard the first movement, the trumpet and the march, which was at the same time immensely tragic and a bit joyful too, or at least potentially joyful, I knew from this unexpected conjunction of emotions that something very important had happened: a new chapter in my musical life had opened, and in my inner life as well. New sounds entered my imagination. At the same time I understood — or only intuited — that I would always have a problem distinguishing between “sad” and “joyful,” both in music and in poetry. Some sadnesses would be so delicious, and would make me so happy, that I would forget for a while the difference between the two realms. Perhaps there is no frontier between them, as in the Schengen sector of contemporary Europe.

    The Fifth Symphony was my gateway to Mahler’s music. Many years after my first acquaintance with it, a British conductor told me that this particular symphony was deemed by those deeply initiated in Mahler’s symphonies and Mahler’s songs as maybe a bit too popular, too accessible, too easy. “That trumpet, you know.” “And, you know, then came Visconti,” who did not exactly economize on the Adagietto from the same symphony in the slow, very slow shots in Death in Venice, where this music, torn away from its sisters and brothers, the other movements, came to serve a mass-mystical, mass-hys-terical cultish enthusiasm, floating on the cushions of movie theaters chairs. Nothing for serious musicians, nothing for scholars and sages…. But I do not agree. For me the Fifth Symphony remains one of the living centers of Gustav Mahler’s music and no movie will demote it, no popularity will diminish it, no easily manipulated melancholy in a distended Adagietto will make me skeptical about its force, its freshness, its depth. 

    As for that trumpet: the trumpet that I heard for the first time so many years ago had nothing to do with the noble and terrifying noises of the Apocalypse. It was nothing more than an echo of a military bugle — which, the biographers tell us, young Gustav must have heard almost every week in his small Moravian town of Jihlava, or Iglau in German, which was the language of the Habsburg empire, where local troops in their slightly comic blue uniforms would march in the not very tidy streets to the sounds of a brass orchestra. Yet there was nothing trivial or farcical about this almost-a-bugle trumpet. It told me right away that in Mahler’s music I would be exposed to a deep ambivalence, a new complication — that the provincial, the din of Hapsburgian mass-culture, will forever pervade his symphonies. This vernacular, this down-to-earth (down to the cobblestones of Jihlava’s streets) brass racket, always shadows Mahler’s most sublime adagios. 

    The biographical explanation is interesting and important, but it is not sufficient. An artist of Mahler’s stature does not automatically or reflexively rely on early experiences for his material. He uses them, and transposes them, only when they fit into a larger scheme having to do with his aesthetic convictions and longings. The strings in the adagios seem to come from a different world: the violins and the cellos in the adagios sound like they are being played by poets. But then in the rough scherzo-like movements we hear the impudent brass. From the clouds to the cobblestones: Mahler may be a mystical composer, but his mysticism is tinged with an acute awareness of the ordinary, often trite environment of all the higher aspirations.

    His aesthetic convictions and longings: what are they? Judging from the music, one thing seems to be certain: this composer is looking for the high, maybe for the highest that can be achieved, for the religious, for the metaphysical — and yet he cannot help hearing also the common laughter of the low streets, the unsophisticated noise of military brass instruments. His search for the sublime never takes place in the abstract void of an inspiration cleansed of the demotic world which is his habitat. Mahler confronts the predicament well known to many artists and writers living within the walls of modernity but not quite happy with it, because they have in their souls a deep yearning for a spiritual event, for revelation. They are like someone walking in the dusk toward a light, like a wanderer who does not know whether the sun is rising or setting. They have to decide how to relate to everything that is not light, to the vast continent of the half trivial, half necessary arrangements of which the quotidian consists. Should they ignore it, or attempt to secede from it? But then what they have to say will be rejected as nothing more than lofty rhetoric, as something artificial, as unworldly in the sense of unreal. They will be labeled “reactionary” or, even worse, boring. Anyway, aren’t they to some degree made from the same dross that they are trying to overcome, to transcend? 

    And yet if they attach too much importance to it, if they become mesmerized by what is given, by the empirical, then the sheer weight of the banality of existing conditions might crush them, flatten them to nothingness. The dross, right. But let us be fair about modernity: it has plenty of good things as well. It has given us, among other things, democracy and electricity (to paraphrase Lenin). Any honest attitude toward modernity must be extremely complex. Modernity, for better and worse, is the air we breathe. What is problematic for some artists and thinkers is modernity’s anti-metaphysical stance, its claim that we live in a post-post-religious world. Yet there are also artists and thinkers who applaud modernity precisely for its secularism and materialism, like the well-known French poet who visited Krakow and during a public discussion of the respective situations of French poetry and Polish poetry said this: “I admire many things in present- day Polish poetry, but there is one thing that makes me uneasy — you Polish poets still struggle with God, whereas we decided a long time ago that all that is totally childish.” 

    To be sure, they — the anti-moderns, as Antoine Compagnon calls them — may also become too bitter and angry, so that their critique of the modern world can go too far and turn into an empty gesture of rejection. In his afterword to a collection of essays by Gerhard Nebel — the German conservative thinker, an outsider, once a social-democrat, always an anti-Nazi, after World War II a marginal figure in the intellectual landscape of the Bundesrepublik, a connoisseur of ancient Greek literature, someone who saw dealing with die Archaik as one of the remedies against the grayness of the modern world — Sebastian Kleinschmidt presents such a case. He admires the many merits of Nebel’s writing, his vivid emotions, his intolerance of any routine, of any Banausentum or life lived far away from the appeal of the Muses, his passionate search for the real as opposed to the merely actual — but he is skeptical of Nebel’s overall dismissal of modern civilization, since it is too sweeping to be persuasive, too lacking in nuances and distinctions. Perhaps we can put the problem this way: there is no negotiation involved, no exchange, no spiritual diplomacy.

    When coping with modernity, with those aspects of it which insist on curbing or denying our metaphysical hunger, we must be not only as brave as Hector but also as cunning as Ulysses. We have to negotiate. We need to borrow from modernity a lot: since we encounter it every day, how could we avoid being fed and even shaped by it? The very verb “to negotiate” is a good example of the complexity of the situation. It comes from from negotium, from the negation of otium. Otium is the Latin word for leisure, but for contemplation too. Thus the verb to negotiate denotes a worldly activity that tacitly presupposes the primacy of unworldly activities (because the negation comes second, after the affirmation).

    In French, le négoce means commerce, business. We can add to it all the noise of the market and the parliament. When we negotiate, we have no otium. But it is also possible to negotiate in order to save some of the otium. We can negate otium for a while but only in order to return to it a bit later, once it has been saved from destruction. As I say, we must be cunning. 

    By the way, the notion of otium that gave birth to the verb “to negotiate” is not a marginal category, something that belongs only to the annals of academia, to books covered by dust. For the Ancients it was a central notion and a central activity, the beginning and the end of wisdom. And even now it plays an important role in a debate in which the values of modernity are being pondered: those who have problems with the new shape of our civilization accuse it of having killed otium, of having produced an infinity of new noises and activities which contribute to the end of leisure, to the extermination of contemplation. 

    But can we discuss Mahler’s music along with poetic texts by, say, Yeats and Eliot, along with the other manifestoes of modernism? Talking about music in a way that makes it seem like philosophy or a philosophical novel, a kind of Zauberberg for piano and violin, is certainly flawed. Questions are methodically articulated in philosophy and, though never fully answered, they wander from one generation to another, from the Greeks to our contemporaries. Does art need such questions? Does music need them? The first impulse is to say no, art has nothing to do with this sort of intellectual inquiry. Isn’t pure contemplation, separated from any rational discourse, the unique element of art, both painting and music, and perhaps poetry as well? 

    But maybe pure contemplation does not need to be so pure. We do not know exactly how it works (another question!), but we do know that art always takes on some coloring from its historic time, from the epoch in which it is created. Art obviously has a social history, and earthly circumstances. And yet impure contemplation is still contemplation. Let us listen for a minute to the words of a famous painter, an experienced practitioner — to Balthus in his conversations with Alain Vircondelet, which were conducted in the last years of the painter’s life:

    Modern painting hasn’t really understood that painting’s sublime, ultimate purpose — if it has one — is to be the tool or passageway to answering the world’s most daunting questions that haven’t been fathomed. The Great Book of the Universe remains impenetrable and painting is one of its possible keys. That’s why it is indubitably religious, and therefore spiritual. Through painting, I revisit the course of time and history, at an unknown time, original in the true sense of the word. That is, something newly born. Working allows me to be present on the first day, in an extreme, solitary adventure laden with all of past history.

    How fascinating: a great painter tells us that in his work he used not only his eye and his hand but also his reason, his philosophical mind; that when he painted he felt the presence of great questions. Even more: he tells us that the pressure of these questions was not inconsequential, that it led him to spirituality. We know that Mahler, in a letter to Bruno Walter, also mentioned the presence of great questions and described his state of mind while being in contact with the element of music in this way: “When I hear music, even when I am conducting, I often hear a clear answer to all my questions — I experience clarity and certainty.”

    Certainly, the questions that sit around a painter or a composer like pensive cats are very different from those which besiege a philosopher. Do they require a response? Here is one more authority: in a note serving as a preface to the publication of four of his letters about Nietzsche, Valery remarked that “Nietzsche stirred up the combativeness of my mind and the intoxicating pleasure of quick answers which I have always savored a little too much.” The irony of it: “the intoxicating pleasure of quick answers” in a thinker who, as we know, was so proud of his philosophizing with a hammer. Of course, this one sentence comprises in a nutshell the entire judgment that mature Valéry passed on Nietzsche — the early temptation and the later rejection of such a degree of “the combativeness of the spirit.” And it confirms our intuition: the questions that accompany art, painting, music, and poetry cannot be answered in a way similar to debates in philosophy seminars, and yet they are an invisible and inaudible part of every major artistic exertion.

    In a way, Mahler’s doubleness of approach seems completely obvious; the brass and the strings attend each other, and need each other, in the complex patterns of his symphonies. I have read that in his time he was accused by many critics of triviality in his music. They claimed that his symphonies lacked the dignity of Beethoven’s symphonies, the depth of great German music. What they ferociously attacked as trivial is probably the thing that I admire so much in Mahler’s music — the presence of the other side of our world, the inclusion of its commonness and its coarseness, of the urban peripheries, of village fairs, of the brass — the quotation of provincial life, of public parades and military marches, almost like in Nino Rota’s scores for Fellini. Very few among Mahler’s contemporaries were able to see the virtue of it.

    The charge of triviality also had anti-Semitic undertones and followed in the footsteps of Wagner’s accusation, in his “Judaism in Music,” that Jewish composers were not able to develop a deep connection with the soul of the people, and were limited to the world of the city only, gliding slickly on the surface. Jewish composers apparently could not hear the song of the earth, argued such critics. How wonderful, then, that Mahler triumphed in his own Song of the Earth! Jewish composers were accused — among the many sins of which they were accused — of introducing modern elements into their music. Never mind that one of the principal modernizers of Western music was Wagner himself. 

    I have yet to understand why Mahler has for so long, from the very beginning, been so overwhelmingly important for me, so utterly central to the evolution of my soul. Once, in speaking with some American friends, I asked them who “made” them, in the sense of a master, a teacher, un maître à penser, and the reason was I wanted to tell them that Gustav Mahler made me. It was an exaggeration, I know, and a bit precious. I had other masters as well. And yet my statement was not false. Did it have to do only with the sonorities of his symphonies, with the newness of his music, the unexpected contrasts and astonishing passages swinging between the lyric to the sardonic? Was it the formal side uniquely? For many years I resisted the temptation to translate my deep emotional bond to his music — the deep consonance between Mahler’s work and my own search in the domain of poetry — into intellectual terms, maybe fearing that too much light shed on it would diminish its grip on my imagination. I still hold this superstitious view, but I also suspect that there may be some larger intellectual benefit to be gained from an exploration of my obsession.

    For everyone who has a passionate interest in art and in ideas, sooner or later a problem arises. When we look for truth and try to be honest, when we try as a matter of principle to avoid dogmatism and any sort of petrification, any blind commitment to this or that worldview, we are, it seems, necessarily condemned to deal with shards, with fragments, with pieces that do not constitute any whole — even if, consciously or not, we strive for the impossible “whole.” But then if we also harbor a love for art — and it is not at all unusual to have these two passions combined in a single individual — a strange tension appears: in art we deal with forms which, by definition, cannot be totally fragmentary. To be sure, at least since the Romantic moment we have been exposed to fragments, and accustomed to fracture, in all kinds of artistic enterprises, from music and poetry to painting — but even these fragments tend to acquire a shape. If we juxtapose them with the “truth fragments,” with Wittgensteinian scraps of philosophical results, an integrated pattern is created by virtue of some little embellishment, by a sleight of hand; a magician is at work who tends to forget the search for truth because the possibility of a form, a more or less perfect form, suddenly attracts him more strongly than the shapelessness of a purely intellectual assessment. 

    These two dissimilar but related hunts, one for truth, one for form, are not unlike husky dogs pulling a sled in two slightly different directions: they are sometimes able to achieve an almost-harmony. The sled fitfully moves forward, but at other times the competing pressures threaten to endanger the entire expedition. So, too, are our mental hunts and journeys, forever hesitating between a form that will allow us to forget the rather uncomfortable sharpness of truth and a gesturing for truth that may make us forget the thrill of beauty and the urge to create, at least for the time being.

    This brings us back to Mahler. The doubleness in his music that I have described may be understood as reflecting the ambiguity of the double search for truth and form. Mahler was a God-seeker who recognized the ambivalence of such a quest in art. He was torn between the search for the voluptuousness of beauty and the search for the exactness of truth.

    Hartmut Lange, a German writer living in Berlin, a master of short prose, told me once that Mahler’s Song of the Earth, which he listens to all the time and adores in a radical way, “is God.” I was taken aback. The deification of this almost-symphony, which I also ardently admire, made me feel uneasy. But I find it more than interesting that this great music can be associated with, and even called, God. This suggests a quasi-religious aspect of the music, and even a sober secularist cannot escape at times placing the work within the circle nearing the sacred.

    Among the many approaches to the sacred we may distinguish two: one which consists in searching, in a quest, and is conducted in a climate of uncertainty and even doubt, and another which proclaims a kind of sureness, a positive certainty, a eureka-like feeling that what was sought has been found. In our tormented and skeptical time it is not easy to find examples of such a positive and even arrogant attitude, at least not within serious culture. Among the great modern poets and writers only few were blessed by certainty. Even the great Pascal had his doubts, and so much earlier. Gustav Mahler belongs to the seekers, not the finders. The quest is his element, and doubt is always near.

    It is true for both poetry and music: whenever one approaches an important work, one is much more outspoken when it comes to discussing the elements within it that will yield to the intellectual or even dialectical categories that the reader or listener cherishes. The other ingredients, especially those that represent pure lyricism and thus are at the very heart of the work in question, are hardly graspable, at least in words. What can we say? It is beautiful, it pierces my soul, or some other platitude of the sort. Or we can just sigh to signal our delight. Sighing, though, is not enough; it is too inarticulate, and in print it evaporates altogether. This is the misery of writing about art: the very center of it remains almost totally ineffable, and what can be rationally described is rather a frame than the substance itself. 

    A frame that enters into dialogue with its period, with its cultural and historical environment, can be much better described than the substance of a symphony or a painting. The nucleus of a work, or of an artist’s output, is less historical, less marked by the sediments of time, and therefore mysterious. It is also more personal, more private. This is certainly the case with Mahler’s music, whose very core constitute those lyric movements, those endless ostinati that we find everywhere, first in his songs, in Lieder eines fahrenden Gesellen and the other lieder, then in his symphonies, and supremely in their adagios, and then finally in the unsurpassable Lied von der Erde. And the Ninth Symphony! I don’t have in mind only the final Adagio but also the first movement, the Andante comodo, which displays an incredible vivacity and, at the same time, creates an unprecedentedly rich musical idiom — a masterful musical portrayal of what it means to be alive, with all the quick changes and stubborn dramas, the resentments and the raptures, that constitute the exquisite and weary workshop of the mind and the heart.

    But let us not forget, when we celebrate the lyric sections, the sometimes simple melodies, and the long ostinati, let us not forget all the intoxicating marches, the half sardonic, half triumphant marches that originated in a small Moravian town but then crossed the equator and reached the antipodes. These marches give Mahler’s music its rhythm, its vigor, its muscle. There is nothing wan in Mahler’s compositions, nothing pale on the order of, say, Puvis de Chavannes; instead they display, even in their most tender and aching passages, an irreversible vitality. The marches propel the music and give it its movement, its strolls and dances and strides. The “vulgar” marches convey the mood of a constant progression, maybe even of a “pilgrim’s progress.” Nothing ever stagnates in Mahler compositions, they are on the move all the time. 

    It’s unbecoming to disagree with someone who was a great Mahler connoisseur and also contributed enormously to the propagation of his work, but it is hard to accept Leonard Bernstein’s observation that the funeral marches in Mahler‘s symphonies are a musical image of grief for the Jewish God whom the composer abandoned. The problem is not only that there is scant biographical evidence for such an interpretation. More importantly, the marches are more than Bernstein says they are. They represent no single emotion. Instead they oscillate between mourning and bliss and thus stand (or walk or dance) high above any firm monocausal meaning.

     In the Song of the Earth, it is the sixth and last movement, der Abschied, the Farewell, that crowns Mahler’s entire work. Musicologists tell us that its beauty consists mainly in the combination of a lyrical melodic line with the rich chromaticism of the orchestra. But obviously such an observation can barely render justice to the unforgettable charm of this sensual music which unwillingly bids farewell to the earth; we hear in this work the tired yet ecstatic voice of the composer who knew how little life was left to him. Perhaps only in Rilke’s 

    Duino Elegies can we find an example of a similar seriousness in embracing our fate, an instance of a great artist finally abolishing any clear distinction between sadness and joy.  

    There is a fine poem written in the early 1980s by the Swedish poet and novelist Lars Gustafsson. It is called “The Stillness of the World Before Bach” and it caught the attention of many readers. Here is part of it:

    There must have been a world before
    the Trio Sonata in D, a world before the A minor partita,
    but what kind of a world?
    A Europe of vast empty spaces, unresounding,
    everywhere unawakened instruments,
    where the Musical Offering, the Well-Tempered Clavier
    never passed across the keys.
    Isolated churches
    where the soprano line of the Passion
    never in helpless love twined round
    the gentler movements of the flute […]

    [translated into English by Philip Martin]

    Of course there were many voices and many composers before Bach, and not at all “a Europe of vast empty spaces.” What would Palestrina, Gabrielli, and Monteverdi say? What would the monks say who created and developed Gregorian chant? Still, in Gustafsson’s poem we immediately recognize some deeper truth. I imagine that in a similar poem in which Gustav Mahler would replace Johann Sebastian Bach, the poet would describe not “a Europe of vast empty spaces” but rather a Europe of cities, great and small ones, of empty Sunday streets, of empty parks, of waiting rooms.

    The Mahler gesture resembles in some respect the Bach achievement, but it is very different too. Bach was a genius of synthesis, who appeared after centuries of the development of Western art and on this fertile soil built a great edifice of music. There is less synthetic energy in Mahler’s creation; the significance of his work seems to reside in its spiritual implication. Mahler, more than any of his contemporaries, tries to graft onto this lay world of ours a religious striving, to convey a higher meaning to a largely meaningless environment without ever forgetting or concealing the obvious features of a secular age.

    The Sludge

    I was never more hated than when I tried to be honest….
    I’ve never been more loved and appreciated than when I tried
    to “justify” and affirm someone’s mistaken beliefs; or when
    I tried to give my friends the incorrect, absurd answers they
    wished to hear. In my presence they could talk and agree with
    themselves, the world was nailed down, and they loved it.
    They received a feeling of security.

    RALPH ELLISON, INVISIBLE MAN

    One Friday afternoon, in a carpeted alcove off the main sanctuary of my school, a Jewish school in the suburbs of Philadelphia, my class collected in a circle as we did every week. A young, liberally perfumed Israeli woman in a tight turtleneck sweater read to us from a textbook about the exodus from Egypt. I asked her why our ancestors had been enslaved to begin with, and then wondered aloud whether it was because only former slaves can appreciate freedom. I remember the feeling of the idea forming in my very young mind, and the struggle to articulate it. Clumsily, with a child’s vocabulary, I suggested to my teacher that Jewish political life began with emancipation, and that this origin ensured that gratitude to God would be the foundation of our national identity. Could that have been God’s motivation? I don’t remember her answer, only her mild bemusement, and my impression that she did not have the philosophical tools or the inclination to engage with the question. I was left to wonder on my own about the nature of slavery, the distant memories that undergird identity, and God’s will; without a teacher, without a framework. I was by myself with these questions. 

    Of course, we were not gathered in that schoolchildren’s circle to study philosophy. We were studying the Biblical tale not in order to theorize about the nature of slavery and freedom, or to acquire a larger sense of Jewish history, but because it was expected of us, and every other grade in the school, this and every week since the school’s founding, to study the weekly portion of the Torah, because that is what Jewish students in a Jewish school of that denomination do. I had mistaken a social activity for an intellectual one. The norms of a community demanded this conversation of us, because otherwise the community would be suspect. People would whisper that graduates of our school lacked the capacity for full belonging within their particular Jewish group, because we had failed to receive the proper training in membership. The overarching objective of our education was initiation. The prayers that we were taught to say before and after eating, and upon waking up in the morning, and going to the bathroom, and seeing a rainbow, and on myriad other quotidian occasions, served the same purpose. These were not theological practices; we were not taught to consider the might and creative power of the God whom we were thanking — the meanings of what we recited, the ideas that lay beneath the words. We uttered all those sanctifying words because it was what our school’s permutation of the Jewish tradition taught Jews to do. We were performing, not pondering. 

    Divine commandments were the sources and accoutrements of our liturgies and rituals. But we lingered much longer over the choreography than over the divinity. The substance of our identity was rules, which included the recitation of certain formulas for certain concepts and customs. And our knowledge of the rules, how or whether we obeyed them, would signal what sort of Jews we were. The primary purpose of this system was to provide talismans that we could use to signal membership. In the context of my religious education, the meaning of the symbols was less important than how I presented them. Badges were more central than beliefs. The content of the badges — the symbols and all the concomitant intellectual complications — was left alone. Marinating within that culture inculcated in me an almost mystical reverence for my religion and for its God because it placed them in a realm outside of reason. I could not interrogate them: holiness is incommensurate with reason. Without the indelible experience of that schooling in anti-intellectualism, the beauties and intoxicants of tradition would be inaccessible to me. Even now, when I witness expressions of fine religious faith, I am capable of recognizing and honoring them because of that early training.

    The anti-intellectualism had another unwitting effect: the indifference of my community to the cerebral and non-communal dimensions of the way we lived meant that I could develop my own relationship with them.  Since they were unconcerned with the aspects of religious life to which I most kindled, I was free to discover them independently. They didn’t care what I thought, so I set out to think. In this manner I began to acquaint myself with fundamental human questions, to feel my way around and develop the rudiments of ideas about morality, slavery, love, and forgiveness. My academic syllabi were rife with references to these themes, but they were rarely discussed directly. They were like so many paintings on the wall: we would walk by them a hundred times a day and never stop and look. As children we became comfortable in their presence, but we did not exactly study them together, so I studied them alone, without the commentaries that would harden them into a catechism.  

    In a certain ironic sense, I was lucky. When someone is taught to think about fundamental human questions within a group, her conception of those themes will be shaped by the group. The goal of that sort of group study, perhaps not overtly articulated but always at work, would be to initiate her into a particular system of particular people, to provide her with a ready-made attitude and a handy worldview, to train her to think and speak in the jargon of that worldview, and to signal membership within the company of those who espouse it.

    If language is a condition of our thoughts, it is also a source of their corruption. Thinking outside a language may be impossible, but thought may take place in a variety of vocabularies, and the unexamined vocabularies, the ones that we receive in tidy and dogmatic packages, pose a great danger to clear and critical thinking. My good fortune was that I was not socialized philosophically. My religious tradition was not presented to me as a philosophical tradition. I was not inducted into a full and finished vernacular that would dictate or manipulate how I would think. And I was young enough not to have become so sensitive to political or cultural etiquettes that they would inhibit or mitigate independent reflection and study. The space in my head into which I retreated to think was built and outfitted mainly by me, or so it felt; and there, in that detached and unassisted space, I became accustomed to the looming awareness that these themes were too complicated for me to really understand (an awareness which provoked an ineradicable distrust for communal ideological certainties). Yet this did not diminish my desire to spend time there. My relationship with my burgeoning ideas felt privileged, the way a child feels playing with plundered high heels or lipstick without the context to understand the social significations that those instruments may one day carry. If I misunderstood them, if they baffled me, there was no reason to be embarrassed. My sense of possibility was large and exciting, because it was unburdened by the adult awareness that convictions have social consequences by which they may then be judged. 

    My limited field of human experience — the people I knew, the fictional and historical figures to which I had been introduced — comprised all the materials with which I could conduct my solitary musings. I studied the rhythms and tendencies of human interactions. I watched the way that other people responded to each other, the way they held themselves when they were alone or in society. This stock of knowledge informed how I thought people in general do and ought to behave. (My theory of slavery and emancipation was a product of this discipline: for example, I noted that I got anxious for recess when in school but bored by endless freedom on the weekend or vacation. We appreciate freedom when we are enslaved: is that what Scripture wanted me to understand? Well, that was consistent with my experience.) My inquiries were catalyzed and sustained by pure curiosity about human beings and in retrospect they seem to have been relatively untainted by my community’s biases. Perhaps I am idealizing my beginnings, but I really do have the memory of an open mind and a pretty level playing field. Like the adolescent heroines in Rohmer’s films, I genuinely wanted to know how people are so I could figure out how I should be.

    The effects of this solitary and informal mental life were permanent. Having developed the space in my head independent of a received blueprint, my intellectual methods would always be fundamentally unsocialized. Despite the external pressures, I have never successfully unlearned these attitudes. I don’t doubt that there were many influences from my surroundings, from my community and my culture, that I was absorbing without recognizing them, but still I felt significantly on my own and, as I say, lucky. But I was also quite lonely. The loneliness intensified as I got older and my family became more religious. The high school that I attended was much more traditional than my earlier schools had been. There were more rules, endless esoteric rituals and cultural habits that I had to learn in order to convince myself and others that I was one of them, that I belonged there. I failed often. There was so much that I didn’t know, and, more to the point, there was something about the weather around me that perpetually exposed my difference. No matter how hard I tried to remake myself into a member, to dismantle and rebuild the space in my head, everyone could sense that the indoctrination was not taking. I recited the script with a foreign accent. 

    In a flagrant, chronic, and no doubt annoying manifestation of otherness, I would badger my teachers and peers for reasons and explanations. Why were we — I was a “we” now – obeying all these rules? I was not in open revolt: I sensed that our tradition was rich and I was eager to plumb the treasures that I had been bequeathed. But it seemed a gross dereliction to obey the laws without considering their purpose. My intentions were innocent, perhaps even virtuous, but my questions were discomfiting anyway. Even now I often recall a particularly representative afternoon. A group of girls in my grade were discussing the practice called shmirat negiah, the strict observance of physical distance between the sexes, which prohibits men and women who are not related from touching one another. I wondered: Why had the rule been written to begin with? When did Jews begin to enforce it? What kind of male-female dynamic did it seek to cultivate? Did such emphatic chasteness in preparation for marriage help or harm that union? These were reasonable questions, except that in a context of orthodoxy they could be received as subversive. A girl I admired — a paragon of membership — complained that the practice made her awkward and scared of men, and that she could not understand why her mother enforced it. “Why don’t you just ask your mother why she thinks you ought to do it?” I finally asked. “Because,” she sighed, “she’ll just tell me that I have to because that is what Jews do.” My mind recoiled. Why on earth would a mother shirk the opportunity (and the responsibility) to help her child grapple with such an important question? Why wouldn’t she consider the law itself a catalyst for conversations about such primary themes? Yet even as I asked myself these questions, I knew the answer. Membership mattered more than meaning.

    But surely that attitude did not govern all human communities. This could not be all there was. Somewhere, I assumed, there were institutions in which people directly addressed the ideas I wondered about on my own. Somewhere there were groups in which the exploration of meaning was an essential feature of membership. In the secular world, which I naively called “the real world,” I imagined intellectual camaraderie would be easier to find. Surely secular people, when they talk about justice, sex, mercy, and virtue, must be interested in seriously engaging those themes. In the real world, surely, there would be no orthodoxies, and people would have no reason to incessantly analyze one another’s behaviors in order to grant or deny them legitimacy. They would not spread petty rumors about neighbors failing to uphold the code or refuse to eat at the tables of those who were not exactly like them, as the worst members of my origin bubble did. They would not, forgive me, cancel each other.

    Of course I was wrong. As it turns out, the secular world also has liturgies, dogmas, ostracisms, and bans. It, too, hallows conformity. It has heretics, and it even has gods: they just don’t call them that. In college I discovered the temples of the progressives, the liberals, the conservatives, and more. Each has a vernacular of its own, comprised of dialects and rituals which serve to establish membership, welcome members, and turn away outsiders. In this realm of proud secularity, my religious upbringing proved unexpectedly useful. It had prepared me to identify the mechanisms of group power, and the cruel drama of deviance and its consequences. (What is cancellation, if not excommunication?) It turned out that all too often in the real world, the open world, the democratic world, the enlightened world, when people talk about fundamental human questions they are far more interested in signaling membership and allegiance than in developing honest answers to them. 

    It is true that many of these questions are hard to answer. The intensity with which people hold convictions belies their complexity. Independent and critical reasoning is not for the faint of heart, and the length and difficulty of the search may eventually make skeptics or cynics of them. It is much simpler to memorize a script, and to establish a quasi-mystical allegiance to ones politics. Holiness is incommensurate with reason, remember. Still, the demands of a nuanced politics are not, I think, why people are reluctant to wrestle with ideas on their own. There are advantages to wholesale worldviews and closed systems. They provide something even more alluring than conviction: solidarity. They are a cure not only for perplexity but also for loneliness. A group with which to rehearse shared dogma, and to style oneself in accordance with the aesthetic that those dogma evoke: this is not a small thing. Thus the answer to a philosophical or moral question becomes…community. We choose our philosophy on the basis of our sociology. This is a category mistake — and the rule by which we live.  

    In a different world, most people would readily admit ignorance or doubt about political or cultural subjects the same way that my young peer would have had no reason to refrain from hugging friends of the opposite gender if Jewish custom did not forbid it. If their group ignored the subject, so would they. Most would not be ashamed of their confusion because intellectual confusion is not a common fear. But isolation is. We dread marginality more than we dread error. After all, the social costs of idiosyncrasy or independence are high. We fear finding ourselves at our screens, watching others retweet or like or share one another’s posts without a cohort of our own in which to do the same. Who does not wish to be a part of a whole? (Identity politics is the current name for this cozy mode of discourse.) In my experience, when most people talk about politics, they are largely motivated by this concern, which compromises the integrity of these conversations. They disguise a social discourse as an intellectual discourse.

    I call this phony discourse the sludge. The sludge is intellectual and political kitsch. It is a shared mental condition in which all the work of thinking has already been done for us. It redirects attention away from fundamentals by converting them into accessories, into proofs of identity, into certificates of membership.

    In a sludge-infected world, in our world, if someone were to say, “that fascist presides over a hegemonic patriarchy,” her primary purpose would be to communicate to her interlocutor that she is woke, trustworthy, an insider, an adept, a spokesperson, an agent of a particular ideology, proficient in its jargon. She would also be indicating the denomination of progressivism to which she subscribes, thus erecting the ideological boundaries for the conversation. If someone else were to say, of the same person, that he is a “cosmopolitan” or a “globalist” or a “snowflake” she would be doing the same thing in a different vernacular. (They would both use the terms “liberal” and “neoliberal” as slurs, probably without a firm sense of what either one means.) In the context of these two conversations, whether or not the individual in question is a snowflake or a fascist is as good as irrelevant. The subject of the conversation is just an occasion for manifesting group solidarity. Righteousness is an accoutrement of the code. In fact, asking either person to justify the assumptions inherent in her statement would be as irregular as asking me to justify my faith in God after witnessing me thank Him for the apple I am about to eat. She would answer with her equivalent of “that’s just what Jews do.” In both these cases, belonging is prior to belief. 

    The effect of sludge-infected language is that quite often the focal point of debates about politics or philosophy is not at all the literal subject at hand. Members are conditioned 

    to present as if they care about the substance of a particular ideology. Learning to present as if you care about something is very different from learning to actually care about something. Caring is difficult, it is a complicated and time-consuming capacity which requires discipline, openness, and analysis. This is not a trivial point. Imagine a sludge-infected romantic relationship (or just look around you) — if, instead of taking a close and patient interest in her lover’s needs, a woman simply asked herself, “What are the kinds of things that people who are in love do?,” and having done those things, considered herself well acquitted of these duties and therefore in love. She may tell him that she loves him, and she may be loving or supportive in a generic kind of way, but she will not really know him. Details about his inner life, about his insecurities and his demons, will not interest her. Romantic success, for her, would be to appear from the outside as if they have created a successful partnership. She will have treated love programmatically, in accordance with the expectations of her social context. Who her lover is when he is not playing the role she has assigned to him will remain mysterious. When tragedy strikes, they will be forced to recognize that they do not know or trust each other. 

    Sludge-infected politics are similarly behavioral and unsettling. Practitioners exploit opportunities for genuine expressions of devotion as occasions to signal membership. Consider the effect of the sludge on antiracism. Suppose we were taught to present as antiracists rather than to seriously consider the imperatives of antiracism (or, again, just look around you). Antiracism (like feminism, like Zionism, like socialism, like isms generally) is difficult to cultivate and strengthen. It requires work and must be consciously developed. It is the result of many individual experiences and sacrifices, highs and lows, of sustained and thoughtful interest and introspection. If we consider ourselves acquitted of  our responsibility to antiracism merely by posting #handsup-dontshoot at regular intervals on social media, perhaps garnering a host of likes and followers, the duties of an honest and reflective antiracism will remain unacknowledged (and the sentiment to which that slogan refers will be cheapened). Our antiracism would be not internal but external, not philosophical but stylistic.

    If a person is a dedicated antiracist, over the years she will come to better appreciate the enormity of the battle against racism. She will develop the minute concerns and sensitivities of a veteran. She will realize that the world is not made up only of friends and enemies. She will know that sometimes, in order to do good, one must work alongside unlikely allies, and that purists are incapable of effecting sustainable change. The very language she uses to discuss her mission will be informed by this knowledge. Indeed, it would strike her as shabby and disloyal to regurgitate common slogans when speaking about the specific, discomfiting realities of which she has intimate knowledge and which she is serious about mitigating. She will choose more precise and shaded words, her own words, careful words. The novice will listen to her and think, “I would never have thought about it that way.” If, by contrast, a person is motivated by the pressure to appear as a loyal soldier, she will never gain this wisdom. Her concerns will be only about the rituals, the liturgies, and the catechisms of a particular politics, however just the cause. Outsiders will recognize her language from Twitter or Facebook or other digitized watering holes, and of course they will ratify it, but she will have gained all that she ever really sought: admiration and affirmation.

    In this manner, movements that purport to exist in service to certain values may perpetuate a status quo in which those values, demanding and taxing, are named but never seriously treated. We ignore them, and pretend — together, as a community — that we are not ignoring them. Every time a self-proclaimed “n-ist” presents as an “n-ist,” every time a tweet or a post racks up a hundred likes in service to that presentation, she can tell herself she has fulfilled the responsibilities of her “n-ism” and so she will not feel further pressure to do so. 

    Consider two examples. First, a college student with two thousand followers on Instagram who attends every Black Lives Matter protest armed with placards, and who posts regularly about white privilege and the guilty conscience of white America. Suppose this woman’s antiracism manifests itself primarily as a crippling guilt in the face of systemic inequity from which she benefits: her service to antiracism is not nonexistent, or merely “performative,” since she does force her followers to think about uncomfortable subjects (though it is quite likely that her followers already agree with her, but never mind), and she does contribute to the increasing awareness that these injustices must be named and reckoned with now.

    It is good that our college student marched. But compare her to a white septuagenarian who has moved into an increasingly gentrifying neighborhood, who is well off and even a member of the American elite, who has the cell phone numbers of more than a few important people. She has never once felt guilty for having been born into power and privilege. She is not a marcher. Now imagine that this woman, out of mere decency, involves herself in the everyday lives of her black neighbors (something which most people like her fail to do). She is who they turn to when forced to confront a system which she can manipulate for them, which they cannot navigate without her. She is the one they call when, say, one of their sons is unjustly arrested (again), or when the school board threatens to cut the district’s budget (again), because they trust that she will work tirelessly on their behalf. She learns over time, through direct experience, about the blisters and lacerations of racism, and about how to preempt and treat them. Owing to her skin color and her tax bracket, she, like our college student, profits from systemic inequity, but, unlike our college student, she takes regular and concrete actions to help the disadvantaged. Her actions are moral but not ideological. She is not a tourist in the cause and the cause is not a flex of her identity. Yet she is regularly in the trenches and she is alleviating hardship. 

    Which of these women has more ardently and effectively fought against racism? I have no objection to activism, quite the contrary, but it must be constantly vigilant against declining into the sludge. (Of course neither the good neighbor nor the tweeting marcher are engaged, strictly speaking, in politics; at the very least they both must also vote.) Sludge-like discourse is not a new phenomenon, of course — prior to the mass revulsion at the murder of George Floyd there was the convulsion known as #MeToo, which exposed some terrible abuses and established some necessary adjustments but was mired in the sludge and the culture of holy rage. And there is another historical revolution to consider: in all the centuries of thought distorted by community, there has never been a greater ally and amplifier of this phenomenon than the new technology. It is uncannily ideal for such shallowness and such conformism, and the best place to go to prove your purity. Owing to it, the sludge has become unprecedentedly manic and unprecedentedly ubiquitous. For all its reputation as an engine for loneliness and isolation, the internet is in fact the perfect technology of the herd. Consider Twitter, the infinitely metastazing home of the member-ships and the mobs. For demagogues and bigots and liars and inciters it has solved once and for all the old problem of the transmission and distribution of opinion. The echo-chambers of the righteous progressives and the righteous reactionaries exist side by side in splendid defiance of one another, drunk on themselves, on their likes, retweets, shares, and followers (the latter a disarmingly candid appellation). All these echo chambers — these braying threads — are structurally identical. Authority is granted to those with the highest numbers. The xenophobic “influencer” with the most followers is granted power for precisely the same reason, and according to the same authority, as the justice warrior with the most followers. And followers are won according to the same laws in all realms: those who are proficient in the vernacular, who can convince others that they are full members, that they understand the code and its implications best, they are the ones to whom the like-minded flock. The priests of one temple wrathfully say, “You are sexist” and those of another wrathfully say “You are un-American” in the same way members of my old community would wrathfully say, “You are a sinner.” It all means the same thing: get out. 

    The sludge does not govern all discourse in America, but a horrifying amount of our “national conversation” is canned. And instead of discussing actual injustices we have endless conversations about how to discuss such things. What can be said and what cannot be said? Why talk about slavery when you can talk about the 1619 Project? Why talk about the nuances and ambiguities endemic to any sexual encounter when you can talk about #MeToo? Why complicate the question for yourself when you can join the gang? Every time we choose one of these options over the other, we demonstrate what kind of knowledge matters to us most.

    And one of the most pernicious effects of this degradation of our discourse occurs in our private lives — in personal relationships. Increasingly in conversations with friends I recognize a thickening boundary, a forcefield that repels us from the highly charged subject of our discussion. We bump up against it and decide not to go there, where integrity and trust would take us. At the point of impact, when honesty collides with membership and shrinks away, I sometimes feel as if I am being pushed back not just from the subject matter but also from the friend herself. She begins to speak in a pastiche of platitudes borrowed from the newsletters clogging her (and my) inbox. I don’t seem to be talking to her anymore, I can’t get through to her own thoughts, to her own perspective — which, I stubbornly insist, lies somewhere beneath the slogans and the shorthands. All too often I find myself following suit. Neither one of us is willing to express our respective curiosities and anxieties on matters related to politics. We just bat the keywords around and pretend we are really in dialogue with each other. He declares that the world will end if Biden is elected, she declares that the world will end if Trump is elected, and I am expected not to ask “Why?” Instead I am being invited to join him or to join her, and the more hysterically, the better.

    Once this parameter, this border wall, has been erected, taking it down would require a troublesome break from social convention. One of us would have to be disruptive, even impolite, to pull us out of the sludge-slinging which prohibits intellectual and verbal independence. And so usually we carry on within those boundaries, interacting as representatives of a cohort or a movement, not as intellectually diligent citizens with a sense of our own ignorance and an appetite for what the other thinks. We become paranoid about discursive limits. Ever present in our conversation is the danger that if one of us deviates from the etiquette, the other will accuse her of being offensive, or worse. The wages of candor are now very high. We have made our discourse too brutal because we are too delicate.

    So we obey the rules in which we have trained ourselves, and look for safety in numbers. We invoke the authority of dogma, hearsay, and cliche. We substitute popularity for truth. We quote statistics like gospel, without the faintest sense of their veracity, as if numbers can settle moral questions. We denounce the character of people we have not met simply because others — in a book group, a twitter thread, a newspaper column, or a mob — say they are no good. The actual interpretation of concepts such as climate change or race or interventionism is less significant than the affiliations that they denote. And when the conversation is over, we are where we were when it began, left to shibboleths and confirmed, as Lionel Trilling once complained about an earlier debasement, in our sense of our own righteousness. But this must not be the purpose of conversation, public or private. It is disgraceful to treat intellectual equals as if they cannot be trusted with our doubts. It is wrong to celebrate freedom of thought and freedom of speech and then think and speak unfreely. “Polarization” is just another name for this heated charade. In an open society, in American society, one should not be made to feel like a dissident for speaking one’s own mind.

    Abolition and American Origins

    The turbulent politics of the present moment have reached far back into American history. Although not for the first time, the very character of the ideals expressed in the Declaration of Independence and the Constitution have been thrown into question by the hideous reality of slavery, long before and then during the founding era and for eighty years thereafter; and then by slavery’s legacy. In this accounting, slavery appears not as an institution central to American history but as that history’s essence, the system of white supremacy and economic oligarchy upon which everything else in this country has been built, right down to the inequalities and injustices of today.

    More than forty years ago, when a similar bleak pessimism was in the air, the pioneering African American historian Benjamin Quarles remarked on that pessimism’s distortions. The history of American slavery could never be properly grasped, Quarles wrote, “without careful attention to a concomitant development and influence — the crusade against it,” a crusade, he made clear, that commenced before the American Revolution. Quarles understood that examining slavery’s oppression without also examining the anti-slavery movement’s resistance to it simplifies and coarsens our history, which in turn coarsens our own politics and culture. “The anti-slavery leaders and their organizations tell us much about slavery,” he insisted — and, no less importantly, “they tell us something about our character as  a nation.” 

    If we are to speak about the nation’s origins, we must get the origins right. As we continue to wrestle with the brutal, and soul-destroying power of racism in our society, it is essential that we recognize the mixed and mottled history upon which our sense of our country must rest. In judging a society, how do we responsibly assess its struggle against evil alongside the evil against which it struggles? With what combination of outrage and pride, alienation and honor, should we define our feelings about America?    

    On November 5, 1819, Elias Boudinot, the former president of the Continental Congress, ex-U.S. Congressman, and past director of the U.S. Mint, wrote to former President James Madison, enclosing a copy of the proceedings of a meeting held a week earlier in Trenton, New Jersey, opposing the admission of Missouri to the Union as a slave state. The crisis over Missouri — which would lead to the famous Missouri Compromise the following year — had begun in the House of Representatives in February, but Congress had been out of session for months with virtually no sign of popular concern. In late summer, Boudinot, who was 79 and crippled by gout, mustered the strength to help organize a modest protest gathering in his hometown of Burlington, long a center of anti-slavery. The far larger follow-up meeting in Trenton was truly impressive, a “great Assemblage of persons” that included the governor of New Jersey and most of the state legislature. The main speaker, the Pennsylvania Congressman Joseph Hopkinson, who was also a member of the Pennsylvania Abolition Society, had backed the House amendment that touched off the crisis, and his speech in Trenton, according to one report, “rivetted the attention of every auditor.” Boudinot, too ill to travel to the state capital, agreed nevertheless to chair a committee of correspondence that wrote to dozens of prominent men, including ex-President Madison, seeking their support. 

    If Madison ever responded to Boudinot’s entreaty, the letter has not survived, but no matter: Madison’s correspondence with another anti-slavery advocate made clear that he was not about to support checking the future of slavery in Missouri. Boudinot’s and the committee’s efforts did, however, meet with approval from antislavery notables such as John Jay. It also galvanized a multitude of anti-Missouri meetings all across the northern states, pressuring Congress to hold fast on restricting slavery’s spread. “It seems to have run like a flaming fire through our middle States and causes great anxiety,” Boudinot wrote to his nephew at the end of November. The proslavery St. Louis Enquirer complained two months later that the agitation begun in Burlington had reached “every dog-hole town and blacksmith’s village in the northern states.” The protests, the largest outpouring of mass antislavery opinion to that point in American history, were effective: by December, according to the New Hampshire political leader William Plumer, it had become “political suicide” for any free-state officeholder “to tolerate slavery beyond its present limits.”  

    Apart from indicating the scope and the fervor of popular antislavery opinion well before the rise of William Lloyd Garrison, two elements in this story connect in important ways to the larger history of the antislavery movement in the United States, one element looking forward from 1819, the other looking backward. Of continuing future importance was the breadth of the movement’s abolitionist politics, as announced in the circular of the Trenton mass meeting. Although it aimed, in this battle, simply to halt the extension of slavery, the anti-Missouri movement’s true aim, the circular announced, was nothing less than the complete destruction of slavery in the United States. “The abolition of slavery in this country,” it proclaimed, was one of “the anxious and ardent desires of the just and humane citizens of the United States.” It was not just a matter of requiring that Missouri enter as a free state: by blocking human bondage from “every other new state that may hereafter be admitted into the Union,” it would be only a matter of time before American slavery was eradicated. Just as important, the abolitionists took pains to explain that restricting slavery in this way fell within the ambit of Congress’ powers, “in full accordance with the principles of the Constitution.” Here lay the elements of the antislavery constitutionalism — asserting congressional authority over slavery in places under its jurisdiction — that would evolve, over the ensuing thirty-five years, into the Republican Party’s program to place slavery, as Abraham Lincoln put it, “in the course of ultimate extinction.” 

    The second connection, looking backward, was embodied by Elias Boudinot. Some historians have linked Boudinot’s antislavery enthusiasm in 1819 to his Federalist politics; more persuasive accounts see it as a natural outgrowth of a deeply religious humanitarianism that had led him, after his retirement from politics and government, to help found the American Bible Society and become a champion of American Indians. The most recent comprehensive study of the Missouri crisis depicts him as something of a throwback, “the quintessential antiegalitarian patrician Federalist” with a pious humanitarian streak who had lingered long enough to play a part in the commencement of the nation’s crisis over slavery.

    In fact, Boudinot had already had a long career not only as an antislavery advocate but also as an antislavery politician. He first threw himself seriously into antislavery politics in 1774 when, as a member of the colonial assembly, he worked closely with his Quaker colleague and abolitionist leader Samuel Allinson in ultimately unsuccessful efforts to hasten total abolition in New Jersey. In 1786, Boudinot joined with another antislavery politician, Joseph Bloomfield, in founding the New Jersey Society for Promoting the Abolition of Slavery; and after several years of indifferent activity, the Society presented a gradual emancipation plan that Bloomfield, elected New Jersey’s governor in 1803, signed into law the following year. Boudinot, meanwhile, was elected to the first U.S. Congress in 1789, where he denounced slavery as an offence against the Declaration of Independence and “the uniform tenor of the Gospel.” In all, if the antislavery arguments of the 1850s dated back to the Missouri crisis, then the antislavery politics that brought about that crisis dated back to the Revolutionary era.

    These two connections — the history of the antislavery constitutionalism that surfaced in the Missouri crisis and the history of antislavery politics dating back to the Revolution — deserve an important place in our account of our origins. I have argued, in a recent book, that by refusing to recognize the legitimacy of property in man in national law, the Federal Convention in 1787 left open ground upon which antislavery politics later developed at the national as well as the state level. Those politics emerged, to be sure, out of the local struggles that dated back before the American Revolution. But the ratification of the Constitution, even with that document’s notorious compromises over slavery, left room for the rise of antislavery politics on the national level. And the origins of those politics, as I wish to make clear here, lay in the efforts by antislavery agitators and their allies in Congress, beginning in the very first Congress, to find in the Constitution the authority whereby the national government could abolish slavery or, at the very least, hasten slavery’s abolition.

    These national antislavery politics, it needs emphasizing, developed by fits and starts, and only began to gather lasting strength in the 1840s. The abolitionists enjoyed just a few significant successes at the national level during the twenty years following the Constitution’s ratification, and they endured some important defeats. These were some of the leanest years in the history of antislavery politics. But that the abolitionists won anything at all, let alone anything significant, contradicts the conventional view that southern slaveholders thoroughly dominated national politics in the early republic. The abolitionists did occasionally prevail; and just as important, in doing so they discovered and began to refine the principles and stratagems of antislavery constitutionalism that would guide antislavery politics through to the Missouri crisis and then, further refined, to the Civil War.

    Reviewing the early history of these abolitionist politics — running from the birth of the federal government in 1789 until the abolition of American involvement in the Atlantic slave trade in 1807 — is part of a broader re-evaluation currently underway of what Manisha Sinha has called “the first wave” of abolitionist activity that lasted from the Revolutionary era through the 1820s. Scholarship by a rising generation of historians, including Sarah Gronningsater, Paul J. Polgar, and Nicholas P. Wood, as well as Manisha Sinha, have begun to revise completely the history of antislavery in this period. They have more or less demolished, for example, the once dominant view of northern emancipation as a grudging and even conservative undertaking, led by polite gentlemen unwilling to take their antislavery too far. When completed, the work of these scholars and others will, I am confident, become the basis for a new narrative for the history not just of antislavery but of American politics from the Revolution to the Civil War. But there is a lot of work left to do.

    Prior to the 1750s, there was very little in the way of antislavery activity among white Americans, with the exception of the Quakers, and it took even the Quakers several decades of struggle among themselves before they turned misgivings about slavery into formal instructions to abandon the institution. Amid an extraordinary moral rupture at mid-century, wider antislavery activity began in earnest. Initially, these efforts emphasized limited public efforts to change private behavior, relying on moral suasion to hasten manumissions, but soon enough some antislavery reformers turned to politics in more forceful ways. In 1766 and 1767, Boston instructed its representatives in the colonial assembly to push for the total eradication of slavery. In 1773, a Quaker-led campaign against the slave trade, captained by Anthony Benezet, the greatest antislavery agitator of the time, swept through the middle colonies and touched New England; and in that same year several Massachusetts towns petitioned the assembly to abolish the slave trade and initiate gradual emancipation. Black abolitionists, including Felix Holbrook and Prince Hall in Massachusetts, initiated their own petition drives, supplementing the freedom suits that would kill slavery in Massachusetts outright in the mid-1780s. Bills for the gradual abolition of slavery were debated in New Jersey in 1775 and in Connecticut in 1777; Vermonters approved the first written constitution ever to ban adult slavery in 1777; and by 1780 ascendant radical reformers in Pennsylvania led by George Bryan prepared to enact the first gradual emancipation law of its kind in history.  

    By then, political abolitionists had begun organizing their own institutions. On April 14, 1775 — five days before the battles of Lexington and Concord — a group consisting chiefly of Quakers formed the Society for the “Relief for Free Negroes Unlawfully Held in Bondage,” the first society with antislavery aims anywhere in the world. Although the Revolution soon disrupted the group, it reorganized in 1784 as the Pennsylvania Society for the Promotion of the Abolition of Slavery; three years later, the society named Benjamin Franklin — conspicuously a non-Quaker — as its president. In 1785, the New-York Manumission Society appeared, dedicated to the same basic goals. By 1790, two more states, Rhode Island and Connecticut, had approved gradual emancipation. Slavery was ended in Massachusetts by judicial decree in 1783, had crumbled in New Hampshire; and at least six more abolitionist societies had formed from Rhode Island as far south as Virginia (where, in 1785, an abolition law was debated to supplement a widened manumission law enacted in 1782). In 1794, the state societies confederated as the American Convention for Promoting the Abolition of Slavery and Improving the Condition of the African Race.

    Abolitionist politics at the national level would await the framing and ratification of the Federal Constitution in 1787-1788. Since the Articles of Confederation had afforded the national government no authority over national commerce, let alone either slavery or the Atlantic slave trade, national abolitionist politics barely existed. The one exceptional effort came in 1783, when a small Quaker delegation from the Philadelphia Yearly Meeting delivered to the Confederation Congress, then sitting in temporary exile in Princeton, a petition signed by some five hundred Quakers, asking in vain for a prohibition of the Atlantic trade. With the calling of the Federal Convention in 1787, though, both of the then-existing abolitionist societies, in Philadelphia and New York, mobilized to send petitions. Benjamin Franklin, a delegate to the convention as well as president of the Pennsylvania Abolition Society decided on tactical grounds against presenting his group’s forceful memorial opposing the Atlantic slave trade, while the New-York Manumission Society failed to complete its broader antislavery draft before learning that slavery as such would not be debated at the convention.

    To comprehend the national abolitionist politics that followed these developments requires a closer look at the Constitution’s paradoxes and contradictions concerning slavery. None of the framers’ compromises over slavery that many historians cite as the heart of the supposedly proslavery Constitution were nearly as powerful in protecting slavery as an assumption that was there from the start: that whatever else it could do, the federal government would be powerless to interfere with slavery in the states where it existed — a doctrine that became known as the federal consensus. This assumption, far more than the three-fifths clause or the Atlantic slave trade clause or the fugitive slave clause or anything else, was the basis of the slaveholders’ confidence that the Constitution had enshrined human bondage. But if the federal government could not abolish slavery outright, then how might it be done, short of hoping that the slaveholders of South Carolina and Georgia would suddenly see the light — a prospect that the South Carolinians and Georgians made clear was not in the offing anytime soon? Once the abolitionists had launched the campaign for emancipation in the North, this would be their great conundrum — but they seized upon it immediately, with actions as bold as their demands. In doing so, they fostered a convergence of radical agitation and congressional politics that would have enduring if as yet unforeseen repercussions.   

    Far from discouraging abolitionist activity, the ratification of the Constitution, even with its notorious compromises over slavery, bolstered it. Above all, the framers’ granting to the new national government, over furious southern objections, the authority to abolish the nation’s Atlantic slave trade, even with a twenty-year delay, struck many and probably most abolitionists and their political allies as a major blow for freedom. This should not be surprising: as historians note too rarely, it was the first serious blow against the international slave trade undertaken anywhere in the Atlantic world in the name of a national government; indeed, the American example, preceded by the truly inspiring anti-slave agitation led by Anthony Benezet, encouraged the rise of the British movement to end the Atlantic trade, formally organized in 1787. Some leading American abolitionists described the Constitution as nothing less than, in the words of the framer James Wilson, “the foundation for banishing slavery out of this country.” Ending the trade had long been considered the vital first step toward eradicating slavery itself; and it seemed at the least highly probably that, as soon as 1808 arrived, Congress would do so. More immediately, though, members of the Pennsylvania Abolition Society wanted to see if Congress would entertain extending its constitutional authority beyond the slave trade provision.

    The first great confrontation over slavery in national politics was a famous but still largely misunderstood conflict in the House of Representatives during the First Congress’ second session in New York, the nation’s temporary capital, in 1790. Through a friendly congressman, the Pennsylvania Abolition Society presented a petition to the House of Representatives, above the signature of its aging president Franklin, bidding the representatives to “step to the very verge of the powers vested in you” and to abolish slavery itself, not simply the Atlantic slave trade. (At the request of John Pemberton of the PAS, two groups of Quakers had already sent milder petitions referring only to the trade.) Paying no attention to the federal consensus, the PAS petition specifically cited the preamble of the Constitution that empowered the new government to “promote the general Welfare and secure the blessings of Liberty to ourselves and our Posterity,” which they contended authorized far-reaching if unspecified congressional action against slavery. Without telling Congress exactly what to do, the petitioners bid the representatives to look beyond the federal consensus to find ways they could attack slavery — to the extent, quite possibly, of disregarding the federal consensus entirely.

    A fierce on-and-off debate over the next three months ended with Congress affirming the federal consensus as well as the ban on congressional abolition of the Atlantic trade until 1808. The outcome is often portrayed fatalistically as a crushing defeat for the abolitionists, sealing the immunity of slavery in the new republic while calling into question the rights of abolitionists even to petition the Congress — an effort undertaken, in one historian’s estimation, by naïve and “psychologically vulnerable” reformers, unprepared “for the secular interest politics of a modern nation.”

    In fact, although the petition (along with the two others from regional Quaker meetings) did not gain the sweeping reforms it sought, it was decidedly not a failure. For one thing, the mobilization behind it, far from weak-kneed, was the first auspicious political protest of any kind to be directed at the new national government. Strikingly modern in its strategy and its tactics, the abolitionists blended insider maneuvering and hard-headed direct appeals to members of Congress with popular propagandizing and political theater of a kind associated with the protest movements of much later decades. The campaign was spearheaded by a delegation of eleven Quaker lobbyists from Philadelphia, including John Pemberton and Warner Mifflin, who were certainly the opposite of naïve and vulnerable. As a consequence, the congressional deliberations over the petitions took a surprisingly radical turn, and in  the end the effort secured important political as well as practical gains.

    Lower South slaveholders reacted with predictable fury as soon as congressmen friendly to the abolitionists introduced the petitions on the floor of the House. The slaveholders’ diatribes asserted that the constitutional ban on congressional abolition of the Atlantic slave trade until 1808 meant that the Constitution barred any federal interference with slavery whatsoever. Given the federal consensus, meanwhile, the slaveholders called the petitions unconstitutional on their face and demanded they be rejected without further debate. But despite the inflation of their numbers in the House by the three-fifths clause, the proslavery forces were badly outnumbered. (“Alass — how weak a resistance against the whole house,” one resigned South Carolina congressman wrote.) By a vote of 43 to 11, the House approved sending the radical petitions to a special committee for consideration.

    Working hand-in-hand with members of the special committee, the abolitionists immediately supplied them with a small library of abolitionist writings, while they arranged, through the Speaker of the House, an ally, to distribute additional abolitionist propaganda to the rest of the chamber. The Quaker lobbyists then advised the committee on its report behind the scenes, sharing drafts and submitting their own suggestions while backing up the PAS petition’s claim that the “General Welfare” section of the Constitution’s preamble gave Congress some unspecified powers over slavery. The committee narrowly turned aside that suggestion — by a single vote, John Pemberton reported — and agreed that the Congress could not ban the Atlantic slave trade before 1808. Yet it also asserted, contrary to lower South protests, that the federal government could regulate the trade as it saw fit at any time. More portentously, the members included wording that the Constitution empowered Congress to abolish slavery outright after 1808 — making the special committee’s report perhaps the most radical official document on slavery approved by any congressional entity before the Civil War.   

    When the report reached the House, the abolitionists swung into action as both agitators and what today we would call lobbyists. Quakers crowded the House gallery to witness the debate, their presence in Quaker gray suits and broad-brimmed black hats inciting and unnerving the southerners. Outside the hall, the abolitionists pursued individual congressmen right down to their lodging houses and taverns and eating places to make their case. Mifflin began a letter-writing campaign, addressed both to individual congressmen and to the House at large. The abolitionists also arranged with allies in the New-York Manumission Society to have a full record of the House debates printed along with antislavery articles in the New York Daily Advertiser, as well as to distribute pamphlets vividly describing the horrors of the slave trade.  

    Finally the House affirmed Congress’ powerlessness over slavery where it existed and over the Atlantic trade before 1808, and a revised report removed the select committee’s language about abolishing slavery itself after 1808. Yet the outcome was hardly a one-sided triumph for the proslavery southerners. The lower South failed utterly in its initial effort to dismiss the petitions without debate. Evidently, contrary to the slaveholders, Congress might well have some authority over slavery worth debating. In the course of arguing that point, moreover, several House members had affirmed that, short of abolishing slavery outright, Congress might restrict slavery in various ways quite apart from the slave trade, including, James Madison remarked, banning slavery from the national territories, where, he declared, “Congress have certainly the power to regulate slavery.” And over howls from lower South slaveholders, the final report affirmed that Congress could legislate over specific matters connected to the Atlantic trade before 1808 — issues that, as we shall see, the abolitionists would agitate successfully. In all, the federal consensus stood, but at the same time the House majority repulsed the proslavery forces and backed the abolitionists on whether slavery was immune from federal authority.

    Over the ensuing decade, the abolitionists, far from discouraged, redoubled their national efforts, despite some serious setbacks. The Southwest Territory — what would become the state of Tennessee — was admitted to the Union with slavery in 1790, with little debate. A coterie of antislavery congressmen could not stave off passage of the Fugitive Slave Act of 1793. Five years later, a spirited antislavery effort to bar slavery from Mississippi Territory was defeated by a wide margin. 

    And yet the abolitionists had reason to remain optimistic. At the state level, the New York legislature, under intense abolitionist pressure, finally passed a gradual emancipation law in 1799 and New Jersey followed five years later, completing the northern “first emancipation.” In part as a response to the Fugitive Slave Act, the American Convention of Abolition Societies was up and running in 1794. There were various signs, from a proliferation of freedom suits in Virginia to the spread of antislavery opinion in Kentucky and Tennessee, that the upper South was seriously questioning slavery. In national politics, antislavery congressmen, numbering about a dozen and led by a few northerners who worked closely with the abolitionists, made good in 1794 on the victory wrung from the abolitionist petition debates four years earlier, passing a law that outlawed the use of any American port or shipyard for constructing or outfitting any ship to be used for the importing of slaves. 

    Five years later the Reverend Absalom Jones, a prominent abolitionist and mainstay of Philadelphia’s free black community, helped lead an even more propitious effort. Late in 1799, a group of seventy free men of color in Philadelphia, headed by Jones, sent yet another petition to the House of Representatives. The drafters of the petition, as Nicholas Wood has shown, were John Drinker and John Parrish, prominent local Quaker abolitionists who had long worked closely with Jones and other black abolitionists; the signers included members of various black congregations, including Jones’ St. Thomas African Episcopal Church, the majority of them unable to sign their names. 

    The petitioners asked for revisions of the laws governing the Atlantic slave trade as well as the Fugitive Slave Law of 1793. But they also went further, as far as the PAS petitioners had in 1790, pressing for — as the abolitionist congressman Nicholas Waln observed when he introduced the petition to the House — “the adoption of such measures as shall in due course emancipate the whole of their brethren from their present situation.” Stating that they “cannot but address you as Guardians of our Civil Rights, and Patrons of equal and National Liberty,” the petitioners expressed hope that the House members 

    will view the subject in an impartial, unprejudiced light. — We do not ask for the immediate emancipation of all, knowing that the degraded State of many and their want of education, would greatly disqualify for such a change; yet humbly desire you may exert every means in your power to undo the heavy burdens, and prepare the way for the oppressed to go free, that every yoke may be broken.

    As if brushing aside the House’s decision in 1790, the abolitionists, citing once again the Constitution’s preamble, wanted Congress to probe once more the document’s antislavery potential. The idea that Congress had untapped antislavery powers was emerging as a core abolitionist argument. And, though the sources are silent, this portion of the petition may have also had tactical purposes. In 1790, the defeat of grand claims about emancipation proved the prelude to the House affirming Congress’ authority over more specific issues connected to slavery. Roughly the same thing would happen this time.

    Southern slaveholders and their New England allies reacted with predictable wrath. John Rutledge, Jr. of South Carolina thanked God that Africans were held in slavery, then railed against the “new-fangled French philosophy of liberty and equality” — he was talking about Thomas Jefferson and his supporters — that was abroad in the land. Rutledge’s fellow Federalist, the notorious Atlantic slave trader John Brown of Rhode Island, attacked the petition’s effort to restrain American participation in the trade, while another New England Federalist, Harrison Gray Otis, sneered that most of the petitioners were illiterate and thus unable to understand what they had endorsed, and that receiving their memorial would mischievously “teach them the art of assembling together, debating, and the like.”  

    The next day, the House considered a resolution condemning those portions of the petition “which invite Congress to legislate upon subjects from which the General Government is precluded by the Constitution.” The resolution passed 85 to 1, a crushing repudiation of the idea that Congress possessed implied powers to interfere directly with slavery where it already existed. Even the abolitionist congressman who presented the free blacks’ petition ended up voting with the majority.

    But that was only part of the story. The core of antislavery Northerners fiercely rebutted the proslavery outbursts. George Thacher, a Massachusetts Federalist and longtime antislavery champion in the House, repudiated the racist attacks on the petitioners, upheld the right of constituents to a redress of grievances regardless of their color, and condemned racial slavery as “a cancer of immense magnitude, that would some time destroy the body politic, except a proper legislation should prevent the evil.” Moreover, once the condemnation resolution predictably passed — Thacher’s was the sole vote in opposition — the House was free to act on the petitioners’ more specific demands, which it swiftly did, sending the petition to committee — thereby, among other things, affirming the right of free blacks to petition Congress.

    The committee assigned to consider the petition sympathized with its section on the fugitive slave law — free blacks, its report contended, were “entitled to freedom & Protection” — but the slaveholders and their allies prevailed on that issue on jurisdictional grounds. On the slave trade, however, Congress took action. After a heated debate, the House, with the concurrence of the Senate, approved by a wide margin the Slave Trade Act of 1800, banning even indirect involvement by Americans with the shipping of Africans for sale in any foreign country while also authorizing naval vessels to seize ships that were in violation. While it expanded enforcement of the restrictive law enacted six years earlier, the new law reinforced expectations that the Atlantic slave trade to the United States would be entirely abolished at the earliest possible date in 1808. 

    The scale of this antislavery victory should not be exaggerated — indeed, three years later South Carolina would re-open its own slave trade with a vengeance — but neither should it be scanted. Most immediately, within a year, under the new law’s provisions, the man-of-war U.S.S. Ganges seized two illegal slave schooners off the coast of Cuba and discovered more than one hundred and thirty African captives, men, women, and children, in chains, starving and naked; once freed, the Africans obtained apprenticeships and indentures from the Pennsylvania Abolition Society. The free black petition debate also marked a highpoint in the efforts by the antislavery congressmen, first to restrict and regulate the Atlantic slave trade prior to its abolition, and then to reform and restrict the Fugitive Slave Law. 

    More broadly, that same small but resolute group took up new antislavery battles and established an antislavery presence that from time to time became an antislavery majority. This was not just the agitation of an elite. It must be emphasized that the congressmen acted in coordination with dense interregional as well as interracial networks of antislavery activists, organized in state abolition societies, churches and church committees, mutual aid societies, fraternal groups, and more. With such popular backing, year after year, antislavery congressmen voiced defiantly antiracist as well as antislavery sentiments on the floor of the House, exploring the Constitution in search of antislavery meanings, trying to find in it whatever powers they could whereby the federal government could limit slavery’s expansion leading to its eventual eradication. Some of their successes were defensive, as when they defeated efforts to augment the Fugitive Slave Act, to otherwise restrict the rights of free blacks, and to repeal the Northwest Ordinance’s ban on slavery in Illinois and Indiana. But the antislavery forces in Congress could be aggressive as well. 

    In 1804, once again bidden by abolitionist petitions, the Senate approved a provision that would have effectively shut the domestic slave trade out of the entire Louisiana Territory, obtained from France a year before, while the House, stunningly, passed a bill that banned outright further introduction of slavery into the territory. The House provision failed to gain approval from the Senate, and the efforts to keep slavery out of Louisiana proved futile, but the passing success was a signal that the antislavery presence in Congress had grown since 1790. Fittingly, the effort in the House was led by a sharp-witted and acid-tongued member from New Jersey named James Sloan, a Jeffersonian Republican who had cut his political teeth as a member of the New Jersey Abolition Society and as its delegate to the American Convention. A permanent goad to the southern slaveholders, including those in his own party, Sloan would cause an uproar in the House in 1805 by proposing a plan for gradual emancipation in the District of Columbia — yet another effort to find places in the Constitution giving the federal government the authority to attack slavery.  

    Finally, in 1807, at the earliest date stipulated by the Constitution, Congress approved the abolition of the Atlantic slave trade to the United States. With the bill supported by most of the large Virginia delegation, whose slaveholders stood benefit, the outcome was a foregone conclusion, but the antislavery members had to beat back several efforts to soften the law, including one proposal by the states-rights dogmatist John Randolph which in effect would have recognized slaves as property in national law. “Hail! Hail, glorious day,” the New York black abolitionist minister Peter Williams, Jr., an ally of the New-York Manumission Society, exclaimed at the city’s celebration.

    This high point in the politics of early American abolitionism would also prove a turning point. Although national agitation continued, there was a noticeable decline in enthusiasm in the ranks, at least outside Pennsylvania, once New York and New Jersey had completed their emancipation laws. A powerful racist backlash instigated by the Haitian Revolution and then by reactions to northern emancipation jolted the existing abolitionist societies and paved the way for the emergence of the American Colonization Society. Just as their British counter-parts perfected the massive petition campaigns required to shake Parliament into abolishing Britain’s Atlantic slave trade, also achieved in 1807, the American movement began to falter. Above all, the dramatic shift in the Southern economy that came with the introduction of the cotton gin in 1793 and the consequent renaissance of plantation slavery dramatically changed the terms of antislavery politics, dispelling forever the original abolitionist hope that the closing of the Atlantic trade would doom American slavery.

    Northern antislavery opinion did rebound after 1815 and reached a political flashpoint during the Missouri crisis of 1819-1820. But the abolitionist organizations, including the American Convention, although still alive and active, were becoming less of a factor in guiding events in Congress than they had been at the end of the eighteenth century. By now, with the expansion of mass mainstream party politics, popular mobilizations in the form of an impromptu Free Missouri movement did more to embolden antislavery congressmen than did the abolitionist’s continued memorials, petitions, and lobbying efforts. And then, in the wake of the Missouri crisis, shaken mainstream politicians sealed what amounted to a bipartisan consensus to prevent slavery from ever again entering into national political debates. With national politics seemingly closed to antislavery agitation, the old Quaker abolitionist strategy of working directly with sympathetic officeholders and political leaders began to look feeble. 

    But the fight had been irreversibly joined. The established abolitionist movement’s strategies left an important legacy on which later antislavery political movements would build. Even as the early abolitionist movement sputtered out, it played a part in shaping abolitionism’s future. In forming as sophisticated a political movement as they did, the early abolitionists created a practical model for organized political agitation in the new republic, antedating the political parties that arose thereafter. Although the effectiveness of that model declined after 1800 or so, it never disappeared; and elements of it would remain essential to later abolitionist politics, including the transformation of abolitionist petitioning into monster popular campaigns, along the lines that British abolitionists had pioneered after 1787. 

    The legacy was even more important with respect to antislavery ideology and strategy. If the initial impetus of the early abolitionists, dating back to 1775, had been to politicize antislavery sentiment, in order to make direct claims on government, so the abolitionists of the early republic perpetuated the idea that politics was the only sure means to achieve slavery’s eradication. In national politics, after the ratification of the Constitution, that meant, above all, advancing antislavery interpretations of the framers’ work. Although the most expansive ideas about Congress’ authority over slavery met with ever firmer resistance, the idea that Congress possessed numerous implicit or indirect powers to hasten slavery’s demise remained.

    Consider again the petition from the free men of color of Philadelphia in 1799. In addition to asking Congress to find the authority to abolish slavery, the petition included its own innovative antislavery interpretation of the Constitution to demonstrate that the Fugitive Slave Law was unconstitutional: as “no mention is made of Black people or Slaves” in the Constitution, the document observed, it followed that “if the Bill of Rights or the declaration of Congress are of any validity,” then all men “may partake of the Liberties and unalienable Rights therein held forth.” The assertion got nowhere, but it had been made, and as long as abolitionists kindled a basic optimism about the Constitution’s antislavery potential, they would sustain their belief that political efforts, and not moral suasion alone, would bring the complete abolition of American slavery. 

    This optimism peaked again during the Missouri crisis, when abolitionists seized upon federal control of the territories and the admission of new states as an instrument to commence slavery’s abolition. The optimism persisted through the 1820s, even as the colonization movement flourished and even as mainstream political leaders built a new system of national politics based on two opposed intersectional national parties — a party system deliberately designed to keep antislavery agitation at the margins. In 1821, a sometime colonizationist, the pioneering abolitionist editor Benjamin Lundy, offered a comprehensive seven-point plan to abolish slavery under the Constitution that began with banning slavery in the national territories and abolishing the domestic slavery trade. Four years later, Lundy joined with the abolitionist and political economist Daniel Raymond in trying to establish an antislavery political party in Maryland. After that failed, Lundy persuaded the American Convention to pick up the dropped thread of James Sloan’s earlier agitation in the House and pressure Congress to use its authority to abolish slavery and the slave trade in the District of Columbia. He then had the idea of mounting a mass petition campaign to support the demand; and in 1828, working in coordination with a Pennsylvania Abolition Society member, congressman Charles Miner, who had announced his intention to work for abolition in the district, he forced the issue to the floor of the House. Younger PAS members warmed to the campaign and kept it going; so would, somewhat ironically in retrospect, the young editor whom Lundy later picked up as his assistant and brought into the abolitionist cause, none other than William Lloyd Garrison. 

    The optimism would be badly battered in the 1830s and 1840s. Some members of a new generation of radical abolitionists, led by Garrison, would conclude that there was no hope of achieving abolition and equality in a political system attached to a proslavery U.S. Constitution — a “covenant with death” and “agreement with hell,” in Garrison’s famous condemnation. Only moral suasion backed with militant protest, Garrison declared, would advance the cause; moral purification would have to precede political action. Taking the long view, this represented as much a regression as an advance, back to the anti-political stance of the more pious of the Quaker abolitionists in the 1750s and 1760s. Garrison’s absolutist high-mindedness forthrightly but perversely lifted the cause above the grimy necessities of actual politics. 

    Yet for all of Garrison’s fiery and intrepid polemics, he and his followers were a minority inside the abolitionist movement, increasingly so after 1840. The abolitionist majority never relinquished the idea, passed on from the first-wave abolitionists, that Congress, by acting wherever it could against slavery, would hasten slavery’s destruction. Inside Congress, meanwhile, a luminary with antislavery convictions but no previous antislavery record, John Quincy Adams, led a small group of colleagues in a guerilla war against the gag rule and finally prevailed in 1844. Adams, the ex-president turned congressman, was a singular figure in American politics, unlike any before or since; and the 1840s were not the 1820s or the 1790s. But Adams, who came to work closely with abolitionists, in his way reprised the roles of George Thacher, James Sloan, and Charles Miner, becoming the face of antislavery inside the Capitol — “the acutest, the astutest, the archest enemy of slavery that ever existed,” in the view of his fiercely proslavery Virginia rival Henry A. Wise.

    By the time he collapsed and died on the floor of the House in 1848, opposing the American war with Mexico, Adams had also helped turn antislavery politics back toward issues concerning federal power over slavery in the territories — the very issues that, within a decade, led to the formation of the Republican Party. The abolitionists search for the constitutional means to attack slavery, begun in 1790, culminated in the agitation over Kansas, the convulsions that followed the Dred Scott decision in 1857, and everything else that led to the Civil War. All of which is a vast and complicated story, making the final connection between the antislavery politics of Anthony Benezet and Benjamin Franklin with those of Frederick Douglass and Abraham Lincoln. The important point, in the consideration of American origins, is that the early American abolitionists, audacious in their own time, formulated the essentials of a political abolitionism that, however beleaguered and often outdone, announced its presence, won some victories, and made its mark in the national as well as state politics of the early republic. It was not least owing to this constitutive achievement of American democracy that in the relatively brief span of fifty years, some of them very violent, slavery would be brought to its knees.

    Which brings us back to Benjamin Quarles’ observations, about the concomitant development of American slavery and American antislavery. The struggle for justice is always contemporaneous with injustice, quite obviously, and the power of injustice to provoke a hostile response is one of the edifying lessons of human life. Once joined, that struggle forever shapes both sides: there is no understanding the growth of pro-slavery politics, leading to the treason of secession, without reference to the growth of anti-slavery politics, just as anti-slavery politics makes no sense absent pro-slavery politics. But the history of anti-slavery in America, even during its most difficult periods, is not merely a matter of edification. It is also a practical necessity, a foundation for political action. It presents contemporary anti-racism with a tradition from which it can draw its ideas and its tools. It is a barrier against despair, and a refreshment of our sense of American possibility. The struggle against slavery was hard and long, and it was won. The struggle against racism is harder and longer, and it has not yet been won. But as our history shows, it has certainly not been lost.