Some Possible Grounds for Hope

    I don’t see how we get out of this. There is nothing truer that can be said of this time. It is a perverse measure of its truth that we have been inundated with books and bromides that purport to show the opposite, that have hit upon the way out, the solutions, or better, the solution, the formulas for the miracle, all the how’s and all the why’s. How can so many people understand so much and so immediately, when so many of our torments are so unfamiliar? Isn’t anybody stunned into silence anymore?

    So many words, so many numbers, so many “frames.” They are fortifying, I guess, and we certainly need strength. Let every-one come forward in the dark with their light. But I don’t see how we get out of this, not yet. 

    The empty streets of the covid nights are so candid in their desolation. They are thronged with the people who are not there. They provide a peculiar serenity, in which one can be alone with one’s fear, and take it for a walk.

    Philosophers since Seneca have known that fear and hope are twins. They are alternative ways of interpreting the opacity of the future. 

    If hope were rational, it would be redundant. Hope picks up where reason leaves off, like changing guides at the frontier. Hope is the best we can do with uncertainty. It is an image of happiness that cannot quite be dismissed as an illusion. If it cannot be proven, neither can it be disproven. Its enchantment lies in its cognitive limitation. It comes to an end with knowledge. 

    One of the characteristic errors of the American debate is to mistake the homiletical for the analytical — preaching for teaching. The objective of moral and social thought is not uplift. And as every religious person knows, castigation, too, can be experienced as uplift. It warms the heart to be told that we are all sinners, doesn’t it? Drop a coin in the charity box on the way out, you miserable excuse for finitude, and recover your contentment. It was never really damaged anyway. Of course this high-level complacency is abundantly found among the secular as well. They, too, like a warm sensation of their own shortcomings, as long as you do not overdo it. They, too, are lifted up by the sound of sermons, as in the editorial “must”: “We must restore trust.” Yes, we must!

    For many years I travelled around the country, like an itinerant preacher, chastising American Jews for their ignorance of Hebrew, which is their language even if they cannot speak it. I was received cordially almost everywhere I went. But I became suspicious of this cordiality: after  all, I had come to discomfit them. And on the occasions when I did discomfit them — as when, after one of those lectures, a woman came up to me and testily said, “Sir, that was a wonder-ful presentation, but I did not feel affirmed!” — I smiled politely and triumphantly. (Actually, what I said to the woman was this: “Madam, I did not come all this way to affirm you.”) But those occasions were rare. The futility of my efforts was owed to the tragi-comic fact that feeling bad makes some people feel good. Criticism assures them of their meaningfulness, which is really all they seek.

    “I don’t see how we get out of this.” Thank you for your honesty. It is not nearly as disagreeable as our circumstances.

    If hope and history ever rhyme, in accordance with the poet’s wishes, it will be a soft rhyme, a weak rhyme, a half-rhyme. 

    I don’t see how we get out of this. The country is poisoned. There is contempt everywhere; contempt and certainty. There are also wonderful people doing wonderful things for the weak and the needy and the scorned — a national plenitude of local kindnesses; but all these practices of solidarity have not yet altered the character of our politics and our culture, or banished our furies. Not just yet. The rampaging passions — otherwise known as populism — have not yet exhausted themselves. Perhaps it is just a matter of patience, except that patience is in ideological disrepute and was long ago retired by our technology.

    The greater the suffering, the greater the dream of redemption. An apocalyptic is a man in extreme pain. He can imagine only an extreme cure. He is not concerned that he may cause pain to end pain. He hurts that much. But must the magnitude of the cure always be commensurate with the magnitude of the pain? What if there are cases in which the only genuine relief is gradual relief? This is insulting to the sufferer, who expects that his view of his suffering to be definitive. Yet our compassion, our love, does not require that we agree with him. A person in pain knows only one thing, but he will be saved with the help of people who know more things. For example: a person in pain hates time, which is abolished by the immediacy of his torments. He lives (to borrow Robert Lowell’s piercing word) momently. A person in pain experi-ences time as an eternity. (In this way he resembles a person in ecstasy.) But time may be his ally, insofar as it is the only condi-tion of his healing. Recovering from pain is a way of returning from eternity to time. Or, more practically, of taking concrete and steady and reasoned steps.

    Of course there are sufferers who do not have time on their side. When we discover this about physical ills, we call it tragedy. But we have no right to invoke tragedy about social ills. The tragic sense connotes a certain helplessness about circum-stances, or more precisely, about other people’s circumstances. It promotes resignation. But whereas it may be legitimate for me to resign myself to my troubles, it is not legitimate for me to resign myself to your troubles. I can surrender myself, but I cannot surrender you.

    To approach injustice from the standpoint of tragedy has the effect of relaxing the will and shrinking the sense of agency, and even of usurping ethics with aesthetics. How do you fight tragedy?

    Was slavery tragic? In retrospect, yes. But in its time, no. In its time it was odious and disgusting and abominable. In its time it demanded resistance and abolition. Only evils of the past are tragic. The evils amid which we live are challenges — occasions of responsibility. Tragedy is precisely what we are charged to preempt.

    Was the catastrophe in Syria tragic? Only because nobody stopped it.

    “Interventionism” is now a dirty word. But it signifies more than a controversy — well, I wish it were still a controversy — about foreign affairs. Who ever did the right thing without intervening? Ethical action is always an intrusion, a refusal to leave a situation as one found it. Morality is a theory of meddling. What is intervention if not the Biblical injunction not to stand idly before the spilled blood of another? I do not recall any mention of costs and benefits in the verse. A government, of course, needs more than the Bible, more than high principle, to guide its actions. But does power exist only for the perpetration of evil? What about the costs and benefits of doing nothing? Or shall we acquiesce in the deformities of the world, except when there is money to be made?

    “But it’s complicated”: the streets of the capital, the corridors of power that masquerade as the corridors of powerlessness when it suits them, echo with those allegedly extenuating words. It is always smart to say that a problem is complicated. As if it is the duty of government to pursue justice only when it is not complicated.

    Tragedy, remember, is designed, in its most influential definition, to excite “pity and fear” so as to bring about “the proper purgation” of those emotions. It is a performance that exercises certain feelings so as to annul them. Never mind that those feelings may be put to good use outside the theater. Tragedy is an entertainment.

    Catharsis is the enemy of action. It leaves one spent and sated. It is the orgasm of conscience. I wondered about the relation of catharsis to politics as I joined the protests at Black Lives Matter Plaza. I was not worried about “performativity,” since the public expression of opposition is an essential element of opposition. I was worried about the problem of spiritual stamina, about the durability of the energy in the streets, about the overestimation of excitement, about the preference for the adventure of protest over its pedantic translation into policy. The politics of the streets can make do with catharsis. We will see.

    Concrete and steady and reasoned steps taken patiently and resolutely over time for the purpose of mitigating and eliminating the sufferings of others: in a word, liberalism. 

    The most widespread cliché of our time is “polarization.” Everyone laments it, and many scholars and commentators regard it as the most dire of our ills. It has provided work for a generation of social scientists. That we are living in an age of spectacular social division is undeniable, and the excesses of this discord are sometimes lunatic and criminal. But a little intellectual pressure needs to be put on this obsession with our lack of harmony. Is it worse than covid, or discrimination, or poverty? Of course not. There are those who argue that it will be impossible to address those monumental wounds in our society unless we overcome polarization. Barack Obama squandered the first two years of his presidency, when he had a majority in both houses of Congress, on lyrical exhortations to bipartisanship. But there is nothing freakish, or surprising, or unAmerican, about partisanship, even extreme partisanship. It is the stuff of which politics is made. But then one must take politics seriously — more, one must think highly of politics, and even revere it, and recognize that its ruthlessness is not inconsistent with its nobility; which is to say, one must come to value power.

    The words “value” and “power” look strange together, don’t they? The juxtaposition certainly makes many liberals uncomfortable. They have been mildly embarrassed about power for many decades, probably since Vietnam. But if you are not serious about power you are not serious about change.

    If despair is born of powerlessness, then power is a reason for hope. It sounds harsh and unlovely, but there is no other way to protect human dignity and its political home, which is democracy.

    Political ideas are not poems. They do not exist to deepen our grasp of reality. Their objective is to modify reality. For this reason, political thinkers may be held accountable for the consequences of their thoughts. Anyone who lacks the stomach for consequences should stick with poetry. (For the purpose of a rich life, however, it beats politics.)

    When the mad and beautiful Phil Ochs was asked for his verdict on the 1960s, he replied: “They won the war, but we had the best songs.”

    Polarization is one of the effects of partisanship and partisan-ship is one of the effects of human association. 

    To acknowledge reality without becoming complicit in it. To correct the world without destroying it. Those were the accomplishments of James Madison. His genius, and it was nothing less, was for being an optimist and a pessimist, an idealist and a realist, at the same time. He got the balance right, while the globe is littered with the ruins of political experiments that got it wrong. The equilibrium was revolutionary, especially on the question of the place of conflict in human affairs.

    A revolution of equilibrium: the American innovation.

    A reading from The Federalist Papers, 10. Please rise.

    The latent causes of faction are thus sown in the nature of man; and we see them everywhere brought into different degrees of activity, according to the differ-ent circumstances of civil society. A zeal for different opinions concerning religion, concerning government, and many other points, as well of speculation as of practice; an attachment to different leaders ambitiously contending for pre-eminence and power; or to persons of other descriptions whose fortunes have been interesting to the human passions, have, in turn, divided mankind into parties, inflamed them with mutual animosity, and rendered them much more disposed to vex and oppress each other than to co-operate for their common good. So strong is this propensity of mankind to fall into mutual animosities, that where no substantial occasion presents itself, the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts. But the most common and durable source of factions has been the various and unequal distribution of property. Those who hold and those who are without property have ever formed distinct interests in society. The regulation of these various and interfering inter-ests forms the principal task of modern legislation, and involves the spirit of party and faction in the necessary and ordinary operations of the government.

    Here endeth the reading.

    So be of good cheer: it was always nasty. To borrow the famous phrase of Madison’s successor in the formulation of the Ameri-can philosophy, the better angels of our nature are not the only angels of our nature. The American system was constructed on the assumption that conflict is ineradicable. The foretold conflicts concern both principles and interests, and the expectation is that they will be brutal. “The causes of faction cannot be removed,” is Madison’s conclusion. Out of this dourness he designed a democracy.

    It should be added that the conflicts that constitute a permanent feature of society are not — as we, in our psychologizing habits, often prefer to think of them — misunderstandings. 

    There is no clarification, no revision of language, that will make them vanish. A misunderstanding is an apparent conflict, a temporary conflict. It can be resolved with some exploration and some patience, and an apology. But a contradiction between worldviews cannot be resolved; it can only be respected, and then managed. And if the opinions are sincerely and thought-fully held, neither side has anything to apologize for.

    Error is a form of innocence. There are many worse things in life than being wrong. (This is the courtesy that Americans seem no longer able to extend to each other.)

    Respect is more valuable, and more arduous, than reconciliation.

    The alternative to “polarization” is not consensus. There will be no consensus. Madison already warned against “giving to every citizen the same opinions, the same passions, and the same interests.” In the American tradition there is no fantasy of unanimity. Social agreement is not our eschaton. The American hypothesis is that consensus is not necessary for cooperation, that social agreement is not necessary for social peace. 

    The horror of uniformity is the democratic idea itself.

    In his painstaking attempt to describe an “overlapping consensus” for a democratic system that must accept “the fact of pluralism,” John Rawls admitted that “we do not, of course assume that an overlapping consensus is always possible, given the doctrines currently existing in any democratic society.” It is a bleak moment in his heroically optimistic enterprise. I think it passes too swiftly. He was a philosopher and he insisted upon a philosophical conception of justice, and for this reason he dismissed what he called a “mere modus vivendi.” He accused Madison (and Hobbes and Locke and Hume and Kant) of philosophical failure by contenting himself with the ideal of compromise between interests. Rawls thought that such a purely improvisational system is too fragile. Indeed it is; but it may be the finest we can do — one fragile compromise after another fragile compromise until the end of time. The problem is not only that we are not a nation of philosophers; it is also that in a pluralist society there is nothing “mere” about a modus vivendi. Madison should not be treated as the first transactionalist. It is dangerous to delegitimate compromise philosophically. Indeed, many unphilosophical activities hide philosophical principles and teach philosophical lessons. There are worse failures than theorylessness.

    I am always a little shocked, and pleasantly so, by the Founders’ ease about interests. They were unembarrassed by human partiality. And from the grubby they rose to the sublime.

    The United States Constitution is the greatest tribute to, and the greatest rebuke of, Hobbes. 

    A philosophy and a system of government that proposes to accept the collisions of society and leave the cacophony alone is a prescription for tough-mindedness. Or more accurately, tough-mindedness in the cause of the tender mercies. We are called upon to be not only sensitive but also effective.

    Too many worriers about “polarization” are so sentimental, so nostalgic, so exquisite in their sensitivity to the injuries of democratic combat, so anxious that taking a side might be a human failure. Yet an open society is a rough society. Polemic is one of the central methods of persuasion. “Deliberative democracy” is not the work of professors, even if it is the invention of professors.  

    We are a society that makes a cult out of honesty and then wants to be protected from it. 

    In an open society, inoffensiveness may be a delinquency of citizenship.

    Democracy is wasted on the timorous. The emboldening of ordinary men and women is its very purpose.

    A reading from The Social Contract, Book I. Please remain seated.

    Properly understood, all of these clauses [of the social contract] come down to a single one, namely, the total alienation of each associate, with all his rights, to the whole community…Instantly, in place of the private person of each contracting party, this act of association produces a moral and collective body, composed of as many members as there are voices in the assembly, which receives from this same act its unity, its common self, its life, and its will…For if the opposition of private interests made the establishment of societies necessary, it is the agreement of these same interests that made it possible….Either the will is general or it is not, It is the will of the people as a body, or of only a part…There is often a great difference between the will of all and the general will. The latter considers only the common interest; the former considers private interest, and is only a sum of private wills…In order for the general will to be well expressed, it is therefore important that there be no a partial society in the State…..

    Rousseau adds a footnote: “In order for a will to be general, it is not always necessary for it to be unanimous, but it is necessary that all votes be counted.” Not always! There is here a dream of social and political seamlessness, which is achieved by the dissolution of the individual in the community, the collectivity, the state. It was appropriate that the animadversion about unanimity, the mild concession to the stubbornness of difference, be a footnote, because in the holistic ethos of Rousseau’s state it really is just a footnote. These passages, and the notorious remark, also in Book I, that “whoever refuses to obey the general will shall be constrained to do so by the entire body, which means only that he will be forced to be free,” provoked a renowned historian to describe Rousseau’s ideal as “totalitarian democracy.” 

    He aspires to a perfect union, but we aspire to a “more perfect union.” The difference between democracy and totalitarian-ism is the difference between the belief in perfectibility and the belief in perfection. (I do not concur that Rousseau was a totalitarian, exactly; but his democracy repels me. I am an American.) He holds that the individual must “alienate” his rights, but we hold that the individual’s rights are “inalienable.” If you wish to understand the philosophical and political excruciations that France has endured in the wake of the murder of Samuel Paty, may his memory be a blessing, you could do worse than begin with the distinction between these notions of alienation and alienability.

    There they do not wish to recognize difference. Here we wish to recognize nothing else. Or so it sometimes seems.

    Is a nation a community? The communitarians among us would like to think so. It is certainly the case that a sub-national idea of community would leave us a state of states, a community of communities, a bubble of bubbles, a collection of monocultures paradoxically justified by multiculturalism. This would amount to a degradation of the pluralist promise, according to 

    which we can live together and apart. In order to cohere as a nation, we must extend ourselves beyond our particularities, beyond our cloisterings. A homogeneous nation has no need of universalism, but a heterogeneous nation is proof of its beauty. 

    Of course there is no such thing as a homogeneous nation. It was one of the necessary fictions of nationalism, and minorities have been paying dearly for it ever since. There is always someone unlike ourselves within our borders, and even if there were only one such person, he or she would still be the test of our decency. (And he or she may think it is me.) 

    Perhaps a nation should not be a community. Perhaps it is enough that it is a nation.

    In 1813, in a case in New York called People v. Philips, which considered the question of whether a Catholic priest could be forced to provide information that was obtained in the confessional, a lawyer named William Sampson told this to the court: “Every citizen here is in his own country.  To the protestant it is a protestant country; to the catholic, a catholic country; and the jew, if he pleases, may establish in it his New Jerusalem.”  An epochal declaration, a genuine liberation from the Old World. But what he described is both a blessing and a curse. It is pluralism carried to the limits of psychosis  For even if we are all of those countries, we are not any of those countries. We are a whole that does not devour its parts, but we are still a whole.

    Who in his right mind would wish to live only among his own? Give me goyim, please! Traditions wither in isolation. Only the infirm of identity seek more of themselves.

    It is the stupendous irony of a multiethnic society that it exposes the limitations of particularism. 

    In 1966, the brilliant Jewish historian Gerson D. Cohen gave a commencement address at the Hebrew Teachers College in Boston that he called, with a hint of wickedness, “The Blessing of Assimilation in Jewish History.” Reading it now, when a soft kind of separatism is enjoying a new prestige, is exhilarating. “A frank appraisal of the periods in which Judaism flourished will indicate that not only did a certain amount of assimilation and acculturation not impede Jewish continuity and creativity, but that in a profound sense, this assimilation and acculturation was a stimulus to original thinking and expression, a source of renewed vitality.” Our borders give us our shape, but their porousness contributes to our substance. A border is not a wall, it is the opposite of a wall, and the confusion of a border with a wall is a prescription for social and cultural disaster. 

    In the name of authenticity, people imprison themselves. And when they do so, these loyal sons and daughters, they usually insult their ancestors, who were less afraid of influences.

    The recent history of American society can be told as a story about the vicissitudes of the idea of integration.

    Differences are not discrepancies, except from the haughty standpoint of somebody else’s norm. They do not have to be brought into line. But we are not wanting in arguments for difference. Everybody screams their difference, which makes them all so tediously alike.

    Permeability ought to be a source of pride in mature individuals and mature societies. 

    A possible ground for hope: the individual. In a country in which people are masterfully manipulated by disinformation and demagoguery, in an electorate that increasingly consists of mobs and herds and gangs, in a society in which citizens are encouraged to seek intellectual strength in numbers, it is past time to remind ourselves of the dignities and the powers of the ordinary man and woman, of the autonomy of adults, of the ability of individuals to think for themselves and rise above the pernicious nonsense that their individuation is what ails them. 

    The religious extol the uniqueness of souls, the secular extol the uniqueness of selves. In this way they issue the elevating challenge that their integralist currents, religious and secular, retract.

    You cannot take your country back until you take your mind back. 

     

    I used to like bowling alone. Not always, but sometimes. Anyway, there is nothing like company to make you feel lonely. Loneliness is a social emotion.

    Individualism is a far larger dispensation than egotism, which is not to be confused with it. Egotism is a debasement of individualism, in the way that selfishness is a debasement of selfhood. The problem of individual self-love is as nothing compared to the problem of collective self-love. 

    The moral superiority of the community to the individual seems dubious to me. Belonging does not insulate anybody from transgression. Worse, there are depredations that we commit together than we would not commit alone. The haters among us, the killers among us, they may be members and they may be loners. They may speak for themselves and they may speak for their group. And communities may be kind or cruel. It’s a wash. The human heart is busy everywhere.

    In Hebrew, the root for “hope” is the same root for “gather together,” as in “Let the waters under the heaven be gathered together to one place.” As the authoritative concordance notes, 

    sperare and congregatio. I have often pondered this mysterious etymology. It suggests that hope is premised upon the end of a dispersal. But what has been dispersed that must be brought together — the community or the individual? If it is the former, if united we stand and divided we fall, then hope is to be found in the reconstitution of community. If it is the latter, then the dispersed self is what bars the way to hope, and the reconstitution of the self will confer the sought-after encouragement. I am reminded of a work of clinical psychology that appeared in the 1960s called The Psychology of Hope, which concluded with a chapter on “the therapy of hope.” In his account of what he calls “a therapeutic tour de force,” the author describes a clinician who “explicitly and deliberately employed communication of his high expectations of patients as a therapeutic procedure.”

    The critics of individualism, the whole army of them, propound a doctrine of demoralization. They have no faith in the actual person, or worse, they detest her. This is uncharitable, and also inaccurate about human capabilities. Given the irreversible fact of individuation, it can be spiritually damaging. 

    There is another option: that divided we stand. Madison’s motto!

    The Covid-19 virus came along to illustrate what genuine isolation is. Monads in masks now yearn nostalgically for their allegedly atomized life before the pestilence. They miss all the communal meetings and social minglings that were said to have been lost. Except of course the political ones, which have all thrown epidemiological caution to the winds.

    In a period of national emergency, and the Trump years were such a period, the ubiquity of politics, its penetration into the deepest recesses of life, its saturation of experience, is understandable. If you believe that your cause or your country is in peril, you will become a sentry and a soldier. There is integrity to such an intensity of commitment, though the question of whether your analysis is correct, whether reality warrants your panic and your politicization, is an important one. But liberals and conservatives both used to believe, as an axiom of both their worldviews, in the limits of politics, in fair weather or foul. Then foul weather arrived and their wisdom collapsed.

    Instead of decrying “polarization” and dreaming of the disappearance of division, we might turn our attention to the overpoliticization of human existence in America. There is no longer any domain of life from which politics is barred. People who deplore the destruction of privacy by Silicon Valley acquiesce in the destruction of privacy by politics. Perhaps the one prepared them for the other, and softened them up for the tyranny of publicity and the public. People who engage in politics for the defense of dignity acquiesce in the destruction of dignity that attends the destruction of privacy. 

    The first casualty of our overpoliticization was our culture, just about all of it. Art is now politics by other means, full stop. What fools we are to rob ourselves of what we do not have enough of, and for the sake of what we have too much of.

    “All art is political,” says Lin-Manuel Miranda. Bullshit.

    The most chilling instance of our overpoliticization, of course, is the ideological repudiation of science. When told by the government that their lives were in danger, millions of Ameri-cans said only, don’t tread on me. There is no longer any Archi-medean point outside these political self-definitions.

    As for the progressive bedroom, and the infiltration of intimacy by political standards for sexual behavior: make love, not history.

    What would a post-“polarized” America look like? I have a visionary inkling. It would consist of men and women who are not only who they vote for and not only who they agree with. They would hold political convictions and defend them, but they would be known also, and mainly, by other beliefs. They would accept the political dissonance but make themselves a little deaf to it, out of respect and for the sake of comity. They would have friends whose views they despise. They would not look forward to family gatherings as an occasion for gladia-torial combat about the issues of the day. They would give up their erotic relationship to anger, and to rectitude. They would renounce their appetites for last battles and last judgements. They would refuse to let even their own extremely correct views interfere with the fullness of living. They would march, and then they would come home. They would mobilize, and then repair to those realms in which mobilization is beside the point. They would not display their politics as proof of their goodness, because they would take note of the good people on the other side. (There are sides, of course, where no goodness can be found, but they are not many.) They would forgive.

    Joy in the struggle for justice: outside the contested epiphanies of mysticism, is there a more astonishing spiritual accomplishment? It is joy in the face of misery, after all; joy amid injustice, but deployed against it. When I watch films of the civil rights movement of the 1960s, I am always dumbfounded by the joy, which somehow never got in the way of strategy. What powers of soul! 

    In ancient Greece there was a sect of philosophers known as the Elpistikoi: the Hope-ists, or in another translation, the Hopefulists. We know nothing about them. They are mentioned only once, in Plutarch, in a discussion about “whether the sea or the land affords better food.” According to a certain Symmachus, “they believe that what is most essential to life is hoping, on the grounds that when hope is not present to make it pleasant, then life is unbearable.” Or in another translation: “in the absence of hope and without its seasoning life is unendurable.” Seasoning, indeed: Symmachus compares hope to salt. This is a utilitarian case for hope, which is undeniable, because in the absence of any verification we cling to hope entirely for its effects. But it is also more: for the hopefulist, life was not bearable or unbearable, but unbearable or pleas-ant. The hopefulist does not wish only to make it through the night. He wants a pleasant morning, too, and pleasant days. 

    Is hope a pleasure? I suppose it depends on what one fears. There may be terrors that hope cannot dispel. Or does hope rise to match them in scale? Hopelessness, in any event, appears when ignorance has passed. Ignorance is the soil of hope, which may be a chapter of its own in the legend about ignorance and bliss. 

    Not so, say the economists, whose subject is now the whole of life. Hope, they say, is an assessment of probabilities. But the more the probabilities are known, the less need there is for hope. If the probabilities could be entirely known, we would all be enlightened and hopeless. I am not sure I like the sound of that. But hope is not an assessment. It is a prayer — perhaps the only prayer that the godless, too, can pray.   

    Symmachus, lying in the Ionian sun, picking at salted delicacies, voluptuously hoping. 

    And back here, in the winter wastes, two possible grounds of hope: a new vaccine and a new president. We are not yet getting to the end.

    A Memory

    A sickness came over me
    whose origins were never determined
    though it became more and more difficult
    to sustain the pretense of normalcy,
    of good health or joy in existence —
    Gradually I wanted only to be with those like myself;
    I sought them out as best I could
    which was no easy matter
    since they were all disguised or in hiding.
    But eventually I did find some companions
    and in that period I would sometimes walk
    with one or another by the side of the river,
    speaking again with a frankness I had nearly forgotten —
    and yet, more often we were silent, preferring
    the river over anything we could say —
    on either bank, the tall marsh grass blew
    calmly, continuously, in the autumn wind.
    And it seemed to me I remembered this place
    from my childhood, though
    there was no river in my childhood,
    only houses and lawns. So perhaps
    I was going back to that time
    before my childhood, to oblivion, maybe
    it was that river I remembered.

    Trash

    General consensus in our home
    was candy or soda would kill us,

    or else rot our constitutions in some
    larger, metaphysical sense. Body & soul,

    to cite the old wisdom. In protest,
    my big sister & I would sneak the stuff

    through customs whenever we could:
    Swedish Fish & ginger beer, Kit-Kats,

    Mary Janes & Malta lining the sides
    of each pocket like the contraband

    spoils they were, smallest joys,
    our solitary arms

    in this war against the invisible
    wall our parents built to bar

    the world of dreams. Now that
    we are older, the mystery is all

    but gone. We were poor. Teeth
    cost. In the end, it was the same

    as any worthwhile piece
    of ancient lore: love obscured

    by law, our clumsy hands
    demanding heaven, forgetting

    the bounty in our bellies, the miracles
    our mother made from Jiffy mix

    & cans of salmon, all the pain
    we never knew we never knew

    held there, against our will,
    in the citadel of her care.

    Reparation

    How are you feeling is always your opening question
    & you know me. I always take it the wrong way
    when you say it like that.
    I hear you asking for damage reports, the autobiography
    of this pile of brown rubble bumbling on
    about his father’s beauty, this chasm splitting
    the voice in his unkempt head & the one
    which enters the realm of the living.
    You are good to me, & this kindness, I think, is not reducible
    to our plainly economic relation, the yellow carbon
    receipt at the end of each session a reminder
    that we aren’t just girls
    in the park catching up, estimating the cost
    of our high school errors.
    I never call you my analyst, because
    that makes me sound like a body
    of work, some extended meditation
    approaching theory, if only asymptotically.
    Anyways. I’m alright today. I remembered
    to eat breakfast, & went for a run uptown.
    I gave myself credit for trying to change.
    Something in me awakened, today,
    ready for liftoff. It sang.

    The Hatboro Blues

    To the memory of friends 

    The first thing I remember thinking about what we now call “the opioid crisis” is that it was making everything really boring. It was 2010, I was in eleventh grade and at a house party about which I had been excited all week. I had with me a wingman in the form of my buddy Curt, and a fresh pack of smokes, and — please don’t think less of me — 750 milliliters of Absolut blueberry vodka. In short, all that was needed for a good night.

    And yet the party was a bust. It seemed that every third kid was “dipped out,” as we called those in drug-induced comas, lit cigarettes still dangling from their lips. Even the terrible rap music wasn’t enough to wake them. Nobody was fighting, nobody was fornicating, nobody was doing much of anything. There was nothing about this sorry shindig that set it apart from many others just like it which were still to come, but it sticks in my mind now for a melancholy reason: It was the point at which I realized that something was very wrong.

    What follows is not some hardcore Requiem for a Dream kind of yarn. Different movies apply. My high school experience was plenty Dazed and Confused, but with shades of Trainspotting and maybe a flash of Drugstore Cowboy. It was like The Breakfast Club, if Claire had carried Percocet in her purse and the dope in Bender’s locker had been white, not green. This is a story about how a kid who enters high school as a Led Zeppelin-loving pothead can leave four years later with a needle sticking out of his arm. (Or not leave at all). It is a tale of a town and a generation held hostage by Purdue pharma — the story of every place on the edge of a big East Coast city flushed with cheap heroin and prescription pills in the mid-to-late aughts. Maybe you already know how it goes.

    Fifteen miles north of Philadelphia’s City Hall sits Hatboro. It is a majority-white town with an average per capita income of $35,000 per year. A set of train tracks dissecting the town can shoot you into the city in a few minutes and for a couple of bucks. My elementary school, Crooked Billet, was named after a Revolutionary-era battle that took place on its grounds on May 1, 1778. Every year on that day kids don tricorn hats and sing songs about America. The town is part of a larger school district encompassing a neighboring township called Horsham, which gets much wealthier as it creeps closer to Philadelphia’s Main Line. In high school, some kids lived in McMansions and drove new cars, others took the bus. The public schools were good.

    I was raised, along with a younger brother and sister, by a single mom who worked as a hairdresser and a waitress. I spent every other weekend with my father, who lived in the next town over and founded a tree and landscaping company and later worked in real estate. We qualified for the free lunch program at school, and some years were tougher than others, but we were not poor and always had everything we needed. One week every summer was spent on vacation in Wildwood, New Jersey. I began my career as a busboy in an Italian restaurant when I was fourteen and kept the job all through high school. Later I became the first person in my family to go to college.

    It started off as your regular suburban experience, innocent enough. I smoked my first cigarette on the same day as my first toke of pot, in the last week of eighth grade. The cigarette was a Marlboro Red, provided by a friend’s older sister whom everyone thought was hot. (Regrettably, I smoke them to this day). Weekends were spent with my three best friends, guzzling Canadian whisky lifted ever-so-gently from a parent’s liquor cabinet and chain-smoking in various parking lots. We were long-haired little gremlins who liked to venture into the city for Warped Tour, Ozzfest, and Marilyn Manson. We loved Cypress Hill and named my friend’s $45 bong “King Zulu.” We hated the rich fucks (that was our term of art for them) who wouldn’t shut up about tie-dying their shirts for the next Dave Matthews concert.

    Sandwiched between a scrap-metal yard and the Revolutionary-era battleground turned elementary school were the aforementioned train tracks and a pathetic patch of mud and trees we called “the woods.” It was to us what the country club was to that other Pennsylvanian, John O’Hara: a place to get soused and settle scores. A few yards down the tracks lived a homeless Vietnam veteran whom we’d christened “the Bum.” He would walk with us to a local bar to buy forty-ounce bottles of beer — usually Olde English or Steel Reserve — in exchange for a couple of bucks. (Bars in Pennsylvania sell beer-to-go, and many of them still allow you to smoke inside.) My best friend at the time was legendary for being able to down an entire forty in under sixty seconds. We played a clever game called “Edward Fortyhands,” in homage to the Tim Burton movie, in which a forty-ounce bottle would be duct-taped to each hand and use of both your mitts would not be regained until the bottles were emptied. A guy named James at the local Hess gas station would sell us cigarettes underage and one woman who operated the McDonald’s drive-thru traded Newports for dollar-menu items. The world was our malt liquor-soaked oyster.

    Another hangout was a place we called “Chronic Bay.” (We were heavily into Dr. Dre’s “The Chronic” back then.) It was a pond-sized storm drainage ditch located behind a sewage processing plant and an abandoned Sam’s Club that was shielded from view by a tree line. It smelled, literally, like shit, but it was the perfect place to smoke weed and drink fortys undetected. Our soundtrack at the time included lots of Sublime, Biggie Smalls, and some tragically awful emo albums. Most of my friends were skaters who loved to watch “Baker 3” on repeat. Those were the carefree days when everything felt like a party, the days before pregnancies and overdoses. Nobody was dying, or making their mom sad, or falling asleep behind the wheel, or stealing from their grandparents, or going to jail.

    People used to talk a lot about pot as a “gateway drug,” but I think about what came next in terms of floodgate drugs: the floodgates of an over-prescribed society opened, and suddenly drugs were everywhere. Some people would learn where or how to draw the line, but others could not see it; and crossing it became a death sentence. After booze and weed we all started to play around with prescription pills in a way that was always getting ratcheted up. It started light, with Klonopin (“K-pins”), and then Xanax.

    The first time I took Xanax was in a McDonald’s parking lot. I took both of the two milligram “bars” my friend Sam plopped in my hand, felt pretty damn loose, and then my memory disappeared.

    Most of my friends liked to eat pills, some more than others. In the first month of eleventh grade, in 2009, a black comedy called Jennifer’s Body starring a salacious Megan Fox as a demonic succubus, came out in theaters. A friend named Becky piled us into her Honda Accord for a trip to the movies. Most kids sneak candy or soda into the movie theater. Our clandestine appetites were different. We popped Klonopin and smuggled into the theater a backpack stocked with “Four Loko,” the fruity malt liquor concoction that contained so much caffeine that its manufacturer was later forced by the FDA to tweak its recipe, because people were dropping dead after drinking it. Why would anyone pay money to see a movie in this state? Most of us were passed out before the credits rolled. But that’s just how we rolled. Everything seemed like an occasion to get “fucked up,” even standardized testing. Before the PSATS, Sam ate so many Xanax “bars” that halfway through the test he dropped his sharpened number 2 pencil and told the proctor that if she didn’t let him out of the classroom he was going to vomit all over her. (She let him out.)

    Sharon was a year older than me and lived in the neighbor-hood. The year her mother was sent to jail, Sharon’s house became our free-for-all party pad and experimentation fort. Sharon’s scratchy baritone made for the perfect imitation mom-voice, so she could supply an alibi to any anxious parent inquiring about their child’s whereabouts. It always worked, including on my own mother.  One night at Sharon’s we couldn’t get our paws on any preferred substances, and so Collin, our friend with the stickiest fingers, had a brainstorm: He would go to the home of a girl he was seeing and raid her parent’s medicine cabinet. After he came back with a bottle of what we thought was pharmaceutical-grade sleeping medication, we decided to divvy up the bottle, pop all the pills at once, wash them down with fortys, and have a contest to see who could stay awake the longest. Fingers were crossed that we would be rewarded with hallucinations. But things went awry and it was only later, after consulting our handy-dandy Pillfinder (“Worried about some capsules found in your teenager’s room? Not sure about those leftover pills still in the bathroom cabinet? There’s a good chance that our Pill Identification Wizard (Pill Finder) can help you match the imprint, size, shape, or color and lead you to the detailed description in our drug database”) that we realized the Seroquel we had ingested was not knock-off Ambien but an antipsychotic medication used to treat schizophrenia. Oh well.

    Meanwhile, all the regular stuff associated with teenage development continued apace. I had some bad haircuts, kept decent grades, and rarely missed a day of work at the restaurant. (There was that one time, when Collin, Sam, and I each ate an eighth of magic mushrooms at midnight, went out to play in a state-of-emergency blizzard, and I missed a brunch shift the next morning. Otherwise I was a model employee and my bosses loved me.) I was the same bookish kid I had always been, devouring every Harry Potter and Lord of the Rings book in the library. I shared a room with my little brother. I hung a Pulp Fiction poster on my wall and bought CDs at the mall. I lost my virginity. I got my permit and then my license. My father bought me a 1999 Nissan Maxima with 190,000 miles on it for $2,000 and taught me how to drive a stick shift.

    Wheels meant freedom and access — to fine things, like trips to the shore, but to trouble, too. Now that our group was mobile, all my friends suddenly became two-bit drug dealers. Usually they had only an ounce or less of pot to peddle, but sometimes more. I held a pound of weed for the first time when a friend asked me to drive to nearby Norristown to pick it up and stash it in the trunk of my car. (Incentive: “I’ll fill your gas tank and smoke you up on the way.”) Most days after school my Maxima was transformed into a roving dispensary of marijuana and other delights. One night I decided to vacuum the thing and install some new air fresheners. Miraculously, the next day the school announced a surprise search of the grounds by the police and their drug-sniffing dogs. Midway through science class a principal knocked on the door and beckoned for me. The whole classroom shifted to watch as I traipsed out, fate unknown. We walked down the hall in silence and approached the exit to the parking lot, where a sortie of my buddies — who didn’t know I had just wiped “the whip,” as we called the car — had congregated with looks of abject terror on their faces to watch the pooches encircle my lemony-scented ride. Even though it had been cleaned, the dogs couldn’t help but stop on their adventure through the school’s parking lot. You can imagine the dismay of the principal and the officers upon finding nothing harder than a pack of cigarettes and some “Rohto Arctic” eye drops inside. As I say, a miracle.

    One friend, high on something or other, crashed his car through a storefront on the town’s Main Street. Later, after a new facade was constructed, we joked that he had merely given the place a free facelift. (No one was seriously injured.) Another time I was cruising around with my friend Ethan when a drug dealer named Pete got in touch. For reasons that now seem inexplicable, we thought Pete was cool and that his imprimatur meant something. At the time he was dating Diana, a beautiful brunette and a real Calamity Jane who had flitted in and out of our crew since the early days of eighth-grade summer, when she would never turn up any place without a Gatorade bottle full of vodka and a pack of Newport 100s. So when she dialed me up to say that Pete had an $800 bag of cocaine from which a modest profit could be made, and did I want to move it for him, I had to take a minute to think about it. Ethan and I both looked at each other and blithely shrugged, but my gut told me it was maybe a bad idea to become a coke dealer. Besides, I had a job already, a real one. I said I was honored but politely declined and hung up the phone.

    Then Ethan’s cell started to ring — it was Diana. He said yes, dropped out of school the next week, and started selling the pile of white powder, gram by gram. This posed two problems for the rest of us: We liked coke and we had no self-control. By the time the weekend rolled around, half the bag had disappeared up our little noses. Even worse, Ethan’s mother found the rest under his bed, freaked out and flushed it. We dodged Pete for as long as possible, and then he turned up on Ethan’s front lawn with a couple goons and baseball bats. Poor Ethan’s parents were left with no choice but to call the cops. Pete eventually backed off, but Ethan’s credit around town was pretty low afterward and there were more than a few parties to which we couldn’t bring him.

    Drugs beget drugs and things begin to blur. The halcyon days of fat blunts and warm beer in the woods were firmly in the rearview. Movie shorthand again: if the ninth and tenth grades were Fast Times at Ridgemont High, junior and senior year were more like Valley of the Dolls, all the Spicolis turned to fiendish Neely O’Hara’s. And it was not just my raggedy clique that was gobbling pills like Pac Man. The vicissitudes of the Lacrosse team and the Richie Rich kids from up the way seemed to mirror our own. Next came Percocet, an opiate, and therefore in the same drug family as heroin. “Perc 10s” and blueish “Perc 30s” could be crushed up and snorted. Luckily for me, I disliked the way Percocet made me feel. I didn’t enjoy the stomach pains, the itches, the bouts of narcolepsy — or the feeling that I was an actual drug user as opposed to a dumb kid having fun.

    When you are a teenager, it is of course easy to make bad choices, because you feel invincible. Maybe the worst decision one could make in pilltown was to try OxyContin. You can have fun, as we all did, with Klonopin, coke, Xanax, Percocet, Ecstasy, and tabs of acid, but there is usually no coming back from OxyContin. A seventeen-year-old doesn’t stand a chance. Adults who are prescribed it for legitimate reasons barely stand a chance. Oxycontin’s not a drug that one can “dabble” in. It is synthetic heroin in pill form manufactured by a gigantic pharmaceutical corporation, and in Hatboro it wasn’t hard to find 40 milligram doses of it — “OC 40s” for short, or the double dosage “OC 80s.” Ingested orally, Oxycontin is meant to mete out pain relief over a number of hours, but the “extended release” could be circumvented for an instantaneous high by crushing and then snorting the pills.

    In 2010, when I was in eleventh grade, Purdue Pharma tweaked its production so that the pills could no longer be crushed. It was like trying to plug a sinkhole with a wine cork. (Studies would later argue that this tweak only pushed people more quickly to heroin.) By then we all knew someone who was a full blown “jawn head,” as we called those addicted to OC’s. Maybe it was the kid next to you in homeroom who stopped showing up to school. Maybe it was a friend from the grade above. Maybe it was an older sibling. There was a stupid rap song called “OxyCotton” extolling the joys of OC’s and it became a kind of unofficial anthem of my high school, Hatboro-Horsham High School, now nicknamed “Heroin High.” The song was a menacing joint by an otherwise obscure rapper named Lil Wyte. One verse, rapped by Lord Infamous, went like this:

    Scarecrow, scarecrow whatʼs that youʼre popping

    A powerful pill they call Oxycontin

    But it’s so tiny and it catch you dragging

    Haven’t you heard big things come in small packages

    I prefer the oranges with the black OC

    Take two and you cannot move up out your seat

    Some people melt ‘em down in a needle and shoot ‘em up

    But I pop ‘em with Seroquel like glue, I am stuck

    This was hardly just a street drug, though. With so many people’s parents being over-prescribed opiates, nabbing pills out of a medicine cabinet became my generation’s version of raiding the liquor cabinet. In this way one of my earliest friends, Danny, got hooked. He lived two streets over and was in the grade above me. We’d known each other since we were in diapers. “In the beginning it was fun, there’s no two ways about it,” he now recalls. “If it wasn’t fun, we wouldn’t have done it. I don’t know if that was the only way we knew how to have fun or if we just took it to another level. Kids in different parts of the country will drink and party and take it to a certain level and there’s nothing else readily available so it fizzles out. Around here, it’s like you partied and then you met older kids and the older kids were doing this, and then, somehow — peer pressure, wanting to fit in and be cool — you somehow got into that.” The way he said it, “somehow” was another word for inevitably.

    I never touched the stuff, not because I was smarter than anyone else, I was just more of a wimp. I was already trepidatious owing to some unpleasant experiences with Percocet, and OxyContin seemed genuinely frightening. By now the kind of havoc that the drug could unleash was everywhere apparent, and snuffing the fun out of house parties was just the start. An older brother type with whom I had worked at the restaurant since the day I was hired was no longer funny, smart, or cool: He was a confirmed and abject jawn head, a zombie. It was heartbreaking to watch someone’s personality dim and die before he was even old enough to vote. You had to look out for your own, and my best buddies and I made a pact that, no matter how far we pushed our partying, we would stay away from OC’s. Still, everything was being warped around us. Even our mood music morphed from metal, grunge, and 90’s hip hop into the real hood stuff coming out of North Philly at the time, mix tapes about “trapping” and being “on the block” and pushing drugs 365 24/7 rain or shine. I hate to sound like Tipper Gore, but I believe that the music, if it did not directly influence us, at least reflected the spiraling and trashy subcul-ture of an ostensibly nice town littered with drug baggies.

    Hatboro is just across the city line and a thirty-minute drive from the open air drug markets of North Philly, known as “the badlands.” That is where all the heroin comes from once it is pulled from the docks and flooded through the streets. OxyContin is expensive, but a $10 “stamp bag” of heroin does the trick just as well. And so before long, in a kind of irreversible entailment, all the jawn heads devolved into dope heads, actual heroin addicts. Ground zero for dope was — and still is — an intersection called “K & A,” where Kensington and Allegheny Avenues meet in the Kensington neighborhood. The streets that spiderweb out from that junction are an addict’s bazaar, a warren of narrow blocks in which dealers sit on porches shouting out their merchandise to passersby. You don’t even have to know someone to collect. When cops roll down the block, the dealers simply retreat back inside. This is the hellish district in which suburban mothers go looking for their heroin-addicted children, bringing them peanut butter and jelly sandwiches or a new coat if they can’t coax them to come home. Half the kids on those streets are from towns just like mine.

    I started hanging out in the city more when Becky — she who had driven us to the movies to see Jennifer’s Body — began dating Matt. He was a year older, out of school, and living in a one-bedroom apartment on Rising Sun Avenue, about a fifteen-minute drive from the open air drug markets. Now drugs were more attainable than ever. A new cast of shady characters floated into our orbit and the old ones just got shadier. One night at Matt’s I pawned some of the Xbox 360 games I had received for Christmas to purchase a bag of ecstasy pills that turned out to be cut with methamphetamines. The red pills emblazoned with stars and the green ones imprinted with palm trees kept me, Sam, and Collin up all night — Sam vomited every hour on the hour and we pondered bringing him to the emergency room — and sent us into horrible withdrawal the next morning. It was the worst I had ever felt in all my short life. The kid who sold us the dirty E-pills, also named Matt, had his newborn baby with him that night. I can still remember Matt fishing for a Newport in his pocket while handing me his baby and saying “Here, you look like you’re good with kids.” That Matt is dead now. When I bumped into the baby’s mother at a bar last year, we didn’t even bother mentioning that fact. It was the order of things. The other Matt became an addict and a father and then, last I heard, got clean. Becky has two rugrats herself and just sent out wedding invitations.

    Until then, the city had always loomed large in our suburban imaginations as the place where we would spend the best nights of our lives. We used to head into the city to see our favorite bands at the Electric Factory or the Theater of the Living Arts on South Street. It was where the best cheesesteaks were, and the Italian market, and the Flyers and Melrose Diner. It was the home of magic. But then going to “the city” meant dipping into a dangerous neighborhood for drugs — a different kind of home for a different kind of magic. We were slowly being blasted. It was on another night at Matt’s when my own sense of invincibility was finally shattered. After polishing off a bottle of vodka we took a drive to K & A for some more provisions. I parked the car while Matt walked up the block. He came back empty handed, but with two cops in tow. They pulled up next to my Maxima, yanked us out, slapped handcuffs on our wrists, and searched my car. There was nothing to find, but one cop grabbed my red Verizon enV3 flip phone, turned to me and asked, “Who am I calling, Mom or Dad?” I thought for a second and then gulped, “Dad.”

    The cop left a voicemail on my father’s phone, gripped me up and spat, “Now go back to the suburbs and stick to smoking your fucking grass, white boy.” When I got home, my father was nothing but rage. He yelled so loud I can still remember the foundations of the house shaking. I try to imagine what the voicemail said: “Hey, we’ve got your loser son down here trying to buy narcotics in a neighborhood where people are shot in broad daylight. Where did you think he was, the mall?” When I reflect on that episode now, what is most shocking to me is the blatant and incontrovertible white privilege. Here we were, teenagers drinking and driving and looking for drugs, a menace to ourselves and to anyone who might encounter us, and my interaction with the police amounted not to a  rap sheet or a bullet but parental concern and an actual slap on the wrist.

    For me, the alarm had sounded. What on earth was I doing in North Philly or with people like Matt? I really harbored no desire to destroy myself. I really was hungry for life. Despair was never my affliction, so why was I acting as if it was? And so I stopped going to the city and cut out everything except pot and booze — a renunciation which, given the habits of most of my friends, was practically monastic. The fact that I had been scared straightish did not mean that anyone else was. The opposite was the case. Things were getting worse. Rehab stints to the local clinic, court mandated or otherwise, became a rite of passage for hard partiers. This meant that Suboxone, a drug just as powerful as heroin that is used to wean one off it, entered an already bleak picture. One day after school I watched as Ethan and Curt split one tiny Suboxone pill, letting it overpower them to the point that they could barely walk or keep from vomiting. Hard drugs were no longer the realm of upperclassmen, either. When Curt’s parents went out of town, we threw a party at his place and were deeply unsettled to discover a fifteen-year-old freshman girl snorting lines of heroin in the upstairs bathroom. We were the moralists! It was an odd sensation for us to be clutching our pearls at the ripe old age of eighteen, but that episode shocked even us.

    My story is coming to its end. In the years after I graduated, the bill for a class of kids hooked on heroin came due. One of the first people with whom we ever smoked weed in eighth grade overdosed and died. So did the kid who used to sell it to us. Two of the most beloved girls in town, lifelong friends who grew up on the same block as each other, both overdosed and died. Danny overdosed a number of times, he was even found turning purple on the floor of a Rite Aid bathroom once, and against all odds he is now sober. (To this day his mother carries two forms of Narcan in her purse because you never know.) Diana, who was dating the drug dealer Pete, descended further into addiction, stole from friends, and fell off the map altogether. One day last year I received a frantic Facebook message from her mother, who was reaching out to Diana’s old school friends for any clues as to her whereabouts. She finally turned up a few months ago newly sober, and posted a long status on Facebook about how, at her lowest, she had picked up a meth addiction, weighed less than ninety pounds, and was hearing voices. Her ex-boyfriend Pete lost his little brother to dope. The list of the lost goes on. And not only of the young. Some of the parents were just addicted as their children. My mom’s ex-boyfriend, who was like a stepfather to me during the years when I was in middle school, became an addict and is now dead. The man she dated when I was in eleventh grade ended up addicted to opiates. As for any judgment about the quality of anyone’s parenting: I have come to believe that no level of awareness about the danger could have prevented it. You can keep a close eye on your child, but when drugs are ubiquitous, when they are a central feature of social life, when the surrounding culture confers prestige upon them, the best you can do is cross your fingers and pray.

    A whole vocabulary has sprung up to convey the shared experience of addiction, a vernacular of the carnage. When I go home and visit with old friends, there is always a grim roll call conducted over beers. “When was the last time anyone heard from her?” “Oh, I heard she’s still really bad.” There is a lot of sorrowful shaking of heads. Another one I’ve heard often and with nonchalance: “So, guess who’s a dopehead nowadays?” Social media has become a surreal forum for this conversation, too. Facebook newsfeeds are so peppered with remembrances and R.I.P posts that you might not even pause while scrolling past one. Many of them include poorly cropped angel wings or some variant of “Heaven just gained another angel,” a phrase so anodyne and overused I consider it Hatboro’s version of a Hallmark card. These were the clichés of social destruction. In the years since I graduated, heroin has been largely edged out by fentanyl, a synthetic opioid that is much easier to overdose on than your garden-variety dope. Meth, which was never around in my picaresque youth, has found a big market in the suburbs, too.

    The crisis is in your face everywhere you go. It is the driver next to you at a stoplight falling asleep at the wheel. It is the dopehead in line in front of you at the 7-Eleven or the grieving mother of one of your school chums standing behind you. Who should we turn to? God, perhaps; but look at His record. The government, perhaps; but look at its record.

    To confront the addiction of the despairing produces its own variety of despair. Along with some of my closest friends from back then, I marvel that we made it out when so many of our comrades did not. Melancholy permeates my town. And it is never really over. One of those friends recently became a cause for concern among our circle after he was fired for dipping out at work, just the way we did at house parties in eleventh grade. He is not returning anyone’s calls, and word is that he has stopped paying some of his debts. It beggars belief: opiates now, after everything we remember? But we are too sober to delude ourselves about what is possible in our town, and in other towns. We have seen this movie before.

    Note: The names in this essay have been changed out of respect for the privacy of its subjects.

    Steadying

    For some time now it has felt like history is itself the pandemic. In our country and elsewhere, it has been in overdrive, teeming with evils, flush with collapses, abounding in fear and rage, a wounding contest between the sense of an ending and the sense of a beginning, between inertia and momentum, with all the terribilities of ages of transition. What is going has not yet gone and what is coming has not yet come. We have become connoisseurs of convulsion. At sea is our new sea.

    For better and for worse, axioms and assumptions are dying everywhere around us. Such vertiginous hours always come with both clarities and confusions — there is no promise of illumination. The guidance we need in our circumstances will not be provided by the circumstances themselves: they are too many and too contradictory and too volatile; passion increasingly unconstrained and power increasingly unconstrained. As the sense of injustice grows, injustice seems to keep pace with it. There is a piercing sensation of flux, of uncontrollable effects and unmanageable consequences. The masks on our faces are emblems of an entire era of vulnerability. The most important thing, therefore, is that we keep our heads. A disequilibrium of history demands an equilibrium of the mind. Steadiness in the midst of turbulence is not complicity with the existing order. It is precisely in such binges of history that we must teach ourselves to sort through the true and the false, the good and the bad, the continuities and the discontinuities, the right statues and the wrong statues, the humane and the utopian.

    Everything will be different: this is a ubiquitous sentiment. In all our upheavals — social and epidemiological — so much seems to be wrong and so much seems to be slipping away that one may be forgiven for enjoying a fantasy of total change. All these horrors, all these outrages, all these marches, and the world stays the same? So the first thing that needs to be said in the effort to keep our heads is that everything never changes. More, the idea that everything will change usually plays into the hands of those who want nothing to change. The cycle of revolution and reaction has never been the most effective engine of progress. Nothing suits the interests of the old regime like utopianism. The thirst for change will not be slaked by the cheap whiskey of apocalyptic thinking. The only certain outcome of the apocalyptic temper is catharsis, and one way of describing the decline of our politics in recent decades is that it has increasingly become a politics of catharsis, in which crisis is met mainly by emotion. (Populism is just mass emotionalism, and the emotions are often ugly ones.) Apocalypse is not an analysis, it is the death of analysis. It sets the stage only for salvation, but salvation must never become a political goal. This is especially true in a democratic society, where the only saviors are, alas, ourselves.

    Thus it is that the struggle against injustice imposes upon us a paradoxical psychology: it demands both impatience and patience. Impatience about injustice, patience about justice. This is hard to do. It looks too much like, and in many cases it may well be, complacence. It is certainly difficult to preach incrementalism to the injured. So why not be impatient about justice, too? There are historical and practical reasons why not. History is stained by tales of instantaneous justice, by the consequences of the rush to perfection, by the victims of the victims. The ethical calculus of means and ends is never teleologically suspended, if just causes are to remain just. Nor is it a quantitative calculus: when I first studied the modern history of the Jews I drew a variety of conclusions from the Dreyfus affair, and one of them, which was an important moment in my moral education, was that Zola and his comrades appropriately threw an entire country into crisis for the sake of one man. Similarly, due process is not a legal formality, a procedural exercise that slows the way to a satisfying climax; it is the very honor of a liberal society.

    More concretely, the establishment of justice involves not only revisions in opinions but also revisions in institutions. A dreary point! But anyone who denies the institutional dimension, in all its exasperating machinery, is not serious about the change. Paroxysms, unlike laws, vanish. This was the year in which the campaign for racial justice found support in virtually all the sectors of American society, with the exception of the White House — an unprecedented national epiphany that cannot be dismissed as “performative”, because culture matters; but the road from protest to policy is long and winding. It is not a betrayal of the ideal of social justice to tread carefully and tenaciously, with a mastery of the scruples and the methods that would make a reform defensible and durable. Tenacity is what patience looks like in the middle of a struggle.

    I will give an example of the complicated nature of the mentality of change. One of the consequences of recent social movements in America — #MeToo (which came also to my door, with its lesson and its recklessness) and Black Lives Matter—has been to reveal how poorly we understand each other. Or more precisely, they have exposed the extent to which the failure to understand others may be owed to the failure to understand oneself — the limitations of one’s own standpoint, the comfortable assumption that one appears to others as one wishes to appear to them, or to oneself. This is nonsense, though sometimes you learn so the hard way. There are limits to our epistemological jurisdiction. The failure to observe these limits is solipsism, and we all begin as solipsists, awaiting correction by social experience.

    Our epistemological jurisdiction stops at the encounter with another person. She is another epistemological kingdom, not more perfect but certainly different, with something important to add, and a perceptual contribution to make. I may like to think that I am what I present myself to be, but I am also what she sees me to be, because she sees me as I cannot, or will not, see myself. I am never in control of my self-representation and never complete in my self-awareness. We always show more of ourselves than we think we do, which is why we may learn from the responses of others. We spill beyond our intentions and our conceits, and what we gain from this overflow is criticism.

    But criticism, too, must be assessed critically – there is no exemption. The enlightenment that one acquires from the judgments of others is owed only to their accuracy. It is certainly not warranted by the belief that a person’s identity or socio-economic position or experience of hardship confers an absolute authority, a special relationship to truth, a vatic privilege. What a simple world it would be if pain were a sufficient guarantee of credibility. But it is not – indeed, the opposite is the case, pain is myopic and sees chiefly itself, which is one of the reasons it hurts. Finally we are all left with the modesty of our grasp. No whole classes of people are right and no whole classes of people are wrong.

    The ineradicability of ambiguity from human relations, the ignorance of ourselves that accompanies our ignorance of others, the whole fallible heap, creates an urgent need for tolerance and, more strenuously, for forgiveness. Historians will record that in the early decades of the twenty-first century we became an unforgiving society, a society of furies, a society in search of guilt and shame, a society of sanctimonies and “struggle sessions” American-style. They will admire our awakening to prejudice but lament the sometimes prejudicial ways in which we acted on our progressive realizations. In this respect America should become more Christian. (There, I said it.) For all our elaborate culture of self-knowledge, for all the hectoring articulateness of our identity vocabularies, we are still, each of us, our own blind spots. We should welcome every person we meet as a small blow against blindness.

    The partiality of perspective: this is the great teaching of the contemporary tumult. The problem is that we have not only begun to acknowledge our partiality, and the partiality of others, we have also begun to revere it, and this is a mistake. We are gagging on all our roots. If pain does not provide access to truth, neither does particularity. The worship of particularism is one of the great impediments to social justice, and in its exhilarating way it coarsens us all. In our moral and social thinking, our obsession with otherness has concealed that the foundation of moral and social action is sameness. The “other” is exotic, but there is nothing exotic about the homeless man on the street: he is the same as me, a human being, except that he is hungry and I am not. The difference in our circumstances is not a difference in our definition. When I hand him a few dollars I am not extending myself toward an alien being; I am practicing species solidarity. I am not discovering his humanity; I am responding to it. I am acting, in other words, universally, and none of the social problems that afflict us will be solved unless we recover the universalist standpoint that sees beyond the visible divisions, and is not trapped in, or enraptured by, the specificities of our tribes. Pluralism secures the right to turn inward, but it also broaches the duty to turn outward. By surrounding us with other partialities it legitimates our own partiality, but it also reveals that there is more to the world than what is merely ours.

    A great deal has been written in recent years about the discovery of our commonplace biases and the techniques for overcoming them. Much of this literature is psychological, but some of it is political, and its aim is to confine us proudly within our limits and call them wonderful. In the name of authenticity, we are instructed that the partiality of our perspective is all we will ever have, and that the aspiration to impartiality is an aspiration to power, or a justification of power. Every view is a view from somewhere. Nobody escapes his or her position. We are all marooned in our respective glories. Objectivity, according to this advanced opinion, is an epistemological plot of the elites.

    This inculcates a kind of localist arrogance that is fully the match of the globalist kind. Such “perspectivism” was one of Nietzsche’s lasting provocations, and in American philosophy it was ringingly championed by Richard Rorty, who was the only man I have ever known to use the word “ethnocentrism” positively. He denounced objectivity in favor of solidarity, and his children are everywhere, in all the movements; and a similar war on truth flourishes, for less sophisticated reasons, also in the offices of prime ministers and presidents. The outlook for intelligence, as Paul Valery used to say in an earlier era of confusion and peril, is not heartening. Truth in America is a refugee, an undocumented immigrant. Philosophers and political operatives have joined together to proclaim the fictive nature of fact. About this there is no “polarization.” It is not only policy over which we differ: we differ also over the description of reality. (And even if science is not all we need to know, is there any plainer measure of stupidity than the mockery of science?)

    All these communitarianisms of the mind are absurd. If all one can express with one’s beliefs is solidarity with one’s community, then how is it possible to disagree with one’s community, and what is the origin of dissent? If it is impossible for people of different backgrounds, or classes, or races, or genders, to understand each other, why are they disappointed or angry when they are not understood? If people who are white or male or rich cannot claim to comprehend people who are black or female or poor, how can people who are black or female or poor claim to comprehend people who are white or male or rich? Of course the world does not work this way, according to this Empedoclean epistemology, for which like can only know like. The startling reality – it is one of the tremendous features of human existence – is that, within societies and among societies, across nations and cultures, we manage to be intelligible to one another. If you don’t get it, you can get it. As a strategy for thwarting human communication, Babel was a bust.

    This everyday mental commerce, this regular passage through these permeable frontiers, sometimes needs the assistance of translation, and always needs the assistance of imagination, but it proves that the inherited perspectives may be enlarged and that the despair of a greater commonality is a self-inflicted wound. Perfect objectivity may never be attained, but that is no excuse to act like merry peasants. “Positional objectivity,” as Amartya Sen has described the only plausible mitigation of our parochialism, will get us very far. Moreover, chafing against one’s limits is a condition of ethical sensitivity: if I were to be content with what my own life has taught me, I could not recognize sufferings which I have not lived and against which I have a responsibility to act. All that I need to know I cannot learn in my town, even if I can learn a great deal there. We have moral obligations in unfamiliar situations.

    I am not a woman and so I must imagine rape. I am not a black man and so I must imagine chokeholds. I am not a Syrian and so I must imagine that charnel house. I am not a Uighur and so I must imagine those camps. (But I am a Jew and so I expect others to extend the same imaginative respect to the fate of my people.) If victims were the only ones who understood oppression, who would help them? Often they insist that they must help themselves, which is correct, and evidence of their irreducible dignity, but there are limits to what they can do, and their “auto-emancipation” does not absolve the rest of us from the work of their emancipation. This work involves shaking ourselves loose from the mental dullness that is the product of our distance. As Judith Shklar once observed, “it will always be easier to see misfortune rather than injustice in the afflictions of others.”

    Objectivity, in other words, is the sturdiest ground of justice, and the despisers of objectivity are playing with fire. Feelings are a reedy basis for reform. After all, the other side also has feelings – which is how we wound up with the revolting solipsist in the Oval Office. In a democratic society, reform comes about by means of persuasion, and the feelings of others may not do the trick. I may not feel what you feel. I will not be convinced that you are right by the fervor of your feeling that you are right. I need reasons to agree with you, that is, appeals to principles, to rational accounts of preferences, to terms and values larger than each of us which, unlike feelings, we may share.

    Without objectivity, without the practice of detachment that makes genuine deliberation possible, without tearing ourselves away from ourselves, justice in our society will mean only what the majority, or the crowd, or the media (all of them fickle) want it to mean. We will gag on our roots. We will continue to despise each other, some scorning the weak and others scorning the strong. Our system of disagreement will continue to be degraded into a system of umbrage, in which a dissenting opinion may be dismissed as “tone-deaf”. Empathy, where it exists, will be remorselessly selective and most often reserved for one’s own kind. (Down with himpathy! Up with herpathy!) We will remain stalled in our excitability. But none of the questions that we are asking as a society can be answered with a scream or a scowl.

    Some of what I have written here will please progressives. Some of it will please conservatives. I call it liberalism.

    “When the facts change, I change my mind. What do you do, sir?” Legend attributes that swaggering pronouncement to Keynes, and it has become the canonical formulation of the anti-dogmatic mentality, the credo of the open and empirical mind. It has always irritated me, and not because I have a complaint about the admiration for factuality. These days the facts are the front lines in the battle for reason in America. The power of the state has been pitted against them.

    Keynes was an economist, and I have no doubt that the relation that he posits between facts and opinions is entirely appropriate for purposes of administration – say, setting an interest rate. As conditions change, policies must be adjusted. Only a fool would think otherwise. If you are not fascinated by the question of what works, stay away from government. (Or join up, because these days nothing gets done.) Practicality is always reactive; its timeline is short. Pragmatism waits on the news. There is even a current in modern American thought for which democracy is itself an exercise in unceasing pragmatism, in trial and error unto the generations. Its definitive statement can be found in the conclusion to Holmes’ renowned dissent in Abrams in 1919. Immediately following his famous observation that “the best test of truth is the power of the thought to get itself accepted in the competition of the market,” which was an important moment in the infiltration of the non-economic spheres of American life by the vocabulary of economics, Holmes went on to declare about the Constitution that “it is an experiment, as life is an experiment.” Whatever the merits of such a philosophy of existence, the sense of the provisional championed by Holmes is admirable for the mental patience that it imparts, and for its revulsion from absolutism.

    Yet Keynes’ statement seems to be reaching for more than a merely managerial responsiveness. It appears to be making a more general claim about the dependence of beliefs on facts. There are many kinds of belief, of course. But there are some kinds of belief that do not originate in the facts, that are not hostage to changes in the facts, that exist prior to the facts and provide the framework within which the facts are understood and assessed. I cannot agree that moral opinions and philosophical opinions, if indeed Keynes had such opinions in mind when he made his remark, require such a tight association with fact. Even the belief that beliefs must be based in facts cannot be based on facts. There are views I hold about right and wrong, about the individual and the group, about ethical obligation, about the duties and the limits of power, about the nature of truth, about the nature of beauty, and about spiritual meanings that will not be revised by the morning paper, whatever it brings. Before tomorrow’s bad news, I already know that the world is an unkind place and that there are a variety of ways to interpret its cruelty, and I have, to the best of my abilities, in ways that I can explain, already chosen an interpretation.

    It is possible, over time and by means of careful reflection, taking your experience into account but not only your experience, to arrive at a view of life, a worldview, and to hold it continuously, through thick and thin, regardless of who the president is, without embarrassment at the steadfastness with which you maintain it, so long as you give reasons and present them for critical examination. There is no shame in intellectual constancy. It is nothing like dogmatism, if it is thoughtful. And the caprices of external events, even when they are cataclysmic, need not throw one into philosophical crisis. Especially in times of cataclysm, one should aspire to what Rebecca West called “an unsurpriseable mind.”

    I remember a conference, not long after the earthquake of 2016, where I was holding forth on the characteristics of populism. When it came time for questions, an acquaintance of mine, a fiendishly intelligent woman with a saturnine look on her face, a distinguished international civil servant, raised her hand. “After what just happened,” she asked, “how should we revise our views?” It was not the first time that I heard this question in the aftermath of the Trump ascendancy. I disliked the question. It represented a fundamental misunderstanding about the formation of belief. We should not revise our views, I replied. The election did not prove that our views are wrong. It proved only that our views are unpopular. (And the well-named popular vote did not prove even that.) All that a poll can establish is the popularity of a belief, its distribution across a population. It has no bearing whatever upon its substance. What we believe may be wrong, but not because many people disagree with us. This is precisely the problem with Holmes’ idea of verification, with his contention that truth will be established in the competition in the market: success in the market has nothing to do with truth. The interminable history of human illusion shows that the “marketplace of ideas” is like every other marketplace. It reflects only appetites and interests; it is easily manipulated; it is quantitative.

    I may have been a little sharp in my reply to the questioner. My disrespect for her notion of intellectual flexibility must have showed. Politicians, of course, must evaluate ideas politically, but this was not an exchange about politicians. A losing side may need to revise its tactics, but beliefs are not tactics. There is nothing illegitimate or disqualifying about a minority position. A democracy, indeed, should be judged by how it treats its minorities, not least its intellectual minorities. There is honor in minority life. There is honor also in defeat, if one stands for something more than victory. If you stand for principle and you lose, you are equipped to fight again. Sometimes there is good company in the wilderness. In wondering whether defeat should inspire second thoughts about first things, my rattled interlocutor was skirting the problem known as the tyranny of the majority, which was long ago identified as one of the supreme abuses of democracy. When I assured her that the results of the election did not constitute a refutation of her views, I did not mean to lull her into a feeling of righteousness about what she – and I – believed. I wished only to draw a line between disappointment and crippling doubt.

    Here is what I do, sir. When the facts change, I interpret the facts according to the methods and the assumptions in which I have the most intellectual confidence. If I can vouch for the integrity of those methods and assumptions, which in my case are liberal methods and assumptions,I will be reluctant to give them up – especially in a dizzying world, where the people with moorings will be better able to explain and to lead. I recognize that moorings come in many forms – evil, too, comes with intellectual frameworks; but those frameworks will be most effectively challenged and repudiated by those who have a different one of their own. As for the facts, I am all for them; but I am not sure they can do all the work that needs to be done. Will bigotry be vanquished by data? A hatred cannot be dispelled for being non-factual. Sooner or later we have to engage at the level of moral and philosophical principle. We must make ourselves competent in kinds of discourse that are not only empirical. We must not forget how to believe.

    This journal begins its life in a time of breakdown and bewilderment, of arousal and expectancy. It is called Libertiesbecause of all the splendid echoes of the word – liberty, liberal, liberate, liberality, even libertarian, even libertine. (The question of the place of pleasure in human life is one of the fundamental questions.) It is both a grave word and a joyous word. The plural is a tribute to the plurality of freedoms that we enjoy as a matter of right, and also to the plurality of freedoms that the citizens of a growing number of countries are being ruthlessly denied. Above all, it is meant to announce that, in this universe of fascists and commissars, the objective of these pages will be, by argument and by example, in politics and in culture, the rehabilitation of liberalism.

    The slander of liberalism is one of the spectacular idiocies of our age. The errors and the failures of the liberal order, at home and abroad, need to be acknowledged, but they do not need to be exaggerated. The pride of liberals deserves to be much greater than their guilt. A glance at history abundantly demonstrates this, as the issues of this journal will explain. But the historical events that provoked the social, economic, and moral achievements of the liberal order have receded in time, and the experience of time itself has been accelerated, so that historical memory can no longer be relied upon for the work of explanation and nothing is obvious anymore. The work of explanation, guided by reason and humaneness and the study of the past, needs to start again. There is nothing nostalgic about such a project. The restoration of liberal ideas and practices – a social equality based not on venerations of identity but on universal principles; an economic equality based not a delusion of dirigisme but upon a rigorous regulation of capitalism; a faith in government as one of the great creations of human civilization and the protector of the weak against the strong; an affirmation of American power in the world because of the good that American power can do in the world – is entirely forward-looking. To curse liberalism is to curse the future.

    It is no longer trite or tautological to say that a democracy is a place that behaves democratically. Within our democracy, and within other democracies, there are many leaders and movements who behave undemocratically or anti-democratically – who view democracy expediently, as an instrument for the acquisition of power and nothing more. For this reason, the philosophical grounds and political benefits of democracy also need to be re-clarified. In 1938, on a lecture tour of the United States, Thomas Mann observed to his American audiences that democracy “should put aside the habit of taking itself for granted, of self-forgetfulness. It should use this wholly unexpected situation – the fact, namely, that it has again become problematical – to renew and rejuvenate itself by again becoming aware of itself.” He was speaking, of course, with the ruefulness of his German experience. Our situation is not as bleak and bitter, but an authoritarian temper is flourishing in our midst too, in the West Wing and the streets and the media and the platforms. We, too, have become self-forgetful. “No,” Mann told the crowds from coast to coast, “America needs no instruction in the things that concern democracy…Europe has had much to learn from America as to the nature of democracy. It was your American statesmen and poets such as Lincoln and Whitman who proclaimed to the world democratic thought and feeling, and the democratic way of life, in imperishable words.” It is bruising to read those sentences. We no longer offer such instruction to the world, or even care about the condition of freedom beyond our own borders.

    The question of how to live is more than the question of how to vote. The liberal idea was never just a political idea. It is, more generally, a grand belief in human capacity, and in the obligation – exclusive to no group and no tradition – to cultivate it. When Henry James wrote about “the liberal heart”, he meant a large heart, a generous heart, a receptive heart, an expansive heart, an unconforming heart, a heart animated by a wide variety of human expressions. Such an ideal of heartfulness pertains not only to politics but also to culture. The war against callousness cannot be won without the resources of culture. There is no more lasting education in human sympathy than an exposure to literature and the arts.

    The dwindling position of the humanities in American society is one of its most catastrophic developments. This journal, an independent journal, will take a side in this struggle. It will champion sensibility as well as controversy, and attend to culture with the same ardor with which it attends to politics. But it will refrain from aligning cultural criticism with political criticism, in grateful awareness of the multiplicity of the realms in which we lead our lives, and in awareness also of the insidious history of the synchronization of culture with politics. Pardon the counter-revolutionary thinking, but culture must never become politics by other means. Of course this is precisely what culture is becoming, thanks not least to the zealous synchronizers at the New York Times. (And at The New Yorker, which is what PM would have been if it had the money.) The autonomy of art threatens nobody and enriches everybody. The social and political origins of artists vitiate the freedom of art about as much as the social and political origins of thinkers vitiate the freedom of thought. When art is weaponized, it is compromised. Racial justice does not require the racialization of all things. And culture harbors no dream of consensus. An aversion to controversy is an aversion to culture, just as it is an aversion to democracy.

    Not least because it will appear only four times a year, this journal will not be in the business of rapid response to the emergencies and the imbecilities with which we are currently inundated. We will crusade, but slowly. There is a deeper reason for this counter-cultural pace. It is that the investigation into bigger ideas and larger causes takes time. If the sorting out of our intellectual pandemonium should not be conceived under the aspect of eternity, neither should it be conceived under the aspect of the news cycle. American journalists have brilliantly responded to an assault on their integrity and their legitimacy with a golden age of investigative journalism, but they cannot be expected to do more: the exposure of lies in a regime of untruth is as exhausting as it is essential. (How many synonyms are there for “madman”?) So in these pages we will be indifferent to the chyrons. There will be no quick takes and immediate reactions and emotional outbursts, nothing driven by velocity or by brevity. At this journal we are betting on what used to be called the common reader, who would rather reflect than belong and asks of our intellectual life more than a choice between orthodoxies. We are not persuaded that it is a losing bet. With a melancholy sense of the fragility of what we cherish, and with a bestirring sense of how much injustice there is in the country and the world, we wish to bring an old intellectual calling into a new era and see what together we can learn. Nothing quickens the mind like hope.

    Plagues

    Consider the plague. I mean the actual, literal, bubonic plague, the disease caused by the bacterium Yersinia pestis. In this pestilential season the subject has been impossible to avoid, because so many people are calling coronavirus “plague” — even though, as pandemics go, they have almost nothing in common. Plague has an astonishingly high fatality rate — between 50% and 80% of its victims die — but is rarely transmitted directly from person to person, traveling instead through the bites of infected fleas. Covid19, by contrast, is much more contagious but significantly less fatal. And there are other distinctions. While the plague comes with painful, swollen tumors, running sores, and putrid secretions, coronavirus leaves no visible marks on the body. Most victims will survive it. Some might never even know they had it.

    There has also been plenty of talk about Ebola and AIDS and influenza and what all of them have to tell us about the present crisis. (I have no intention of interpreting the present crisis). But plague has retained a special hold on the imagination. To Thomas Dekker, the Elizabethan hack pamphleteer, it was simply “the sicknesse,” a disease with “a Preheminence above all others…none being able to match it for Violence, Strength, Incertainty, Suttlety, Catching, Universality, and Desolation.” The Black Death is still the most deadly pandemic in recorded history. At its height, between 1348 and 1351, the disease may have killed half the population of Eurasia. It has only two close rivals for sheer morbidity: the Spanish influenza of 1918-1919 and the smallpox pandemic brought to the Americas by Europeans after 1492. Both events caused untold human suffering, but neither left behind the same long history of written records. That was because the plague kept coming back. Its periodic recurrences swept through Europe with devastating regularity until the 1770s, and continued to ravage the Ottoman Empire into the 1850s. For almost five centuries, it was not unusual for cities to lose a quarter of their population in a year.

    So when Asiatic cholera spread to Europe in the 1830s, a century after the last plague outbreak, it was swiftly termed “the new plague.” Newspapers from 1918 proclaimed that influenza was “just like a plague of olden times.” Yellow Fever was called “the American plague” when it struck Philadelphia in 1793, and early coverage of AIDS in the 1980s demonized its victims by calling it “the gay plague.” Like coronavirus, none of these diseases are particularly similar to bubonic plague. They have different symptoms, causes, biological agents, and epidemiologies. What they share is a particular social profile: all are epidemic diseases of unusual suddenness and severity. They take populations by surprise. Cholera was the most feared disease of the nineteenth century, not the more deadly and more familiar tuberculosis. Endemic childhood illnesses killed more people than the plague before the invention of vaccination, but they did not inspire nearly the same terror. Fear of plague is not just about death or pain: more fundamentally, it is the fear of not knowing what comes next.

    Unsurprisingly, plague literature is currently having a moment. Publishers have announced a flood of upcoming books about the coronavirus experience. Recent months have seen rising sales of everything from Boccaccio’s Decameron to Dean Koontz’s The Eyes of Darkness (a novel about a fictional bioweapon called the Wuhan-400 virus). Camus’ The Plague is a best-seller in Italy and Korea; Penguin is currently issuing a reprint. For a couple of days in March, Defoe’s A Journal of the Plague Year was actually sold out on Amazon.

    Defoe might not be the best-selling plague author of the moment (though it’s close), but he has almost certainly been the most reviewed. After all, the Journal is the original plague novel, and arguably the only genuine historical narrative of the lot. By reading Defoe, we can tell ourselves a story about what really happened in 1665, when the Great Plague swept through London — and by extension, what has really happened to us now. In just a few days I read that it “speaks clearly to our time,” offers “some useful perspective on our current crisis,” and gives an “eerie play-by-play” of recent events. And at times, reading the Journal really did give me an uncomfortable sense of familiarity. Vague rumors of the plague reach London. The threat is discussed, then dismissed. The government waffles. Deaths start to mount through the winter of 1664 and the spring of 1665. By the time quarantines are established, schools closed, and public events  banned, it’s too late to prevent the worst. There is flight, uncertainty, panic, and lots of hoarding. Grocery shopping is perilous — careful vendors make sure never to touch their customers and keep jars of vinegar on hand to sanitize coins. Quack doctors peddle toxic “cures” and citizens obsess over mortality statistics. Everyone is constantly terrified and also somehow really bored.

    And then there is the famous ending:
    A dreadful plague in London was 
    In the year sixty-five,
    Which swept an hundred thousand souls  Away; yet I alive!

    I suspect that this is the true appeal of plague literature: the narrator always survives to tell the story. The glimpses of the present that we find in Defoe or Camus or Manzoni have a kind of talismanic effect, somewhere between a mirror and a security blanket. The more similarities we find — and judging by the current spate of writing about plague literature, there are always a great deal of “striking parallels” — the easier it is to tell ourselves that things will play out the same way. This, too, shall pass. My copy of the Journal is only 192 pages long and at the end of it the outbreak is over.

    There is nothing wrong with seeking this kind of comfort, but it does make me wonder: what is hiding behind the reassuring promise of human universals? If you read a lot of plague novels, you will notice that they tend to hit similar beats. The threat is dismissed, things get worse, quarantines are imposed, city-dwellers flee, the rule of law breaks down, we learn a very valuable lesson about man’s inhumanity to man and emerge on the other side not unscathed but wiser. Another advantage of fiction over reality is that everything occurs for a reason. Epidemics create a natural backdrop for extreme heroism or extreme selfishness. The disease itself, an inhuman killer that turns fellow-survivors into existential threats, naturally lends itself to allegorical interpretation. Plague is a divine punishment (Defoe) or a parable for totalitarianism (Camus). If we expand the genre a little, it is the inevitability of mortality (Edgar Allen Poe), a device to pare civilization down to stark moral binaries (Stephen King), or whatever it is Thomas Mann is doing in The Magic Mountain —  it is anything at all, that is, except a real disease. By treating fiction as a window into the past, we substitute a particular author’s attempt to make meaning out of meaninglessness for the full, complicated, messy range of responses which every outbreak has inspired.

    A Journal of the Plague Year is a particularly strong object lesson in the creative and purposeful appropriation of history. Defoe was five years old in 1665, too young to remember the epidemic in much detail. He wrote the book almost sixty years later, in response to an outbreak of the plague in Marseilles. Then as now, it was a good time for plague writing: 50,000 of the city’s 90,000 inhabitants had perished, and fears were high that the disease would cross the channel. Parliament issued new quarantine laws. Public fasts were proclaimed. The book was an instant success. Defoe paints a truly apocalyptic picture of London in the grip of the worst outbreak in its history: mass hysteria, corpses rotting in the streets, infants smuggled out of infected houses by desperate parents, the agonized screams of the dying in an unnaturally quiet city. Above it all, there is the omnipresent fear that an incidental touch or stray breath from a seemingly healthy person could spread the contagion.

    Critics have spent the better part of the past three hundred years debating just how accurate this portrait really is. Defoe liked to mix fact and fiction. Just four years earlier, he had published Robinson Crusoe as an authentic travelogue (it sold thousands of copies). The Journal also purports to be a factual account, “written by a Citizen who Continued All the While in London.” When the book was published in 1722, the great plague was still within living memory, and Defoe’s account rang true enough that his contemporaries largely accepted it as fact. His pseudonymous narrator, H.F., freely cites real mortality statistics, veiled or overt references to historical figures, and anecdotes found in genuine accounts of the plague year. Few scholars would go as far as his most peevish defender, Watson Nicholson, who asserted in 1919 that “there is not one single statement in the Journal, pertinent to the history of the Great Plague in London, that has not been verified” — but there is no denying that Defoe did his research.

    At the same time, Defoe’s concerns in the novel have at least as much to do with the present as the past. In the first place, horror sells. Defoe, who ghost-wrote the memoirs of a notorious thief to sell at his execution, was well aware of the commercial value of ghoulishness. He also had definite opinions about public health legislation. Defoe was a vocal advocate of the government’s new and highly unpopular maritime quarantine laws, which included an embargo on trade with plague-stricken countries. In the Journal, he portrays the similar restrictions put in place in 1665 as necessary life-saving measures. True, he acknowledges, they are costly and inconvenient — but that hardly seems relevant in the face of his catastrophic account of the alternative.

    While in favor of maritime quarantine, Defoe was one of a growing number of critics in the seventeenth and early eighteenth centuries who opposed the practice of impris-oning whole families in their homes at the first sign of infection. Some of the book’s most bone-chilling anecdotes are devoted to this “cruel and Unchristian” practice, which increased death tolls, he argued, by shutting up the healthy with the sick, and was in any case ineffective, since the plague was most contagious before its symptoms were evident. (Notably, household quarantine was not one of the provisions adopted in the controversial Quarantine Act of 1721. Here, too, H.F.’s recommendations for containing the disease support the tottering Whig government.)

    Defoe’s account of London in 1665 reflects the particular polit-ical conditions of London in 1722, but it also draws on a much older tradition of English Protestant plague writing.

    By 1665, plague was a very familiar occurrence. “It was a Received Notion amongst the Common People that the Plague visited England once in Twenty Years, as if after a certain Interval, by some inevitable Necessity it must return again,” wrote Nathaniel Hodges, one of the few physicians to remain in London during the Great Plague. In fact, its recurrences were even more frequent: an elderly Londoner in 1665 would have witnessed seven plague outbreaks in his or her lifetime, and only one interval of more than two decades without a visitation.

    The plague inspired unequaled terror, accompanied by intense religious fervor. Since it was universally accepted that the disease was a manifestation of divine vengeance, plagues made for powerful rhetorical tools in sectarian disputes. Under Queen Mary, plague was the consequence of Protestantism; when Queen Elizabeth restored the Anglican church, it was blamed on Catholics. Nonconformists were especially well-placed to take advantage of the revivals which nearly always accompanied outbreaks. Thomas Vincent, a Puritan minister who continued to preach in London through the worst months of 1665, noted that his sermons had never been so well-attended: “If you ever saw a drowning man catch at a rope, you may guess how eagerly many people did catch at the Word, when they were ready to be overwhelmed.” It didn’t hurt that Puritanism stressed emotional piety with an emphasis on sin, punishment, and predestination — all popular themes during outbreaks of a horrific disease that seemed to strike at the virtuous and the wicked indiscriminately.

    For Anglican and Nonconformist ministers alike, the plague was an opportunity to frighten a very receptive audience back into God’s good graces. Their grotesque eyewitness accounts and graphic descriptions of the suffering of plague victims warned readers of the consequences if they failed to repent. Defoe was raised a Calvinist and once intended to pursue a career as a minister. His stock of metaphors, anecdotes, and moral tales recalls the preachers and pamphleteers of earlier outbreaks. Like them, the Journal features lengthy excurses on the plight of the poor, the corruption and hypocrisy of the court, the benefits of piety and charity, and the grisly details of what a bubo really looks like up close. Defoe waxes especially poetic on the stench they emit while being lanced.

    The authors of these materials were quite willing to exaggerate certain details in the interest of leading their readers to religion. In reality, the Great Plague subsided gradually, with deaths returning to pre-plague levels by February 1666. Defoe, in one of his few outright falsehoods, has the plague end abruptly: “In the middle of their distress, when the condition of the city of London was so truly calamitous, just then it pleased God … to disarm this enemy.” This sudden reprieve cannot be attributed to medicine, public health, or anything but “the secret invisible hand of Him that had at first sent this disease as a judgement upon us.” This is where Defoe drops the pretense that he is writing a history book. His words are a warning to the reader: beware. Quarantine laws are all to the good, but if you do not repent, nothing on earth can save you.

    The Great Plague provoked just as much apocalyptic preaching as any other outbreak, but intense religiosity was not the only or even the dominant response. Indeed, the 

    biggest difference between Defoe’s Journal and the diaries of actual plague survivors is how much less the plague features in them. When we consider the scope of the disaster — 100,000 dead, large-scale quarantines, the total cessation of public life — it is hard to imagine how anyone who lived through it could think about anything else. Remarkably, they could and they did. “It is true we have gone through great melancholy because of the great plague” wrote Samuel Pepys, the least inhibited diarist in seventeenth-century England, but “I have never lived so merrily (besides that I never got so much) as I have done this plague time.”

    The Journal picks up in September 1664, with the first rumors of an attack of the plague in Holland. Pepys doesn’t mention the plague at all until the end of April 1665, and then drops the subject entirely for another month. By summer, the traditional peak of the plague season, the epidemic had grown impossible to ignore. John Evelyn, another diarist, first brings up the plague in his entry for July 16: “There died of the plague in London this week 1,100; and in the week following, above 2,000. Two houses were shut up in our parish.” Both men shared Defoe’s interest in mortality statistics. The numbers punctuate Evelyn’s diary for the next few months: “Died this week in London, 4,000.” “There perished this week 5,000.” “Came home, there perishing near 10,000 poor creatures weekly.” But between them, life goes on. Evelyn goes about his business as a commissioner for the care of sick and wounded sailors and prisoners of war. (Unsurprisingly, he is very busy.) He pays social calls. His wife gives birth to a daughter. The plague clearly weighed on his mind, but Evelyn treats it matter-of-factly. The disease is frightening, inconvenient, and a nuisance at work, but it is not the end of the world.

    Throughout the months of August and September, Pepys manages to fit a regular diet of plague-related anxiety in and around more important topics such as food, sex, and earning large quantities of money. He worked as a naval administrator, and the Anglo-Dutch war provided good opportunities for business. In his diary, Pepys is equally assiduous in recording plague mortality, monetary gains, and the “very many fine journys, entertainments and great company” which he consistently manages to provide for himself. The frequent, intense, and jarring juxtaposition of life and death makes for a bizarre reading experience. In a typical entry, Pepys enjoys a venison pasty with some business associates, complains of a mild cold, spends a pleasant evening with his family, and remarks that fatalities have jumped by almost 2,000, bringing this week’s total to 6,000 — though the true number is probably higher.

    It’s not that Pepys is insensitive to the suffering around him —  in fact, he seems keenly aware of it. He records his grief at the deaths of friends and servants, his own fears, the dismal mood in the city. At the same time, he seems to possess a preternatural ability to experience everything fully, from existential dread to a particularly good breakfast. For him, the greatest disaster in living memory is just another part of life. In his entry for September 3, which I can’t help but quote at length, Pepys describes his morning toilette: “Up; and put on my coloured silk suit very fine, and my new periwigg, bought a good while since, but durst not wear, because the plague was in Westminster when I bought it; and it is a wonder what will be the fashion after the plague is done, as to periwiggs, for nobody will dare to buy any haire, for fear of the infection, that it had been cut off of the heads of people dead of the plague.” What indeed will the plague do to periwiggs? The question is so delightfully specific. Nobody but a fashion-conscious seventeenth-century Londoner could possibly think to ask it. In its concreteness it sticks in my mind more than any given passage in Defoe, or any observation about the universal effects of epidemics. Here the disease is human-scale, an event in a particular place and a particular time, a cause of small vanities as well as mass tragedies.

    The specific has more sticking power than the general — which is another reason we look to Defoe. The Great Plague of London seems so familiar to modern readers not because there is some fundamental human response to outbreaks of infectious diseases, but because the reactions it inspired were so different from the medieval outbreaks that came before it. Everything from enforced isolation to widespread fear of infection and attempts to understand the plague’s progress were relatively new developments. The practice of quarantine emerged in northern Italian city states in the aftermath of the Black Death, along with systematic methods of state surveillance, recorded death tallies, and dedicated plague hospitals. This apparatus of plague regulation diffused gradually throughout Europe. By the turn of the seventeenth century, England had official mortality statistics and punitive sanctions to enforce home quarantine.

    The outbreak of 1665 marked another transition. Rather than an unpredictable act of providence, the plague became a predictable act of providence: while still a manifestation of divine punishment, it was carried out through natural means and could be discussed in detached and objective terms. (This development also began in Italy, but there is no great English-language novel of the plague in sixteenth-century Milan.) The Great Plague was the first outbreak in which the discourse of naturalism prevailed, and medical treatises on plague outnumbered religious ones. This medical literature included recipe books of cures and prophylactics, lengthy volumes on the nature of the disease, and theoretical debates carried out in pamphlets and broadsides. While medical writers all acknowledged God as the “first cause” of the epidemic, they established a clear separation between religious and naturalistic inquiry.

    It is tempting for the modern reader, looking back on the past with the benefit of hindsight and germ theory, to treat religious etiologies of plague as a response to a lack of available medical explanations. In fact, early modern Londoners had no shortage of naturalistic causes to choose from. A list by Gideon Harvey, a Dutch-born and Cambridge-educated member of the Royal College of Physicians, includes “great Inundations, Stinks of Rivers, unburied Carcases, Mortality of Cattel, Withering of Trees, Extinction of Plants, an extraordinary multiplication of Froggs, Toads, Mice, Flies, or other Insects and Reptils, a moist and moderate Winter, a warm and moist Spring and Summer, fiery Meteors, as falling Stars, Comets, fiery Pillars, Lightnings, &c. A ready putrefaction of Meats, speedy Moulding of Bread, briefness of the Small Pox and Measles, &c.” Other proposed sources of the plague included rotten mutton, imported carpets, and a particular dog in Amsterdam. 

    William Boghurst, an apothecary who remained in London during the plague, took a cynical view of these lengthy traditional lists: “because they would bee sure to hitt the nayle, they have named all the likely occasions they could think of.” Noticing that most of the commonly listed causes related to dirt or rot, he traced the origin of the plague to corrupt particles lurking in the earth. Like many others, his theory combined the two dominant explanatory frameworks for disease in Early Modern Europe. The classical explanation, derived from the Greek physician Galen, connected plagues and other infectious diseases to miasma, or poisonous effusions from rotting organic matter. The more modern contagionist view held that the plague could be transferred invisibly from person to person. Boghurst believed that outbreaks began when miasmas rose from disturbed earth, and quickly spread through contagion. In a similar vein, Harvey wrote that “the Plague is a most Malignant and Contagious Feaver, caused through Pestilential Miasms.”

    The fear of contagion drove Londoners to measures that even Boghurst considered excessive. He complained of the extreme lengths to which his patients would go to avoid even incidental contact: “for example, what care was taken about letters. Some would sift them in a sieve, some wash them first in water and then dry them at the fire, some air them at the top of a house, or an hedge, or pole, two or three days before they opened them … some would not receive them but on a long pole.” He was right — though he had no way of knowing it — that the plague bacterium does not live for very long on paper. But frightened citizens were eager to implement the mass of medical knowledge suddenly made available to them.

    As we have seen, this enthusiasm for information had a statistical bent. The city of London started to publish weekly bills of mortality during the outbreak of 1592. During times of plague, Londoners enthusiastically read, reprinted, and circulated the bills, which they used to track the progress of the disease from parish to parish. In 1662, John Graunt published the first statistical analysis of the data in his Natural and Political Observations Made upon the Bills of Mortality. Graunt argued that the number of deaths which the bills attributed to the plague during past outbreaks was inaccurate, and speculated that reporting was less than reliable. When the plague struck again in 1665, many Londoners adopted a similarly critical attitude to the reported death rates, suggesting that some groups (Quakers, the poor) might be undercounted, or that fatalities from other diseases were being reported as plague deaths.

    The weekly bills gave rise to one of the weirder genres of English plague publishing: the “Lord Have Mercy” broadside, named for the title which nearly all of them shared. These documents, which were reprinted almost identically in each outbreak, usually included a prayer, a woodcut, some remedies, maybe a poem, and mortality statistics from six or seven previous visitations. Examples from 1665 typically featured data from 1592, 1603, 1625, 1630, 1636, 1637, and the current week. They also included pre-printed headings for the next few weeks or months for the reader to fill in as the epidemic wore on.

    For anyone who has checked the numbers, again, just to see if they have changed, it is not hard to imagine what people got out of this practice. But the historical data is harder to interpret. Knowing how many people died in 1636 is not particularly useful in 1665. Why did Londoners want this? And why did they want it again and again in exactly the same form? Of course, this is the central question of plague literature in general. When historians discuss it, they tend to use phrases like “conventional and derivative” or “a vast and repetitive outpouring.” It is, famously, boring.

    In outbreak after outbreak, plague tracts featured the same assortment of prayers, cures, and exhortations to repent. They also shared the same stories. Some served as cautionary tales: a wealthy man refuses to assist a plague victim and immediately falls ill. Another is struck down after boasting about  his own safety. Premature interment is a common theme. One of Defoe’s anecdotes concerns a drunk piper who passes out in the streets and is loaded onto a dead-cart, only to wake up just as he is about to be buried. In another variant of the story, he is tossed into a plague pit and terrifies the sexton the next morning by calling out from the grave. In yet another, he is thrown out of a tavern for fear that his dead-sleep will  be mistaken for actual death and the whole establishment  will be declared infected.

    The same tale appears in the memoirs of Sir John Rareseby, a bona fide survivor of the plague of 1665, who certainly believed it to be both true and current. “It was usual for People to drop down in the Streets as they went about their Business,” he reports, and it may well have been — but the tale of the drunk piper also appears in plague tracts from 1636 and 1603. Repeated over decades or even centuries, these stories imposed a kind of narrative order on outbreaks. The residents of an infected city knew what to expect when the plague came. They were so familiar with the cultural scripts that they began to see them everywhere.

    The extent to which first-person plague narratives draw on earlier accounts makes it difficult to tease out the subjective experience of individual survivors. “To a degree, interpretations and responses to plague were copied and taught, not reinvented and coined afresh whenever plague occurred,” the historian Paul Slack has observed. When Samuel Pepys and John Evelyn talk about grass growing in the streets of London in 1665, they are quoting Paul the Deacon a thousand years earlier (whether they know it or not), and nearly everybody is citing Thucydides nearly all of the time. His account of the plague in Athens in 430 B.C. is the source of innumerable plague tropes, from the image of bodies lying unburied in the streets to the moral lesson that the disease brings about the collapse of social order. As with Sir John Raresby, there is no reason to believe that later chroniclers used these common-places intentionally to mislead. Expectations have a powerful ability to shape perception. Through them, the disease is tamed, familiarized, and given meaning.

    We are among the first human beings for whom the experience of a disease outbreak so severe and wide-ranging is outside of living memory. Our generation has inherited no familiar stock of coronavirus parables; no script that tells us exactly why we are suffering; no sheets of mortality statistics with an empty space left over for next time. Our fascination with plague literature is a sign that some things never change: this desire to tell and retell stories puts us in the company of every other set of survivors in recorded history. The instinct to impart structure and purpose to a fundamentally purpose-less crisis might be the only truly universal response to life in a pandemic. That we should feel it so strongly is all the more remarkable in a society as blissfully and unprecedentedly pandemic-free as the developed world was at the beginning of the twenty-first century. But no more: now we have narrative resources of our own, stories of contagion and endurance and recovery, to bequeath to the vulnerable who come after us. When faced with the unimaginable, we did what we have always done: look back. 

    The Doctrine of Hate

    Julius Margolin was born in 1900 in Pinsk. After studying philosophy in Germany in the 1920s he moved to Poland with his family, where he became active in Revisionist Zionism  and published a Yiddish book on poetry. From there he and his family moved to Palestine. For economic reasons, Margolin returned to Poland in 1936, where he was trapped by the Nazi invasion, and was eventually imprisoned in Soviet labor camps. In July, 1945 he was released and made his way back to Tel Aviv, where he wrote a pioneering memoir of the Gulag and died in 1971. The full text of Journey into the Land of the Zeks and Back was not published in his lifetime.

    After my release from Maxik’s hospital, having had an opportunity to rest, and armed with certification as an invalid, I returned to the camp regime. In Kruglitsa, a certified invalid with a higher education has a wealth of possibilities. You can choose: assist the work supervisor in compiling the lists of personnel in the brigades; work in the Cultural-Educational Sector (KVCh); or be an orderly in the barrack. Until a prisoner is taken off the official work register, he will not be sent to such unproductive work. The place for a healthy, able person is in the forest or field, where hands and shoulders are needed. The work boss will not allow an able-bodied worker to have an office or service job. An invalid is another matter. Whatever he is able and willing to do without being obliged to do so is a pure gain for the state.

    At first, I was amused at the accessibility of work from which I had been barred as a third-category worker. When they found out that Margolin had been deactivated, people immediately invited me to work in various places, and I succumbed to temptation. An invalid is allotted the first level food ration and 400 grams of bread. By working, I received the second level and 500 grams.

    For an entire month, I tried various places. After a ten-week stay in the hospital, it was pleasant to be occupied and to be listed in a job. After a month, however, I came to feel that I had been deactivated for a reason. I lacked strength. The job with the work supervisor dragged on until late at night. Work at the KVCh entailed being in motion all day, making the rounds of the barracks, rising before reveille. As a worker in the Cultural-Educational Sector, I had to get up an hour before everyone else: by the time the brigades went out to work, I had to list on the huge board at the gate the percentage of the norm that each brigade had fulfilled the previous day.

    A worker calculated these norms in the headquarters at night and, before going to sleep, he left the list for me in a desk drawer in the office. The camp was still sleeping, the dawn reddened behind the barracks, and the guards were dozing on the corner watchtowers, when I would climb with difficulty onto a stool that I had placed in front of the giant chart and begin writing in chalk on the blackened board the figures for the twenty brigades.

    This work bored me. The thought that as an invalid I was not obliged to endure this misery gave me no rest. I had been an invalid for an entire month and had not yet utilized the blessed right to do nothing; I had not taken advantage of my marvelous, unbelievable freedom. In the middle of the summer in 1943, I declared a grand vacation. At the same time, it represented a great fast: 400 grams of bread and a watery soup. It was June. Blue and yellow flowers bloomed in the flowerbeds in front of the headquarters; under the windows of the infirmary, the medical workers had planted potatoes and tobacco. In the morning, the patients crawled out to the sun and lay on the grass in their underwear or sunned themselves in the area around the barracks. When I went by, barefoot, in my mousy gray jacket without a belt, fastened by one wooden button near the collar, they shouted to me: “Margolin, you’re still alive? We thought you were gone already!”

    Without stopping, I went on to the farthest corner of the camp territory. I had a blanket, a little pencil, and paper. There was lots of paper: in the past month, I had hoarded a respect-able amount. I even had a little bottle of ink from my work in the KVCh. I would take a rest from people, the camp, work, and eternal fear. I lay on my back, watching the clouds float above Kruglitsa. A year earlier, I had worked in the bathhouse and ran into the forest for raspberries. Amazingly, then I was able to carry three hundred buckets of water a day. That year depleted me. Now there were no raspberries, but neither did I have to drag water buckets. I was satisfied; it was a profound rest.

    In the summer of 1943, a storm raged over Kursk, and Soviet communiqués spoke of gigantic battles, as if all the blood receded from this great country and flowed to the single effort in that one spot. One hardly saw healthy males in Kruglitsa. Women guarded the prisoners and conducted the brigades to work. Gavrilyuk, who the past summer had been a Stakhanovite wagoner, now, like me, had been retired from work, and women prisoners worked as wagon drivers in camp. Women, like reservists, went to the first line of work. We knew from the newspapers that, throughout the country, women were working as tractor drivers, in factories, and in the fields. The free men held the battle front while the male prisoners in the camp melted like snow in the spring sun and descended under the ground. I knew that in another year I would be weaker than I was at present. If the war dragged on, I would die and not even know how it ends. Out of pure curiosity, I wanted to make it to the end of the war.

    That summer, my first grand interlude as an invalid, I wrote “The Doctrine of Hate.” That summer I was preoccu-pied with thoughts about hate. Lying in the grass behind the last infirmary, I returned to the topic from day to day and turned out chapter after chapter. I experienced a profound and pure enjoyment from the very process of thought and from the awareness that this thinking was outside-the-camp, normal, free thought, despite my current conditions and despite the barbed wire fence and guards. This was “pure art.” There was no one to whom I could show it or who could read what I was writing, and I felt pleasure from the very activity of formulating my thoughts, and as the work advanced I also felt carry three hundred buckets of water a day. That year depleted me. Now there were no raspberries, but neither did I have to drag water buckets. I was satisfied; it was a profound rest.

    In the summer of 1943, a storm raged over Kursk, and Soviet communiqués spoke of gigantic battles, as if all the blood receded from this great country and flowed to the single effort in that one spot. One hardly saw healthy males in Kruglitsa. Women guarded the prisoners and conducted the brigades to work. Gavrilyuk, who the past summer had been a Stakhanovite wagoner, now, like me, had been retired from work, and women prisoners worked as wagon drivers in camp. Women, like reservists, went to the first line of work. We knew from the newspapers that, throughout the country, women were working as tractor drivers, in factories, and in the fields. The free men held the battle front while the male prisoners in the camp melted like snow in the spring sun and descended under the ground. I knew that in another year I would be weaker than I was at present. If the war dragged on, I would die and not even know how it ends. Out of pure curiosity, I wanted to make it to the end of the war.

    That summer, my first grand interlude as an invalid, I wrote “The Doctrine of Hate.” That summer I was preoccupied with thoughts about hate. Lying in the grass behind the last infirmary, I returned to the topic from day to day and turned out chapter after chapter. I experienced a profound and pure enjoyment from the very process of thought and from the awareness that this thinking was outside-the-camp, normal, free thought, despite my current conditions and despite the barbed wire fence and guards. This was “pure art.” There was no one to whom I could show it or who could read what I was writing, and I felt pleasure from the very activity of formulating my thoughts, and as the work advanced I also felt proud that to a certain degree I was prevailing over hatred, was able to grasp it, and to subject it to the court of Reason.

    This subject was dictated by my life. What I had endured and seen around me was a true revelation of hate. In my previous life, I only heard or read about it, but I never encountered it person-ally. Neither racial nor party hatred had crossed the threshold of my peaceful home. In camp, for the first time, I heard the word “kike” directed at me, felt that someone wanted me to perish, saw victims of hate around me, and witnessed its organized apparatus. In camp, I, too, for the first time learned to hate.

    Now it was time for me to elaborate all this material theoretically. How simple it would be to go away from the haters to that bright kingdom of warmth and humanity in which I, unawares, lived before the Holocaust. It is natural for a person to live among those who love and are loved by him, not among enemies and haters. But this was not my fate. Nor was I able to resist hatred actively. The only thing that remained free in me was thought; only by thought could I respond. There was nothing else I could do but try to understand the force that wanted to destroy me.

    I was less interested in the psychology of individual hatred than in its social function, its spiritual and historical meaning. I saw hatred as a weapon or as a fact of contemporary culture.

    The most important thing, with which I began, was the dialectic of hate. Hatred is what unites people while dividing them. The link via hate is one of the strongest in history. Souls come together in hate like the bodies of wrestlers — they seek each other like wrestlers in a fight. You cannot understand hate as pure negation, because if we merely do not love or do not want something, we simply walk away from it and try to eliminate the unnecessary and unpleasant from our life. There was something in my hatred of the camp system that forced me to think about it, and I knew that my hatred would not let me forget it even when I got out of here. Hate arises in conditions when we cannot escape. Hate is a matter of proximity. Personal, class, or national hatred — it is always between cohabitants, between neighbors, between Montague and Capulet, over borderline and frontier.

    The paradox of hate is that it leaves us in spiritual proximity to that which we hate until, ultimately, there arises rapprochement and similarity. Sometimes, the hate itself turns out to be merely a concealed fear of what attracts us, as in Catullus’ poem Odi et amo, as in Hamsun’s “duel of the sexes,” as in a lackey’s hatred for the lord, and finally, in antisemitism of the maniacal type, when people cannot do without the Jews. Here is an acute example. Adolf Nowaczyński, a talented Polish writer, was a malicious hater of everything Jewish. When he approached old age, he took off for Palestine to see things with his own eyes, and it turned out that he felt quite good in Tel Aviv. This man’s life would have been empty without Jews. If they had not existed, he would have had to invent them, and ultimately that is what he did all his life. There is hatred toward fascism and even hatred of communism that derives from a certain moral closeness and, in any case, leads toward it over time. We cannot hate what is absolutely incomprehensible and alien. The incomprehensible arouses fear. Hatred, however, needs an intimate knowledge and multiplies it, and it endlessly forces us to take an interest in what we detest.

    This was the paradox of hatred that I examined from all sides while lying in the sun in the corner of the camp yard. Hatred was not only before me — it was also inside me. In me, however, it was different from the hatred against which my entire being rebelled. It thus was necessary to differentiate the various forms of hatred, in order to distinguish between  the hatred that was inside me and what to me was an odious and evil hatred.

    I began by identifying some bogus and altered forms, the pseudo-hatred that only obscures the essence of the matter. I saw that an inapt item or something with an external resemblance paraded under the label of hatred. Away with counterfeits!

    First: juvenile hatred, odium infantile. Children are capable of the most fierce, frantic hatred, but that is only “ersatz,” not serious. Juvenile hatred is a momentary reaction, an acting out. It boils up in an instant and passes without leaving a trace; it rises and bursts like a soap bubble. In essence, it is an outburst, a fit of emotional distress. This is precisely the reason why, in its mass manifestation, by virtue of its qualities of easy arousal, easy manageability, and evanescence, it is particularly suitable for the purposes of cold-blooded producers of this hatred and inciters, who always mobilize it in the masses when it is necessary to stimulate them to an extraordinary effort, to struggle in the name of changing goals. Hatred goes to the masses, flows along the channels of calculated propaganda, but it is all on the surface; it has neither depth nor stability. Left to itself, it dies out or unexpectedly changes direction, as in 1917, when the masses, filled by Tsarist governments with pogromist and front-line hatred, turned against the govern-ment itself. The savage hatred of the incited mass, like fuel in a car, turns the wheels of the military machine, but the ones at the steering wheel are calm and cool.

    Ripe, mature hatred does not have the nature of a momentary reaction; it is a person’s automatic, internally determined and stable position. It does not exhaust itself in one ferocious outburst but gnaws at a person’s entire life and lurks behind all his manifestations and deeds. Psychologically it is manifested in a thousand ways. From open hostility to blind nonrecognition, all shades of dislike, malice, vengeful-ness, cunning and envy, mockery, lies, and slander form the vestments of hatred, but it is not linked exclusively with any one of them. There is no specific feeling of hatred; in its extreme form, it ceases to need any kind of “expression.” 

    A child’s hatred is expressed in screaming, foot stamping, and biting. The hatred of a savage, which is the same as a child’s hatred, elementary, bestial fury, is expressed in a pogrom, in broken skulls and bloodletting. There is, however, mature hatred that is expressed only in a polite smile and courteous bow. Perfect hatred is Ribbentrop in Moscow, kissing the hands of commissars’ wives, or Molotov, smiling at the press conference. We adults have learned to suppress and regulate manifestations of our hatred like a radio receiver, turning it off and on like a light switch. Our hatred is a potential force; therefore, it can be polite and calm, without external manifestations, but woe to the one who shakes an enemy’s extended hand and walks along with him.

    The second form of pseudo-hatred is odium intellectuale: the hatred of scientists, philosophers, and humanists — it is the hatred of those incapable of hating, the academic hatred of intellectuals, which was introduced as an antidote and placed as a lightning rod against barbarism. This vegetarian, literary hatred would have us hate abstract concepts — not an evil person but the evil in man, not the sinner, but sin. This hatred unceasingly exposes vices and fallacies, mistakes and deviations against which we are ordered to fight. This theoretical hatred completely fences itself off from the practical. Unfortunately, the street does not understand these fine distinctions: mass hatred recognizes only that enemy whose head one can break.

    Humanism in its essence cannot oppose hatred. We know of two attempts in the history of culture to eliminate hatred from human relations: “nonresistance to evil” and the view that the end does not justify immoral means. Passive resistance to evil, however, invariably switches to active resistance against the bearers of evil, and the question of “ends and means,” with its artificial division of the indivisible, remains intractable so long as we do not know what specific means are being used for precisely what goals. Historically, butchers and murderers invariably used abstract, theoretical hatred for their own purposes, expertly contriving to turn every intellectual product into a weapon of mass murder and unlimited slaughter.

    Christ drove the money lenders out of the Temple. His successors excommunicated the heretics from the church and lit the bonfires of the Inquisition, up to Torquemada and that papal legate who, upon suppressing the Albigensian heresy, said, “Kill all of them; God will recognize his own.” The Encyclopédistes and Rousseau hated vice and believed in the triumph of virtue. The French Revolution introduced the guillotine. Marx started with the liquidation of classes and of exploitation in human relations. His followers turned Marxism into a formula of mass terror, when a “class” is destroyed not as an economic category but as millions of living, innocent people. “Kill them all; history itself will revive what it needs.” The process contains a tragically inevitable progression, and, unavoidably, the warrior-humanist becomes a captive of an alien element, as in the case of Maxim Gorky in the role of a Kremlin dignitary. The teachers either capitulate in the face of the conclusions that the pupils derive from their lessons or perish in prison or on the scaffold.

    Odium intellectuale, the theoretical hatred of scholars, thus either fails to achieve its goal or leads to results that are diametrically opposite to the original intention. Luther throws an inkpot at the devil. The devil turns the philosopher’s ink into blood and a sea of tears.

    The third form of hate that I isolated in my analysis is odium nationale, the well-meaning hatred of those who take up arms in order to halt the force of evil. Evidently, there was never a dark force that did not try to pass itself off as just and worthy. Evidently, we have no other means of distinguishing between good and evil than by Reason and Experience, which teach us to recognize the essence of phenomena from their manifestations and consequences. There is, thus, a hatred that is rational and transparent in all its manifestations. It is clear to us why and when it arises. Its logical basis is at the same time the reason for its conditional nature, as it disappears along with the causes that evoked it. This hatred is so secondary and reactive that we can easily designate it as counter-hatred. We do not need it intrinsically, but when an enemy imposes it upon us, we do not fear to take up the challenge, and we know that there are things in the world that are worth fighting against — the passion and force of survival which do not yield to the enemy’s force and passion but have nothing in common with them in their inner essence.

    Having thus carefully differentiated the historically present forms of pseudo-hatred — mass-juvenile and intellectual-ab-stract, and the rational counter-hatred of the warrior — I approached the eyeless monster that at the time of my imprisonment had spread over all of Europe.

    Unlike the superficially emotional, infantile hatred of the crowd, the theorizing hatred of the intellectual, and the sober, clear conviction of the defenders of humankind, there is a force of primal and pure hatred, active despite its blindness, and blind despite its initiative, and the more active the less causally provoked. It fears only the light of day. Reason is its natural enemy. 

    Haters of the world are united in their negation of freedom of the intellect. The mark of Cain by which one can recognize genuine hate is scorn of free thought, rejection of the intellect. For Hitlerism, free thought is “a Jewish invention”; for the Inquisition, it is a mortal sin; for the ideologues of communism, it is counterrevolution and bourgeois prejudice. Every basis for such hate is imaginary and pseudo-rational. It is therefore natural that the people who established forced-labor camps in Russia simultaneously eradicated freedom of discussion and the right of independent investigation there. 

    In a pure, undiluted form, hatred is self-affirmation via another’s suffering. People become haters not because their surrounding reality forces them to that. There is no sufficient basis for hatred in the external world. There is nothing in the world that could justify the annihilation of flourishing life and proud freedom undertaken by Hitler, the fires of the Inquisition, or the prisons and pogroms and the camp hell of the Gestapo and the NKVD.

    There is a pyramid of hate, higher than the Palace [of the Soviets] that is being constructed in Moscow at the cost of hundreds of millions while people are dying of starvation in the camps. At the base of this pyramid are people similar to children, wild savages, like the one who hit me with a board on the road to Onufrievka, or the SS man who shot my elderly mother on the day the Pinsk ghetto was liquidated. These people rape, destroy, and murder, but tomorrow they themselves will be the most mild and obedient and will serve the new masters and believe the opposite of what they believed yesterday, and others — just like them — will come to their homes to murder and rape. Above these people stand others who teach them and entrust them to do what they do.  Above them are still others, who engage in ideology and theoretical generalizations, and those embellishers, who service the hatred, deck it out, put it to music, and dress it in beautiful words. 

    Ultimately, however, at the very top of the pyramid stands a person who needs all this: the incarnation of hatred. This is the organizer, the mastermind, the engineer and the chief mechanic. He has assembled all the threads in his hands, all the subterranean streams and scattered drops of hatred; he gave it direction, a historic impetus and scope. At his signal, armies cross borders, party congresses adopt resolutions, entire peoples are exterminated, and thousands of camps are erected. And he may be kind and sweet: he may have six children as Goebbels did or a “golden heart” like Dzerzhinsky’s, an artistic nature like Nero’s or Hitler’s, and the Gorkys and Barbusses will not stop slobbering over him. He, however, decreed that somewhere people must suffer. He executed them in his mind when no one yet knew about his existence. Even then he needed this.

    This brings up a central question in the doctrine of hate: What is the makeup of a person, a society, an epoch if naked hatred has become such a necessity for them, if the senseless torment-ing of their victims becomes a necessary condition of their own existence? It is not at all easy to answer this question if one does not adduce the familiar so-called arguments that the German people “were defending themselves against the Jews,” that the Inquisition was “saving souls,” or that Stalin is re-educating and reforming “backward and criminal elements” with the help of the camps. This is obvious nonsense. Of course, I in no way harmed the Germans or needed a Stalinist re-education, but even if that had been the case, it would not justify the gas chambers or turning millions of people into slaves. Germany did not need the gas chambers; the Russian people did not need the camps. But they are truly necessary for the big and little Hitlers and Himmlers, Lenins and Stalins, of the world. What, indeed, is going on?

    One must clearly recognize that the people holding the keys of power are fully aware of and admire the extent of the avalanche of human and inhuman suffering that seems like an elemental misfortune to us little people. Those people are responsible for its existence every minute and second. They have started it and control it, and it exists not because of their ignorance or impotence but precisely because they know well what they are doing, and they are doing precisely what meets their needs. Only a dull, wooden German lacking imagination, such as Himmler, needed to visit Auschwitz in person in order to look through a little window of the gas chamber to see how hundreds of young Jewish girls choked to death, girls who had been specially dispatched to execution that day for that purpose. People of the Kremlin do not need to observe personally; they have statistics about the camp death toll.

    There is no answer to why this is necessary other than to analyze the known pathological peculiarities of human nature. There is no rational, “economic,” or other explanation of hatred. The logic of hatred is the logic of madness.

    That man [Stalin] hates: He cannot do without this attitude to people; without it, he suffocates. Hate is the oxygen that he breathes. Taking hatred away from him would leave him destitute.

    That man hates, which means that some kind of inner weakness develops into hate, the result of some organic problem. Some kind of lack, defect, or unhappiness may remain within the bounds of his sense of self, but it may also spread to his social milieu and be transmitted to other people. There are wounded people, vulnerable classes, ready to turn into breeding grounds of collective hate. There are situations when people, groups, or societies are unable or unwilling to look truth in the face.

    In Vienna, young Hitler discovered that the Jews are responsible for depriving him and the German people of their deserved place in the sun. This is preposterous but, indisputably, this man started with some feeling of pain; he was deeply hurt. Had he wanted the truth, he would have found a real cause, but the truth was too much for him to bear. He therefore began to search for external guilty parties. Here the mechanism of hate begins to operate. The real pain turns into an imagined insult. An enemy and offender must be found. 

    The need for an enemy is radically different from the need for a struggle that is characteristic of every strong person. Strong people seek an arena, an outlet for strength. The hater seeks offenders to accuse. On the one hand, the need for a struggle engenders courage and initiative. On the other, the need to deal with a cunning enemy engenders aggressiveness and malice. The offender is always nearby. If he is not visible, that means he is in disguise and must be unmasked.

    All haters are great unmaskers. Instead of a mask, however, they tear off live skin, the true nature, and they replace reality with a creation of their inflamed fantasy. Hatred starts with an imaginary unmasking and ends with real flaying, not in theory but in practice.

    The analysis of our epoch given by Marx and developed by Lenin crossed all bounds of a reasonable interpretation of reality. Pseudo-rational theory turned into a Procrustean bed that did not accommodate real life. It is sufficient to compare the tirades of Mein Kampf with Lenin’s passionate polemics  and his thunderous charges against capitalism to sense their psychological affinity. It is the language of hate, not of  objective research. We can learn as much about reality from Marxist-Leninist scholastics as we can from the Protocols of the  Elders of Zion.

    Every hatred that reworks pain into insult carries out “transference,” in the language of modern psychoanalysis. The source of the pain is internal but we transfer it to the outside. Others are to blame when things go wrong for us, when our plans do not succeed and our hopes are crushed. We thus find an outlet, a relief, but only an illusory one. Hate acquires an address — a false one. Revenge, dictated by hate, misses the mark, like a letter sent to an incorrect address. Hatred engenders a constantly hungry vengefulness. 

    An imagined or real hatred becomes a pretext for hateful acts if a person has a need and desire to hate. Sooner or later, this need will be expressed in aggression. Even if there is a real objective murderous force, which derives from a hopeless attempt to build one’s own cursed existence on the misfortune and death of those around.

    In order to find support in the external world, this deadly force needs to falsify it. The world is not suitable as is. It is literally true that Streicher and Goebbels could not hate Jews because they did not know them at all. If they had known this people with true, live knowledge, this hatred could not have developed. Their hatred related to that distorted, deformed notion of the Jewish people that they themselves had created and that was dictated by their need to hate. In the institutions of the National Socialist Party, in the Erfurt Institute, there were enormous piles of material about the Jewish people, but the thousands of pieces served them only to create a monstrous mosaic of slander. 

    In the same way, the people who sent me to this camp did not know me. Their hatred consisted precisely of their not wanting to know me and not having hesitated to turn my life and face into a screen onto which to project an NKVD film: “A threat to society, a lawbreaker. Henceforth this person will be not what he thought he was but what we want him to be and what we shall make of him.” In order to erase my existence as they did, one had to harbor a great, formidable hatred  of humanity.

    Until we uproot this hatred, it will not stop slandering people and their real impulses, will not cease circling around us, seeking out our every weakness, mistake, and sin, which are numerous, not in order to understand us and to help us but in order to blame us for its own thirst for cruelty and blood.

    Pathological hate reflects the primal instinct of the rapacious beast who knows that he can appease his agonizing hunger by the warm blood of another. Millennia of cultural development infinitely distanced and complicated this instinct by pseudo-rational sophistry and self-deception. Human rapaciousness exceeded that of the beasts, differing from it in that it manifested itself under senseless pretexts in the name of imaginary goals. The struggle against hatred is thus not limited by humankind’s biological nature but encompasses all the specifically inhuman, the perversions, and the lies that comprise the anomaly of highly developed culture and cannot be destroyed until its existence becomes common knowledge. 

    Free and perspicacious people someday will destroy  hatred and create a world where no one will need to hate or oppose hatred. The human striving for freedom is incompatible with hate. Without going into complex definitions of freedom, one can agree that as it develops, freedom will steadfastly expel lies and hatred not only from the human heart but also from human relationships and the social order. Opposition to lies and hatred is thus already the first manifestation of human freedom.

    Having finished my investigation of hatred with this proud phrase, I turned over onto my back and looked around: I was lying in a meadow, on green grass at the end of the camp. The forbidden zone started five steps away and a tall palisade with barbed wire spread around. Several prisoners swarmed in the forbidden zone; they were cutting the grass and digging up earth. Under the windows of the hospital kitchen formed a line of medics with buckets for soup and kasha.

    In the most minuscule hand, I erased all dangerous hints.  I read it with the eyes of the security operative: it was an “antifascist” document written by a stranger, but it was not blatantly counterrevolutionary. Understandably, there was not a word about Soviet reality in this manuscript. I had to keep in mind that it could be taken away at any moment in a search. 

    But I had pity on my manuscript. There was no chance of hiding a work of that size for a long time in the camp. Suddenly, I had a fantastic idea. I got up and went to the KVCh, where two girls were sitting at two tables. 

    “What do you want?”

    “This is what I want,” I said slowly. “I have a manuscript of about a hundred pages. … I am an academic and wrote something in my specialty. In the barrack, you know, it’s dangerous. They’ll tear it up to use for rolling cigarettes. I want to give it to the KVCh for safekeeping. When I leave here, you’ll return it to me.”

    The girl was taken aback. She and her friend looked at me in dull astonishment, suspiciously, as at someone abnormal. In the end, she went to the phone and asked the guardhouse to connect her to the security supervisor.

    “Comrade supervisor, someone came here, brought a manuscript, and asks that we take it for safekeeping. He says that he is a scientific worker.”

    She repeated this several times over the telephone, then she turned to me:

    “Your name?” 

    I gave it.

    The girl conveyed my name, listened to the answer, and hung up the receiver. “The supervisor said,” she turned to me, hardly keeping back laughter, “’let him throw his manuscript into the outhouse’.”

    The Wonder of Terrence Malick

    The best American film of 2019, A Hidden Life, was little seen, and nominated for nothing. Why be surprised? Or assume that our pictures deserve awards any more than the clouds and the trees? Try to understand how movies may aspire to a culture that regards Oscars, eager audiences, and fame as relics of our childhood. The ponderous gravity of The Irishman and its reiterated gangster fantasy, the smug evasiveness of Once Upon a Time … in Hollywood, were signs that old movie habits were defunct. Parasite was no better or worse than cute opportunism. It was a wow without an echo. Whereas A Hidden Life was like a desert, known about in theory but ignored or avoided. I use that term advisedly, for Malick is a student who knows deserts are not dull or empty. They are places that can grow the tree of life as well as any forest. Simply in asking, what is hidden here?, Terrence Malick was leading us to ponder, What should a movie be?

    He had never volunteered for conventional schemes of ranking. His creative personality can seem masked or obscure, but his reticence is portentous too, and it belongs to no one else. Had he really taught philosophy at M.I.T. while doing a draft for Dirty Harry? Please say yes: we so want our auteurs to be outlaws.  His self-effacement, his famous “elusiveness,” was often seen as genius. Yet some early admirers felt he had “gone away” in the twenty-first century, or migrated beyond common reach. People regarded his private and idiosyncratic work as egotism, no matter how beautiful it might be. Some were disinclined even to try A Hidden Life after the inert monuments that had preceded it. But it was — I say it again — the best American film of 2019, a masterpiece, and it invited us to try and place Malick, and to ponder if our “map” was part of the problem. To put it mildly, A Hidden Life does not seem American (or even Austrian, where it was set and filmed). It is occurring in cultural memory as a sign of what we might have been.

    There was never a pressing reason to make up our minds about Malick. He was casual, yet lofty; he might be an artist instead of a regular American moviemaker in an age when it was reckoned that tough pros (like Hawks and Hitchcock) made the best pictures. Thus he began with two unwaveringly idiosyncratic films — Badlands in 1973 and Days of Heaven in 1978. He took in their awed reception and then stopped dead for twenty years, and let his reputation become an enigma. Did he really prefer not to appear with his movies, or give helpful interviews, so that he could be free to pursue ornithology and insect life? Was he unpersuaded by careerist plans, or cleaning up in the manner of Spielberg or Lucas? In never winning an Oscar, he has made that statuette seem a stooge.

    It has always been hard to work out his intentions. Going on the titles, Badlands could be a perilous vacation, while Days of Heaven might promise transcendence. In the first, across the empty spaces of the Dakotas and Montana, Kit Carruthers found his daft halcyon moment of aimlessness while being taken for James Dean, while in the latter, in its gathering of rueful magic hours, we encountered a broken family where a guy was shot dead, his girl was thinking of being a hooker to survive, and the kid sister was left alone with her mannered poetry (like Emily Dickinson voiced by Bonnie Parker). In its locusts and fire, and a screwdriver thrust in the farmer’s fragile chest, Days of Heaven spoke to the ordeal of frontier people in 1916 going mad, skimming stones at entropy, or posing for the pictures in Wisconsin Death Trip (published by Michael Lesy in the year Badlands opened). The two films together said that America was an inadvertently gorgeous place full of terrors. 

    Those early films were filled with love and zest for drab characters buried in the hinterland yet nursing some elemental wonder. But decades later, in 2012, To the Wonder felt like a crushing title for a film that had lost touch with ordinary poetry. Its women were models fleeing from Vogue. Whereas Sissy Spacek as Holly in Badlands (twenty-four yet doing fourteen without strain or condescension) was somehow routine as well as brilliant. Her unwitting insights hovered at the brink of pretension, but any doubt we had was lost in captivation for this orphan who packed vivid party dresses for her violent spree into emptiness. This was after Kit had shot her father dead, not just because dad didn’t approve of a garbage collector outlaw going with his Holly, but because he hadn’t the patience to listen to the rambling solo that was so folksy and menacing — “Oh, I’ve got some things to say,” Kit promised. “Guess I’m lucky that way.”

    And Holly did feel wonder for this vagrant actor. It was there in the flat adoration that Spacek offered him. She slapped his face for killing Dad, but then went along with him, too matter of fact to pause over spelling out love, but utterly transported by this signal for young getaway. Badlands was akin to Bonnie and Clyde, but you felt that Kit and Holly were in a marriage they did not know how to express. And they were sustained by Malick’s amused affection. He was close to patronizing his couple, maybe, making them babes in the woods in a surreal frame, but he felt their romance as much as he was moved by sunsets and the childish tree houses that they built. They were savage yet humdrum, and Kit’s killings were as arbitrary or impulsive as his funny chat. Yes, he was psychotic and headed for the electric chair, but the sweet interior desolation of their America understood them and treated them kindly. When Kit was captured at last, the horde of cops, sheriffs, and soldiers recognized that he was a cockeyed hero, the escapee they had dreamed of.

    One can love Bonnie and Clyde, but that love story is self-conscious about its lust for fame; it really was the heartfelt cry of Beatty and Dunaway and a generation yearning to be 

    known. Badlands, by contrast, is so casual or inconsequential, and so appreciative of a wider span of history, the one we call oblivion. It has a notion that vagrancy and lyricism were responses to the heart of it all, the vast stretch of land where badness is as implicit as beauty. Bonnie and Clyde do not notice where they are, but Kit and Holly are specks in an emptiness as infinite as breathing. It’s only now, in retrospect, that the film seems so intoxicated with talk and its futile liberty, when Malick was headed towards a sadder future in which his stunned characters said less and less, and sometimes became so reduced to half-stifled sighs you wished they’d shut up. That early Malick loved loose talk. Compared with the directors of the early 1970s he was like a muttering Bartleby alone in a crew of insistent press-agented Ahabs.

    This leaves you wondering how few authentic outsiders American film has permitted.

    Malick was thirty-five when Days of Heaven opened, the son of a geologist, born in a small town in Illinois, of Assyrian and Lebanese descent. He graduated from Harvard, and then went on to Oxford as a Rhodes scholar, without getting any degree there. The general estimate is that he was brilliant, as witness his published translation of Heidegger’s The Essence of Reasons. But who has read that book, or is in a position to judge the translation? So it’s part of the uncertain myth that includes our wondering over whether Malick has had private money. Or some lack of ordinary need for it. How has he seemed so unprofessional?

    He is credited with the script for Pocket Money (1972), a Stuart Rosenberg film with Paul Newman and Lee Marvin that is more odd than striking. But it led to Badlands, for which he had some money from the young producer Edward Pressman, from the computer pioneer Max Palevsky, and from a few people he knew. All of which meant it wasn’t a regular production like other films of 1973 — The Exorcist, Mean Streets, The Sting,  American Graffiti, The Way We Were. Caught between two parts of The Godfather, it didn’t seem to hear them or know them. Badlands may have cost $300,000. Warner Brothers bought the picture and released it: it closed the New York Film Festival in 1973, and if it perplexed audiences, there was a sense 

    that something rare and insolent had passed by. Badlands didn’t care what we felt: suspense and attention were mere horizons in its desert, not luxury hotels. It was an American product, but it had a more hushed European ambition. You could label it a Western if you were ready to agree that Hollywood, born and sited in the West, never knew or cared where it was.

    Some tried to see Badlands as a slice of picaresque life. We knew it was derived from a real case, a minor outrage on the remote prairie. In 1957-1958, in Nebraska mostly, the nineteen-year-old Charles Starkweather had killed ten people with fourteen-year-old Caril Ann Fugate as his companion. Fugate actually served seventeen years in prison, no matter that in the movie she says she married her lawyer. (That was a prettification or a kind of irony.) And there was more real grounding in the steady assertion that Martin Sheen’s Kit was a lookalike for James Dean and therefore rooted in popular culture. Kit and Holly dance to Mickey and Sylvia singing “Love is Strange,” from 1956.

    Strange was only half of it. In 1973, the feeling that sex was at hand on the screen was still pressing. As Kit took off with Holly, it was natural to think they would soon be fucking. Malick allowed an offhand obligation to that craze — $300,000 carried some box office responsibility — but he was too unimpressed or philosophical to get excited about it. Married three times by now, he doesn’t do much sex on screen. “Did it go the way it’s supposed to?” Holly asks Kit about their unseen coupling. “Is that all there is to it? Well, I‘m glad it’s over.” All said without prejudice to their affinity or their being together. 

    The absent-minded talk meant more than the way Sissy Spacek secured the top button on her dress “afterwards.” After all, her character was fifteen and he was twenty-four. Yet they were both children in Malick’s art. And then, like kids, they lost interest in their adventure, even in sex, the sacrament in so many pictures of the 1970s. The novelty of Badlands was its instinct that life was boring or insignificant. And that was asserted amid a culture where movies had to be exciting, urgent, and “important.”

    Malick knew that “importance” was bogus. Or he had his eye on a different order of significance. And other truths and differences were inescapable in his film: that no runaway kids had the temerity or the rhythm for talking the way these two did; that stranger than “Love is Strange” was the way Carl Orff and Erik Satie played in their summer as warnings against “realism.” The people in the film were not just movie charac-ters, they were shapes in a mythology. A similar thing happened in Days of Heaven with its triangle of attractions, where Richard Gere, Brooke Adams, and Sam Shepard seemed unduly pretty for the Texas Panhandle. Malick had narrative problems on that picture which he solved or settled by summoning the voice of Linda Manz’s kid sister — a laconic, unsentimental, yet dreamy observer of all the melodrama. (The voice was sister to Holly, too.) She was part of the family, but her voiceover let us feel the narrative was already buried in the past, and nothing to fret over. Life itself was being placed as an old movie.

    Days of Heaven was extreme in its visualization: it included a plague of locusts, which was an epic of cinematography and weird special effects, involving showers of peanut shells and characters walking backwards. But the quandary of the Brooke Adams character, in love with two men, both unlikely in the long term, was the closest Malick had come to novelistic drama. I still feel for Shepard’s farmer, a rich man at a loss with feelings, though Malick had the sense to save the reticent Shepard from “acting.” Instead he was simply photographed, as gaunt and theoretical as his great house planted on the endless prairie. Just as he was shy of sex, so Malick the director was hesitant over what the world called story.

    No great American director has carried himself with such indifference as to whether he was being seen, let alone understood.  To see Malick’s work has always been a way of recognizing that the obvious means of doing cinema — appealing stories with likeable actors that move us and make money — was not settled in his mind. I think that is one reason why he stopped for twenty years — just to remain his own man, and not to yield to the habit of eccentric beauty in case it became studied, precious, or crushingly important.

    Thus, in 1998, The Thin Red Line seemed to believe in a new kind of authenticity and directness. Wasn’t it “a war movie”? Didn’t it make more money than Malick has ever known? Wasn’t it about killing the enemy, that blessed certainty that films provide for fearful guys? It offered Guadalcanal in 1942, and it came from a James Jones novel, the writer of From Here to Eternity, which for someone of Malick’s age really was the Pacific War, despite being short on combat and going no farther west than Hawaii. The Thin Red Line is the infantry, landing on an island, and reckoning to take its peaks and destroy the enemy. It is a man’s world that male audiences might relax with. There are only fragmentary glimpses of women left at home — a rapturous shot of an occupied dress on a summer swing, something that would become an emblem of happiness in Malick’s work.

    But nothing competes with the ferocity of Colonel Tall, played by Nick Nolte in the most intense performance in a Malick picture, as a commander whose orders were abandoned and denied. That is not how war films are supposed to work: no one ever challenged John Wayne at Iwo Jima or Lee Marvin in The Big Red One. But Malick’s thin red line is less conventional or reliable. It finds its example in the wistful instinct to desert on the part of a common soldier, Private Witt (Jim Caviezel). For Jones, Witt was an extension of the brave victim Prewitt 

    whom Montgomery Clift played in From Here to Eternity, but for Malick the lonely private is another version of Bartleby, who gives himself up finally not just in heroism but in almost yielding to hesitation.

    Maybe this was once a regular combat picture, to be set beside the work of Sam Fuller or Anthony Mann. But not for long: inscape pushes battle aside for a contemplation of tropical grasses tossing in the wind, insect life, parrots and snakes, intruded on for a moment by war but not really altered by it. Malick has an Emersonian gift for regarding human affairs from the standpoint of nature. It is in the perpetuity of nature that Malick perceives the strangeness, and the humbling, in Earth’s helpless duration. This war prepares us for the bizarre little dinosaurs in The Tree of Life, and the unnerving perspective in which we observe or suffer the earnestness of Sean Penn in that film.

    That touches on a fascinating atmosphere attached to Malick and his blithe treatment of stars. In his long absence from the screen, the glowing characters in those first two films seemed to attract actors, as if to say it might be them, too. He seemed as desirable for them as Woody Allen — and sometimes with a similar diminution of felt human reality. He must have been flattered that so many stars wanted to work for him; he may have forgotten how far he had excelled with newcomers or unknowns. Still, I found it disconcerting when John Travolta or George Clooney suddenly turned up in spiffy, tailored glory in The Thin Red Line, and one had the feeling with The Tree of Life that Sean Penn was vexed, as if to say, “Doesn’t Terry know I’m Sean Penn, so that I deserve motivation, pay-off, and some scenes, instead of just wandering around?” Led to believe he was central to The Thin Red Line, Adrien Brody was dismayed to find he had only a few minutes in the finished film.

    Was this just an experimenter discovering that his film could remain in eternal post-production? Or was it also a creeping indifference to ordinary human story? Was it an approach that really required novices or new faces? How could big American entertainments function in this way? How was Malick able to command other people’s money on projects that sometimes seemed accidental or random, on productions that had several different cuts and running times? He seemed increasingly indecisive and fond of that uncertainty, as if it were a proof of integrity. Was he making particular films, or had the process of filming and its inquiry become his preoccupation? How improvisational a moviemaker is he? And what were we to make of its end products — or was “the end” a sentimental destination mocked by the unshakable calm of 

    duration? How could anyone get away with The Thin Red Line costing $90 million and earning back only a touch more? I could make a case for The Thin Red Line as Malick’s best film and the most intellectually probing of them all. But “best” misses so many points. To shoot it, Malick had gone to the jungles of northern Queensland and even the Solomon Islands. The weapons and the uniforms seemed correct, but the hallowed genre of war movie was perched on the lip of aestheticism and absurdity and surrealism.

    As a world traveler and a naturalist — his nature films are certainly among his most marvelous achievements — Malick was especially sensitive to terrain. For The New World, in 2005, he went to the real sites, the swampy locations, of early settlement in Virginia. He researched or concocted a language such as the natives might have spoken. His tale of John Smith, Pocohontas, and John Rolfe has many enthusiasts for its attempt to recreate a time so new then and so ancient now. This was also a historical prelude to the wildernesses in Badlands and Days of Heaven. It might even begin to amount to a history of America.

    I had worries about the film, and I have never lost them. Its Pocohontas was exquisite and iconic, even if the picture tried to revive her Powhartan language. But the actress, Q’orianka Kilcher, was also part German, part Peruvian, raised in Hawaii, a singer, a dancer, a stunt performer, a princess of modernity, with evident benefit of cosmetics and a gymnasium. Whereas Sissy Spacek in Badlands had a dusty, closed face credible for the look of a kid from Fort Dupree in South Dakota in the 1950s, uneducated, indefatigably unradiant, born bored, more ready for junk food than primetime fiction. That background was what made Holly so absorbing, and it was Kilcher’s emphatic beauty that shifted The New World away from urgency or naturalism. It was as if Angelina Jolie or Joan Crawford were pretending to be the Indian maiden.

    In a way, Pocohontas was the first adult female in Malick’s work, but was that a warning sign that maybe he didn’t fathom grown up women once they had got past the wry baby talk that makes the first two films so endearing? The New World did not really have much caring for Native Americans, for women, or for the challenge of Europeans determined to take charge of any viable Virginia. It was a film that opted for the picturesque over history, whereas Badlands and Days of Heaven lived on a wish to inhabit and understand America in the unruly first half of the twentieth century as a wilderness succumbing to sentimentality. But the picturesque has always been a drug in cinema, and it had been lurking there in the magic hours in Days of Heaven.

    There was a gap of six years before the pivotal The Tree of Life, perhaps Malick’s most controversial film. Here was a genuinely astonishing picture, ambitious enough to range from intimacy to infinity. In so many ways, it was an eclipsing of most current ideas of what a movie might be. At one level, it was entirely mundane, the portrait of two parents and their three sons in a small town in Texas in the 1950s. For Brad Pitt (a co-producer on the project), the father was a landmark role in which he allowed his iconic status to open up as a blunt, stubborn, unenlightened man of the 50s. Jessica Chastain was the mother, and she was placid but eternal — she was doing her pale-faced best, but surely her part deserved more substance to match not just Pitt but the wondrous vitality of the boys (Hunter McCracken, Finnegan Williams, Michael Koeth, and Tye Sheridan).

    All his working life, Malick has excelled with the topic of children at play, and as emerging forces who jostle family order. Don’t forget how in his first two pictures adult actors were asked to play child-like characters. The family scenes in The Tree of Life are captivating and affirming with a power that is all the more remarkable because the subject of the film is the family’s grief at the death of one of these children. The Tree of Life insists that the death of a child is a cosmic event. Not long after the young man’s death is announced, and before the story of the family is told in flashback, there is an unforgettable yet pretentious passage shot with almost terrifying vividness from nature — the bottom of the sea, the fires of a volcano, the reaches of space — accompanied by religious music. With an epigraph from Job, the real subject may be sublimity itself.

    No one had ever seen a film quite like it. Reactions were very mixed. The picture won the Palme d’Or at Cannes; it had many rave reviews; it did reasonable business. There were those who felt its perilous edging into pretension and a sweeping universality in which the movie vitality of the family succumbed to the melancholy of grazing dinosaurs who had never been moviegoers. But there were more viewers who recognized an exciting challenge to their assumptions. The Tree of Life prompted a lot of people in the arts and letters to revise their ideas about what a movie might be. Pass over its narrative situation, this was a film to be measured with Mahler’s ruminations on the universe or with the transcendent effects of a room full of Rothkos.

    And then Malick seemed to get lost again. He veered away from the moving austerity of Days of Heaven to a toniness more suited to fashion magazines.  There was widespread disquiet 

    about his direction, owing to the modish affectation in To the Wonder (2012), Knight of Cups (2015) and Song to Song (2017). From a great director, these seemed confoundingly hollow films that almost left one nostalgic for the time when Malick worked less.

    Ironically, To the Wonder is the one film for which he has owned up to an autobiographical impulse. It grew out of hesitation over his third and fourth wives, presented in the movie as Olga Kurylenko and Rachel McAdams, two unquestioned beauties. McAdams delivers as good a performance as Brooke Adams in Days of Heaven, but there are moments where her character’s frustrations could be interpreted as the actress’ distress over poorly written material. Malick was now running scared of his ear for artful, quirky talk. But the women in To the Wonder are betrayed by the worst example of Malick’s uninterested stars. Ben Affleck is the guy here, allegedly an “environmental inspector.” That gestural job allows some moody depictions of wasteland and some enervated ecstasy over the tides around Mont-Saint-Michel in France. Yet the situation feels the more posed and hollow because of Affleck’s urge to do as little as possible. His hero is without emotional energy; he deserves his two women as little as male models earn their expensive threads in fashion spreads. The film’s clothes are credited to the excellent Jacqueline West, but they adorn a fatuous adoration of affluence.

    West was part of Malick’s modern team: the film’s producer was Sarah Green; the engraved photography was by the extraordinary Emmanuel Lubezki; the production design was from Jack Fisk still, who had held that role since Badlands, where he met and then married Sissy Spacek; the aching music was by Hanan Townshend in a glib pastiche of symphonic movie music — it was so much less playful or spirited than the score for Badlands. The only notable crew absentee was Billy Weber, who has been the editor on many Malick pictures. To the Wonder is said to have earned $2.8 million at the box office, and it’s hard to believe it cost less than $20 million. If that sounds like a misbegotten venture, wait till you struggle through it and then wonder what let Malick make another film in the same clouded spirit, Knight of Cups. And then another: Song to Song, the ultimate gallery of beautiful stars, supposedly about the music world of Austin, which came off semi-abstract no matter that Malick had lived there for years.

    Any sense of experience and vitality seemed to be ebbing away. Was he experimenting, or improvising, or what? The several loyalists involved, as well as those players who were filmed but then abandoned, might say it was a privilege to be associated with Terry. I long to hear some deflating rejoinders to that from Kit Carruthers. There was a wit once in Malick that had now gone missing. I say this because a great director deserves to be tested by his own standards, which in Malick’s case are uncommonly high. Even with the more adventurous Christian Bale as its forlorn male lead — a jaded movie screenwriter — Knight of Cups is yet more stultifyingly beautiful and Tarot-esque, with a placid harem of women (from Cate Blanchett to Isabel Lucas, from Imogen Poots to Natalie Portman), all so immediately desirable that they do not bother to be awake. Richard Brody said it was “an instant classic,” which only showed how far “instant” and “classic” had become invalid concepts. The film earned a touch over $1 million, and it had disdain for any audience. It was a monument to a preposterous cinephilia and to a talent that seemed in danger of losing itself.

    Those are harsh words, but I choose them carefully, after repeated viewings, and in the confidence that Badlands, Days of Heaven and The Thin Red Line are true wonders. The Terrence Malick of early 2019, passing seventy-five, was not a sure thing. And then he retired all doubt about his direction and released his fourth great film; and surely four is enough for any pantheon.

    Malick had been contemplating A Hidden Life and the historical incident upon which it is based for a few years. In 1943, Franz Jagerstatter was executed in Berlin for refusing to take an oath of loyalty to Adolf Hitler. He was a humble farmer high in the mountains of northern Austria, where he lived with his wife, his three daughters, his sister-in-law, and his mother. They were valued members of a small community and worked endlessly hard to sustain their meager living. They were devout Catholics, and Franz had done his military service without thinking too much about it. His farm and his village are surrounded by breathtaking natural beauty, and Malick lingers long over the fields and the peaks and the clouds in a way that teaches us that even Nazism is ephemeral.

    The film has few long speeches in which Jagerstatter spells out his reluctance to honor the Nazi code. He is more instinctive than articulate. He knows the fate he is tempting; he understands the burden that will put upon his wife and children; he appreciates that he could take the oath quietly and then do non-combatant service. It is not that he understands the war fully or the extent of Nazi crimes. He is not a deliberate or reasoned objector. But just as he feels the practical truths in his steep fields and in the lives of his animals, and just as he is utterly loyal to his wife, so he believes that the oath of allegiance will go against his grain. He does not show a moral philosophy so much as a moral sense. He cannot make the compromise with an evasive form of words.

    There is no heavy hint in A Hidden Life of addressing how Americans in our era might withhold their own allegiance to a leader. But the film rests on a feeling that such cues are not needed for an alert audience living in the large world. We are living in a time that will have its own Jagerstatters. That is part of the narrative confidence that has not existed in Malick since Days of Heaven. It amounts to an unsettling detachment: he shares the righteousness of Jagerstatter, but he does not make a fuss about his heroism. In the long term of those steep Alps and their infinite grasslands, how much does it matter? Do the cattle on the farm know less, or are they as close to natural destiny as the farmer’s children?

    That may sound heretical for so high-minded a picture. And there is no escaping — the final passages are shattering — how Jagerstatter is brutalized and then hung by the Nazi torturers and executioners. The Catholic church would make a saint of him one day, and Malick has taken three hours to tell what happened, but the film has no inkling of saintliness or a cause that could protect it.  The farmer’s wife, rendered by Valerie Pachner as sharp and uningratiating, does not need to agree with her man, or even to understand him. People are alike but not the same, even under intense pressure. No one could doubt Malick’s respect for Jagerstatter, and August Diehl is Teutonically tall, blond, and good-looking in the part. But he is not especially thoughtful; his doubts over the oath are more like a limp than a self-consciously upright attitude. Certainly the old Hollywood scheme of a right thing waiting and needing to be done leaves Malick unmoved; he would prefer to be a patient onlooker, a diligent chronicler, attentive and touched, but more rapt than ardent, and still consumed by wonder.

    Malick has admitted how often he had got into the habit of working without a script (or a pressing situation), so that he often filmed whatever came into his head. But he seems to have learned how far that liberty had led him astray. So A Hidden Life has as cogent a situation as those in Badlands and Days of Heaven. That does not mean those three films are tidy or complacent about their pieces clicking together. They are all as open to spontaneity and chance as The Thin Red Line. But just as it is trite and misleading to say that The Thin Red Line was a film about war, so A Hidden Life feels what its title claims: the existence of an inwardness that need not be vulgarized by captions or “big scenes.” The film concludes with the famous last paragraph of Middlemarch, about the profound significance of “hidden lives” and “unvisited tombs.” Yes, this is what a movie, a heartbreaking work, might be for today. As for its relative neglect, just recall the wistful look on the dinosaur faces in The Tree of Life.

    We can do our best, we can make beauty and find wisdom, without any prospect of being saved from oblivion.

    Owed To The Tardigrade

    Some of these microscopic invertebrates shrug off temperatures
    of minus 272 Celsius, one degree warmer than absolute zero.
    Other species can endure powerful radiation and the vacuum of space.
    In 2007, the European Space Agency sent 3,000 animals
    into low Earth orbit, where the tardigrades survived
    for 12 days on the outside of the capsule.

    The Washington Post, “These Animals can survive until the end
    of the Earth, astrophysicists say”

    O, littlest un-killable one. Expert
    death-delayer, master abstracter

    of imperceptible flesh. We praise
    your commitment to breath.

    Your well-known penchant
    for flexing on microbiologists,

    confounding those who seek
    to test your limits using ever more

    abominable methods: ejection
    into the vacuum of space, casting

    your smooth, half-millimeter frame
    into an active volcano, desiccation

    on a Sunday afternoon, when the game
    is on, & so many of us are likewise made

    sluggish in our gait, bound to the couch
    by simpler joys. Slow-stepper, you were

    called, by men who caught first
    glimpse of your eight paws walking

    through baubles of rain. Water bear.
    Moss piglet. All more or less worthy

    mantles, but I watch you slink
    through the boundless clarity

    of a single droplet & think
    your mettle ineffable, cannot

    shake my adoration
    for the way you hold fast

    to that which is so swiftly
    torn from all else living,

    what you abide in order
    to stay here among the flailing

    & misery-stricken, the glimpse
    you grant into limitless

    persistence, tenacity
    under unthinkable odds,

    endlessness enfleshed
    & given indissoluble form.

    A Democratic Jewish State, How and Why

    The question of whether Israel can be a democratic Jewish state, a liberal Jewish state, is the most important question with which the country must wrestle, and it can have no answer until we arrive at an understanding of what a Jewish state is. A great deal of pessimism is in the air. Many people attach to the adjective “Jewish” ultra-nationalistic and theocratic meanings, and then make the argument that a Jewish democratic state is a contradiction in terms, an impossibility. On the left and on the right, among the elites and the masses, people are giving up on the idea that both elements, the particular and the universal, may co-exist equally and prominently in the identity of the state. This way of thinking is partly responsible for the recent convulsions in Israeli politics, for the zealotry and the despair that run through it. Yet it is an erroneous and unfruitful way of thinking. It rigs the outcome of this life-and-death discussion with a tendentious and dogmatic conception of Judaism and Jewishness.  

    There is another way, a better way, to arrive at an answer to this urgent and wrenching question. Let us begin by asking a different one, a hypothetical one. Let us imagine the problem in a place that is not Israel or Palestine. Could a Catalan state, if it were to secede from Spain, be a democratic Catalan state, a liberal Catalan state? Catalan nationalism is a powerful force, and many Catalans wish to establish an independent state of their own with Barcelona as its capital, based on their claim that they constitute a distinct ethnocultural group that deserves the right to self-determination. Though recent developments in Spain have shown that the establishment of an independent Catalan state is far from becoming a reality in the near future, let us nonetheless consider what it might look like. In this future state — as in other European nation-states, such as Denmark, Finland, Norway, Germany, the Czech Republic, and others that have a language and state symbols that express an affinity to the dominant national culture — the Catalan language would be the official language, the state symbols would be linked to the Catalan majority, the official calendar would be shaped in relation to Christianity and to events in Catalan history, and the public education of Catalans would insure the vitality and the continuity of Catalan culture, transmitting it to the next generation. Revenues from taxation would be distributed solely among Catalan citizens and not across Spain, and the foreign policy of the Catalan state would reflect the interests of the ethnocultural majority of the state. It is very probable that Catalunya’s immigration policy, like that of all contemporary European and Scandinavian states, would attempt to safeguard the Catalan majority in its sovereign territory. 

    It is important to note that these aspects of a Catalan state would not reflect anything unusual in the modern political history of the West. The Norwegians, for example, demanded all these characteristics of statehood in 1907, when they seceded from Sweden (under threat of war) since they saw themselves as a separate national group. In the matter of identity, Catalunya, like Norway, would not be a neutral state in any meaningful fashion, and there is no reason that it should be a neutral state. Members of the Catalan group deserve a right to self-determination, which includes a sovereign territory inhabited by a Catalan majority in which a Catalan cultural public space is created and the culture of the majority is expressed. 

    But this is not all we would need to know about a Catalan nation-state that purports to be a democracy. The test of the question of whether Catalunya, or any other state, is democratic is not dependent upon whether it is neutral with respect to identity. Its moral and political quality, its decency, its liberalness, will be judged instead by two other criteria. The first is whether its character as a nation-state results in discriminatory policies towards the political, economic, and cultural rights of the non-Catalan minorities that reside within it. The second is whether Catalunya would support granting the same right of self-determination to other national communities, such as the Basques. Adhering to these two principles is what distinguishes democratic nation-states from fascist ones. 

    Ultra-nationalist states are sovereign entities in which the national character serves as a justification for depriving minorities of political, economic and cultural rights. In the shift to ultra-nationalism that we are witnessing around the world today, such states also attack and undermine the institutions that aim at protecting minorities — the independent judiciary, the free press, and NGO’s dedicated to human and minority rights. In addition, ultra-nationalists states do not support granting rights of self-determination to nations that reside within them or next to them. They generally claim that no such nations exist, or that the ethnic groups that call themselves a nation do not deserve the right to self-determination.

    The legitimacy of Israel as a nation-state should be judged just as we would judge any other nation-state, according to these two principles. If, in the name of the Jewish character of the state, the Arab minority in Israel is deprived of its rights, the very legitimacy of the State of Israel as a Jewish nation-state will be damaged. Discrimination in the distribution of state resources in infrastructure, education, and land, and the refusal to recognize new Arab cities and villages in the State of Israel, threatens to transform it from a democratic nation-state into an ultra- nationalist state. Such a threat to the democratic character of the state is posed also by recent legislative attempts (which fortunately have failed) to demand a loyalty oath solely from Israel’s Arab citizens. The threat is heightened by a political plan put forth by elements of the Israeli radical right, which, in a future agreement with the Palestinians, would deny Israeli citizenship to Israeli Arabs, by virtue of a territorial exchange that would include their villages in the territory of a future Palestinian state. This is to act as if the Israeli citizenship of the Arabs of Israel is not a basic right, but a conditional gift granted to them by the Jewish nation-state — a gift that can be rescinded to suit the interests of Jewish nationalism. The Nation-State law that was passed by the Israeli parliament in 2018, which formulates the national identity of the country in exclusively Jewish terms, is an occasion for profound alarm, in particular in its glaring omission of an explicit commitment to the equality of all Israeli citizens, Jews and Arabs alike. Such a commitment to the equality of all citizens was enshrined in Israel’s Declaration of Independence, the founding document that to this day contains the noblest expression of the vision of Israel as Jewish and democratic. The commitment to the equality of all citizens might be legally and judicially ensured in relation to other basic laws in Israel’s legal system, yet its striking absence from this latest official articulation of the character of the state is yet another marker of the drift to ultra-nationalism. 

    The structural discrimination manifested in these examples constitutes an unjustified bias against the Arab citizens of Israel. It also serves to undermine the very legitimacy of the Jewish state. A Jewish nation-state can and must grant full equality to its Arab citizens in all the realms in which it has failed to do so until now. It must recognize them as a national cultural minority, with Arabic as a second official language of the state and the Islamic calendar as an officially recognized calendar. The public educational system must be devoted, among other goals, to the continuity of the Arab cultural traditions of Israel’s citizens. 

    In the recent elections held in Israel, three within a single year, the participation of the Arab citizens of Israel in the vote increased by 50%, reaching very close to the percentage of the vote among Jewish citizens. This is a wonderful and encouraging sign of the greater integration of the Arab population in larger Israeli politics. As a result the Joint List, the Israeli Arab party, which encompasses different ideological and political streams in the Arab community of Israel, increased its seats in Israel’s Knesset from ten to fifteen — an extraordinary achievement. But its positive impact was undone by the disgraceful failure of the left and center to form a government with the Joint List on the grounds that a government that rests on the Arab vote is unacceptable. Thus was lost an historic opportunity to integrate the Arab minority as an equal partner in sharing governmental power.  

    As is true of all other legitimate democratic nation-states, the second condition that Israel must maintain is the recognition of the right of the Palestinian nation to self-determination in Gaza and the West Bank — the same right that Jews have rightly demanded for themselves. The denial of such a right, and the settlement policy that aims at creating conditions in which the realization of such a right becomes impossible, similarly damage the legitimacy of Israel as a Jewish nation-state. The Trump plan for peace includes, among its other problematic aspects, the annexation of the Jordan Valley to the state of Israel, which would constitute yet another significant impediment to the possibility of a two-state solution. If any Israeli government includes such an annexation in its plans, it will also create de facto conditions that will undermine the possibility of a Jewish democratic state in the future. 

    It is important to stress that the fulfillment of the first condition — equal rights to minorities — is completely within Israel’s power. Discrimination against citizens of your own country is always a self-inflicted wound. The second condition, by contrast, the recognition of the Palestinian right to self-determination, is not exclusively in the hands of Israel. The conditions of its realization are much more complicated. It depends to a significant degree upon Palestinians’ willing ness to live side by side with the State of Israel in peace and security. The situation with regard to the possibility of such co-existence is difficult and murky and discouraging on the Palestinian side — and yet Israel must nevertheless make clear its recognition of the Palestinian right to self-determination, not least for the simple reason that achieving it will lend legitimacy to Israel’s own claim to the same right.  

    If democracy and decency do not require cultural neutrality from a nation-state, then how should the identity of the majority be recognized in such a state without vitiating its liberal principles? There are four ways, I believe, that the Jewish nature of the State of Israel should be expressed. The first is to recognize the State of Israel as the realization of the Jewish national right to self-determination. In this era, when the meaning of Zionism is mangled and distorted in so many quarters, it is important to recognize what Zionism incontrovertibly is: a national liberation movement aimed at extracting a people from the historic humiliation of dependence on others in defining their fate. That remains its central meaning. Zionism gave one of the world’s oldest peoples, the Jewish people, the political, military, and economic ability to define themselves and defend themselves.  

    The most fundamental feature of Israel as a Jewish state resides, therefore, in its responsibility for the fate of the Jewish people as a whole. If the responsibility of the State of Israel were confined only to its citizens, it would have been only an Israeli state. In light of this responsibility to a people, it has the right and the duty to use the state’s powers to defend Jews who are victimized because they are Jews.

    The second feature that defines Israel as a Jewish state is the Law of Return. This law, which was established in 1950, and is intimately connected to the first feature of national self-determination, proclaims that all Jews, wherever they are, have a right to citizenship in the State of Israel, and can make the State of Israel their home if they so desire. The State of Israel was created to prevent situations — plentiful in Jewish history — in which Jews seeking refuge knock on the doors of countries that have no interest in receiving them. For the same reason, Palestinian refugees in the Arab states ought to have immediate access to citizenship in the state of Palestine when it is established.

    Yet the justification of the Law of Return does not rest exclusively on conditions of duress. If national groups have a right to self-determination — the right to establish a sovereign realm where they constitute the majority of the population, and where their culture develops and thrives — it would be odd not to allow Jews or Palestinians a right of citizenship in their national territory. It is also important to emphasize that the Law of Return is legitimate only if accompanied by other tracks of naturalization. If the Law of Return were the only way of acquiring Israeli citizenship, its exclusively national character would harm the rights of minorities and immigrants who are not members of the ethnic majority. Safeguarding the ethnocultural majority in any state is always severely constrained by the rights of minorities. Thus the transfer of populations, or the stripping of citizenship by the transfer of territory to another state, are illegitimate means of preserving a majority. It is crucial, therefore, that other forms of naturalization exist as a matter of state policy, including granting citizen-ship to foreign workers whose children were born and grew up in Israel, and to men and women who married Israeli citizens.

    The third expression of the Jewishness of the State of Israel relates to various aspects of its public sphere, such as its state symbols, its official language, and its calendar. These symbolic institutions are derived from Jewish cultural and historical symbols, including the menorah and the Star of David; Hebrew is the official language; Israel’s public calendar is shaped according to the Jewish calendar; and the Sabbath and Jewish holidays are official days of rest. Yet a democratic state demands more. The public expression of the majority culture must go along with granting official status to the minority cultures of the state, including Arabic as the second official language of the state of Israel, and recognizing the Islamic calendar in relation to the Arab minority. Again, official symbols and practices that have an affinity to the majority culture exist in many Western states: in Sweden, Finland, Norway, Britain, Switzerland and Greece, the cross is displayed on the national flag. In all those cases, the presence of state symbols that are connected to the religion and culture of the majority does not undermine the state’s democratic and liberal nature. In many of those states, however, there are powerful political forces that wish to limit democracy to the dominant ethnicity. The historical challenge in these multiethnic and volatile societies — and Israel also faces this challenge — is to prevent the self-expression of the majority from constraining or destroying the self-expression of the minority. 

    The fourth essential feature of a democratic nation-state, and the most important one, relates to public education. In the State of Israel, as a Jewish state, the public system of education is committed to the continuity and reproduction of Jewish cultures. I emphasize Jewish cultures in the plural, since Jews embrace very different conceptions of the nature of Jewish life and the meaning of Jewish education. In its commitment to Jewish cultures, the State of Israel is not different from many modern states whose public education transmits a unique cultural identity. In France, Descartes, Voltaire, and Rousseau are taught, and in Germany they teach Goethe, Schiller, and Heine. The history, the literature, the language, and sometimes the religion of different communities are preserved and reproduced by the system of public education, which includes students of many ethnic origins. Jews who happen to be German, American, or French citizens and wish to transmit their tradition to their children must resort to private means to provide them with a Jewish education. In Israel, as in other modern states (though not in the United States), such an education should be supported by state funds. This commitment does not contradict — rather, it requires — public funding for education that, alongside the public education system, insures the continuity of the other traditions represented in the population of the state, the Islamic and Christian cultures of the Arab minority in Israel. The culture of a minority has as much right to recognition by the state as the culture of the majority. 

    There are voices that maintain that the only way to secure Israel’s democratic nature is to eliminate its Jewish national character and turn it into a state of all its citizens, or a bi-national state. This sounds perfectly democratic, but it would defeat one of the central purposes of both national communities. In this territory there are two groups that possess a strong national consciousness — Jews and Palestinians; and there is no reason not to grant each of them the right of self-determination that they deserve. Moreover, a state of all its citizens in the area between the Jordan River and the Mediterranean Sea would, in fact, be an Arab nation-state with a large Jewish minority. It would become a place of exile for the Jewish minority. Historical experience in this region, where national rights and civil liberties are regularly trampled, suggests that Greater Palestine would be one of the harshest of all Jewish exiles.

    Honoring the status of the Arab citizens of Israel and espousing the establishment of a Palestinian state ought not to focus on — and does not require — the impossible and unjust annulment of the Jewish character of the State of Israel. It should focus instead on the effort to create full and rich equality for the Arab minority in Israel, and on the possibility of establishing a Palestinian nation-state alongside the state of Israel.

    In a Jewish state, the adjective “Jewish” carries within it another crucial challenge to liberal democracy, which is not tied to its national content but to its religious implications. This Jewish character, or the religious meaning of the adjective “Jewish,” might harm the freedom of religion in the state. Indeed, some currents in Israeli Judaism — and some religiously inspired ideological and political trends in the Jewish population of Israel — constitute a powerful and complex challenge to Israeli liberalism. Some voices assert that the Jewish identity of the state justifies granting the weight of civil law to Jewish law, and the use of the coercive machinery of the state for the religious purposes of the dominant community. 

    But a Jewish state conceived in this way could not be democratic in any recognizable manner, for two reasons: it would harm both the religious freedom of its citizens and the religious pluralism of the communities that constitute it. The attempt to “Judaize” the state through religious legislation, above and beyond the four features mentioned above, would undermine Israel’s commitment to liberalism and destroy some of its most fundamental founding principles. It would take back the pluralism that was explicitly and stirringly guaranteed in Israel’s Declaration of Independence. 

    Since the nineteenth century, Jews have been deeply divided about the meaning of Jewish identity and their loyalty to Jewish law. Jews celebrate the Sabbath in a variety of ways. They disagree ferociously about basic religious questions, including the nature of marriage and divorce. Any attempt to use the power of the state to adjudicate these deep divisions would do inestimable damage to freedom of religion and freedom from religion. In this case it would be the freedoms of Jews that would be violated. 

    The role of the state is not to compel a person to keep the Sabbath or to compel her to desecrate it. The state must, instead, guarantee that every person has the right to behave on the Sabbath as she sees fit, as long as she grants the same right to individuals and communities who live alongside her. All attempts at Judaizing the state through religious legislation — such as the law prohibiting the selling of bread in public during Passover, or the law prohibiting the raising of pigs — are deeply in error, since it is the obligation of a liberal democratic state to allow its citizens to decide these matters autonomously, as they see fit.

    The Sabbath, like other Jewish holidays, ought to be part of the official calendar of Israel as a Jewish state. A shared calendar, with Islamic and Christian holidays on it too, is an essential feature of the life of a state, and it enables a kind of division of cultural and spiritual labor, a pluralist form of cooperation among its citizens. If state institutions do not function during the Sabbath, it is not only because we would like religious citizens to be able to take equal part in the running of those institutions, but also because Israel ought to respect the Jewish calendar. The same applies as well to factories and businesses that must be shuttered during days of rest. 

    Such a policy, moreover, should be supported not for religious reasons, but owing to secular concerns about fairness. First, it allows equal opportunity to workers and owners who wish to observe the Sabbath. Historically, in the various Jewish exiles, the observance of the Sabbath sometimes caused Jews a great deal of economic hardship owing to the advantage that it conferred upon competitors who did not observe the same day of rest. In a Jewish state, Jews who observe the Sabbath ought to be free from such an economic sacrifice. The second reason for closing businesses and factories on the Sabbath concerns the rights of workers. The institution of the Sabbath is more widespread than most Jews know, and it is consistent with universal ethical considerations. Constraining the tyranny of the market over individual and family life by guaranteeing a weekly day of rest for workers and owners is common in European states which, in accordance with the Christian calendar, enforce the closing of businesses on Sunday. In a similar spirit, factories, malls, stores, and businesses ought to be closed during the Sabbath in a Jewish state — but art centers, theaters, museums, and restaurants should continue to function, so that Israeli Jews may choose their own way of enjoying the day of lovely respite.

    The abolition of the coercive power of the state in matters of religion should be applied as well to the primary domain of religious legislation in Israel: divorce and marriage. The monopoly granted to rabbinical courts in issues of divorce and marriage must finally be terminated. It is an outrageous violation of the democratic and liberal ethos of the state. Alongside religious marriage, Israel must recognize civil marriage. Such a reform would allow a couple that cannot marry according to Jewish law to exercise their basic right to form a family. It would also recognize the legitimate beliefs of many men and women who do not wish to submit to the rabbinical court, which is often patriarchal in its rulings and financially discriminates against women in divorce agreements.

     The claim of some religious representatives that establishing civil marriage would cause a rift among Jews, since members of the Orthodox Jewish community would not be able to marry Jews who did not divorce according to rabbinical procedure, is not persuasive. Many Jews all over the world marry and divorce outside the Orthodox community, and this is de facto the case in Israel as well, since many Israelis obtain civil marriages outside Israel, or live together without marrying under the jurisdiction of the rabbinate. The establishment of two tracks of marriage and divorce, religious and secular-civil, would not create division, which already exists in any case, but it would remove the legal wrong caused to Israelis who cannot practice their right to marry within Jewish law, and it would liberate those who aspire to gender equality from the grip of the rabbinical courts.

    I should confess that my analysis of the place of religion in Israel does not rest exclusively upon my liberal commitments. It is grounded also in my concern for the quality of Jewish life in Israel. Religious legislation has had a devastating impact on Jewish culture and creativity in Israel. The great temptation to settle the debate over modern Jewish identity through the coercive mechanism of the state justifiably alienates major segments of the Israeli public from Jewish tradition, which comes to be perceived by many Israelis as threatening their way of life. The deepening of alienation from the tradition, and its slow transformation into hostility, suggests that the more Jewish the laws of Israel become, the less Jewish the citizens of Israel become. 

    The Israeli parliament is not the place to decide the nature of the Sabbath, or which Jewish denomination is the authentic representation of Judaism, or who is a legitimate rabbi. Such controversies have corrupted the legislature, creating cynical political calculations in which religious matters have served as political payoffs to maintain government coalitions. The unavoidable debate on Jewish culture and religion must move from parliament to civil society. The nature of Jewish life in Israel must be determined by individuals and communities who will themselves decide how to lead their lives without interference from the state. For instance, there is no law in Israel prohibiting private transportation during the sacred day of Yom Kippur, yet the sanctity of the day is generally observed without any coercion. Wresting Judaism from the control of the politicians will unleash creative forces for Jewish renewal and allow for new ways of refreshing the tradition and extending its appeal. 

    Among the precious and time-honored institutions of Judaism which have been corrupted by the state is the rabbinate. The methods used for nominating and choosing the chief rabbis, and the rabbis of cities and neighborhoods, demonstrates that the rabbinate has turned into a vulgar patronage system, used by politicians to distribute jobs to their supporters. In many places, there is no affinity between the state-appointed rabbis and their residents. It is urgently in the interest of both Judaism and Israel that the state rabbinate be abolished.

    I do not support the total separation of religion and state as practiced in the United States. It seems to me that the model of some European countries is better suited to Israel. The establishment of synagogues and the nomination of rabbis ought to be at least partially supported by public funds, in the same way that museums, community centers, and other cultural activities are supported by the state. But this funding should be distributed in accordance with the communities’ needs and preferences, without allowing for a monopoly of any particular religious denomination over budgets and positions. Each community should choose its own rabbi according to its own religious orientation, as was the practice of Jewish communities for generations. And these same protections of freedom of religion must be granted to Muslim and Christian communities of Israel.

    Israel can and should be defined as a Jewish state, where the Jewish people exercises its incontrovertible right to self-de-termination; where every Jew, wherever he or she lives, has a homeland; where the public space, the language, and the calendar have a Jewish character; and where public education allows for the continuity and flourishing of Jewish cultures. These features do not at all undermine the democratic nature of the state, so long as Israel’s cultural and religious minorities are also granted equal and official recognition and protection, including state funding of Muslim and Christian public education systems, and the recognition of Arabic as a second official language of the state and the Muslim and Christian calendar as state calendars. In this sense, there is nothing contradictory or paradoxical about the idea of a Jewish democratic state. 

    The pessimism is premature. These essential principles can be reconciled and realized. Yet there are significant limits in such an experiment that must be vigilantly respected. Any attempt to “Judaize” the state of Israel beyond those limits would transform it into an undemocratic nation-state, and compromise its liberal nature, and undo its founders’ magnificent vision, and damage the creative Jewish renewal that may emerge from the great debate about modern Jewish identity. The tough question is not whether a Jewish state can be both democratic and liberal, but rather what kind of Jewish state do we wish to have. [END]

    Dark Genies, Dark Horizons: The Riddle of Addiction

    In 2014, Anthony Bourdain’s CNN show, Parts Unknown, travelled to Massachusetts. He visited his old haunts from 1972, when he had spent a high school summer working in a Provincetown restaurant, the now-shuttered Flagship on the tip of Cape Cod. “This is where I started washing dishes …where I started having pretensions of culinary grandeur,” Bourdain said in a wistful voiceover. For the swarthy, rail-thin dishwash-er-turned-cook, Provincetown was a “wonderland” bursting with sexual freedom, drugs, music, and “a joy that only came from an absolute certainty that you were invincible.” Forty years later, he was visiting the old Lobster Pot restaurant, cameras in tow, to share Portuguese kale soup with the man who still ran the place.

    Bourdain enjoyed a lot of drugs in the summer of 1972. He had already acquired a “taste for chemicals,” as he put it. The menu included marijuana, Quaaludes, cocaine, LSD, psilocybin mushrooms, Seconal, Tuinal, speed, and codeine. When he moved to the Lower East Side of New York to cook profession-ally in 1980, the young chef, then 24, bought his first bag of heroin on the corner of Bowery and Rivington. Seven years later he managed to quit the drug cold turkey, but he spent several more years chasing crack cocaine. “I should have died in my twenties,” Bourdain told a journalist for Biography.

    By the time of his visit to Provincetown in 2014, a wave of painkillers had already washed over parts of Massachusetts and a new tide of heroin was rolling in. Bourdain wanted to see it for himself and traveled northwest to Greenfield, a gutted mill town that was a hub of opioid addiction. In a barebones meeting room, he joined a weekly recovery support group. Everyone sat in a circle sharing war stories, and when Bourdain’s turn came he searched for words to describe his attraction to heroin. “It’s like something was missing in me,” he said, “whether it was a self-image situation, whether it was a character flaw. There was some dark genie inside me that I very much hesitate to call a disease that led me to dope.”

    A dark genie: I liked the metaphor. I am a physician, yet I, too, am hesitant to call addiction a disease. While I am not the only skeptic in my field, I am certainly outnumbered by doctors, addiction professionals, treatment advocates, and researchers who do consider addiction a disease. Some go an extra step, calling addiction a brain disease. In my view, that is a step too far, confining addiction to the biological realm when we know how sprawling a phenomenon it truly is. I was reminded of the shortcomings of medicalizing addiction soon after I arrived in Ironton, Ohio where, as the only psychiatrist in town, I was asked whether I thought addiction was “really a disease.

    In September 2018, I set out for Rust Belt Appalachia from Washington, D.C., where I am a scholar at a think tank and was, at the time, a part-time psychiatrist at a local methadone clinic. My plan was to spend a year as a doctor-within-borders in Ironton, Ohio, a town of almost eleven thousand people in an area hit hard by the opioid crisis. Ironton sits at the southernmost tip of the state, where the Ohio River forks to create a tri-state hub that includes Ashland, Kentucky and Huntington, West Virginia. Huntington drew national attention in August 2016, when twenty-eight people overdosed on opioids within four hours, two of them fatally.

    I landed in Ironton, the seat of Lawrence County, by luck. For some time I had hoped to work in a medically underserved area in Appalachia. Although I felt I had a grasp on urban opioid addiction from my many years of work in methadone clinics in Washington DC, I was less informed about the rural areas. So I asked a colleague with extensive Ohio connections to present my offer of clinical assistance to local leaders. The first taker was the director of the Ironton-Lawrence County Community Action Organization, or CAO, an agency whose roots extend to President Johnson’s War on Poverty. The CAO operated several health clinics.

    Ironton has a glorious past. Every grandparent in town remembers hearing first-person accounts of a period, stretching from before the Civil War to the early turn of the century, when Ironton was one of the nation’s largest producers of pig iron. “For more than a century, the sun over Ironton warred for its place in the sky with ashy charcoal smoke,” according to the Ironton Tribune. “In its heyday in the mid-nineteenth century there were forty-five [iron] furnaces belching out heat, filth, and prosperity for Lawrence County.” After World War II, Ironton was a thriving producer of iron castings, molds used mainly by automakers. Other plants pumped out aluminum, chemicals, and fertilizer. The river front was a forest of smokestacks. High school graduates were assured good paying if labor-intensive jobs, and most mothers stayed home with the kids. The middle class was vibrant.

    But then the economy began to realign. Two major Ironton employers, Allied Signal and Alpha Portland Cement, closed facilities in the late 1960s, beginning a wave of lay-offs and plant closings. The 1970s were a time of oil shocks emanating from turmoil in the Middle East. Inflation was high and Japanese and German car makers waged fierce competition with American manufacturers. As more Ironton companies downsized and then disappeared, the pool of living wage jobs contracted, and skilled workers moved out to seek work elsewhere. At the same time, the social fabric began to unravel. Domestic order broke down, welfare and disability rolls grew, substance use escalated. Most high school kids with a shot at a future pursued it elsewhere, and the place was left with a population dominated by older folks and younger addicts.

    Ironton continues to struggle. Drug use, now virtually normalized, is in its third, sometimes fourth, generation. Almost everyone is at least one degree of separation away from someone who has overdosed. Although precise rates of drug involvement are hard to come by, one quarter to one third is by far the most common answer I hear when I ask sources for their best estimate of people dealing with a “drug problem of any kind.” Alluding to the paucity of hope and opportunity, one of my patients told me that “you have to eradicate the want — why people want to use — or you will always have drug problems.”

    When Pam Monceaux, an employment coordinator in town, asked me whether I thought addiction was “really a disease,” she was thinking about her own daughter. Christal Monceaux grew up in New Orleans with her middle-class parents and a younger sister, and started using heroin and cocaine when she was nineteen. Pam blamed the boyfriend. “Brad sucked her in. Finally, she dumped him, went to rehab and did well, but a few months later took him back and the cycle began all over again.” Eventually Christal’s younger sister, who had moved to Nashville with her husband, persuaded her to leave New Orleans and join them. Pam, a serene woman who had over a decade’s time to put her daughter’s ordeal into perspective, said that relocating — or the “geographic cure,” as it is sometimes called — worked for Christal. A new setting and new friends allowed her to relinquish drugs. She got married, had children, and lived in a $400,000 house. The happy ending was cut short by Christal’s death at the age of forty-two of a heart attack. “If she could kick it for good when she was away from Brad and then when she moved to Nashville, how is that a disease?” Pam asked in her soft Louisiana drawl. “If I had breast cancer, I’d have it in New Orleans and in Nashville.”

    Unlike Christal, Ann Anderson’s daughter had not left drugs behind for good. So, at age 66, Ann and her husband were raising their granddaughter, Jenna. Ann, who worked for my landlord, was bubbly, energetic, and, curiously, sounded as if she were raised in the deep South. The welcome basket she put together for me when I arrived, full of dish towels, potholders, and candies, foretold the generosity that she would show me all year. Ann makes it to every one of Jenna’s basketball games. Jenna’s mom lives in Missouri and has been on and off heroin for years. “I love my daughter, but every time she relapsed, she made a decision to do it,” said Ann, matter-of-factly, but not without sympathy. “And each time she got clean she decided that too.”

    Another colleague, Lisa Wilhelm, formed her opinions about addiction based on her experience with patients. Lisa was a seen-it-all nurse with whom I had worked at the Family Medical Center located across highway 52 from the Country Hearth, a drug den that passed itself off as a motel. She did not ask for my opinion about addiction; she told me hers. “I think it is a choice. And I’ll devote myself to anyone who made that choice and now wants to make better ones,” Lisa said, “But it’s not a disease, I don’t think.”

    Then there was Sharon Daniels, the director of Head Start. Sharon managed programs for drug-using mothers of newborns and toddlers. “I see opportunities our women have to make a different choice,” she said. She is not pushing a naive “just say no” agenda, nor is she looking for an excuse to purge addicted moms from the rolls. This trim grandmother with bright blue eyes and year-round Christmas lights in a corner of her office is wholly devoted to helping her clients and their babies. But she thinks that the term disease “ignores too much about the real world of addiction. If we call it a disease, then it takes away from their need to learn from it.”

    Before coming to Ironton, I had never been asked what I thought about addiction by the counselors at the methadone clinic at which I worked in Washington. I am not sure why. Perhaps abstractions are not relevant when you are busy helping patients make step-wise improvements. Maybe the staff already knew what I would say. On those rare occasions when a student or a non-medical colleague asked me, generally sotto voce, if addiction were really a disease my response was this: “Well, what are my choices?” If the alternatives to the disease label were “criminal act,” “sin,” or “moral deprivation,” then I had little choice but to say that addiction was a disease. So, if a crusty old sheriff looking to justify his punitive lock-‘em-up ways asked me if addiction were a disease, I would say, “Why yes, sir, it is.”

    But Pam, Beckey, Lisa, and Sharon had no concealed motives. They were genuinely interested in the question of addiction. And they were fed up with the false choice routinely thrust upon them in state-sponsored addiction workshops and trainings: either endorse addicts as sick people in need of care or as bad actors deserving of punishment. With such ground rules, no one can have a good faith conversation about addiction. Between the poles of diseased and depraved is an expansive middle ground of experience and wisdom that can help explain why millions use opioids to excess and why their problem can be so difficult to treat. The opioid epidemic’s dark gift may be that it compels us to become more perceptive about why there is an epidemic. The first step is understanding addiction.

    Most people know addiction when they see it. Those in its grip pursue drugs despite the damage done to their wellbeing and often to the lives of others. Users claim, with all sincerity, that they are unable to stop. This is true enough. Yet these accounts tell us little about what drives addiction, about its animating causal core — and the answer to those questions has been contested for over a century. In the mid-1980s the Harvard psychologist Howard J. Shaffer proclaimed that the field of addiction has been in a century-long state of “conceptual chaos.” And not much has changed. For behaviorists, addiction is a “disorder of choice” wherein users weigh benefits against risks and eventually quit when the ratio shifts toward the side of risk. For some philosophers, it is a “disorder of appetite.” Psychologists of a certain theoretical stripe regard it as a “developmental” problem reflecting failures of maturity, including poor self-control, an inability to delay gratification, and an absence of a stable sense of self.  Sociologists emphasize the influence of peers, the draw of marginal groups and identification with them, and responses to poverty or alienation. Psycho-therapists stress the user’s attempt at “self-medication” to allay the pain of traumatic memories, depression, rage, and so on. The American Society of Addiction Medicine calls addiction  “a primary, chronic disease of brain reward, motivation, memory and related circuitry.” For the formerly addicted neuroscientist Marc Lewis, author of Memoirs of an Addicted Brain, addiction is a “disorder of learning,” a powerful habit governed by anticipation, focused attention, and behavior, “much like falling in love.”

    None of these explanations best captures addiction, but together they enforce a very important truth. Addiction is powered by multiple intersecting causes — biological, psycho-logical, social, and cultural. Depending upon the individual, the influence of one or more of these dimensions may be more or less potent. Why, then, look for a single cause for a complicated problem, or prefer one cause above all the others? At every one of those levels, we can find causal elements that contribute to excessive and repeated drug use, as well as to strategies that can help bring the behavior under control. Yet today the “brain disease” model is the dominant interpretation of addiction.

    I happened to have been present at a key moment in the branding of addiction as a brain disease. The venue was the second annual “Constituent Conference” convened in the fall of 1995 by the National Institute on Drug Abuse, or NIDA, which is part of the National Institutes of Health. More than one hundred substance-abuse experts and federal grant recipients had gathered in Chantilly, Virginia for updates and discussions on drug research and treatment. A big item on the agenda set by the NIDA’s director, Alan Leshner, was whether the assembled group thought the agency should declare drug addiction a disease of the brain. Most people in the room — all of whom, incidentally, relied heavily on NIDA-funding for their professional survival — said yes. Two years later Leshner officially introduced the concept in the journal : “That addiction is tied to changes in brain structure and function is what makes it, fundamentally, a brain disease.”

    Since then, NIDA’s concept of addiction as a brain disease has penetrated the far reaches of the addiction universe. The model is a staple of medical school education and drug counselor training and even figures in the anti-drug lectures given to high-school students. Rehab patients learn that they have a chronic brain disease. Drug czars under Presidents Bill Clinton, George W. Bush, and Barack Obama have all endorsed the brain-disease framework at one time or another. From being featured in a major documentary on HBO, on talk shows and Law and Order, and on the covers of Time and Newsweek, the brain-disease model has become dogma — and like all articles of faith, it is typically believed without question.

    Writing in the New England Journal of Medicine in 2016, a trio of NIH- and NIDA-funded scientists speculated that the “brain disease model continues to be questioned” because the science is still incomplete — or, as they put it, because “the aberrant, impulsive, and compulsive behaviors that are characteristic of addiction have not been clearly tied to neurobiology.” Alas, no. Unclear linkages between actions and neurobiology have nothing to do with it. Tightening those linkages will certainly be welcome scientific progress — but it will not make addiction a brain disease. After all, if explaining how addiction operates at the level of neurons and brain circuits is enough to make addiction a brain disease, then it is arguably many other things, too: a personality disease, a motivational disease, a social disease, and so on. The brain is bathed in culture and circum-stance. And so I ask again: why promote one level of analysis above all of the others?

    Of course, those brain changes are real. How could they not be? Brain changes accompany any experience. The simple act of reading this sentence has already induced changes in your brain. Heroin, cocaine, alcohol, and other substances alter neural circuits, particularly those that mediate pleasure, motivation, memory, inhibition, and planning. But the crucial question regarding addiction is not whether brain changes take place. It is whether those brain changes obliterate the capacity to make decisions. The answer to that question is no. People who are addicted can respond to carrots and sticks, incentives and sanctions. They have the capacity to make different decisions when the stakes change. There is a great deal of evidence to substantiate faith in the agency of addicts. Acknowledging it is not tantamount to blaming the victim; it is, much more positively, a recognition of their potential.

    The brain-disease model diverts attention from these truths. It implies that neurobiology is necessarily the most important and useful level of analysis for understanding and treating addiction.  Drugs “hijack” the reward system in the brain, and the patient is the hostage. According to the psychiatrist and neuroscientist Nora Volkow, who is currently the head of NIDA, “a person’s brain is no longer able to produce something needed for our functioning and that healthy people take for granted, free will.” Addiction disrupts the function of the frontal cortex, which functions as “the brakes,” she told a radio audience, so that “even if I choose to stop, I am not going to be able to.” Volkow deploys Technicolor brain scans to bolster claims of hijacked and brakeless brains.

    Rhetorically, the scans make her point. Scientifically, they do not. Instead they generate a sense of “neuro-realism” — a term coined by Eric Racine, a bioethicist at the Montreal Clinical Research Institute, to describe the powerful intuition that brain-based information is somehow more genuine or valid than is non-brain-based information. In truth, however, there are limits to what we can infer from scans. They do not allow us, for example, to distinguish irresistible impulses from those that were not resisted, at least not at this stage of the technology. Indeed, if neurobiology is so fateful, how does any addict ever quit? Is it helpful to tell a struggling person that she has no hope of putting on the brakes? It may indeed seem hopeless to the person caught in the vortex of use, but then our job as clinicians is to make quitting and sustained recovery seem both desirable and achievable to them.

    We start doing this in small ways, by taking advantage of the fact that even the subjective experience of addiction is malleable. As Jon Elster points out in Strong Feelings: Emotions, Addiction, and Human Behavior, the craving for a drug can be triggered by the mere belief that it is available. An urge becomes overpowering when a person believes it is irrepressible. Accordingly, cognitive behavioral therapy is designed precisely to help people understand how to manipulate their environment and their beliefs to serve their interests. They may learn to insulate themselves from people, places, and circumstances associated with drug use; to identify emotional states associated with longing for drugs and to divert attention from the craving when it occurs. These are exercises in stabilization. Sometimes they are fortified with anti-addiction medications. Only when stabilized can patients embark on the ambitious journey of rebuilding themselves, their relation-ships, and their futures.

    I have criticized the brain disease model in practically every lecture I have given on this wrenching subject. I have been relentless, I admit. I tell fellow addiction professionals and trainees that medicalization encourages unwarranted optimism regarding pharmaceutical cures and oversells the need for professional help. I explain that we err in calling addiction a “chronic” condition when it typically remits in early adulthood. I emphasize to colleagues who spend their professional lives working with lab rats and caged monkeys that the brain-disease story gives short shrift to the reality that substances serve a purpose in the lives of humans. And I proselytize that the brain changes induced by alcohol and drugs, no matter how meticulously scientists have mapped their starry neurons and sweeping fibers, need not spell destiny for the user.

    Yet despite my strong aversion to characterizing addiction as a problem caused primarily by brain dysfunction, I genuinely appreciate the good ends that the proponents of the brain model have sought to reach. They hoped that “brain disease,” with its intimation of medical gravitas and neuroscientific determinism, would defuse accusations of flawed character or weak will. By moving addiction into the medical realm, they can get it out of the punitive realm. And if addicts are understood to suffer from a brain disease, their plight will more likely garner government and public sympathy than if they were seen as people simply behaving badly. But would they? Research consistently shows that depictions of behavioral problems as biological, genetic, or “brain” problems actually elicit greater desire for social distance from afflicted individuals and stoke pessimism about the effective-ness of treatment among the public and addicted individuals themselves.

    Evidence suggests that addicted individuals are less likely to recover if they believe that they suffer from a chronic disease, rather than from an unhealthy habit. More radically, there is a grounded argument to be made for feelings of shame, despite its bad reputation in therapeutic circles. “Shame is highly motivating,” observes the philosopher Owen Flanagan, who once struggled mightily with alcohol and cocaine, “it expresses the verdict that one is living in a way that fails one’s own survey as well as that of the community upon whose judgment self-respect is legitimately based.” But under what conditions do feelings of shame end up prodding people into correcting their course, as opposed to making matters worse by fueling continued consumption to mute the pain of shameful feelings? The psychologists Colin Leach and Atilla Cidam uncovered a plausible answer. They conducted a massive review of studies on shame (not linked to addiction per se) and approaches to failure, and found that when people perceive that damage is manageable and even reversible shame can act as a spur to amend self-inflicted damage. They underscored what clinicians have long-known: only when patients are helped to feel competent — “self-efficacious” is the technical term — can they begin to create new worlds for themselves.

    Thinking critically about the disease idea is important for conceptual clarity. But a clinician must be pragmatic, and if a patient wants to think of addiction as a disease I do not try to persuade them otherwise. Yet I do ask one thing of them: to be realistic about the kind of disease it is. Despite popular rhetoric, addiction is not a “disease like any other.” It differs in at least two important ways. First, individuals suffering from addiction respond to foreseeable consequences while individuals with conventional diseases cannot. Second, this “disease” is driven by a powerful emotional logic.

    In 1988, Michael Botticelli, who would go on to become President Obama’s second drug czar over two decades later, was charged with drunk driving on the Massachusetts Turnpike. A judge gave him the choice of going to jail or participating in a treatment program. Botticelli made a decision: he went to a church basement for help, joined Alcoholics Anonymous, 

    and quit drinking. Yet on CBS’ 60 Minutes he contradicted his own story when he drew an analogy between having cancer and being addicted. “We don’t expect people with cancer to stop having cancer,” he said. But the analogy is flawed. No amount of reward or punishment, technically called “contingency,” can alter the course of cancer. Imagine threatening to impose a penalty on a brain cancer victim if her vision or speech continued to worsen, or to offer a million dollars if she could stay well. It would have no impact and it would be cruel. Or consider Alzheimer’s, which is a true brain disease.

    (True insofar as the pathology originates in derangements of brain structure and physiology.) If one held a gun to the head of a person addicted to alcohol and threatened to shoot her if she consumed another drink, or offered her a million dollars if she desisted, she could comply with this demand — and the odds are high that she would comply. In contrast, threatening to shoot an Alzheimer’s victim if her memory further deteriorated (or promising a reward if it improved) would  be pointless.

    The classic example of the power of contingency is the experience of American soldiers in Vietnam. In the early 1970s, military physicians in Vietnam estimated that between 10 percent and 25 percent of enlisted Army men were addicted to the high-grade heroin and opium of Southeast Asia. Deaths from overdosing soared. Spurred by fears that newly discharged veterans would ignite an outbreak of heroin use in American cities, President Richard Nixon commanded the military to begin drug testing. In June 1971, the White House announced that no soldier would be allowed to board a plane home unless he passed a urine test. Those who failed could go to an Army-sponsored detoxification program before they were re-tested.

    The plan worked. Most GIs stopped using narcotics as word of the new directive spread, and most of the minority who were initially prevented from going home produced clean samples when given a second chance. Only 12 percent of the soldiers who were dependent on opiate narcotics in Vietnam became re-addicted to heroin at some point in the three years after their return to the United States. Whereas heroin helped soldiers endure wartime’s alternating bouts of boredom and terror, most were safe once they were stateside. At home, they had different obligations and available rewards, such as their families, jobs, friends, sports, and hobbies. Many GIs needed heroin to cool the hot anger they felt at being sent to fight for the losing side by commanders they did not respect. Once home, their rage subsided to some extent. Also, heroin use was no longer normalized as it was overseas. At home, heroin possession was a crime and the drug was harder and more dangerous to obtain. As civilian life took precedence, the allure of heroin faded.

    We know the value of “contingencies.” Hundreds of studies attest to the power of carrots and sticks in shaping the behavior of addicted individuals. Carl Hart, a neuroscientist at Columbia University, has shown that when people are given a good enough reason to refuse drugs, such as cash, they respond. He ran the following experiment: he recruited addicted individuals who had no particular interest in quitting, but who were willing to stay in a hospital research ward for two weeks for testing. Each day Hart offered them a sample dose of either crack cocaine or methamphetamine, depending upon the drug they use regularly. Later in the day, the subjects were given a choice between the same amount of drugs, a voucher for $5 of store merchandise, or $5 cash. They collected their reward upon discharge two weeks  later. The majority of subjects choose the $5 voucher or cash when offered small doses of the drug, but they chose the drug when they were offered a higher dose. Then Hart increased the value of the reward to $20, and his subjects chose the money every time.

    One of my patients, I will call her Samantha, had been using OxyContin since 2011 when she was working in the kitchen at Little Caesar’s in downtown Ironton. The 20 mg pills belonged to her grandmother, whose breast cancer had spread to her spine. Samantha visited her grandma after work, watched TV with her, and went through the mail. She would also remove three or four pills per day from the massive bottle kept by the fancy hospital bed that Samantha’s brother moved into the living room. When Samantha’s grandmother died in 2016, so did the pill supply. “I just couldn’t bring myself to do heroin, and, anyway, I had no money for drugs,” Samantha said.

    When the pills were almost gone, Samantha drove to an old friend’s house, hoping that the friend would give her a few Oxy’s in exchange for walking Snappy, her arthritic chihuahua. “My friend wasn’t home, but her creepy boyfriend Dave answered the door and told me he’d give me some Oxy’s if I gave him a blow job.” Samantha was feeling the warning signs of withdrawal — jitteriness, crampy stomach, sweaty underarms. Desperate to avoid full blown withdrawal, she gave a minute’s thought to the proposition. “Then I felt revolted and I said no way and drove straight here because I knew I could start buprenorphine the same day,” she said.

    What of Samantha’s “hijacked” brain? When she stood before Dave, her brain was on fire. Her neurons were screaming for oxycodone. Yet in the midst of this neurochemical storm, at peak obsession with drugs, Samantha’s revulsion broke through, leading her to apply the “brakes” and come to our program. None of this means that giving up drugs is easy. But it does mean that an “addicted brain” is capable of making a decision to quit and of acting on it.

    On Tuesday nights, I co-ran group therapy with a wise social worker named John Hurley. In one group session, spurred by a patient sharing that he decided to come to treatment after spending some time in jail, the patients went around the room reciting what brought them to the clinic. Without exception, they said that they felt pressured by forces inside or outside themselves.

    “I couldn’t stand myself.”

    “My wife was going to leave me.”

    “My kids were taken away.”

    “My boss is giving me one more chance.”

    “I can’t bear to keep letting my kids down.”

    “I got Hep C.”

    “I didn’t want to violate my probation.”

    Ultimatums of these kinds were often the best things to happen to our patients. For other addicts, the looming consequences proved so powerful that they were able to quit without any professional help at all.

    The psychologist Gene Heyman at Boston College found that most people addicted to illegal drugs stopped using by about age thirty. John F. Kelly’s team at Massachusetts General Hospital found that forty-six percent of people grappling with drugs and alcohol had resolved their drug problems on their own. Carlos Blanco and his colleagues at Columbia University used a major national database to examine trends in prescription drug problems. Almost all individuals who abused or were addicted to prescription opioids also, at some point in their lives, had a mental disorder, an alcohol or drug problem, or both. Yet roughly half of them were in remission five years later. Given low rates of drug treatment, it is safe to say that the majority of remissions took place without professional help.

    These findings may seem surprising to, of all people, medical professionals. Yet it is well-known to medical sociologists that physicians tend to succumb to the “clinicians’ illusion,” a habit of generalizing from the sickest subset of patients to the overall population of people with a diagnosable condition. This caveat applies across the medical spectrum. Not all people with diabetes, for example, have brittle blood sugars — but they will represent a disproportionate share of the endocrinologist’s case load. A clinician might wrongly, if rationally, assume that most addicts behave like the recalcitrant ones who keep stumbling through the emergency room doors. Most do not. Granted, not everyone can stop an addiction on their own, but the very fact it can be done underscores the reality of improvement powered by will alone: a pathway to recovery rarely available to those with conventional illness.

    The second major difference between addiction and garden- variety disease is that addiction is driven by powerful feelings. Ask an alcoholic why she drinks or an addict why he uses drugs and you might hear about the pacifying effect of whisky and heroin on daunting hardship, unremitting self-persecution, yawning emptiness, or harrowing memories. Ask a patient with Parkinson’s disease, a classic brain disease, why he developed the neurological disorder and you will get a blank stare. Parkinson’s is a condition that strikes, unbidden, at the central nervous system; the patient does not consciously collude in bringing it about. Excessive use of a drug, by contrast, serves some kind of need, an inner pain to be soothed, a rage to be suppressed. It is a response to some sort of suffering.

    Memoirs offer portals into the drama of addiction. One of my favorites is Straight Life, by the master alto saxophonist Art Pepper. Self-taught on the instrument by the age of thirteen, Pepper endured a childhood of psychological brutality at the hands of a sadistic alcoholic father, an icicle of a grandmother, and an alcoholic mother who was fourteen years old when he was born and who did not hide her numerous attempts to abort him. “To no avail,” he writes. “I was born. She lost.” What preoccupied him as a child was “wanting to be loved and trying to figure out why other people were loved and I wasn’t.” Pepper’s self-loathing bubbled like acid in his veins. “I’d talk to myself and say how rotten I was,” he wrote. “Why do people hate you? Why are you alone?” At 23, after years of alcohol and pot, he sniffed his first line of heroin through a rolled up dollar-bill and the dark genie dissolved. He saw himself in the mirror. “I looked like an angel,” he marveled. “It was like looking into a whole universe of joy and happiness and contentment.”

    From that moment on, Pepper said, he would “trade misery for total happiness… I would be a junkie…I will die a junkie.” Indeed, he became a “lifelong dope addict of truly Satanic fuck-it-all grandeur,” in the words of his passionate admirer, the critic and scholar Terry Castle. He was in and out of prison for possession charges. Pepper lived without heroin for a number of years after attending Synanon, a drug-rehabilitation center in California, from 1969 to 1972 and was treated with methadone for a period in the mid-1970s. Eventually, though, he returned to drugs, mainly consuming massive amphetamine, and died from a stroke in 1982. He was 56.

    Addicts can appear to have everything: a good education, job prospects, people who love them, a nice home. They can be people who “are believed to have known no poverty except that of their own life-force,” to borrow the words of Joan Didion, and yet suffer greatly. The malaise is internal. Or they can be in dire circumstances, immiserated by their lives, moving through a dense miasma. “There was nothing for me here,” said one patient whose child was killed in a car accident, whose husband cheated on her, and who was trapped in her job as a maid in a rundown motel with an abusive boss. OxyContin made her “not care.” She reminded me of Lou Reed’s song “Heroin”:

    Wow, that heroin is in my blood
    And the blood is in my head
    Yeah, thank God that I’m good as dead
    Oooh, thank your God that I’m not aware
    And thank God that I just don’t care

    Pharmacologists have long classified opioid drugs as euphoriants, inducers of pleasure, described often as a feeling of a melting maternal embrace, but they could just as easily be called obliviants. According to the late Harvard psychiatrist Norman Zinberg, oblivion seekers yearned “to escape from lives that seem unbearable and hopeless.” Thomas De Quincey, 

    in Confessions of an English Opium Eater, which appeared in 1821, praised opium for keeping him “aloof from the uproar of life.” Many centuries before him Homer had likely referred 

    to it in the Odyssey when he wrote that “no one who drank it deeply…could let a tear roll down his cheeks that day, not even if his mother should die, his father die, not even if right before his eyes some enemy brought down a brother or darling son with a sharp bronze blade,” When the Hollywood screen-writer Jerry Stahl surveyed his life in 1995 in his memoir Permanent Midnight, he concluded that “everything, bad or good, boils back to the decade on the needle, and the years before that imbibing everything from cocaine to Romilar, pot to percs, LSD to liquid meth and a pharmacy in between: a lifetime spent altering the single niggling fact that to be alive means being conscious.” Drugs helped him to attain “the soothing hiss of oblivion.”

    According to ancient myth, Morpheus, the god of dreams, slept in a cave strewn with poppy seeds. Through the cave flowed the river Lethe, known as the river of forget-fulness, also called the river of oblivion. The dead imbibed those waters to forget their mortal days. Unencumbered by memory, they floated free from the aching sadness and discomforts of life.  The mythological dead share a kinship with opioid addicts, oblivion-seekers, and all their reality-manipulating cousins. The difference, mercifully, is that actual people can “un-drink” the numbing waters. Aletheia, truth, is a negation of lethe, the Greek word for forgetting. Recovery from addiction is a kind of unforgetting, an attempt to live in greater awareness and purpose, a disavowal of oblivion.

    Addiction is a cruel paradox. What starts out making life more tolerable can eventually make it ruinous. “A man may take to drink because he feels himself a failure,” said Orwell, “but then  fail  all the more completely because he  drinks.” The balm is a poison. Drugs that ease the pain also end up prolonging it, bringing new excruciations — guilt and grief over damage to one’s self, one’s family, one’s future — and thus fresh reason to continue. The cycle of use keeps turning. Ambivalence is thus a hallmark of late-stage addiction. The philosopher Harry Frankfurt speaks of the “unwilling addict” who finds himself “hating” his addiction and “struggling desperately…against its thrust.” This desperate struggle is what Samuel Taylor Coleridge, himself an opium addict, called “a species of madness” in which the user is torn between his current, anguished self who seeks instant solace and a future self who longs for emancipation from drugs. This explains why the odds of treatment drop out are high — over half after six months, on average. The syringe of Damocles, as Jerry Stahl described the vulnerability to relapse, dangles always above their heads. Many do not even take advantage of treatment when it is offered, reluctant to give up their short-term salvation. They fear facing life “unmedicated” or cannot seem to find a reason for doing so. My friend Zach Rhoads, now a teacher in Burlington, Vermont, used heroin for five years beginning in his early twenties and struggled fiercely to quit. “I had to convince myself that such effort was worth the trouble,” he said.

    Thomas De Quincey consumed prodigious amounts of opium dissolved in alcohol and pronounced the drug a “panacea for all human woes.” For Anthony Bourdain, heroin and cocaine were panaceas, defenses against the dark genie that eventually rose up and strangled him to death in 2018. But not all addicts have a dark genie lurking inside them. Some seek a panacea for problems that crush them from the outside, tribulations of financial woes and family strain, crises of faith and purpose. In the modern opioid ordeal, these are Americans “dying of a broken heart,” in Bill Clinton’s fine words. “They’re the people that were raised to believe the American Dream would be theirs if they worked hard and their children will have a chance to do better — and their dreams were dashed disproportionally to the population as the whole.” He was gesturing toward whites between the ages of 45 and 54 who lack college degrees — a cohort whose life-expectancy at birth had been falling since 1999. They succumbed to “deaths of despair,” a term coined by the economists Anne Case and Angus Deaton in 2015, brought on by suicide, alcoholism (specifically, liver disease), and drug overdoses. Overdoses account for the lion’s share. The white working class has been undermined by falling wages and the loss of good jobs which have “devastated the white working class,” the economists write, and “weakened the basic institutions of working-class life, including marriage, churchgoing, and community.”

    Looking far into the future, what so many of these low income, under-educated whites see are dark horizons. When communal conditions are dire and drugs are easy to get, epidemics can blossom. I call this dark horizon addiction. Just as dark genie addiction is a symptom of an embattled soul, dark horizon addiction reflects communities or other concentrations of people whose prospects are dim and whose members feel doomed. In Ironton, clouds started to gather on the horizon in the late 1960s. Cracks appeared in the town’s economic foundation, setting off its slow but steady collapse.

    Epidemics of dark horizon addiction have appeared under all earthly skies at one time or another. The London gin “craze” of the first half of the eighteenth century, for example, was linked to poverty, social unrest, and over-crowding. According to the historian Jessica Warner, the average adult in 1700 drank slightly more than a third of a gallon of cheap spirits over the course of a year; by 1729 it was slightly more than 1.3 gallons per capita, and hit 2.2 gallons in 1743. A century later, consumption had declined, yet gin was still “a great vice in England,” according to Charles Dickens. “Until you improve the homes of the poor, or persuade a half-famished wretch not to seek relief in the temporary oblivion of his own misery,” he wrote in the 1830s, “gin-shops will increase in number and splendor.”

    During and after the American Civil War, thousands of men needed morphine and opium to bear the agony of physical wounds. In his Medical Essays, the physician Oliver Wendell Holmes, Sr., a harsh critic of medication, excepted opium as the one medicine “which the Creator himself seems to prescribe.” The applications of opium extended to medicating grief. “Anguished and hopeless wives and mothers, made so by the slaughter of those who were dearest to them, have found, many of them, temporary relief from their sufferings in opium,” Horace B. Day, an opium addict himself, recorded in The Opium Habit in 1868. In the South, the spiritual dislocation was especially profound, no doubt explaining, to a significant degree, why whites in the postbellum South had higher rates of opiate addiction than did those in the North — and also, notably, one reason why southern blacks had a lower rate of opiate addiction, according to the historian David T. Courtwright. “Confederate defeat was for most of them an occasion of rejoicing rather than profound depression.”

    A similar dynamic was seen when Russia’s long-standing problem with vodka exploded during the political instability and economic uncertainty of the post-Communist era. The majority of men drank up to five bottles a week in the early 1990s. Back home, heroin was a symptom of ghetto life for millions of impoverished and hopeless Hispanics and blacks in the 1960s and 70s, followed by crack among blacks in the mid-80s. The rapid decline of manufacturing jobs for inner city men, writes the historian David Farber in his recent book Crack, “helps explain the large market of poor people, disproportionately African Americans, who would find crack a balm for their troubled, insecure, and often desperate lives.”

    Children raised by dark horizon parents often bear a double burden. Not only do they suffer from growing up with defeated people in defeated places where opportunities are stunted and boredom is crushing. Often they are casualties of their parents’ and their grandparents’ addictions. One of my patients, Jennifer, described herself as a “third generation junky.” Patches of acne clung to her cheeks, making her look younger than thirty. Her maternal grandmother managed well enough with an ornery husband who drank too much on weekends until he lost his job at a local casting plant in the 1970s and became a full-fledged alcoholic, bitter, aimless, and abusive to his wife. The grandmother worked cleaning motel rooms and began staying out late, using pills and weed. Jennifer’s mother, Ann, was the youngest in a household that had devolved into havoc.

    When Ann was sixteen, Jennifer was born. Not one reliable adult was around. “No one really cared if I went to school,” Jennifer recalls. No one urged her to succeed or expressed confidence in her. “I learned that when something bothered you, you got high.” Her mother, Ann, was aloof, Jennifer said, except for the stretch they were both in jail at the same time: she was 19, her mother was 42. “My mother was assigned to be the chaperone for my group of inmates,” Jennifer recalled. “She did my laundry and saved me extra food in jail. It was the only time she acted like a mom towards me.” Children raised in such homes are greatly disadvantaged. The absence of a steady protector in their lives often derails their developing capacity for tolerating frustration and disappointment, controlling impulses, and delaying gratification. They have difficulty trusting others, forming rewarding connections with others and they often see themselves as damaged and worthless. When adults around them do not want to work regularly, children cannot imbibe the habits of routine, reliability, and dependability. At worst, the cycle repeats itself, inflicting wounds across generations and communities as their collective disenchantment with the future mounts. Sociologists call this “downward social drift.”

    The germ theory of addiction: that is my term for one of the popular if misbegotten narratives of how the opioid crisis started. It holds that the epidemic has been driven almost entirely by supply — a surfeit not of bacteria or viruses, but of pills. “Ask your doctor how prescription pills can lead to heroin abuse,” blared massive billboards from the Partnership for a Drug-Free New Jersey that I saw a few years ago. Around that time, senators proposed a bill that would have limited physician prescribing. “Opioid addiction and abuse is commonly happening to those being treated for acute pain, such as a broken bone or wisdom tooth extraction,” is how they justified the legislation.

    Not so. The majority of prescription pill casualties were never patients in pain who had been prescribed medication by their physicians. Instead, they were mostly individuals who were already involved with drugs or alcohol. Yes, some actual patients did develop pill problems, but generally they had a history of drug or alcohol abuse or were suffering from concurrent psychiatric problems or emotional distress. It is also true, of course, that drug marketers were too aggressive at times and that too many physicians overprescribed, sometimes out of inexperience, other times out of convenience, and in some cases out of greed.

    As extra pills began accumulating in rivulets, merging with pills obtained from pharmacy robberies, doctor shopping, and prescription forgeries, a river of analgesia ran through various communities. But even with an ample supply, you cannot “catch” addiction. There must be demand — not for addiction, per se, but for its vehicle. My year in Ironton showed me that the deep story of drug epidemics goes well beyond public health and medicine. Those disciplines, while essential to management, will not help us to understand why particular people and places succumb. It is the life stories of individuals and, in the case of epidemics, the life story of places, that reveal the origins. Addiction is a variety of human experience, and it must be studied with all the many methods and approaches with we which we study human experience.

    Dark genies can be exorcised and dark horizons can be brightened. It is arduous work, but unless we recognize all the reasons for its difficulty, unless we reckon with the ambiguity and the elusiveness and the multiplicity of addiction’s causes, unless we come to understand why addicts go to such lengths to continue maiming themselves with drugs — compelled by dark genies, dark horizons, or both — their odds of lasting recovery are slim, as are the odds of preventing and reversing drug crises. The complexity of addiction is nothing other than the complexity of life.