The Exclamation Point

    For Tom at seventy in Zion

    Sergio Sierra was born in Rome in the winter of 1923. When he was twenty-six years old he received rabbinical ordination, after which he assumed a rabbinical post in Bologna, where he assisted in the reconstruction of the shattered Jewish community. The embers of history’s wildfires had not yet cooled. The great synagogue in Bologna, built by a well-known local architect in a sensitively adapted Art Nouveau style, had been destroyed in a bombing raid in 1943, and it was not until a decade later, and under Sierra’s supervision, that its restoration would be complete. Sierra served in Bologna until 1959, when he left to take up a prominent pulpit in Turin, and to direct its rabbinical college. His talents were not only clerical. The community rabbi also published erudite papers in scholarly journals on modern and medieval themes in Jewish literature. He produced a translation into Italian of Rashi’s commentary on Exodus, and a translation of Bahya ibn Paquda’s eleventh-century masterpiece Hovot Ha’Levavot, or The Duties of the Heart, a monument of Jewish reason and piety, and a translation of Keter Malkhut, or The Kingly Crown, an epic philosophical prayer in rhymed verse by the eleventh-century poet Solomon ibn Gabirol, who has been beloved by readers of Hebrew for a millennium. He also produced a critical edition of one of the most curious works of medieval Jewish literature, the Hebrew translation of Boethius’ Consolation of Philosophy by an early fifteenth-century Jew in Perpignan named Azariah ben Yosef ibn Abba Mari, also known as Bonafus Bonfil Astruc, who fled to Italy from persecution in southern France and was one of the very last figures of the golden age of Provencal Judaism. Sierra was yet another of the many rabbinical figures in the annals of Judaism who managed to combine a pastoral calling with an intellectual one, leadership with learning. He served in Turin until 1985, and in 1992 he moved to Jerusalem, where he died in 2009.

    In Rome, immediately after the war, when he was twenty-two, Sergio Sierra bought a book, a small book, about a hundred pages long, and small in format too. It was a translation into Hebrew of Theodor Herzl’s The Jewish State. It had been published a year earlier in Tel Aviv under the auspices of the Department of Youth Affairs of the Zionist Federation, and the fine translation was by Asher Barash, a distinguished Hebrew writer and editor who came to Palestine from Galicia in 1914 and became a founding father of Israeli literature. (He was also the author of perhaps the first Hebrew work on literary theory, which is a terrible responsibility to bear.) When the young man acquired the volume, he proudly stamped his name and address all over it, including on the dust jacket, right above the canonical image of Herzl with his weirdly Assyrian beard. I know all this because I discovered the book in the basement of an antiquarian bookshop in Jerusalem. No doubt Sierra’s heirs had rid themselves of his library. We latecomers fill our shelves from the philistinism of the sons and the daughters.

    But I did not acquire this book — I did not seize on it — for bibliophilic reasons. Herzl in Hebrew is not hard to find. And at the time I had no idea who Sergio Sierra was. What happened was that I opened the book and was shaken to my core. On the front endpaper Sierra had inscribed his name in Italian and in Hebrew, Sierra Yosef Sergio, in a fine hand, and next to his signatures he recorded the date of his purchase. “1945,” he wrote — but not just that; he also underlined “1945” — but not just that; next to “1945” he also added an exclamation point. “1945!”

    The exclamation point undid me. The entirety of a man’s spirit and the entirety of a people’s spirit was in it. It denoted astonishment: we are still here. It denoted ferocity: we really do intend to exist. It denoted resolution: we are still in our fight for the mastery of our fate. It denoted vitality: even now we are strong. It denoted politics: a state will be ours. And it denoted incredulity: could it really be that on the morrow of the greatest catastrophe in the history of these intimates of catastrophe there appeared this book in this language from that city and that land, and it made its winding way to the trembling hands of this ashen Jew? The young man was a believer, and the exclamation point was, for him, a punctuation of providence. I see no metaphysics in it myself, but I do not have the effrontery to deny the intimation of the miraculous that it granted in the rubble.

    In all the years that I have been its custodian, I regularly turn to the little volume to behold the exclamation point. The sight of it fortifies me and refreshes my purposes. Emphasis is one of the central activities of identity. We are known by our emphases. The emphatic gesture of Sergio Sierra in Rome in 1945 has an elevating effect. And also an emboldening one, because we have reached a troubling point in time, less than a hundred years after he penned his inscription, in which the exclamation point must be defended. Was its truth obvious in 1945? It is obvious no more. The despisers of the principles for which it stands are growing in number, not least among Jews. Where there once was an exclamation point, there is now a question mark.

    The story that I have just told would be dismissed by Omri Boehm as “Zionist Holocaust messianism.” Boehm is the most recent in a long line of critics of Zionism who have attributed its success to the Holocaust, when in fact the structures of the Israeli state were built before the Shoah, and the only “upside” of the catastrophe for Israel was to buy it a few guilty decades of sympathy in the world. Boehm’s formulation is incoherent: there are indeed messianists among those who call themselves Zionists, but Zionism represented a repudiation of Jewish messianism, which it disavowed in favor of a new conception of Jewish agency in history. And there is no such thing as “Holocaust messianism,” which is a vaguely obscene phrase. In Boehm’s view, the utility of the Holocaust for the Jews — we are again skirting obscenity: imagine a serious discussion of the utility of slavery for black people — is that “the Holocaust remains opaque to reason and stands outside of normal politics.” He explains: “Emerging from this ahistorical transcendent mystery, Israel remains beyond universalist politics and moral critique.”

    To establish this claim, he devotes many pages in his book Haifa Republic: A Democratic Future for Israel, a rich document of the new thinking about the Israeli-Palestinian conflict, to Elie Wiesel’s theological response to the Holocaust, which he associates with the writer’s lifelong disinclination to criticize Israeli policies. This is nonsense. No, not the bit about Wiesel’s apologetic attitude toward the Jewish state, whose existence he (and survivors like him, and the children of those survivors) could never quite treat as a natural fact of world history. It is true that the large-souled Wiesel disliked controversy and might have played a more helpful role in those moments in Israel’s history when its conduct was worthy of “moral critique.” There were times when I beseeched him, futilely, to do so, to teach, to castigate, to clarify, because many Jews were confused and angered by certain Israeli actions; and once I came to him with a bundle of Biblical and rabbinical texts — he knew them all, of course — that taught the obligation to criticize one’s own. About these matters we agreed to disagree, because we detected in each other the same over-arching love.

    But Boehm is hiding behind Wiesel so as to propagate a caricature of Zionism and its ethos. The response to the Holocaust in the Jewish world was not primary theological, even if the catastrophe did crush many frameworks of explanation. (For whom is suffering of such magnitude not a mystery?) There was certainly nothing “ahistorical” or “transcendent” about the nationalism that roused the Jewish people and established a state. Zionism and the state that it created were among the greatest triumphs of secularization in the modern era, a resounding historical and philosophical rupture, a genuine discontinuity, even if the social authority of the Chief Rabbinate of Israel is a hideous anachronism that should be abolished right after tomorrow morning’s prayers. Zionism, and Israel, is assuredly not “opaque to reason,” though a sickening amount of unreason now flourishes within it. Jewish nationalism proceeded not only through settlement or force; it grew in reasoning, in argumentation, in persuasion, addressed to both Jews and non-Jews, since its very inception. Nor does it stand “outside of normal politics.” It is a cauldron of normal politics, whatever one’s view of the current brew. Its founding documents are an application of “universalist” values to a particular people — a tense enterprise, but an admirable one. The Jewish state is not an occult entity and it is not immune from criticism. No state and no movement and no person is exempt from the duty of self-justification, especially about the treatment of others. Like generations of Israel’s critics before him, Boehm complains that criticism is forbidden. Meanwhile whole careers are made out of it.

    I have never met a Zionist who would not have preferred to persevere in the nationalist cause for another hundred years and Auschwitz not happen. Indeed, the tragic element in the relationship between the Holocaust and Israel is that the chronological order of extermination and statehood was not the reverse of what it was. If Zionism had accomplished statehood — which was not its common goal until 1942, owing to the excruciating events in Europe — a decade earlier, millions of Jews — millions of human beings — might have been saved. Can we all agree to react unambivalently to this fantasy, to share this regret? I wonder. We are witnessing instead an outpouring of lofty regret that Israel was created at all. This nasty ruefulness is owed to certain views of the Palestinian problem, not all of which are entirely incorrect. Fifty-three years of occupation, with no end in sight, is a miserable reality, violations of rights and laws are commonplace, heartlessness abounds, and the frustration is overwhelming. A feeling of despair about the plausibility of a two-state solution, an Israeli state and a Palestinian state, is everywhere, and it suits the cynical and irresponsibly short-term interests of both the Israeli government and the Palestinian Authority. There are no heroes in the leaderships of this conflict.

    Now the feeling has been turned into an idea, known as the “one-state solution.” In such a state, as Boehm describes it, Jews and Arabs in the territory between the Jordan River and the Mediterranean Sea would each enjoy “a sub-sovereign political autonomy with a constitutional federative structure.” He has written his book to call for the end of Jewish sovereignty and the invention of a Switzerland in the Levant. The dissolution of a state — an actually existing state, not an ideological or political hypothesis — is a high price to pay for a release from exasperation. If this were medicine, we would call it quackery and demand an investigation. “At some point”, Boehm declares, “one must admit that the two-state dream has faded into a two-state illusion.” Nowhere does he prove that this is really so. He shows only that the two-state solution will be difficult to enact, and that the political will to enact it is now lacking. Is this not true also of the one-state solution, which upon inspection may be no solution at all? But I am getting ahead of myself.

    The pertinence of the Holocaust to an understanding of Israel is not as a demagogic shield against disagreement, or as a sanctuary for yet another cult of victimization. Nothing large or lasting was ever built on self-pity. In this context the Holocaust stands for a particular lesson about Jewish history, and Zionism is nothing if not a conclusion drawn from Jewish history, or more precisely from the experience of the Jews in the exile. I do not mean to say that the exile was one long death camp. The exile was not completely (in the famous word of a great historian who hated the notion) lachrymose, not at all. But it was lachrymose enough. The frequency of Jewish wretchedness in the exile, of discrimination and oppression and violence, urgently broaches the question of Jewish security and insecurity. Jews in the exile were sometimes happy and sometimes unhappy, and they attained to extraordinary heights of thought and literature and spirituality, but they were generally unsafe. There were no havens. They lived in perennial vulnerability and permanent subordination.

    For some writers and scholars, from George Steiner to Daniel Boyarin, the powerlessness of the Jews was morally glamorous, and their “subaltern status” was the condition of their cultural refinement; and it is certainly true that if the Jews over many centuries did not commit certain crimes, it was partly because they lacked the power to do so. But powerlessness does not confer purity. It confers pain and death. There is no virtue in vulnerability. The Holocaust was the worst that happened to the Jews in exile, but — contrary to Jewish theologians and historians who numinously insist upon its “uniqueness” — it differed from earlier persecutions in degree but not in kind. The Nazis innovated new methods for old evils. The helplessness of the Jews in Europe in the 1930s and 1940s is unbearable to contemplate, but so is their helplessness in Ukraine in 1648, and in the Iberian peninsula between 1391 and 1497, and in Germany in 1298, and in the Rhineland in 1096, and in many other times and places that are too numerous to list here. They are plentifully documented in the annals of Jewish tears. There are many differences between these specific events, which it is the duty of historians to mark, but they should not obscure a certain bleak political commonality.

    Sooner or later Jews were going to see that for their own good they needed to acquire politics and power. But nowhere in Boehm’s book, or in any other espousal of the “one-state solution” that I have seen, is there a shred of interest in the question of Jewish security. In one passage Boehm writes sneeringly that “Israel is designed to protect Jewish ethnicity, Jewish blood,” as if the Zionist insistence upon self-defense is racist. (This slander reminds me of the awful charge that is sometimes made against the motto that “black lives matter.”) No, brother; not Jewish blood, Jewish bodies. They need to be protected, not only on the grounds of their human rights, which even people with armies possess, but also if they are to reconcile with their neighbors. Anyway, the power of the State of Israel was not developed for purposes of conquest and expansion, even if that power, like all military power, has sometimes been abused.

    The military strength of the Jewish state is an ethically and empirically warranted dispensation. Self-defense is a corollary of self-emancipation, or “auto-emancipation,” which was the founding axiom of Jewish nationalism. Is it really necessary to be reminded that Israel has vicious and lethal enemies, and that a large portion of the enmity that it has encountered has been provoked not by its actions but by its existence? There are many ways in which the Holocaust has figured too prominently in contemporary Jewish identity, but it is cheap of Boehm to describe Zionism as “a sort of Angst-based mythical Holocaust messianism.” There is nothing mythical about the suffering and the Angst is real. Jewish anguish is not the only anguish that counts, of course, but it must not be coldly discounted, especially by people who pride themselves on the exquisiteness of their empathy.

    The internecine Jewish debate about the Israeli-Palestinian conflict is sometimes portrayed as a quarrel between those who care about Israeli security, the hawks, and those who care about Israeli morality, the doves; but in truth security is itself a moral duty. I wish the rulers of Gaza would grasp this. Safety, too, is a right. Ensuring it is one of the primary duties of government. (As long as thousands of rockets are launched into Israel, the Iron Dome system, and the consequently low casualty statistics in Israel, is not an unfair advantage; it is the evidence of a state’s seriousness about protecting its population.) For this reason, there is also nothing ethically scandalous about the insistence that there must be a Jewish majority in Israel. This is not an undemocratic majoritarianism, unless one holds that all majoritarianism is undemocratic. After all, there will be a majority in a one-state entity, too — a Palestinian majority, which does not seem to trouble the proponents of the exciting new idea. For some reason they are confident that a Palestinian majority will fulfill the ideal of equality more scrupulously than a Jewish majority has done. They have not yet disclosed the historical basis for their optimism.

    Until the political borders and the ethnic borders of a polity — state, province, canton, district, whatever — coincide and perfect social homogeneity is achieved within a single political framework, which will never happen, and for the sake of the moral and cultural development of the citizenry should never happen, there will be majorities and minorities, and the supreme responsibility of a diverse polity will always be to regard minorities democratically, equally, in full and active recognition of their rights. A multiethnic state, which is what all states are, whether they know it or not, cannot escape this duty, which will require it, in the name of social peace and common decency, to control the maximalist and authenticist and exclusivist incitements of each of the groups within its borders. This is the case with the State of Israel, and it will be the case, inshallah, with the State of Palestine, and it would be the case even with the State of Isratine, which was the name that Muammar Qaddaffi gave to his proposal for a binational federated one-state solution for Israelis and Palestinians. (#StrangeBedfellows.) What will determine the justice of any of these compound political entities will be their determination to deploy the principles and the practices of democracy to resist the temptation of ethnic tyranny. There is nothing anti-democratic about the nation-state. It all depends on the character of its governance.

    The concern about Jewish numbers is nothing more sinister than a concern about Jewish security. The nightmare is simple: it is that one day Jews will need to flee to safety and a majority non-Jewish government will deny them entry. There is nothing paranoid, or even fanciful, about such a scenario. It is not “ethno-nationalism,” it is prudence. One of the more outrageous aspects of the new proposal to abolish the Jewish state is that it is being advanced at precisely the moment when anti-Semitism is dramatically on the rise. The old crisis that Zionism was conceived to address, the primal emergency, is back. The higher rates of Jewish emigration to Israel in recent years have been a classical flight for safety, and the highest numbers of Jewish emigrants over the last decade have come from countries whose Jewish communities have been shaken by anti-Jewish violence. Jews and Jewish institutions are being attacked on the streets of many cities, often in the name of Palestine. When Prime Minister Netanyahu — how nice it is to type those words retrospectively! — told the Jews of France to “come home,” he was speaking uncontroversially from the standpoint of Jewish history. (Uncontroversially, that is, for those who have given up on the pluralistic prospects of France. The other French Jews, the majority of them, the ones who have chosen to remain in the fight for the Enlightenment traditions of the country, are certainly in a good fight.)

    Why aren’t Ilan Halimi and Sarah Halimi household names? Jews are still in need of refuge and Israel is still a refuge for Jews. But anti-Semitism does not particularly interest progressives, because it interferes with too many of their dogmas. Never mind that what happened at the Tree of Life synagogue in Pittsburgh was exactly what happened at Mother Emmanuel church in Charleston. The parallel is ideologically inconvenient. Jews, you see, are white (even when they are brown or black), strong, and privileged oppressors, and they are most perfectly exemplified by Israeli soldiers who shoot at Palestinian children. And so the Jews have been stricken from the canonical roster of threatened groups and scorned identities. It is now considered inclusive to commemorate the painful past of every group except ours. We have the unique honor of being disqualified from intersectionality. Who needs Israel, anyway?

    For Boehm, who is an Israeli philosopher in New York, Jewish majoritarianism is another example of the “bad faith” of liberals, their way of concealing their own acquiescence in ethnic hegemony. In his account, there are only two parties to the debate now: those who are for his “federal binational republic” and those who, whether we know it or not, are objectively for ethnic tyranny and ethnic cleansing because we support the existence of a Jewish state. He detests us, the hawkish doves, the democratic statists, the liberals. His hatred of Israeli liberals is tiresome in the old radical way, according to which liberals are progressives who dare not speak their name or reactionaries who dare not speak their name, but they must in either case be destroyed. Of course liberalism is no more a sign of cowardice than progressivism is a sign of courage. (These days the percentage is in progressivism.) And modern history provides abundant evidence of the calamitous consequences of the radical contempt for liberalism: it has regularly assisted in bringing the worst to power.

    In support of his exercise in simplification, Boehm reverently cites that most dubious authority on Jewish and Zionist subjects, Hannah Arendt, who in 1944, after the Zionist Organization of America adopted a resolution calling for a “democratic Jewish commonwealth” in “the whole of Palestine, undivided and undiminished,” grimly proclaimed that Revisionist Zionism was now “victorious.” In her view, Zionism turned into fascism when it desired a state. “Seventy-five years later,” Boehm writes, “we can see that Arendt was dead right about the collapse of Zionism into its hard-right Revisionist interpretation.” Yet a student of history can see no such thing. Only three years later those same rapacious Zionists joyfully agreed to accept only a part of Palestine, very divided and very diminished, for their commonwealth, when they accepted the United Nations resolution of partition. (The Palestinians were the Revisionists.)

    For the haughty Arendt, as for Boehm, all distinctions are erased: everything to the right of the left is the same. I am reminded of a heated argument I once had with Amos Elon when he loudly announced over dinner that he would not vote in the Israeli election of 1992 because there was no difference between Yitzhak Rabin and Yitzhak Shamir. A year after Rabin beat Shamir, the Oslo accords were signed. Boehm, like the Israeli right, gloats over the failure of that breakthrough. We liberals mourn it, its flaws notwithstanding, because we know how difficult it is to accomplish even a little good and to push evil even a small way back, and because the status quo, which Boehm deplores, really is deplorable. But David Grossman and Moshe Halbertal are not standing in the way of its amelioration.

    The argument for the one-state solution comes with a politics of memory. Or more precisely, with a politics of anti-memory. Its proponents contend that the stasis between the Israelis and the Palestinians — no, there is no stasis, the situation constantly worsens — is the result of Jews remembering too much or remembering too little. Naturally Boehm trots out Nietzsche on the abuse of historical consciousness and the inhibiting effects of memory, though he, like everyone else, has events that he chooses to remember and events that he chooses to forget. He twice quotes, and twice misreads, Yosef Hayim Yerushalmi’s haunting question, “Is it possible that the antonym of ‘forgetting’ is not ‘remembering,’ but justice?” Yerushalmi was not, to put it mildly, hostile to memory; he wrote an entire book lamenting the failure of more critical modes of historical awareness to do for a living culture what memory once did. The point of his remark is, quite obviously, to suggest that remembering is a condition of justice, which cannot be advanced by forgetting.

    Boehm, by contrast, calls for forgetting. “It is time to restore a binational Zionism — with a strong notion of equal citizenship in a one-state solution,” he writes. “One way we can do this is by developing an art of forgetting, a politics of remembering to forget the Holocaust and the Nakba in order to undo rather than perpetuate them as the pillars of future politics.” There are two chapters in Boehm’s short book about this summons to forgetfulness, one about the Holocaust and one about the Nakba. Both of them, a little comically, are about the Jews. We remember our disaster too much and their disaster too little. But there are no injunctions to the Palestinians about the selectivity of their own memories, about their interpretations of their own historical narrative and its political uses. Why would there be? They have the past right, because they are the victims, without historical capability, with no good or bad choices that are relevant to the discussion, just the brutalized objects of Jewish representations and troops. The memories of victims are simple and sacred.

    For many decades I used to wander like an itinerant fire-and-brimstone preacher among the Jewish communities of America, chastising my brethren for certain failures that in my view imperiled the future of our people and our culture, and one of those failures is their inability, or their refusal, to understand what the events of 1948-1949, and more generally the raising of a modern society in the sands, meant for the Palestinians. In school, a fine Zionist school, I was taught that the Palestinian refugee problem was created by the flight of Palestinians at the behest of their leaders. Even before Benny Morris settled the question once and for all, I suspected that the explanation was too tidy and too self-exculpatory. I am the son of refugees, and I have always associated refugee status with prior cruelty.

    Moreover, I do not believe in the innocence of states, of any states; even in a just cause, innocent blood is spilled. (About this the pacifists are right.) The war of 1948-1949 was a just war for the creation of a state that had a right to exist and a need to exist, and also a war of self-defense, but massacres and expulsions were perpetrated. We must not lie, especially to ourselves. The mythmaking powers of national feeling are well known, and they must be corrected by historical and ethical accountings. None of this, of course, absolves the Palestinians of their own mythmaking, and certainly not of the many repulsive falsehoods that their leaders have promulgated over the years about the Holocaust and the Jews. Still, as Amnon Raz-Krakotzkin has incontrovertibly observed, “Israel, a state of refugees, was built on the creation of a state of refugeehood.” One has an obligation to become acquainted with the people with whom one will always live. It is past time for Jews to know, and to honor, the Nakba.

    “The art of forgetting” is no more a guarantee of mutual respect than the art of remembering. Atrocities have been the work of people who believe that they are nullifying the past and beginning again, just as they have been the work of people with ancient grievances to avenge. (Sometimes they are the same people.) Collective memory needs to be carefully and morally managed. I want Palestinians to remember the Holocaust and Israelis to remember the Nakba, because otherwise they will not comprehend each other. I want Jews to remember the Holocaust and Palestinians to remember the Nakba, because otherwise they will not comprehend themselves.

    Moreover, the culture of Zionism was already quite practiced in the “art of forgetting.” I do not refer only to its tendentious interpretations of Jewish life in the exile. Revolution requires a sensation of newness, and so it always erases and exaggerates. Thus, to choose a prominent example, in 1942 the Hebrew writer Haim Hazaz wrote a short story — an essay in fictional form, really — called “The Sermon.” It describes a fiery statement given to the central committee of a kibbutz by an otherwise meek member of the community named Yudka, which can almost be translated as “Jew.” “I wish to announce,” he tells his comrades, “that I am opposed to Jewish history.” And more: “I do not respect Jewish history…No, ‘respect’ is not the word. It’s what I said: I’m against it.” And more: “I don’t accept it…Not a single detail. Not a single line, not a single point. Nothing, nothing…None of it!” Also sprach Yudka!

    This program of revolutionary amnesia, this ferocious rejection of inherited ways, this contemptuous avant-gardism, did not work out so well. It was eventually responsible for bitter fissures in Israeli society and culture, and for the slow collapse of Labor Zionism. No arrangements between the peoples in the land will succeed that are based on the denial of the wounds that they seek with those arrangements to heal. The most that we may permit ourselves to dream of is a coexistence of traumas, of haunted communities.

    The meaning of national identity is not only morbid. One of the blandishments of security is the protection of immanent flourishing cultures. It is well known that alongside “political Zionism” there was “cultural Zionism”, though the former also had cultural preferences and the latter also had political preferences. The political preferences of cultural Zionism have played an important part in the argument for a one-state solution. A distinction is drawn between cultural self-determination and political self-determination, between cultural self-determination and statehood. Statehood, it is said, is not required for cultural fulfillment, for Zionist fulfillment, for Jewish fulfillment. There is some truth to this claim, as the early history of Judaism illustrates: it was Rabbi Yohanan ben Zakkai, when he petitioned the Roman commander of the siege of Jerusalem to grant him a small coastal town for the creation of an academy of Torah and never mind the fate of the commonwealth, who made Judaism as mobile as the soul and thereby freed Jewish culture from what might be called the Ozymandian anxiety. We have learned about many cultures from their ruins, but Jewish culture is not one of them.

    So isn’t cultural self-determination Zionism enough? The question has been posed heroically by Peter Beinart, in his timely journey leftwards. If the world were flat, Beinart would have fallen off it a long time ago. He has a reputation for courageous dissent against the Sith Lords of the American Jewish community, which is why he is the darling of Jewish millenials everywhere. “You know,” he once scolded me, “criticism can be an expression of love!” Actually, I did know this. “Yes, it can,” I replied, “but it cannot be the only expression of love.” Beinart is deeply worried that his rejection of a Jewish state — “I No Longer Believe in a Jewish State,” he grandiosely proclaimed in the New York Times — will be mistaken for a rejection of Zionism. He aims for heresy, not apostasy. That is why, for example, he has a strange habit of ornamenting the expression of his views with assurances about his religious observance. One afternoon I found him on CNN screaming at a conservative pro-Likud interlocutor that he worships in an Orthodox synagogue and even leads the prayers there. So what? I presume that many of the worshippers in his synagogue disagree with him, since it is an Orthodox congregation. When they lead the prayers, are they right? Beinart is just shul-washing.

    In order to prove that he is not anti-Zionist or post-Zionist, Beinart must locate a definition of Zionism that will give him cover, and identify a current of Zionism that would be satisfied with a political objective short of Jewish sovereignty. As it happens, the history of Zionism is rife with non-statist conceptions of Jewish self-determination. Neither Herzl (the title of his momentous book notwithstanding), nor Pinsker, nor Ahad Ha’am, nor Jabotinsky (some of the time), nor Berl Katznelson, nor David Ben Gurion (some of the time), nor any of the other titans of Jewish nationalism were animated in their work by the goal of sovereignty — until the Biltmore program of 1942, as I noted earlier. It is worth noting that the Biltmore program made no mention of a “Jewish state,” but called for called for “ending the problem of Jewish homelessness” by “establish[ing] a Jewish Commonwealth” which “welcomes the economic, agricultural, and national development of the Arab peoples and states.” A terrible hardening!

    Beinart relies heavily for his new thinking — if his thinking seems so fresh, it is because his knowledge is so recent — upon a splendid book called Beyond the Nation-State: The Zionist Political Imagination from Pinsker to Ben-Gurion by the Israeli intellectual historian Dmitry Shumsky, whose previous study of Zionism in early twentieth-century Prague skillfully shed light on the origins of bi-nationalism. With exquisite scholarship, particularly about Pinsker and Jabotinsky, Shumsky shows that statism was a late development in Zionism, which pictured the Jewish homeland to which it aspired in sub-statist or bi-nationalist terms, and mainly as autonomy within a larger political framework. Shumsky’s analysis seems unimpeachable to me, and also to Boehm, who quotes him at enormous length. But Shumsky’s narrative does not quite provide the Zionist alibi that one-staters and “cultural Zionists” such as Boehm and Beinart seek.

    For a start, the broad outlines of Zionist political thought, its evolution from autonomy to sovereignty, have long been familiar. I learned about the bi-nationalist tradition in Arthur Hertzberg’s seminar on Zionism fifty years ago. Shumsky has discovered new trees in an old forest. More importantly, there is a plain historical explanation for the trajectory of the Zionist political imagination. It is that all these figures, all these builders, lived and worked in imperial circumstances. The Ottoman empire and the Hapsburg empire were the contexts in which the idea of a Jewish homeland was first developed. It was conceived in the terms of its time, on the model of those tolerant multi-ethnic entities, in which the civil and cultural autonomy of ethnicities flourished in the absence of political power, which was held exclusively by the imperial authorities. The Nationalitätenstaat, or “states of nationalities,” that inspired Zionist intellectuals and activists was a notion of Austrian Marxists who were promoting the benign imperial conditions in which they lived into their ideal. The question of sovereignty, in other words, was moot. Shumsky is perfectly clear about this.

    But the empires are gone now. Is it progressive to be nostalgic for them? When the empires collapsed, sovereignty became possible for the nations that they contained and states were formed, as would happen again later with the end of the British empire. Subordinated peoples began to associate self-determination with political power. The Palestinians — who were subordinated to Arabs before they were subordinated to Israelis — would eventually express a similar desire. The vocabulary of self-determination had changed. And in the case of Zionism, there was another reason for the escalation of its political ambition. Its name was Hitler. Is it really any wonder that in 1942 the Zionists chose statehood? Was a rescue from extermination to be found in autonomy? Would a millet have saved the Jews?

    In the judicious conclusion to his book, Shumsky presents his own assessment of the relevance of pre-statist Zionism to the contemporary predicament. He notes “the ever-increasing pervasiveness of a bi-national existence between the Jordan River and the Mediterranean Sea resulting from the repeated failures of negotiations between Israel and the Palestinians and the constant expansion of the Israeli settlement enterprise beyond the Green Line.” In the light of these discouraging developments, he continues, “one is sorely tempted to pluck the Nationalitätenstaat formula from the Zionist past, to rescue from oblivion the repressed and deliberately forgotten attachment to mainstream Zionism, and to place them squarely on the Israeli and international agenda as the old-new federative alternatives to the apparently no longer viable two-state solution.” This is precisely what Boehm and Beinart and the other high-minded nullifiers of Israel are proposing. But Shumsky is not one of their company. We “would be well-advised to beware of such temptations,” he cautions.

    After all, following many generations of a bloody national conflict and given that Israel’s ongoing control over the occupied territories both Israelis and Palestinians continue to live alongside one another in separate institutional constellations, it is by no means certain that an attempt to reapply the binational models that occupied a central position in the political imagination during the Ottoman and British Mandate periods would meet the current “living concerns” of the two peoples…. Zionism’s conceptions of national self-determination were never subject to a single static political model but were rather reformulated at each given point in time in line with changing historical circumstances.

    (I wish the same historical flexibility could be imputed to the leaders of Palestinian nationalism.) Finally Shumsky elects to disobey the title of his own book and arrives at the sober conclusion that “Israel’s political consciousness would do well to embrace the notion of the division of the Land of Israel/Palestine into two nation-states.”

    Boehm’s advocacy of the autonomist option leads him in an unexpected direction — to the admiration of Menachem Begin. In 1977, after Anwar Sadat’s magical visit to Israel, Begin prepared a plan about the Palestinians for the imminent peace negotiations. It was called “Home Rule, for Palestinian Arabs, Residents of Judea, Samaria, and the Gaza District.” It was a surprising proposal. Most surprisingly, perhaps, it insisted that the question of sovereignty be left open. Though “security and public order” in the territories would remain in the hands of Israel, the plan terminated the Israeli military government in the occupied territories and established a Palestinian civil authority, headquartered in Bethlehem, whose officials would be democratically elected and provide “administrative autonomy.” It extended a choice of citizenship, full citizenship, for Palestinians in Israel. Palestinian refugees would be permitted to return “in reasonable numbers,” as determined by a joint Israeli-Palestinian-Jordanian committee. Palestinians in the occupied territories would enjoy “free movement and free economic activity,” including the purchase of land, as would Israelis dwelling there. Boehm affectionately, and ludicrously, declares that Begin’s “autonomy plan” should more properly be called “the one-state program.”

    Boehm blames the scuttling of Begin’s plan on — who else? — Israeli liberals, who, “deferring to the two-state orthodoxy,” denounced it because they suspected that it was the prime minister’s attempt to fob off the problem of the West Bank on an Austro-Hungarian fantasy. Given Begin’s ideological and oratorical record, there were grounds for such a suspicion. I shared it myself, though I hoped ardently that I was wrong. For there was another element in Begin’s plan that encouraged me: “these principles will lend themselves to re-examination after a period of five years,” he stipulated in his famous speech to the Knesset in 1977. I remember thinking, a Palestinian flag could fly over a Palestine in 1982! Of course it did not come to pass, but not because of anything uttered by Amos Oz. The opportunity was squandered because Yasser Arafat refused to consider Menachem Begin’s proposal.

    The Palestinians turned autonomy, “the one-state program,” down. The rais chose instead to give hysterical speeches and interviews about sumud, or steadfastness, in the PLO’s war against the fascist and colonialist enemy. “The Palestinians,” Boehm writes, “seeking sovereignty, rejected it.” But they did not dismiss the plan because they were seeking sovereignty. They dismissed it for a deeper and less reputable reason: they were not prepared to recognize and to respect the being in the world of Israel. And Boehm’s six words about the Palestinian rejection of Begin’s plan are just about all there is to be found in his book about the role of the Palestinians in the story of the infernal stalemate, which is typical of the progressive prejudice in the discourse about the conflict. This manifesto for a state of two nations is about only one of them.

    Why are Boehm and Beinart so confident that in a single state the lions will lie down with the lambs? What do they know about them that the rest of us do not? Boehm calls his utopia the “Haifa republic” in homage to the decades of Arab and Jewish coexistence in the northern coastal city. There “you get a glimpse of what Palestinian-Jewish cohabitation could one day look like.” He is somewhat prettifying the place. Arabs constitute only twenty-five percent of Haifa’s population: the harmony of the city is owed in part to the small size of its minority, which does not threaten its majority. One of the reasons that we reactionary proponents of the two-state solution long for the establishment of a state of Palestine alongside the state of Israel is so that the minority numbers in both states will make neither of the majorities jittery. The jitteriness of majorities promises trouble. Also, Haifa is not quite the idyll that Boehm depicts. It experienced its own share of the awful Arab-Israeli violence of last year. The assumption that a union of Israelis and Palestinians in a single country is an easy prescription for peace is delusional.

    Beinart, who could use a little sumud of his own, peddles different reassurances. In an essay in Jewish Currents, he moistly reports that “a new generation of Palestinian activists has begun advocating one equal state between the Jordan River and the Mediterranean Sea.” He has seen the future and it works. Of course it is no wonder that Palestinians, young or old, would endorse the idea of a single state, because that state, owing to demographic realities, will be Greater Palestine. This is the dishonesty in the argument: to be for one state is to be for a Palestinian state with a Jewish minority. Time, as two-staters have grown hoarse from warning, is not on Israel’s side. Still, I have met a few such young Palestinians and they do indeed represent a break with the immobilism and the illiberalism of the Palestinian establishment. I do not doubt their commitment to the principle of equality, even if I cannot suffice with it. It is certainly a thin reed with which to dismantle a state.

    I used to pin my hopes on new people, on new generations. I have since discovered that all the generations contain all the varieties of human types, and that people change. In 1993, a few days after the handshake at the White House, I met with Nabil Shaath, one of Arafat’s key advisors and an architect of the Israeli-Palestinian rapprochement. He was a worldly man, a rational man, a successful businessman, a longtime member of Fatah who became a minister in the Palestinian Authority. We had become friends, I liked him, and I brought him to lunch at the infamously Zionist magazine where I worked. He arranged for a few Israeli and American Jewish advocates for peace, including myself, to meet with Arafat at his hotel in the evening after the handshake. The chairman, he said, wanted to thank us. (Every negative impression I had of Arafat was confirmed by those few hours on the sofa.) At our meeting after the signing ceremony, Nabil spoke in noble and eloquent terms about Israeli-Palestinian reconciliation, and about building democracy in Palestinian society. He also talked with uncanny moral and historical sensitivity about the Holocaust. I was exhilarated. Then he moved to Ramallah and became one of the worst kleptocrats in the region and issued despicable pronouncements about Israel.

    Equality may be honored, or dishonored, in one state as in two states. What will tell is the prestige of the principle in the respective political cultures. The enumeration of rights “without distinction of religion, race, and sex” in the Israeli Declaration of Independence has not always been realized in the state that it launched, but it furnished the intellectual, legal, and social foundation for the quarrelsome and reformist politics, the persistent critique of inequality, that has characterized the public discourse of that state, rather in the way that the American Declaration of Independence provided the grounds for criticism of certain repugnant passages in the American Constitution; and Israel’s founding document stands as a lasting rebuke to the contemptible “Nation-State Law” that Netanyahu and his xenophobic supporters manufactured a few years ago. There is a struggle taking place in Israel for the values that will define it, and the struggle is by no means lost. And there are resources for it, for the humane side in it, in the Zionist tradition itself. (“Do you, like the medieval inquisition, fail to understand that diversity is life, and that only death is featureless?” Leon Pinsker wrote that golden sentence in 1861 in a short-lived journal called Sion.) But “cultural Zionism” cannot make a contribution to a struggle over the direction of a state that it wishes to obliterate.

    The political culture of the Palestinians, by contrast, has so far been, let us say, a stranger to the Enlightenment. I do not say this to offend, or with glee. The happiness of my people depends in part on the philosophical condition of the Palestinian people. In the internecine Palestinian war between democratizing forces and theocratizing forces, we, I mean Jews and Americans, must unflaggingly support the former. “If you believe in equality, how can you create a state which claims members of a certain race, or certain religion, belong to it more than others?” Beinart asked in an interview. He was right. He was referring, of course, only to Israel. But it is, in fact, a perfectly Israeli question. Beinart flatters himself about his moral fineness. Political Zionists and two-staters are plentifully to be found in the ranks of the egalitarians. Why are his professions of this shared belief any more credible than ours? Is his formula really so spotless, so devoid of dangers? Can one injustice be righted by another injustice? The Palestinians have for a long time asked that piercing question, but some of their tribunes should ask it also of themselves.

    His interviewer aptly inquired why Beinart’s denunciation of sins against equality does not extend also to Muslim countries, which have not exactly covered themselves in glory in this regard; and the question reminded me of my own disgust with the reticence of the left about Syria, Ukraine, Belarus, Myanmar, Hong Kong, Cuba, and the other locations of authoritarian horror. How in good conscience can one march for Palestinians and not for Syrians? “True,” Beinart replied, “there are other countries who violate this principle. In my opinion, they need to be reformed.” Yes, reform them! But he is not demanding that Israel be reformed. He is demanding that Israel be eliminated. Beinart is breaking new ground in progressive politics: he is cancelling an entire country. I know of no other state whose unjust treatment of others has thoughtful people calling for its erasure. There are concentration camps in China.

    The Israeli occupation of the West Bank, which originated in a war for survival, has been transformed by religion and chauvinism into a moral disgrace for a state that calls itself, and largely is, open and lawful; and the unceasing settlement of the West Bank has been perhaps the greatest strategic blunder in Israel’s history — a nuisance from the standpoint of security, and utter madness if the Israelis are ever genuinely to coexist with the Palestinians. The Palestinians deserve security, and dignity, and identity; and the attributes of nationhood, which include political ones. But no amount of sympathy for the Palestinians warrants this amount of antipathy for the Israelis. They, too, deserve security, and dignity, and identity; and the attributes of nationhood, which include political ones. It has long been known that nationalism is an affair of collective subjectivity: one people cannot dictate to another people how they should represent themselves to themselves, or to others.

    Brit Shalom was an association of formidable Jewish intellectuals, founded in Jerusalem in 1925, that advocated a binational state of Jews and Arabs in Palestine. Martin Buber was its most famous member, whom Peter Beinart likes to cite as his precursor. In the end nothing came of it, because it found no Arab interlocutors: it was a conversation that Jews were having with themselves. As Arthur Ruppin, the sociologist who was its chairman, remarked in a letter, “what we can get [from the Arabs] we do not need and what we need we cannot get.” In 1936, in a volume of essays called Jews and Arabs in Palestine, Ben-Gurion made a comment about Brit Shalom that is worth pondering. “We oppose Brit Shalom,” he wrote, “not because of its desire for peace with the Arabs, but because of its attempt to obliterate the Jewish truth, and to hide the Jewish flag as a price for peace.” There was nothing mystical about his statement, though “the Jewish truth” is a locution that can make a liberal squirm. Its secular meaning is simply that people are fulfilled, as individuals and as groups, in their particularity. Peace is gorgeous but contentless. We seek peace, and the security that is the premise of peace, as the setting for self-fulfillment.

    Jewish peoplehood is one of the most ancient facts of recorded history, and Jewish nationalism is the modern interpretation, according to the protocols of modern politics, of Jewish peoplehood. The people called the Jews constitute a primordial and independent element of the world, irreducible to its other elements, converging with them and diverging from them, and through all the convergences and the divergences remaining themselves and no other, changed but the same. Their story is one of the sublime human stories, and it deserves to command the attention and the esteem of all peoples. After they were expelled from their commonwealth and made slaves and subjects and serfs, with self-governance gone, the Jews learned to live in many cultures and many circumstances, and to combine adversity with vitality. To the world they offered a certain vision of God and goodness, but of the world they asked only to be allowed to be themselves. Yet their demand for apartness, their perseverance in their beliefs and their practices, was more than the world was willing to grant, because the insistence upon difference defied the need of other faiths for universal vindication, even by means of coercion. Suffering became a regular feature of the Jewish exile. The suffering never determined the substance of Jewish peoplehood, or of Jewish religion, not even in the worst of times. But the suffering had to stop, and nobody except the Jews themselves was going to stop it. The people that taught the world about the relation between history and redemption chose to act on their own idea, splitting themselves for the sake of saving themselves. Their self-reliance was both a revolution and a restoration. And so one day, long after they were supposed to have disappeared, in the very land from which they had been banished by empire, the wounded and hopeful people raised a flag. Insofar as “the Jewish flag” is the symbol of this saga — of the end of the suffering, of the reversal of the vast misfortune, of the efficacy of the victims against their own victimization, of the failure of misery to crush a civilization, of the beautiful stubbornness of purposeful survival, of the accession to freedom of the perennially unfree — insofar as the Jewish flag represents all this, then it represents more than statehood, it represents the energies and the potencies of a magnificent people, and it cannot be denied. It has flown over war and over peace, over stirring victories and wise compromises and sordid mistakes. Nobody is without guilt. Power will always be a challenge to wisdom. Introspection is another name for self-rule. Pity the people who need to suppress others to become themselves; pity them and resist them.

    Turning in My Card

    “How many Vietnam vets does it take to screw in a light bulb?”
    “I don’t know. How many?”
    “You wouldn’t know. You weren’t there.”

    In the American military, identity is an enduring obsession. Long before debates swirled through cultural institutions about the value of hyphenated American identities or the relative fixity of gender-based pronouns, the American military had already determined that identity supersedes individuality. Within the ranks, the individual means little, he or she exists as a mere accumulation of various organizational identities — your rank, your unit, your specialty — all of which stand in service to the collective. This obliteration of the individual begins in training, on day one, when every new recruit is taught a first lesson: to refer to themselves in the third person. You cease to exist, you have become “this recruit.” And you are taught, among the many profanities you might hear in recruit training, that there is one set of slurs that is most unforgivable of all: I, me, my. 

    This doesn’t last forever. I served in the Marines and one of the first privileges the Corps granted me on the completion of training was the privilege to again refer to myself in the first person. Except that I was no longer the same person. I was now 2nd Lieutenant Ackerman, my military identity had eclipsed my civilian one. This new identity placed me firmly within the military hierarchy as a junior officer, and from this position I would over years further build out my identity — and thus my authority — within the organization. I would pass through training courses that would give me expertise. I would go on deployments that would give me experience. And I would gain in seniority, which would give me rank. When in uniform, I would literally wear my identity. Badges of identity, indeed: eventually it became the captain’s bars on my collar, the gold parachute wings and combat diver badges that showed I had passed through those rigorous training courses, as well as the parade of multicolored ribbons that at a glance established where I had served, if I had seen combat, and whether I had acquitted myself with distinction. 

    All these colorful pieces of metal on my uniform served the purpose of immediately establishing my place within a hierarchy. Which is to say, the military obsession with identity is not really an obsession with identity at all; it is an obsession with status and rank. And so it has become in the cultural hierarchy of America, where identitarians invoke an elaborate taxonomy of hyphenations and pronouns with the zealotry of drill instructors. Ostensibly, this new language is designed to celebrate individual difference. In practice it annihilates the individual, fixing each of us firmly within an identity-based hierarchy that serves collective power structures.

    As a combat veteran, I have been the beneficiary of identity-based hierarchies for years. But this was not always the case. In April 2004, I took over my first unit, a forty-man Marine rifle platoon. We were based in Camp Lejeune waiting to deploy to Iraq that June. On a rainy day, when I asked some of my Marines to patrol around the base practicing formations we would soon have to employ in combat, Sergeant Adam Banotai, a super-competent (and at times super-arrogant) twenty-one-year-old squad leader in the platoon told me that he thought my plan was a waste of time. He had been to combat, and I had not. Even though I outranked him, he sat above me in an invisible moral hierarchy in which combat sits as the ne plus ultra of status. 

    I decided to respond to this minor act of insubordination. I brought Sergeant Banotai into my office and had him sign a counseling sheet in which I marked him deficient in “leadership.” I explained that leadership required loyalty both up and down the chain of command. By flagrantly refusing to follow orders he had been disloyal to me and, thus, a bad leader. When I explained that I would place this counseling sheet in his service record, Sergeant Banotai didn’t like it one bit. As he signed, he said, “What the fuck do you know about leading Marines, sir. I was leading Marines while you were still in college.” 

    Fair enough; but we still had to go to war together. Only a few weeks after the counseling sheet incident, out on patrol near Fallujah, my Humvee hit an IED. We were driving parallel to a long canal and I was first in the column of vehicles with Sergeant Banotai sitting a few Humvees back. He later told me that from his perspective I simply vanished in a cloud of dust and smoke. As hunks of shrapnel and earth plunked down into the canal, he was certain that pieces of my body were among the debris, and, in a macabre admission, later told me that he imagined having to fish my joints out of the putrid water. What had happened was that two artillery rounds had gone off right next to my door. Fortunately, the rounds had been dug in too deep, so that their blast fountained upward, over my head, leaving me with dust in my throat and ears ringing but little else. I then jumped out of my Humvee. Whoever had detonated the IED fired a few shots at us as I jogged back to Sergeant Banotai. He and I worked together to coordinate our platoon’s response, in which we searched the area and eventually carried on with our patrol. 

    After that day, everything changed. Our operations ran more smoothly, with no complaints. During off hours, Sergeant Banotai and the other NCOs came by my “hooch” to joke with me. We all got along. Several months — and firefights — later, I asked Sergeant Banotai about that sudden shift in attitude. At first he laughed off my question. When I pressed, he became a bit sheepish, even apologetic. “Well, you got blown up,” he said. “After that we decided that you were okay, that you were one of us.” 

    To this day his words bring to mind a moment in Oliver Stone’s Platoon, in which Charlie Sheen’s character, the doe-eyed new soldier Chris Taylor, after being wounded in his first firefight, returns to his platoon after a brief stay in a field hospital. An experienced soldier named King takes him to an underground bunker. Here the old hands are having a little party. When one of them asks, “What you doin’ in the underworld, Taylor?” King replies on his behalf, “This here ain’t Taylor. Taylor been shot, this man here is Chris, he been resurrected.” At which point, Chris joins their party, smoking dope and singing along to Smokey Robinson’s “Tracks of My Tears” along with the rest of the platoon. It is an incredibly human scene and — call me sentimental — I am moved every time I watch it, as it traces my own experience of rejection followed by acceptance born out of combat. Chris’s experience in the firefight has resurrected him. In the eyes of the group — the platoon — he isn’t the abstraction “Taylor” anymore; his spilt blood has made him “Chris,” an individual. 

    For a while, I resented Sergeant Banotai’s response. I was the same person before the IED attack as I was after it, no more or less competent. This need to classify me as “other” because I was not yet a “combat vet” felt capricious, indulgent, condescending, and so against the best interests of the platoon, which needed coherent leadership up and down the chain of command to run smoothly in combat. But of course the metrics of identity are typically arbitrary, and also typically they rarely serve the best interests of the group. Tribal by nature, identity fixates on difference, too often seeking to narrow, as opposed to enlarge, who merits membership in the tribe. Identity is as much, or more, a method of exclusion as of inclusion; it fortifies itself by casting others out.

    There is a famous Bedouin adage, which I first came across in Iraq: “I am against my brother, my brother and I are against my cousin, my cousin and I are against the stranger.” In this remorselessly reductive manner, through the accentuation of differences (as opposed to the assertion of commonalities), one group is pitted against another in perpetuity. In my case, I had power over Sergeant Banotai because I was an officer. In his case, he exercised power over me for a time because he was already a combat veteran. Again, identity is hierarchy, a wolf in sheep’s clothing.

    Eventually we returned from Iraq. Sergeant Banotai left the Marine Corps, shedding that identity, and we shed the attendant hierarchy, the system of rank that had once existed between us. A few years later, when he invited me to serve as a groomsman in his wedding, I was still on active duty. He invited me to wear my uniform to the wedding, if I wanted. I wore a suit instead. We were now simply friends, resurrected (to use King’s word) outside of identity and into individuality, where we remain to this day.

    When he returned from Vietnam, the writer Karl Marlantes went to work at the Pentagon in an anonymous desk job. He had seen the worst of war as a Marine, earning two Purple Hearts in the process. Then, slowly, his actions in Vietnam caught up with him and he received several further commendations, to include the Navy Cross, our nation’s second highest award for valor. In his memoir, What It Is Like To Go To War, Marlantes writes about the experience of earning these medals: “With every ribbon that I added to my chest I could be more special than someone who didn’t have it. Even better, I quickly learned that most people who outranked me, who couldn’t top my rows of ribbons, didn’t feel right chewing me out for minor infractions. I pushed this to the limit.” Marlantes stopped cutting his hair. He grew a mustache that he describes as a “scraggly little thing that made me look like a cornfed Ho Chi Minh.” Eventually, a more senior officer who had also been to Vietnam but “had nowhere near my rows of medals” called Marlantes into his office. “I don’t give a fuck how many medals you’ve got on your chest,” he said. “You look like shit. You’re a fucking disgrace to your uniform and it’s a uniform I’m proud of. Now get out of here and clean up your goddamn act.” 

    Reflecting on the incident in his memoir, Marlantes writes, “I can’t remember the man’s name. If I could, I’d thank him personally. He called my shit.” It takes courage to call someone else’s “shit,” particularly when their externally verifiable identity trumps one’s own. We all know when someone is tossing about identitarian arguments in order to evade the substance of a matter, confidently issuing assertions that cannot stand on their own logic and so instead they hoist themselves up on who, in some framework, they are. Typically, these special pleadings are spoken with that tired preamble, “Speaking as a…” in which the speaker telegraphs their intention to silence dissent through an appeal to identity-based deference, as surely as if they are standing on a golf course shouting “Fore!” down the fairway. As on the golf course, the objective of the intervention is for everyone to get out of the way.

    Rhetorically and psychologically, identity is often wielded as a weapon. Some identities cut sharper than others. I am descended from Ukrainian Jews on one side of my family and Scotch-Irish Texan wildcatters on the other. The world perceives me as a straight white man — a dull blade if I’m hoping to cut with identity. Except that there is one thing that corrects for my disadvantage in the identity sweepstakes and compensates for my dull archaic status: I am a combat veteran. Suddenly my blade is sharp! I am owed deference, and have the standing in the great American identity calculus to shut people up. Late in my military service, I came to understand how my identity accorded me such deference in certain situations, the ability to silence the dissent of those who might disagree with me when discussing, say, our wars in Iraq and Afghanistan.

    Some might argue that this is appropriate, that I have earned it. I don’t think so. The authority of experience certainly counts for something, but should it count for everything? Should only those who have the authority of “lived experience” be entitled to raising their voice on certain issues — on race, on gender, or, as in my case, on the critical issues of war and peace? Is Oliver Stone’s Platoon acceptable because he is a Vietnam veteran, while Francis Ford Coppola’s Apocalypse Now is a work of “cultural appropriation” because he is not? Must one have been to war for one’s opinions about war to matter? 

    Consider an obvious example of the interplay between identity and art. The Catcher in the Rye is commonly regarded as a work of adolescent alienation, but I would argue that it is more properly understood as a war novel. J.D. Salinger was a veteran of the Second World War who landed at Utah Beach on D-Day and fought in the Battle of the Bulge and in the Hürtgen forest, and was among the troops that liberated the concentration camps — but generally he did not take on the war in his work, as if he knew that a limit existed as to what he could directly convey. Yet the voice of Holden Caulfield, for which the novel is renowned, is one whose provenance I recognized after returning from my own wars: it is the voice of the cynical veteran to whom everyone is “a phony,” the vet who wants to visit the ducks in Central Park, to recover to an innocence that will never return and perhaps never was. Take the novel’s last line: “Don’t ever tell anybody anything. If you do, you start missing everybody.” Those are quintessentially the words of a veteran. And yet Salinger scrupulously, as a matter of authorial intention, chose to omit his experience from his work, engaging with it obliquely. Do the new protocols of identity require that we put it back in?

    There is a philosophical problem here: how can you truly know what someone else’s experience is, or what access points he or she — or they, speaking of ever more recent complex identities — brings to a subject? I am not suggesting that identities are necessarily false, but they are certainly subjective, and we need to think more critically about the authority of subjectivity in our society. A good place to begin such critical self-examination would be to propose that there are no classes of people whom we should believe as such. We must show empathy, and make every effort never to begrudge it or hold it back, but after empathy we must inquire after truth, and evaluate the claims that are made on our conscience. Injury does not confer infallibility. Military veterans have sometimes misremembered the experience of battle, and sometimes even lied about it, and they are not immune, nobody is immune, from correction, from being called on “their shit.”

    The appeal to identity as the dispositive consideration in any debate is anathema to an open liberal society. Yet here we are. I recall reading a column by David Brooks in 2015, when the tide was beginning to rise on identity. His column was framed as a personal letter of appeal to Ta-Nehisi Coates on the occasion of the publication of Coates’s Between the World and Me, a book that not only touches on the black experience in America but also on the American experience itself and the validity of our shared experiment in creating a multicultural democracy. It is a book about its author, but also about all of us. Brooks did not agree with some of Coates’ conclusions, and his disagreement rattled him. “Am I displaying my privilege if I disagree?” he plaintively wrote. “Is my job just to respect your experience and accept your conclusions? Does a white person have standing to respond?” Those timid sentences are a kind of epitaph for free and candid — and respectful and constructive — discussion.

    Marlantes, when reflecting on his own standing as a decorated combat veteran, writes that “In the military I could exercise the power of being automatically respected because of the medals on my chest, not because I had done anything right at the moment to earn that respect. This is pretty nice. It’s also a psychological trap that can stop one’s growth and allow one to get away with just plain bad behavior.” This “psychological trap” is now the trap of our culture, in which identity confers authority; a culture that not only stifles the interest in the individual by reducing him and her to a representative and a spokesperson, but also further isolates those groups whose interests it purports to advance. 

    This has certainly been the case among veterans. Few groups in American life are more fetishized. We are elaborately thanked for our service, allowed to board planes in front of the elderly, and applauded at sporting events. Honoring us has become a secular eucharist. Yet when it comes to the devastating issues that disproportionately affect veterans — homelessness, suicide, political extremism — most people look away. Our insularity, our otherness, has done nothing to lift us up. In fact, it has hurt us. A citizen need only render their deference and then be on their way. Be wary of people who pay fulsome respect to your identity, because what they are actually paying respect to is identity’s twin: victimhood.

     

    The first time someone called me a victim was at a moment when I was very publicly engaging with my identity as a combat veteran. In retrospect, this correlation between identity and victimhood seems obvious, but at the time it was not. A group interested in international relations had invited another Marine and me to give a presentation about Iraq, specifically our “on-the-ground perspective.” This was a little more than a year after I had returned from the war. On that day I wore my olive green “service alpha” uniform with its khaki shirt and tie, and before giving my talk I was generously feted around the room by my hosts. It was a distinguished group and, as we sipped soft drinks and nibbled hors d’oeuvres, I learned that certain of the people in the room held rather senior positions in government, or at least positions many levels above a lieutenant of Marines. 

    The presentation began and my co-panelist and I made remarks, showed photographs from our deployments, and did our best to describe the conditions under which the war was being waged. We answered questions from a moderator, the majority of which focused on the tactics of the war as opposed to its strategic utility. In short, we spoke only as junior officers with combat experience.

    Then, when answering a follow-up question, my co-panelist made a comment about the Sunni tribes in al-Anbar province beginning to organize against al-Qaeda in Iraq. He regarded this as a positive development and contended that the United States needed to fully commit to this effort (which eventually become known as the Sunni Awakening) by “surging” even more troops into the country. He believed the war might soon turn a corner. Suddenly we were not talking tactics anymore. We were talking strategy, and he had veered outside the lane proscribed by his identity as a combat veteran. He was now speaking the language of policy among those who held senior policy positions. This was in 2006, a time when the war in Iraq was becoming extremely unpopular. When the moderator asked whether or not I agreed with my comrade, I said that I did.

    The moderator then solicited the next question. Hands shot up. An older woman went first. She asked how either of us could possibly defend the idea of sending more American troops to Iraq. My co-panelist reiterated his arguments — that al-Qaeda had overplayed its hand, that Sunni fears of Shia dominance in the newly formed Iraqi government created an opportunity that could undermine the insurgency, that it was worth making an effort to salvage the blood and treasure America had already expended. Did counterarguments exist? Of course they did. Did this woman engage us on the basis of those counterarguments? She did not. Her disagreement took an entirely different direction. She explained that we supported a surge because the war had made us victims. 

    “I’m very sorry for what you’ve been through,” she said. “But you are victims of this war. Given your experience, I have a hard time believing you can see the situation in Iraq clearly. Emotionally, you’re too invested.” Having made this declaration, she did not sit down, but remained standing in expectation of an answer. Politely, I explained that I was in no way a victim, that I had volunteered to serve in our wars and had volunteered again (I would soon leave for Afghanistan), that my opinions were rooted in my experience and my understanding of it, and that she was free to disagree with my arguments on their merits — but not on some specious claim that I was a victim of the very experience she had come to hear me discuss, and therefore no longer able to think as an individual. 

    The woman refused to relinquish the microphone until making a final point. Although she appreciated hearing an “on the ground” assessment from a combat veteran and continued to offer the somewhat obsequious respect that my identity commanded, I did not have her permission to repudiate the description of myself as a victim. According to her, the very fact that I refused to view myself as a victim was all the more proof that I was one, that the wars had damaged me. I had been blinded by my time at war to the wrongness of supporting any position except the swift and immediate termination of these wars, regardless of the actual conditions. (Never mind that my support for a surge was itself based on my “on the ground perspective.”) The only position that I could properly derive from my experience was one that coincided with her own.

    This incident has stayed with me not because it was unique — on later occasions I would again be called a victim — but because it was the first time that my relatively new identity as a combat veteran had served to disempower me. Another combat veteran, J.R.R. Tolkien, who fought in the First World War, provides an analogy. In his Lord of the Rings trilogy, Tolkien writes about the Rings of Power, particularly the One Ring which gives its wearer the ability to see and govern the thoughts of others; but it also slowly erodes the wearer’s vitality. In Tolkien’s trilogy, the humble hobbit Frodo, whom Tolkien modeled on the common British Tommy with whom he served in the trenches, is the only one who can bear the One Ring because of his pureness of heart; but even it calls to him and haunts him, robbing him of his strength and nearly spelling his demise. 

    Identity, even an identity confirmed by a chest full of ribbons, seduces in a similar way. It is a devil’s bargain — not a heaven in which one serves a nobler cause but a hell in which one reigns. When you wear identity, you can feel its power. But in the long run it takes more than it gives, leaving you bereft of your personal difference — a Gollum enslaved to its service.

    To escape this system of doctrinaire social evaluation, we must each disarm. Is it possible to displace those who brandish identity as a cudgel from the center of our culture to its fringes? Now that would be revolutionary. This would involve us no longer deferring to a person’s identity but rather to their individuality, and recognizing that individuality consists in more than the simple accumulation of sub-identities, the sum total of all our group memberships. Veterans occupy an interesting niche in the politics of identity. As a group our struggles are as real as those faced by other groups. The history of civil rights in America is also the story of America’s veterans; and we, too, enjoy inclusion in legislation with special protections, much like those dispensed to racial minorities and other marginalized groups — including equal employment opportunity, access to housing and education, and protections from targeted crimes. The important difference is that we are not born veterans. It is an identity we come to later. We choose it.

    When I was beginning my military career as a college midshipman, my enthusiasm to become a Marine was boundless. I wore my hair according to regulations (though I was not yet required to do so) and during vacations I found a way to volunteer for an internship gophering papers around the Pentagon. In certain ways I must have been insufferable. That is why, I suspect, a recently retired Navy SEAL Commander, who worked a few cubicles down from mine as a civilian, stopped by one morning. He stood well over six feet tall, and had fought in Beirut, Panama, and Desert Storm, before being forced into early retirement due to a parachuting injury. In sum, he was an intimidating fellow who embodied much of what I hoped to become. Until that morning he had never taken much of an interest in me, and with just the two of us in the office, he had my full attention. “Can I give you some advice?” he said. “I’ve served with a lot of Marines. Some good, some bad. Do you know the difference between the good ones and the bad ones?” Sitting straight in my perfectly creased uniform, I didn’t have a clue. And so he told me: “The good ones never forgot who they were before they became Marines. Don’t you forget, either.”

    I tried never to forget that advice, when I went to war but also when I came home. That is what I mean when I say that I’m turning in my card. I am not going to stop being a veteran, any more than someone from a specific racial, ethnic, or gender group will ever stop having the experiences that come with being a part of that group. But I will not allow this single element of my experience, this one personal attribute out of many, to eradicate the core of who I am. I will not play my veteran card in interactions with others, even if it’s a very good card to play. And I will not allow myself to forget who I was before I became a Marine, or any of the identities — there were many — into which I was born. If I ever do forget the true core of who I am, however elusive it sometimes is, the self or soul that lies beneath all ascriptions of identity, then please use another word to describe me: call me lost.

    Writing and Slaughter

    I

    The Thousand Year Reich had come to an end after twelve bloody years.

    The “belated nation,” which had drawn the short straw when it came to dividing up the overseas colonies of the world and so colonized inwards with the expulsion and destruction of the Jews (this was the writer Heiner Müller’s thesis), had become the scourge of the world, a disgrace among nations. Germany’s dream of expanding eastwards, with military villages and farming communities all the way to the Urals and protectorates everywhere, the evil utopia of world domination envisaged by its Führer, was over. It happened so fast that all anyone could do was rub their eyes. Had these Germans lost their minds?

    After the war, the previously hyperactive nation with its vision of world domination turned inwards. Now the Volk ohne Raum, the nation deprived of its longed-for Lebensraum, was to focus instead on the last unspoiled bit of Heimat, or homeland, left to it — the Feldweg, “the field path” or “country path” extolled by Martin Heidegger, the philosopher of the hour, in a short but widely read essay in 1953. Martin Heidegger, forerunner of the eco-movement, secret hero of the Greens? Something abiding had to be found, something tried and tested, 

    unspoiled, something that by its very nature spoke of Heimat and made defeat bearable as a kind of renunciation. Because, as the philosopher declared, “The Renunciation does not take. The Renunciation gives. It gives the inexhaustible power of the Simple. The message makes us feel at home in a long Origin.” 

    This was the new program, an ecological manifesto avant la lettre, in a mixture of romanticism and the objectivity of the moment, as only a German could write it. There was the lark on a summer’s morning, the oak tree on the wayside and the roughly hewn bench, on which “occasionally there lay […] some writing or other of the great thinkers, which a young awkwardness attempted to decipher.” And right there was the vision of a world into which those weary of civilization could withdraw from the catastrophe of modernity. Only they, these few, the abiding, will someday be able, through “the gentle might of the field path…, to outlast the gigantic power of atomic energy, which human calculation has artifacted for itself and made into a fetter of its own doing.”

    The “jargon of authenticity” was what another German thinker, Theodor Adorno, called it, in his critique of ideology based on what he described as linguistic atavisms at odds with modern life — a reckoning with Heidegger’s philosophical style. His book by that name was perhaps not the last word on the matter, but it was formative in its polemic. The representatives of the Frankfurt School shelved Heidegger as a problem of linguistic aberration, of anti-modern prose. But the thinking behind Heidegger’s work was not to be got rid of quite so easily; the seminal eco-sound of his philosophy could not be switched off as one switches off the radio. Zeitgeist or not: the debate read like a commentary on the economic miracle, the era of motorway construction and the booming automobile industry with its Volkswagen, Mercedes, and BMWs, all made in Germany. Who was on whose side? Who cared about the objections of the newly emerging discipline of sociology, about Adorno’s critique of language, set against the logic of origin in the words of the ontologist Heidegger, who had become suspect as a teacher because he had praised the Führer in his Rector’s speech at the University of Freiburg in 1933, believing in the Platonic discourse of “tyrannous education”? 

    The fixation on overcoming the density of populated spaces had, once and for all, been driven out of the little men of the master race, the descendants of poets and thinkers, who wanted to rule over the peoples of the world. Their territory — which on a map of Europe of 1939, after Germany had gained Saarland and annexed Sudetenland and subsequently all of Austria, stretched as far as East Prussia on the border with Lithuania — had now shrunk to the potent core that the victorious powers divided among themselves. 

    A yeast dough that could no longer rise, that from now on had to be content with what was left of the burned cake. An area between the North Sea in the West and the Oder River in the East: so little room for such a mighty people. Eighty million people who had to learn their lesson. Several different generations who had to grasp in school, in geography lessons, that this was it, once and for all. No more urge for expansion; all outward movement in terms of territory was at an end.

    Hitler’s last feint had not been credible even for a second: his attempt to present himself as a protector against the oncoming inundation of Bolshevism, the core of his morale-boosting speeches — his view that Mein Kampf was only the expression of a “final argument about the reorganization of Europe.” The survivors of his adventure were left with only one option: to turn back into a smaller space, to turn within, to find diligence and modesty. In this, the Germans were well-practiced: fantasists, born dreamers, for whom, once they had repressed their national feelings of guilt, only the worship of silence remained. The silence after the final bell tolls. “It reaches out even to those who were sacrificed before their time through two world wars,” said Heidegger. What this really meant was silence about one’s own memory of the dead that tried to pass over the millions of deaths of others in silence, too.

    A new nation was thus born, a divided one, lifted from the cracked baptismal font by the victorious powers — with the Marshall Plan affiliation with the West on the one side, and on the other side integration into the Eastern bloc under the control of the Soviet Union — forty years of decreed division. Yet after this Cold-War-limbo between the former Allies who had been involved in a coalition against Hitler, the Germans came together again in a Europe characterized by diverse dissolved power blocks, and they now needed to discover how Germany would cope in an epoch of intensified globalization, given its precarious position in the middle of the continent.

    The above is how a speaker at the U.N. General Assembly in New York, a German practiced in self-humiliation, might have begun a speech. It could have been me, perhaps. But what am I trying to say with all this? Quite simply, that history, in capital letters, intruded even into my little life one day. I was still at school, on the outskirts of Dresden, when it first dawned on me what I had gotten myself into, without any of my own doing. Beyond the box hedge in front of my childhood home lay an empire that stretched eastwards to the Pacific, to Vladivostok and to Inner Mongolia. Or in the words of Hölderlin, “I am pulled as streams are by the ending of something that stretches away like Asia.” Hölderlin, another outsider, a poet and misunderstood artist who baffled authority and who served first the nation (and later National Socialism) as a heaven-sent source of quotations because he changed the course of German poetry forever.

    Two scenes from my childhood are still vivid. In the first — a winter’s morning in the late 1970s — I pull the garden gate behind me and lift my rucksack onto my shoulder. I am on my way to school when a Russian military convoy made up of Ural-375 troop carriers races past me on Karl-Liebknecht-strasse, and my gaze is caught by the huge, hulking wheels of the vehicles, the felt coats and steel helmets of the soldiers, and I stand stock still in amazement and forget time.

    What did I know then of Isaac Babel’s Red Cavalry, or Mikhail Sholokhov’s And Quiet Flows the Don, later to be compulsory school reading, or Vasily Grossmann’s Life and Fate, a book I read only decades later, long after the Soviet Union disappeared? But the image of those Russian troops imprinted itself on me, and from that moment on I began to understand German history from the point of view of the Russians, who were cavorting here in my hometown, right on the doorstep. In the beginning it was the cartridge cases that we boys found in the surrounding woods, military badges that we bartered with the Russian soldiers at the barracks gates, not two hundred meters from my parents’ house, in exchange for photos of naked women (from the only East German magazine that printed such pictures). It was not until much later that I read Joseph Brodsky, that brilliant Soviet renegade and the first poet of the transition from East to West, whose verses immediately struck home: “In the beginning, there was canned corned beef. More accurately, in the beginning, there was a war. World War II; the siege of my hometown, Leningrad; the great hunger, which claimed more lives than all the bombs, shells, and bullets together. And toward the end of the siege, there was canned corned beef from America.”

    The second scene is a bit more complicated. We are in the middle of a history lesson in the eighth grade, and the teacher, a strict member of the Communist Party, gives me the task of preparing a lecture. Its theme is the Nuremberg Trials. I head to the Saxon State Library, then housed in the city’s largest barracks complex, also containing troop accommodations and weapon depots belonging to the Soviet Army and the NVA, or the GDR’s National People’s Army. The juxtaposition of scholarship and the military life of the “fraternal armies,” as I experienced it on the way to the silence of the reading room, got me thinking. I procured my first library card and set myself up as a permanent fixture among the books, which soon earned me the respect of the staff and years later led them to allow me to view the archive of Viktor Klemperer. At that time hardly anyone outside Dresden knew the author of one of the most important diaries of the Nazi era; he was known only as the author of LTI, a groundbreaking study of the infiltration of the German language by the ideas of Nazism. His extraordinary diary was discovered in Germany, via the success of the America edition in the West, after the fall of the Berlin Wall. Only then was it recognized across the world for what it is: a rare chronicle of everyday life of the Nazi era in a city (Dresden), from the perspective of an academic dismissed from his post, a persecuted Jew, who experienced the increasing loss of rights first-hand, to the point where he himself was threatened with deportation. This coincided with the destruction of his hometown, Dresden’s downfall in a firestorm, which saved his life. 

    The fruits of my schoolboy work gathering facts on the Nuremberg trial was limited. Still, it had the function of opening the poison cabinets to otherwise censored publications from the West. I was able to study Eugene Kogon, Telford Taylor, Joe Heydecker’s photographs, and the twelve-volume series of writings of the International Military Court, all the monographs available then on the crimes committed in the concentration camps, the ghettos in the East, and behind the fronts. Raul Hilberg’s key work, The Destruction of the European Jews, which appeared in America as early as 1961, was not published in Germany until twenty years later. Claude Lanzmann’s documentary film Shoah was still far beyond all horizons of such “reappraisal.” It was through him that I first learned the Hebrew word for the deflated term “Holocaust.”

    The result, then, was meager compared to the wealth of specialized material available today. But I had been bitten by the bug of historical research, and my work soon began to display worryingly manic features. My lecture to the class did not stop at that one assignment: it kept on expanding, until it took up three whole history lessons — to the great satisfaction of the teacher, who believed my soul had been saved for the anti-fascist cause. In the end I was pretty clued in about the arms industry and the murderous judiciary in the Third Reich, about Hitler’s euthanasia program and the special task forces, the system of forced labor, the extermination camps and the medical experiments performed on inmates.

    I can well imagine what a strange impression a fifteen-year-old must have made lecturing the class on racial politics, Jewish persecution, SS massacres, gas chambers, and the role of German companies in the business of extermination. That was when I first learned with absolute certainty that the firm I.G. Farben supplied Zyklon B, that the Erfurt company Topf & Sons built the crematoria, and that the Allianz AG insurance company insured the barracks of Auschwitz. And even today I can still recite, immediately on waking, the new categories established during the trials, against all precedent of jurisprudence: crimes against humanity; war crimes; and (most astonishingly) crimes against peace. Though often disregarded, the Nuremberg ruling outlawing wars of aggression is valid still today.

    It was then, almost as an aside, that I first heard about the Declaration of Human Rights. But the dour teacher allowed the mention of it to pass without comment. For the Declaration of Rights touched on a taboo in the self-image of the other German state, because invoking universal human rights was the last resort of the opposition. The argument as to whether the GDR was an illegitimate state still divides people today. The Nuremberg trials, I explained to my classmates, were not a case of so-called “victors’ justice,” and that, too, remained uncontradicted. Or had it simply been lost in the mass of material? Fired up by my awakening sense of justice, I was able to quote to my class the American chief prosecutor, Judge Robert H. Jackson, without challenge: “That four great nations, flushed with victory and stung with injury, stay the hand of vengeance and voluntarily submit their captive enemies to the judgment of the law is one of the most significant tributes that Power has ever paid to Reason.”

    This was a completely new principle, and it sounded so auspiciously redolent of the values of a liberal order that even today, remembering the scene, I am still astonished by the silence in the classroom. It was as if the nations that held court over the mass-murderers and their superiors had wanted for the first time to test out what Kafka in his diaries had called “jumping out of death row.” And for me, it was as if I had won a small victory when the teacher, changing the habits of a lifetime, graded the lecture with an A+, because she could do nothing else. I still recall her fixing her gaze on me like a strange toad, and me staring straight back. 

    Eleven years later, my first volume of poetry appeared, a year before the Soviet empire began to crumble with the fall of the Berlin Wall. Grauzone morgens, or Mornings in the Grayzone, was its title, a book in which scenes like the ones just described added to the overall panorama, but which was more of a search image — you would not find any of the crucial terms in it. Viewed from today, one can see that it followed the scenography of a Tarkovsky film. I happened to have seen his masterpiece Stalker in a Dresden cinema in the early1980s, but only later found the key to his enigmatic film parable in Eliot’s The Waste Land: “‘Who is the third who walks always beside you?’ ‘There is always another one walking beside you.’” Only now am I fully aware of the overlapping motifs. Someone had been walking beside me for a long time. An angel? A superior? 

    One of my poems from that time that reminds me of those days, and of my trips to the Saxon Library. It is called “Accept It.”

    So many days with nothing
    occurring, nothing but those
    brief winter manoeuvres, a few

    mounds of snow in the mornings,
    melted away by evening, and the
    strange moment at the barracks

    was an exotic handbill: this little
    squad of Russian soldiers in
    green felt uniforms, standing in silence

    guarding a bundle of newspapers, and I read
    “коммунист” on top and
    the line came into my mind: “picture

    the wristwatch on Jackson Pollock’s wrist.”

    II

    It is no longer the specter of communism that haunts Europe today. It is the afterimage of authoritarian rule, the dream of right-wing populism among the people, realizable through propaganda and political marketing. All those discredited socialist utopias that vanished with the fall of the Soviet Union have been replaced by backward-looking visions of a strong nation with fortified borders and as self-sufficient an economy as possible. Regressive fantasies stoke the flames of the struggle for majorities. Aggressive discourses of power have long since captured command positions here and there in Europe and America. The theatrical spectacle starring real estate tycoon T, the Twitter King Ubu Roi in the White House, finally came to a close, but it showed where things might be heading in this second millennium. 

    “Retrotopia” was what the sociologist Zygmunt Baumann called it in his last, posthumously published, book, his critique of a world of nationalist politics that leads of necessity to a restricted sense of nationhood, to trade-wars and the rearmament of the nuclear powers — in sum, to the increasing violence to be seen in all spheres, not least in language. Baumann knew what he was talking about: as a Jew he had experienced political violence early on, and he too became an émigré in his thinking. It is hardly an accident that, after much back and forth between East and West, Baumann, who had been born in Poznan in Poland, finally fetched  up in England and died there having found a haven for the time being. 

    The fixation on the past identified in Baumann’s book rests on the need for security for the many uprooted people in Western societies. For a large number of these people, freedom is simply overwhelming: the freedom of the individual as well as that of capital, which dissolves all ties and thus threatens the very basis of their existence. The discomfort with culture goes hand in hand with a transfiguration of the past. The “little people,” Baumann explains, want a return to the tribal fire, as if they had not failed more than once in that undertaking. In times of globalization and the migration that comes with it, phenomena that are experienced as the destabilization and dissolution of local and family life, and in times of growing economic inequality and terrorism permeating many of the precincts of everyday life, the fevered visions arise from what Baumann calls “the lost/stolen/abandoned, but undead past, instead of being tied to the not-yet-unborn and so inexistent future.”

    But what kind of past is that?
    Past that does not pass by
    Past that has been managed
    Past that has not been managed.
    Memory, work of mourning,
    So many pasts that each one of us
    caught up in the whirl remembers differently, But it is supposed to be just one
    the good one, easily told,
    ideally the one where everything was fine Without the masses of the dead
    Without those who went under the wheels, the murdered, those who died miserably for whom there was no past
    and often not even a grave
    ‘Doesn‘t a breath of the air that pervaded earlier days caress us as well?’

    That was the idea of history: in any given situation it is the last stage of time handed down and passed on, or the most recent episode of the Great Narrative in the form of a television series. It is the reassuring intermediate state in which the lives of the living rest on the works of the dead, and everything that has been passed down is at the disposal of the most recent.

    An extreme form of Fascism: the destruction of the past in the course of the struggle for the survival of the fittest, the apotheosis of vitalism, pure technocracy riding roughshod over the interests of human beings. The commemoration only of one’s own dead, to whom monuments and mausoleums are erected. (The march to the Weimar Republic began at the Feldherrenhalle in Munich, the site of Hitler’s putsch in 1923 and during the Hitler years a funerary monument to his dead Nazi comrades.) When the hour of victory struck, the “leaders” of the new Europe came together before the new temples for the commemorative tryst, hands covering their privates. Fascism: “One reason it has a chance is that, in the name of progress, its opponents treat it as a historical norm,” observed Walter Benjamin. “The current amazement that the things we are experiencing are still possible in the twentieth century is not a philosophical amazement.”

    We should not, we are told, be tempted into drawing historical comparisons to understand what is happening today. We need not look to old models of explanation, nor does the reference to the emergence of National Socialism help us. But it may be pertinent to reflect on a few characteristics of classical fascism so as to rule out the possibility that we are dealing with specters of its return, or derivatives of them — new chemical compounds that could be produced out of old elements. 

    It is certainly not wrong to see Fascism as a politics of dynamism. (Its synonym, after all, was “the Movement,” as in “Munich, the capital of the movement.”) In addition to its original history of violence, it was a manipulation of the masses powered by the most modern technological means, especially communications technology — today we would call it a marketing strategy. Opponents were deported to the concentration camps; potential comrades were brainwashed by Goebbels’ propaganda; the Jews were excluded as foreign bodies and finally destroyed (this, however, was unique). Class hatred was replaced with racial hatred. 

    Historians deal with content and process; sociologists inquire into the impact on the structure of everyday life; philosophers concern themselves with the ideas involved and place them in the larger perspective of human thinking. There is agreement on the fact that fascism was a revolution — a revolution of the right, the only one that has ever really succeeded in Germany and caught the imagination of the people. In comparing the two totalitarian movements, Fascism and Communism, the German historian Ernst Nolte sought to provoke, and for a long time he remained a lone voice. But no one can escape the shift of perspective. Anyone who interprets this move as a sign of revisionism, as the philosopher Jurgen 

    Habermas did in the 1980s during the famous Historikerstreit, may be right in a humanist sense, but stands is the way of a proper understanding of political dialectics, of the history of modern tyranny. The two dragons of Communism and Fascism faced each other off snorting with rage. For a time, they competed on the world stage in terms of civilization and aesthetics (as was strikingly visible at the Paris World Fair in 1937), until their totalitarian flirtation descended into war.

    But the question as to which dragon was the more successful at courting the masses leads us straight back to the present, and to a decision that faces us. And this is where Umberto Eco comes in, with his list of “Common Features of Eternal Fascism,” or ur-Fascism, delivered in a speech marking the fiftieth anniversary of Europe’s liberation from Nazism and later expanded into a tract called Il fascimo eterno. The semiotician takes the broad view without really resolving the contradictions: on the one hand, Fascism is, for him, part of the cult of tradition, a rejection of modernity (“blood and soil,” the fixation on race, the condemnation of materialism and the “evils of democracy,” the apotheosis of the Führer-state); on the other hand, it relies on the most modern technology and, over time, even adopts an avant-garde approach, such as Futurism, its aesthetic vanguard. On the one hand, he claims, it is a mythical construction; on the other hand, a praxis, based on action for action’s sake and thus materialist through and through. It exploits natural fears about social disparity (based not so much on income as on background and religion) and exacerbates them. 

    This much is certain: it is a product of nationalism compounded until it is limitless — which is to say, the exploitation of people based on the simple fact of being born in a country where they grew up and were unable to leave (because poverty or loyalty kept them prisoners). Jews who had lived in Germany for generations, for their part permanent exiles, were to remain outsiders forever. Fascism determines who is an outsider and who belongs to the holy community prepared to give their lives for the nation. Fascism is a construction of belonging, the identity of the Volk, including its own people living abroad, beyond its frontiers (in Saarland, the Sudetenland, Romanian Germans and Volga Germans, and so on). 

    It is also the ideology of the have-nots, who interpret capitalism as a conspiracy of plutocrats, millionaires (preferably Jews), but one which can sustain itself only through investment and campaign contributions on the part of big business. It postulates the struggle for survival and a cult of the strong (invoking Darwin in the process) and it doesn’t give a damn if the weak go under the wheels (the tank chains) or drown in the sea. It believes in the idea of a “final solution” for all human problems: away with the weak, the sick, the homeless, the feeble, and the mad, and with any intransigent opponents. A constant war is to be fought, but in the end a Golden Age will dawn in which the biologically superior (not the most intelligent) will lead an orderly family life. It dreams of an elite, but its foot soldiers come from the uneducated levels of society; intellectuals, the educated, the cultured are despised (“the lying press”). Fascism needs heroes, and they must be prepared to go the whole way; they train for death, the final battle, and fight it to the last cartridge. Fascism is a male affair. In the will to battle, women become mere accessories. In Fascism the situation of a woman is always precarious: she is there at best to become the bearer of children and a heroic mother (and a concentration camp supervisor), or even a deputy, an official in the right-wing populist camp, an opponent of abortion. Never a feminist, in any case.

    For some time now I, too, have been concerned with the issue of the nostalgic appeal of Fascism. Has the specter really been banished and burned like the poor witches of the Middle Ages? Can you in fact burn specters, as the Nazis burned books in Berlin’s Opernplatz? 

    Is there a myth that has remained alive — preserved under the rubble of the Thousand Year Reich — that may yet resurface? Heiner Müller wrote: “as once ghosts came from the past / now they come from the future as well.” For a long time I considered such worry to be the product of hysteria. The question of the return of the past only seemed relevant inasmuch as it was gaining traction among historians and sociologists: surely the experts were in a position to say what caused the virulence of these specters, and as long as they could assure me that we find ourselves in a new situation, and that a revival of horrors, whether as farce or operetta, could be ruled out historically, there was no danger. But now I am not so sure.

    III

    It may be that the German in me is gripped, every now and then, by a certain disquiet. That is why I stare as if transfixed at the twelve insane years of Nazi rule, and constantly immerse myself in the growing specialist literature on the subject. Just recently I discovered yet another figure in the generation of young careerists in the Third Reich, a certain Franz Alfred Six. He was one of the co-founders of the Security Service (SD) in what became the Reich Main Security Office (RSHA), the ideal henchman to his boss Heinrich Himmler and a resourceful colleague of his almost contemporary and superior Reinhard Heydrich, who was rumored in the inner circles of the SS to be the heir-apparent to Hitler. Six was one of the pioneers of the “scientific” study of journalism, as director of the Königsberg Institute of Journalism. In 1935 his position as a major in the SS moved him ahead of his colleagues and he was appointed head of the press department at the main office of the Security Service in Berlin; his specialty was research into ideological opponents. It was his office that gathered the huge stream of data concerning all those within Germany and abroad who, whether as organizers or writers or publicists, could be defined as political opponents of the Nazi regime. 

    Six was the man earmarked for the position of Commander of Security Police in London after the occupation of Britain, according to the wishes of his boss Heydrich. The order came personally from Field Marshal Göering. In the event of a successful invasion (“Operation Sea Lion”), not only did all the operational plans of the various services lie ready in the cupboard, but also the lists of all potential opponents to be found on the island. Six’s counterpart at the SS, Brigadier Walter Schellenberg, produced the manual for the planned German invasion, which included the infamous Sonderfahndungsliste G.B. (Special Wanted List GB) that fell into the hands of the Allies after the end of the war. It came into being mainly owing to the diligence of the pedantic researcher Six and his colleagues. Among the approximately 2,700 dangerous subjects to be arrested after the invasion (Winston Churchill was enemy No. 1) are the names Alfred Einstein and Sigmund Freud, but also artists and writers such as John Heartfield, Aldous Huxley, H. G. Wells, and Virginia Woolf.

    Six was the typical armchair perpetrator, an inconspicuous civil servant, presumably often in plainclothes, with the international daily newspapers in his briefcase. He would hardly have stood out at a meeting of stamp collectors, with his round nickel-framed glasses. The fact that he had quickly risen to a position that earned him a villa in Berlin-Dahlem (on Thielallee), a chauffeured service car, and a private office in the city center with three secretaries (Wilhelmstraße, Prinz-Al-brecht-Palais) is a testament to the enormous opportunities for advancement in the dynamic Nazi state, whose inner workings the ordinary German people could hardly envisage. Six was the author of essays with titles such as “The Fate of the European Community” and “Russia as Part of Europe,” but his core business was the creation of comprehensive catalogues of enemies. The walls of his Berlin office were papered with detailed organizational charts in which the world was divided into political groups who presented a threat. He shared with Heydrich the fascination with a certain secret-agent aura, copied wholesale from their British opponents.

    His ambitious project was to coordinate domestic propaganda with so-called foreign studies, geopolitics, and research into the various categories of opposition (Marxists, Socialists, Jews, Freemasons, Jesuits, and members of religious sects), on the basis of the “scientific National Socialism” that he espoused, which was analogous to the ideology of the revolutionary Bolsheviks. The concept of a branch of sociologically and historically defined research into the enemy came from Heydrich, the “man with the iron heart,” as Hitler dubbed his chief functionary at the pompous funeral ceremony after his assassination in 1942, when he was Reich-Protector of Bohemia and Moravia. Like his boss Himmler, he was convinced that he was exposed to a huge and disparate army of opponents, especially after the occupation of half of Europe in a territory “with 200 million people of foreign origin and race,” as Himmler said in his infamous speech in Poznan in 1943. “To win their hearts and minds will only be possible when the great struggle between the two world empires, Germany and England, is decided. Then we will be able to affiliate these thirty million true Teutons to our own nation.”

    Thankfully, it never came to an invasion of Britain. Instead England was to be worn down by air raids until it would agree to peace terms, so that the Nazis could finally turn to Russia, which posed the most serious competition for dominance in Europe. And after the attacks on the Soviet Union, we find Herr Six at work once again. With his own motorized SS command, he drives just behind the advancing front to be the first of the security police in Moscow, and to secure official archives and files from enemy authorities. 

    Historians speak of polycracy in the Nazi state — the competing system of different departments that appeared overnight. Later, depending on the general political climate and the whims of the dictator, they were adapted to the real course of the war. For diplomats, academics, journalists, and born bureaucrats of all kinds, this may have been a nightmare, but in the end they all set about competing to fulfill the Führer’s requirements. They were trained to “work towards the Führer,” a formula found by Ian Kershaw in the files of a Prussian secretary of state.

    This was also what Six and his colleagues from the relevant fields set about doing. In one of the units under his command, the plans for a coordinated Jewish policy were developed, in close consultation with the Gestapo. One of the most prominent of his protégés was Adolf Eichmann, whom Six immediately sent to Vienna after the occupation of Austria in 1938, where he developed the model of a department dedicated to the expulsion and economic plundering of the Jews: the Central Office for Jewish Emigration. At that time various ideas were still in circulation about the resettlement of the Jews: for example, an agreement with Zionist associations, which imagined a separate Jewish state in Palestine, under British mandate at the time. This led to the adventurous “Madagascar Plan,” which soon proved unworkable. Adolf Eichmann, later an expert in deportation, and SS-Sturmbann-führer Herbert Hagen, head of the department called “II/112: Jews” in the Security Department’s main office, were sent on a business trip to Palestine, on the orders of Six, in order to explore the possibilities of an orderly deportation. (The British authorities allowed them to stay in Haifa for only a day, September 26, 1937, and then expelled them from the country, so they set about seeing what was possible in terms of negotiations on the Arab side in Cairo.) 

    As little came of this idea as the plans discussed a year later, at the International Conference on Refugees in Evian, to distribute the Jews as asylum seekers on a quota basis to the participating countries. For now, the borders were closed. The ship was full, as the expression goes — a phrase that appears on cue with every wave of refugees everywhere. After Hitler unleashed the war, the course was set for the extermination of European Jews. Soon the trains began to travel to Chelmno, Belzec, Sobibor, Treblinka, and Auschwitz. Six million Jewish people were fed into the killing machine by the those driving the policy of extermination.

    The courts later had only to clarify who was directly or indirectly involved in the great crime. Someone such as Six represents the elite of those planners, those who escaped undetected — he represents the cool functionary of the hour. According to Adorno, this coldness was “the basic principle of bourgeois subjectivity, without which Auschwitz would not have been possible.” Or, in the treacherous language of the murderers, “You don’t fight rats with a revolver, but with poison and gas.” (That sentence comes from a memorandum of the Munich SD headquarters to Reinhard Heydrich.)

    In the last years of the war, Six transferred to the Foreign Office, and drove right across Europe, where, as eyewitnesses attest, he spent his time barking at employees of the diplomatic missions and at cultural representatives. For his involvement in the crimes of task forces in Smolensk in the autumn of 1941, he was convicted in one of the smaller trials in Nuremberg, but he walked free after four years in prison in Landsberg, uncorrected and undeterred. He never showed up on the list of the principal authors of Jewish extermination and he was overlooked by historians for many years. It was in reference to his type of person that scholars came to speak of “functional anti-Semitism,” as opposed to the virulent ideological sort — but the “physical elimination of Eastern Jewry” was nonetheless important enough to Six for him to give a lecture on the subject, though whether he ever got his hands dirty remains unclear. His name stands for bureaucratic preparation; he was one of many “intellectuals” who prepared the ground for mass annihilation. He is an abbreviation, a kind of shorthand for genocide, a fleeting file note and a large amount of paper which in the end led to the most extreme consequences. 

    After the war, without missing a beat, Six moved into the automobile industry, as a self-employed management consul tant. The specialist for lethal propaganda became one of the leading marketing experts of Porsche-Diesel Motorenbau GmbH. Untroubled by justice, he lived through the building of the Federal Republic and, in receipt of a handsome salary, he offered his reflections on “The Nature of Marketing.” Fascist propaganda and its methods had its uses in civilian life. An observation from Paul Celan seems apposite: “The germ-free is the murderous; fascism today lies in formal design.” 

    IV

    What refuses to give me any peace is the way a people can make themselves totally available to such purposes. Who would you have been in a dictatorship? I do not have to ask myself this question, because I found myself in the middle of a dictatorship and survived it. Instead I must ask myself the question, who would you have been in the Nazi era, and what would you have done against Hitler? For I, too, would have been inundated by the ubiquitous images and words of the Führer. At the time there was no outside vantage point, so it makes no sense to develop a moral standpoint out of hindsight. The principle of contemporaneity, to which we are bound, excludes us from other historical experiences. All I can say is that as a young child I slept through all knowledge of the Third Reich, and someone might have called to me: keep on dreaming, friend! So the only question can be: can anything be learned from this particular history, if anything can ever be learned from the course of history?

    This brings me to the present. I would like to come back to the problem of writing, and the question of why one writes in the first place. Why live without writing? I know that most people do not ask that question quite so urgently, but it was a question for me from early on. You wander silently within yourself for a long time before you get to the point of scribbling a few lines on a piece of paper, at first only  for yourself and naturally without the least understanding  of history. 

    Let us hear what the philosopher Gilles Deleuze has to say about it. “Writing is a question of becoming, always incomplete, always in the midst of being formed, and goes beyond the matter of any livable or lived experience.” That is the starting point: we do not know what is driving us, and we can only pull together a few phrases to express what happens to us — to form a provisional response. Writing will sum us up, it shortens what we call life, quite inevitably. The philosopher, Deleuze, standing before the Absolute, then makes leaps and bounds, he is immediately in need of transforming himself — into a woman, an animal, a plant, a molecule — and he is right. “The shame of being a man — is there any better reason to write?”

    No, there is no better reason, but I will skip over his other thoughts on the subject and stop at a statement that immediately seized me and would not let go of me when I read his 

    Essays Critical and Clinical for the first time. “As Moritz said, one writes for the dying calves.” He was referring to Karl Philipp Moritz, an eighteenth-century German writer, a contemporary of Goethe, who was the author of Anton Reiser, a four-volume life story told from below, from the perspective of a child born into poverty and a strict religious milieu. It was the first psychological novel in the German language.

    I was electrified by the quotation and I followed the trail — and came out at myself. My first story, the first piece of prose on which I worked seriously, started at a street crossing in Dresden. I was sixteen at the time. After visiting my grandparents, I was waiting for the tram that was supposed to take me back to the outskirts of the city, to the garden city of Hellerau where we had been living for a few years. I was standing there, staring out into the rain, when a cattle truck roared past me. I will never forget the sight of the animals, the dark eyes of the cows and the calves, the bodies of those destined for death, clearly visible behind the vents of the van. This is where my text began.

    I wrote a soliloquy of a cow being taken to the slaughterhouse. It was written in a primitive stream-of-consciousness style, an interior monologue. I did not know James Joyce or Arthur Schnitzler at the time. I knew nothing of the fact that literature was a technique that could be learned and developed. But with that momentary glance, the glint of a pupil, I had recognized myself in that animal, and the prose began to flow. The piece almost wrote itself. It ended after all the stress and terror of being unloaded on the ramp and driven through a tunnel towards the last station of its suffering, with the moment in which the beast felt the bolt gun pressed against its forehead. I knew the procedure because my grandfather, who spent his life working as a master butcher in the Dresden slaughterhouse, once told me about it. 

    As the ill-treated creature blacked out, the text suddenly broke off. Quite clearly, an unsatisfactory ending for a story. And I felt my failure acutely and buried the manuscript under other half-baked drafts, overcome with a feeling of shame. I was upset because nothing seemed to work. That is the painful secret of writing — I do not know what to expect, just as a laboratory animal during an experiment on himself does not know where the exploration will lead. “To write is not to recount one’s memories and travels, one’s loves and griefs, one’s dreams and fantasies”: that I had understood very quickly. “Literature begins only when a third person is born in us that strips us of the power to say I,” as Deleuze put it. But even without such an “I,” I was stuck for a long time. “Health as literature, as writing, consists in inventing a people who are missing”: Deleuze’s remark sounded pompous, but one day its meaning dawned on me — and so, more or less by chance, I fell into German literature as one of many who dream of a people that does not yet exist.

    “‘You write for the dying calves,’ says Moritz.” I read the German novel which included that sentence early on, but only came across the specific reference thanks to a French philosopher. It is there, in Anton Reiser, but not in quite the same words. This can happen if one simply reads. It is not about the reading itself; it is about stopping at a certain point. And that was the point I had simply passed over. When I returned to the passage, I found more. Moritz continues: “From this time forward, when he saw an animal slaughtered, he identified himself with it in thought, and as he so often had the opportunity of seeing it at the slaughterer’s, for a long time his thought was centered on this — to arrive at the distinction between himself and a slaughtered animal like that. He often stood for an hour, looking at a calf’s head, eyes, ears, mouth, and nose — and, pressing as close to it as possible, as he did with a human stranger, often with the foolish fancy that it might be possible for him to think himself gradually into the nature of the animal. His own concern was to know the difference between himself and the animal; and sometimes he forgot himself so completely as he gazed at it persistently that for a moment he really believed he had come to feel the nature of the creature’s existence. From childhood on, his thoughts were busy with the question — how would it be if I were a dog or some other animal living among men?” 

    From the philosophers, above all Descartes, who saw animals as machines, bundles of reflexes, creatures without reason, one can learn what a discourse is. Literature has always had its own discourses and themes. In this it has always been sovereign and did not need to wait for the social sciences. If asked about my poetics, I would say today that we are working towards a photosynthesis of words and images. Words work at transmission; images reach us from a tiny future that quickly becomes the past. I am referring to the images in all media that overwhelm us every day as a shock experience of the real, right down to our dreams. Every day history drives us out of ourselves and confuses our imagination: history — that brutal translation of time into collective experience. The poet is simply one of many; his problem is how to lay aside the pretentions of poetry. In the end he knows only what anyone can know: that there are so many realities, that they exist independently of us and simultaneously, and the same is true of identities. Even if they are dreamers, the poets, the only thing they do not doubt is that the words and the deeds of our predecessors will catch up with us. In this respect they are especially sensitive, specialists constantly in radio contact with the dead.

    There is something the sociologists call transgenerational transmission. No one can jump out of their historical time, and no one escapes being formed by history. Once it may have been possible, perhaps, in the unimaginable times of myth and fairy tale, but today it is impossible. In the same way, the much-vaunted attempt to draw a line under legacy of German fascism is impossible, too. There is no question of a flight from time, or of a flight inward —even there history will catch up with everyone. Instead, history as a history of violence passes though time and imprints itself with all its dates on our bodies. There is something beyond literature that calls writing into question. And there is literature that criss-crosses history in fictions, literature as what Walter Benjamin called a “secret agreement between the past generations and the present one. Doesn’t a breath of the air that pervaded earlier days caress us as well? In the voices we hear, is there not an echo of now silent ones?” As if responding to his questions, Ingeborg Bachmann declared: “History constantly teaches but it finds no disciples.” 

    She had experience with this, no doubt: she was a woman. But we cannot know, fortunately we cannot know, whether this is the last word on the matter.

    Notes on Assimilation

    There is a passage in Democracy in America in which Tocqueville observes that in a mass of land spanning the width of the continent and extending from “the edge of the tropics” in the south to the “regions of ice” in the north, “the men scattered over this area do not constitute, as in Europe, shoots of the same stock.” On the contrary, “they reveal, from the first viewing, three naturally distinct, I might almost say hostile, races.” It was not simply the seeming incompatibility of the customs, origins, habits, memories, and laws — to say nothing of class positions — that separated the whites, blacks and Native Americans from one another: “even their external features had raised an almost insurmountable barrier between them.”

    This blunt observation prompts one of the book’s more striking asides, in which Tocqueville recounts pausing during his travels at the log cabin of a pioneer in Alabama, on the edge of Creek territory. Beside a spring he encounters a microcosm of American society in the nineteenth century: an Indian woman holding the hand of one whom he assumes to be the pioneer’s young daughter, followed by a black woman. The visual descriptions contain within them the two-pronged tragedy of the American experiment: “The Indian woman’s dress had a sort of wild luxury” that preserved and advertised what we would now call her totalizing alterity; “the Negro woman was dressed in tattered European clothes,” a simulacrum of the people who would never consent to accept her. Both women lavish attention on the five- or six-year-old white “Creole” girl, who already “displayed, in the slightest of her movements, a sense of superiority.” The Indian woman, in Tocqueville’s words, remained “free,” “proud,” and “almost fierce”; the black woman was “equally divided between an almost motherly tenderness and a slavish fear.” In both cases, the Native and the African, Tocqueville sees a cursed choice — the fundamental inability to overcome “the thousand different signs of white supremacy” firmly in place in the new world. Indeed, “nature’s efforts” to draw the oppressed and the oppressors close here only “made even more striking the wide gap between them.”

    There can be no serious discussion of American reality without the sober — and I would argue, dispassionate — acknowledgment of the historical fact of European dominance, which nearly obliterated the seemingly “unassimilable” Native and for centuries diminished the African to a state of wretchedness and isolation verging on inhumanity. Whereas the European — however lowly or removed from Anglo-Saxon Protestant norms — came by choice to this new land, retaining her ties to the old countries that shaped her, the Native and the African were left with nowhere else to go, not even in the realm of imagination or conjecture. For both, the link between emancipation and assimilation is inexorable. There might be many ways of belonging, but there was no possibility of not belonging in any way. Oppressed or free, they were here. Even their exclusion by white society did not release them from finding a way to live in and with America. Their misery was an American misery, as their happiness would be, if ever it were achieved. The question of how to make a genuine home has always been a question of how to blend into a society that was based on your subjection even if — and in fact, precisely because — you coincided with or predated its foundation. Yet there is nowhere else for you to be.

    The psychological reality of this dynamic presented itself fully formed in Tocqueville’s time and has remained alarmingly present through our own. Waves of “non-white” immigrants — from southern and eastern Europe, from Latin America, from Asia, and even sometimes from the Caribbean and the African continent, too — were able successfully to integrate into the national mainstream to the precise degree that they could distance themselves from the enslaved and otherwise unfree and their descendants already living here, whose social condition, often but not always, manifested itself in those conspicuous physical characteristics set in opposition to whiteness. One thing that united a multitude of disparate groups over the years was a diligence about separating themselves from black Americans. Without being glib about the depredations that other ethnicities endured, the question of assimilation in America — of the continuous liberation and integration of a plurality of distinct peoples into a reasonably harmonious mongrel whole — has always been a question that turns on the status and fate of the people deemed “black,” the hardest case, the ultimate case, who have always been living here and on whose exclusion various other forms of belonging have been based. This is an old tension that remains highly pertinent today. (Consider the extent to which, in just the past year, we have witnessed extremely contentious discussions of the degree to which Latinos and Asians can even be welcomed under the umbrella term “POC.”)

    The challenge of finding a place in America was both eased and complicated by those millions of immigrants, whose presence necessarily rendered the question less black and white. If America did not begin as a nation of immigrants but rather as “a white Anglo-Protestant settler society,” as Samuel P. Huntington maintained (though the Puritans were also immigrants), it certainly became one, a multiethnic society of many origins. By the twentieth century the idea of assimilation was championed widely, if not by the nativists and the bigots then certainly by the striving members of various new and marginal groups themselves. These men and women weighed their own prospects for belonging and flourishing in a country that was being transformed by transplants, from a Europe ravaged by famine and war and persecution, and from the former Confederacy as well, whose recently emancipated population found itself searching north and west for the liberty and the prosperity that was denied them in the South.

    The founding text of the assimilationist ideal was The Melting Pot, a mawkish play by a British Jewish writer named Israel Zangwill, which appeared in 1908 and popularized the phrase that is its title (which, not for nothing, would be considered a microaggression today). It told the story of a young Moldovan man who comes to America in the wake of a Russian pogrom that decimated his family. Here he marries the daughter of his family’s oppressor (literally) and composes a symphony that is an ode to a future in which distinctive ethnicities macerate into something bold and new. President Theodore Roosevelt attended the premiere of the play in Washington, D.C.

    The premise of Zangwill’s play was in its titular metaphor: ethnic and religious differences needed to be overcome and annulled, so that a homogeneous social substance remained. Difference was an obstacle to social peace, and the obstacle could be removed by assimilation, which denoted a large or total erasure of prior identities. Assimilation was a particularly ruthless form of integration, because it implied an embarrassment about origins; assimilation was re-invention. “After the prohibition of large-scale immigration in 1924,” Huntington noted in 2004 in Who We Are, his controversial book on immigration and the American character, “attitudes toward America’s immigrant heritage began to change.” By World War II, “ethnicity virtually disappeared as a defining component of national identity.” The preponderance of assimilation literature — books with titles such as How the Irish Became White, How Jews Became White Folks, and Whiteness of a Different Color: European Immigrants and the Alchemy of Race —  presented such an optimistic vision of the rise of European immigrants, who had been considered in the nineteenth century not just ethnically but racially distinct from “whites.” This literature ignored Asians and did not know what to make of “Hindus” or Arabs. It was mostly silent about the plight of those deemed black, whose existential predicament was dealt with most powerfully in the literature of “passing,” such as Nella Larsen’s fine novella of the same name and James Weldon Johnson’s subversive faux-memoir  The Autobiography of an Ex-Colored Man. Yet, as those texts make clear, passing made acceptance a reward for deception. The only way in was to lie, to compromise one’s dignity. This was a curse.

    Full assimilation for American blacks, until quite recently, was predicated on the plausibility of self-negation. Immigrants, again, were also expected to conceal or to distort their differentiating heritages, but they were rewarded for this more quickly and easily. Anti-black racism has a special force in the panoply of American prejudices. The civil rights achievements of the 1960s were victories of the legal code, but there is more to society than law: informal and much more rigid rules of habit and custom continued to frustrate full integration for masses of black people (and to varying degrees people of Native American, Latino, and Asian descent), except for upwardly mobile and fortunate individuals who have somehow been able to traverse these inherently porous and frequently contradictory barriers. The ascent of those individuals was an important fact about what was possible in America, but it proved less than its celebrants thought it did. These exceptions were not “Uncle Toms,” but they were also certainly not the harbingers of racism’s end. With the election of Barack Obama to the presidency in 2008, it did seem, however fleetingly, as though the country had finally reconciled itself to a genuinely pluralistic future in which, at last, blacks had been fully incorporated into the multi-hued national family. But the swift backlash and visceral dissatisfaction — first from the right and then from the left — made it clear that the story of assimilation is very much still being written.

    What is race? Or, perhaps more to the point, what is racial difference? These are related questions whose answers cannot be simply taken for granted, and must continuously be spelled out in plain, jargonless language. It is obvious that human beings differ physically from each other, in both superficial and significant ways. Groups of related individuals do tend to broadly share physical traits that distinguish them from other groups. “When these physical differences involve skin color, eye shape, hair, and facial features, people have for centuries labeled them differences in race,” Huntington explains. “The physical differences exist; the identification of them as racial differences is a product of human perception and decision, and attributing significance to these racial differences is a result of human judgment.” What is decisive is not what we see, but how we evaluate what we see. Or, more fundamentally, that we look carefully. Albert Murray put it most succinctly: “Any fool can see that white people are not really white, and that black people are not black.”

    The story of racial progress has been the story of seeking to render surface-level and socially constructed differences less and less potent — to dilute the self-reinforcing logic of what Barbara and Karen Fields have called “racecraft,” and have shown in their monumental book of the same name to be the illusion of race produced by the practices of racism. (In their formulation, racism — the ideological justification for a preceding economic exploitation — creates race, and not the other way around.) We choose to organize our polity along crude color categories even though, in point of fact, and even in the time of Tocqueville — as the presence of that Creole girl attests — we have always been a heterogeneous population. The average black American derives fully 20 percent of her DNA from Europe, and almost entirely from that continent’s Anglo-Saxon precincts.

    Outside of the great modern black writers — Wright, Ellison, Murray, Baldwin — the fact of our biologically assimilated reality has often been given short shrift in the mainstream conversation. (It is not only that “blacks” contain ample amounts of Europe within them; today millions of “white” Americans possess enough Africa in their DNA to have been enslaved under the laws of hypodescent.) It makes many people uncomfortable to discover that we — as a society and as individuals — already are mixed. The discussion of identity, the sentiment of identity, leads very quickly to a need for purity and a desire for exclusiveness. But it is much too late for purity and exclusiveness: we really do contain multitudes.

    The first time I read Norman Podhoretz’s “My Negro Problem — and Ours” — an infamous and almost unbearably candid essay whose title I am powerless to glance at without cringing — a flash of recognition overtook me almost against my will. I am the titular Negro, that is a fact, though I don’t think he has described me fully. He feared me but he did not know me. Yet what vexed him in the ’60s, and what still vexes us, is not even race, it is the myth and superstition of color grafted onto social circumstance and class. That is why my all-American experience, in which encounters with racism and the disadvantages of color did not block the uses of my talents or the free expression of my views, has also made of me the authorial “white” liberal whose discomfort cascades into guilt. I reject that guilt just as I resent the tribal and despairing essentialism that describes my trajectory as “white.” What I want to be, in the fullness of my heritage, is the disembodied writer seeking, however implausibly, an exit from this whole dishonest bind. “Will this madness in which we are all caught never find a resting-place? Is there never to be an end to it?” Podhoretz raised an important if completely impolitic point: all of us will need to let go of who we were in order to become who we are capable of being.

    The question needs to be continually posed: what exactly is being preserved under the guise and the performance of our inherited racial identities — and at what near- and long-term price? This cuts both ways. What do blacks and other marginalized peoples — groups, it is true, that have always been in flux — cling to by asserting continued difference (regardless of what others would assert on their behalf)? Perhaps more importantly, what do majority and normative populations — which is to say, whites — believe they are maintaining through our exclusion, and what has been that cost?

    Passing narratives have always carried more than a whiff of shame about them. In recent years, depictions of immigrant and non-white communities in America have increasingly been judged based on the degree to which they allow a given group’s distinctiveness to obtain. We hail Junot Diaz for refusing to translate whole blocks of Dominican Spanish just as we praise Jay Z for, in middle age, suddenly deciding to dreadlock his hair. (We don’t know what, exactly, to make of a multicultural cohort of congressional leaders kneeling after the death of George Floyd on the Capitol’s polished marble floors in vivid kente cloth stoles.) The word of the day is unapologetic. This is no doubt an improvement over meek or diminished. It harks back to the earlier slogan of “black pride.” And who can argue against pride? There is, after all, so much for us to be proud of, as the “assimilationist” themselves certainly believe. But what happens when pride is diminished into mere defiance and is pointlessly glorified? It happened in the 1960s and it is happening again. (We need a new Tom Wolfe to bear witness to this era’s bizarre new radical chic.) And what happens to a human being when the part about him that he most cherishes is his defiance? Is that not a travesty, too? Defiance is not the same as self-respect — or it is self-respect turned hysterical, a tensed and defensive perversion of self-respect. A human being is infinitely more than a clenched fist.

    Embedded in this discussion is the accusation that people who desire assimilation are attempting to escape from who they really are. They are said, by the proponents of separation, to suffer from a psychological disorder, a reprehensible lack of self-esteem. Self-hatred is real, of course, and the constant experience of recognition withheld can inculcate it. But there are at least as many reasons for being ashamed of oneself as there are men and women populating the earth — many of which have nothing to do with racial identity at all. And what if that is a false objection in the first place? What if you do not have to erase anything in order to join? This is what Nathan Glazer and Daniel Patrick Moynihan discovered in the early 1960s: that Zangwill wrote for another age, that melting down was no longer necessary to establish a position and a commonality in American society. Indeed, what if we find ourselves in a society that accepts us — that frequently admires us — for  not erasing anything, that celebrates our particular and original attributes?

    The zero-sum negotiation between authenticity and acceptance deserves to end. Based on a misunderstanding of American pluralism, it has caused too much pain. It is entirely possible to remain aware of histories of exclusion and domination without conceding the motive for wanting to assimilate going forward. What we need is a new discourse, and beyond that, an infinitely more secure means of imagining ourselves and each other, one that abhors unambiguous and intimidating notions of loyalty and disloyalty, and allows for a kind of national project that does not reject one’s inherited culture but which, on the contrary, understands it as but one arrow in a shared quiver — a single source of strength among many. The most effective way to combat racism is precisely in the name of pluralism.

    What we need is an unpejorative understanding of assimilation that views it simply as a means of experiencing the capaciousness of the world. Assimilation is a welcoming impulse, an expression of nothing more sinister than curiosity. One wishes to assimilate because one wishes during this fleeting life to maximize one’s human potential. But the contemporary hyper-focus on racial difference, and above all on the singular, nefarious power of whiteness, frowns on curiosity and prohibits us from forging the inclusive societies — and the inclusive individuals — that we must ultimately will into existence. While it is true that we may have been an Anglo-Protestant culture at the founding — and I am as right to say “we” as you are — we are far beyond the point of no return: we are now, or very soon will be, a minority-majority country. Such changes to the country’s hardware require a software update — one that incorporates all of these all-Amer-ican differences. This in turn requires the confidence to state a basic truth: the things that one wants to acquire by assimilating or integrating into the mainstream — or, ideally, by forming a new and holistic core culture — are not things that white people own.

    The essence of Homer or Plato is not whiteness. When a black man reads the Iliad and the Republic, he does not steal them from anybody, or trespass into a territory where he does not belong. Baldwin was liberated by the knowledge that Shakespeare also belonged to him. I relish the thought that Kafka and Borges and Dostoyevsky are mine. Yet we are moving, at warp speed, away from such generosity of spirit — away from an appreciation of the blessings of universalism, and of the extent to which they need not threaten the particular.  At New York’s elite Fieldston School, to cite but one depressing recent example, even physics has become an identity-based power struggle. “We don’t call them Newton’s laws anymore,” an upperclassman confided to a journalist. “We call them the three fundamental laws of physics. They say we need to ‘decenter whiteness.’”

    Such a capitulation to racecraft rigs the discussion from the start. Assimilation — or genuine personal cultivation and interpersonal toleration — is a matter of addition, not subtraction. It hungers for more. Newton’s genius was not white, it was human. (And it may have been only his own. Think of all the white people who were not Newton.) When we acquire the ability to see the physical world as he did, we transcend a provincial identity, ours and his, an obsession with provenance, to commune with what is objectively true. We are, every one of us, lifted up. Anyone — black or white or brown — who would assert that the notion of Newtonian physics “privileges whiteness” is only betraying the thinness of their own self-understanding.

    Behind all of these debates, however high-minded, there always lurks the dire question of sex. Who sleeps with who? Who marries who? Whose physical features, superficial as they may be, will wax and whose will wane? These are primal and delicate questions that do not usually inspire rational analysis. And, beyond that horizon, to whose gods or idols will we continue to bow our heads, if we still bow them at all? In France, where I live, a proudly “universal” culture has been reckoning with the pressures of particularism — specifically, with how much latitude it can afford to extend to a young Muslim population that has grown more desirous of the veil the more it has been commanded to stash it away. Here, too, the question of assimilation — of how to make the de facto reality of a multiethnic society work in theory — has begun to rage again. Where America had its melting pot, Paris-centric France could point to le creuset  français, which transformed both paupers from the provinces and immigrants from the colonies into raceless Frenchmen equal within the Republican ideal. Assimilationism was the public philosophy, and not much attention was paid to the matter of how much in the way of cultural and ethnic and religious self-immolation the state could legitimately demand.

    As early as the 1970s and 1980s, the assimilationist tradition faced a “differentialist” reaction. A popular slogan at the time was the droit à la différence.  As in America today, such fervent identitarian sentiment in France gained ground on the right as well as on the left. The philosopher Pierre-André Taguieff has written extensively about the multiculturalist “new right” that emerged around the reclusive and prolific thinker Alain de Benoist, whose ideas have recently migrated to America and inspired the likes of Richard Spencer and other organizers of the disgraceful march on Charlottesville. What was genuinely novel in the French formulation, however, was the xenophilic, counterintuitively antiracist and even egalitarian turn of Benoist and some (but not all) members of the nouvelle droite. Cultural and racial differences were taken as paramount and practically sacrosanct, as the reactionaries sought to preserve abstract collective identities — and communal differences — that they perceived as threatened by mixing of any kind. The political rise of Jean-Marie Le Pen shifted the terms of debate. “As a result,” the American sociologist Rogers Brubaker notes, “the moral and political ambiguity, and the exclusionary potential, of culturalist differentialism were brought into sharp focus — indeed, much sharper focus in France than elsewhere.”  In the wake of an active political right, the slogans had to change. Universalism had to be reaffirmed, politically and polemically. There was less talk in the mainstream about the “right to difference” and more about the right to resemblance, or better yet indifference, which is to say the right to be treated like everyone else. (Philosophers since the Enlightenment have noted that indifference is a form of toleration.) It seemed that French society was intensely focusing again on the assimilationist goal. That is why the recent spread of identity politics in France has come as such a shock — seemingly another damaging import from overseas. Les Americains!  It is certainly true that French political culture does not yet possess a natural and homegrown understanding of pluralism.

    In the United States, the situation is different. We are losing any natural understanding of universalism. At least since the second Obama term, when the Black Lives Matter movement took off in earnest, there has been, on the consensus-shaping left, a pronounced rejection of assimilation as a fundamentally anti-black ideal. “Assimilationist ideas are racist ideas,” Ibram X. Kendi has ruled in his Manichean book How to Be an Antiracist. “Assimilationists can position any racial group as the superior standard that another racial group should be measuring themselves against, the benchmark they should be trying to reach. Assimilationists typically position White people as the superior standard.” This is a sorry misunderstanding of the reasons for assimilation. (Who wants to be white?) It is also the hackneyed accusation of race treason.

    In a slim book that has been anointed, in multiple didactic iterations, the definitive tool for everyone from toddlers to CEOs, Kendi builds on the success of recent “afropessimistic” discourse, and on the fashionable adulation of (early) Malcolm X over Martin Luther King, Jr.,  and traces the failure to emphasize cultural specificity and the maintenance of origins back to W.E.B. Du Bois. “The duel within Black consciousness seems to usually be between antiracist and assimilationist ideas,” he instructs. “DuBois believed in both the antiracist concept of racial relativity, of every racial group looking at itself with its own eyes, and the assimilationist concept of racial standards, of ‘looking at one’s self through the eyes’ of another racial group — in his case, White people. In other words, he wanted to liberate Black people from racism but he also wanted to change them, to save them from their ‘relic of barbarism.’ Du Bois argued in 1903 that racism and ‘the low social level of the mass of the race’ were both ‘responsible’ for the ‘Negro’s degradation.’” Kendi continues by citing the master without commentary, wagering that the cultural mood has shifted enough for the offense in DuBois’ words to be obvious to his readers: “Do Americans ever stop to reflect that there are in this land a million men of Negro blood…who, judged by any standard, have reached the full measure of the best type of modern European culture?” (Du Bois was marveling about a population still living with the memory of slavery.) Kendi expects you to recoil from those words. When I read them, I cheer.

    Kendi’s solution to the enduring reality of inequality in the absence of any assimilationist objective — which has become its own orthodoxy now — is no solution at all. The point is not only that perfect separation is impossible, and would hobble the black community. It is also that Kendi is committing the very sin he has made a career of bemoaning: race-based separatism. Insisting on the perpetual and unalloyed distinctiveness of blacks (or any other historically marginalized group) is as exclusionary as ignoring us had ever been. We are stuck, then. As he argues depressingly elsewhere in the book, “The only remedy to racist discrimination is antiracist discrimination. The only remedy to past discrimination is present discrimination. The only remedy to present discrimination is future discrimination.” Orwell would have loved those sentences. Kendi’s vision of American society is a perpetual juggling trick in which the balls may never come down.

    Cultural relativism is bad for culture, because it looks suspiciously on outside influences. It is tribalism through and through. Like all tribalism, it is easily threatened by others and it threatens others easily. What the proponents of this limiting worldview fail to grasp about assimilation is that, even when it is desired and achieved, it can only ever be partial. Human beings do not and cannot replace each other. When they undergo changes, and accept influences, there are continuities and discontinuities. The differentialists — whether white or black, left or right — have less to fear than they imagine.

    The differentialists loathe a genuinely multicultural society because they are terrified of a genuinely multicultural individual. They lay claim to the entirety of their sons and daughters, every aspect and every dimension of them. They want them to be completely faithful and completely monolithic. They fear that the world is nothing but a slippery slope to defection and betrayal. That is why they subsume the individual, with all her wants and needs, with all her decisions and acts, with all her capacity to self-create, beneath the collective. The individual, except as a loyal representative of her origins, is effaced and forgotten. Individualism itself is regarded as a corrupting deviation from community, which is finally all that matters.

    Why are they so afraid? Perhaps they have not transmitted enough of their tradition to sustain their children within it, so that they may carry it into the “outside world” with confidence. Perhaps, ironically, it is their tradition, and its capacity to meet the modern world, in which they have insufficient faith. Either concern is overplayed. Inherited identity, in all its elements, is never completely alienable. There is no such thing as perfect assimilation, nor should there be. We are all amalgamations, whether we choose to recognize and to cultivate this inner diversity or to stifle it. The notion of hybrid vigor is useful here. Arguing for segregation — whether physical, mental, ethical (one set of standards for you and yours, and another for me and mine) or cultural — is arguing for weakness, or admitting to it. To equate assimilation with racism is to shame and intimidate us out of our birthright — as much a birthright as any authenticity could be. The world and my father’s house: they are both mine.

    If we can intuit why people overemphasize their historical glories, we might also pause to ask what makes them so anxious about loosening or even severing the bonds of past adversity. Why can’t they take yes for an answer? (When it is given, that is. There is still far too much no.) In any case, it is clear that an obsessive focus backward — whether indignant or prideful — hinders the necessary forward motion. It is impossible to transcend racism by doubling down on racial differences. The problem with the anti-assimilationist program of the contemporary left is not that it rejects a diminishing movement by non-whites to an Anglo “core culture” from which everything else is a deviation from the norm, as in Tocqueville’s day. The problem — which is mirrored and even amplified in the regressive populism of a myopic Republican right — is that it fails to meet the imaginative task at hand. To make our future society not just function but flourish, we will all have to assimilate into something outside ourselves, something based on all of us that is entirely new.

    The Fall of the House of Labor

    In 1927, there was a deep economic crisis in Palestine. Unemployed workers would gather in a workingmen’s club in the cellar of Beit Brenner in Tel Aviv to bitterly vent their difficulties. One evening, David Ben-Gurion, then General Secretary of the Histadrut (Zionist Labor Federation in Palestine), addressed them about the future of Zionism and the primacy of the Jewish worker’s role in building the land of Israel. A cry of anger erupted from the audience: “Leader, give us bread!” Ben-Gurion replied: “I have no bread. I have a vision.”

    This episode provides the terms for understanding what has happened to the Labor movement in Israel. There is no famine in the country now, and until the advent of the corona-virus there was no economic distress — but neither is there a vision, or anyone worthy of being described as a leader. How did it come to pass that the movement which built and led the nascent Jewish state from 1935, and the actual Jewish state until 1977, and later wrote several important chapters in Israel’s history, evaporated into a handful of mediocre Knesset members who were attached like a final appendage, almost vestigially, to a parity government headed by Benjamin Netanyahu, who was indicted on charges of corruption? The party that built a nation, established a state, and gathered the Jews in their ancient homeland seems to be dying slowly, unattractively.

    In the beginning there was the vision: The Jewish state and the Israeli nation would be built from the bottom up, in a gradual process of shaping society and culture. It was supposed to be a project that combined nation-building and the creation of a new society, a national goal and a social goal. It was to be carried out by Jewish workers animated by universal ideals in a particular place.

    The Labor movement did not stem from the Jewish proletariat in Eastern Europe. That proletariat was not enthralled by the Zionist idea. They preferred to find refuge and a better life in America. Of the few who were attracted to the Zionist idea, most were from the lower-middle class, semi-educated, with ties to Jewish tradition and culture, lapsed yeshiva students who adopted the idea of national and social redemption. They fell under the spell of socialist doctrine, and were motivated by the aspiration to establish a state for the Jews in Palestine. It was a dream that seemed highly unrealistic, and certainly unparalleled in the world: a nation exiled from its homeland for millennia returning to the same land.

    They drew their socialism from the Russian tradition — the Narodniks, a movement that “went to the people,” that sought to reform the Russian nation, and placed the requirement to live in accordance with one’s beliefs at the top of its value system. Believing in socialism and occasionally taking part in demonstrations was not enough; one was expected to live every day in accordance with one’s convictions. When applied to Zionism, this high ideal of philosophical consistency made it inappropriate to advocate Zionism while continuing to live in the Diaspora. It was similarly inappropriate to strive for a life of equality and continue to live as a capitalist. The demand for overlapping belief and action was unique to the Russian Narodniks, and passed from them onto the leftist Zionists, who would come to call it “realization” (hagshamah). 

    This principle, which is typical of small sects, was supposed to apply to a wide public, an entire society, in Palestine. The extreme expression of this conviction was the work of a small minority who went to live a life of equality on kibbutzim, but in the wider circle of workers who were members of the Histadrut impressive efforts were also made to create equality. Thus, for example, the pay scale in the Histadrut was determined in accordance with the number of people in the family, not the person’s standing in the functional hierarchy: according to legend, the person serving tea at the Histadrut, who had a big family, earned more than the head of his department. A Histadrut member was entitled to establish a cooperative in which all the members were partners, but not to employ hired workers, since that was considered exploitation. The aspiration to equality sometimes manifested itself in extreme ideas, such as Ben-Gurion’s proposal in 1923 that the Histadrut become the employer of all workers and pay them an equal salary — a general commune. The proposal was rejected as impractical.

    The utopian aspiration to an egalitarian society lent added value to the aspiration to establish a Jewish state in Palestine. It connected nationalism to the lofty ideals that were exciting Western intellectuals at the time. Between the two world wars and in the decade following World War II, “progressive” circles viewed the Soviet Union as a beacon in a world where fascism and Nazism were running riot, where human dignity was being trampled underfoot, especially the dignity of the Jews. In the West, Russia was perceived as a country that granted Jews equal rights and equal opportunities, and where anti-Semitism was forbidden by law. It was only in retrospect that many people in the West recognized the horrors perpetrated, not least against the Jews, by the Soviet regime. (There were some, dissidents and scholars, who knew of the horrors in real time, and there were those, true believers and ideologues, who denied them.) But at the time when the foundations for the Jewish national home in Palestine were being laid, Russia still served for many people as a prime example of the feasibility of establishing an ideal society. This mixture of socialism and Zionism in the Labor Movement served as a force for recruiting the idealistic element among young Jews in Eastern Europe. Socialism conferred moral worth and universal meaning upon a national movement, which, from its inception, was accused by its critics of undermining the country’s Arab residents. Belonging to a rising global movement for justice ameliorated the harsh life that Jewish workers in Palestine encountered every day. This was worth waking up for in the morning.

    The Labor Movement created an entire philosophy of life that shaped the culture, everyday behavior, folklore, ceremonies, and myths of the growing Jewish community in Palestine. A “New Jew” — that was the idea. This Jew was a free individual who spoke and read Hebrew, was proud and self-reliant, and prepared to defend himself and his brethren. Added to this was the demand for a life of physical labor. Since the late nineteenth century, many Jews had internalized the anti-Semitic argument that they were parasites who fed on the work of the non-Jews among whom they lived. Although there were in fact Jewish farmers, and a great many Jewish craftspeople, this did not prevent them from being cast as parasites. Only the return to the land, the early Zionists believed, to working the soil in the homeland, to “productivization,” could engender a New Jew, and erase the stain on the image of the Jewish people.

    This aspiration was congruent with the needs of the Zionist movement to settle the land, and also to provide employment for those coming to the country, which had no industry or commerce to speak of. The default option was agricultural work, which was grueling, tedious, and financially unrewarding. The Tolstoyan Zionist thinker A.D. Gordon’s notion of the “religion of labor” provided emotional compensation for the hardship, the boredom, and the poverty in the fields: the New Jew returns to the land, from which he draws vitality and virtue. He (and she) redeems the land and the land redeems him. Thus, hard work was presented not as a constraint or an obstacle, but as an ideology, a spiritual and social opportunity, a philosophy of life. Yitzhak Tabenkin, one of the leaders of the Kibbutz movement, described the New Jew as carrying a hoe in his hand and a rifle on his shoulder.

    The equal society and the New Jew were enfolded in social solidarity and assisted by an ideological system. Poems, songs, novels, and innumerable speeches and articles underscored the idea that the Jewish society being built in Palestine is a just and moral society that seeks to right a historical wrong perpetrated against the Jewish people for centuries. It also served as justification for the fact that Jews were settling in a country inhabited by another nation that steadfastly refused to share it with them. There was, of course, a measure of self-righteousness in this self-image, but it enabled the so-called “Arab problem” to be disregarded for many years, and when it was no longer possible to ignore it, in the 1930s, the new awareness of the problem was accompanied by the conviction that “we” were in the right, “they” were the aggressors, and “we” were merely defending ourselves.

    The Labor movement skillfully transformed adversity into a virtue and necessity into a moral advantage. Necessity created abstemiousness, and ideology imbued modesty and poverty with pride. When members of the youth movements sang about their uniform, “The blue shirt is without a doubt superior to all jewelry,” it was an expression of a deep faith in simplicity. Ben-Gurion made the unbuttoned collar a characteristic of Histadrut members, in contrast with the bourgeois suit. Of course there were Jewish bourgeois and even industrialists in the land of Israel at the time, and also workers who longed to live in bourgeois Tel Aviv, but the dominant ethos was one of modesty, toil, solidarity, and a willingness to enlist in great national undertakings.

    In spite of economic hardship, the culture that developed in the country was high culture. Almost miraculously, people who had come from the small towns of Eastern Europe, where books were the only refining cultural force they knew, discovered in the land of Israel an aspiration for beauty. This manifested itself in cultivating gardens in the kibbutzim, where there was barely enough to eat; in the reproductions of Van Gogh’s Sunflowers that were hung in the homes of kibbutz and moshav members; in lending libraries that operated in all the workers councils; in music classes, choirs, and concerts held near Maayan Harod in the Jezreel Valley; in workers’ housing projects in cities built in the Bauhaus style; in the education system and the non-formal education of the youth movements; in the establishment of a national theater and many publishing houses. The aim was to establish a cultivated society possessing a spirit of service, a task-oriented society that also prized theater and art. And all this was supported by a social welfare system: a sick fund assuring medical care for every worker and an employment bureau that allocated work (and included new immigrants, in contrast with the particularistic interest of the local workers) and a guarantee of a pension fund, and much more.

    The above description idealizes a past that was not ideal. There was a degree of coercion in the imperative, “Jew, speak Hebrew!” which made it difficult for new immigrants to integrate into the cultural milieu. There was also coercion in the obligation to be a Histadrut member, which implied acceptance of the principles of socialism, in order to benefit from social welfare programs. And there was always opposition from right and left to the pragmatic compromises dictated by Mapai (the Hebrew acronym for Workers Party of the Land of Israel), which was established in 1930 and expressed what its members viewed as a version of Rousseau’s “general will” in Jewish Palestine. As long as the goal — a Jewish state — was the beacon lighting the way, the flaws of a bureaucratic system and the inequities of favoritism were regarded as minor blemishes in an otherwise glowing picture.

    All this changed after the establishment of the state. The writer Amos Kenan once exclaimed: “The state destroyed my homeland!” He meant that the landscape of Mandatory Palestine, in physical and human respects alike, was becoming extinct. The Arab villages that blended with the hilly landscape were destroyed, and were being replaced with ugly immigrant housing projects that were mutilating the land he loved. Mass immigration subsumed the old Yishuv (the pre-state Jewish community in Palestine). There is a moment in Road to Life, the novel by the influential Soviet educator Anton Makarenko, in which he and his students — abandoned children and adolescents — are about to move from their small and intimate educational establishment, which had nurtured their identity, into a grand institution in which they would become a minority. Makarenko wonders whether his slender disciples, whom he had educated in his collectivist spirit, would be able to prevail over the majority, or if the opposite would occur. This resembles the situation that developed in the State of Israel in the 1950s: within three years, the Jewish population had doubled, and this was just the beginning of an  extraordinary process of growth and development. The new state’s slogan was “the melting pot”: just as immigrants to the United States adapted and integrated themselves to the existing ethos, immigrants to Israel would do the same.

    Deep down Ben-Gurion had misgivings, which were expressed in the early 1950s when he wrote that for generations the Jewish people had dreamed, anticipated, and yearned for a state, but it had never occurred to anyone that when the state was finally established there would not exist the nation to build it. He was alluding, of course, to the catastrophe of the Holocaust. He needed a critical mass of Jews in the country, and so he supported mass immigration, and fought against his advisors who sought to limit it for economic reasons. But like Makarenko, he wondered who would subsume whom. He searched for the miraculous formula that would transform what he called the “torn remnants of tribes” into a nation.

    The old ethos did not stand a chance in the new state. The veterans, the standard bearers of the ethos of modesty, physical labor, and high culture, had become bourgeois. The heroes of the War of Independence, of the struggles against the British, were weary. They now favored a less collective and more individualistic society. Very few of them responded to Ben-Gurion’s calls to enlist in the work of immigrant absorption. Inertia gripped the old pioneers. Even prior to statehood, many of them would have preferred a more comfortable life, less spartan, but it was impossible in the limited economic frameworks of the Yishuv. Now, as the economic horizons expanded, skilled workers, doctors, teachers, and engineers rebelled against the policy that sought to maintain low wage gaps between manual labor and the liberal professions.

    It was a rebellion against the egalitarian state. Equality can be sustained in a small, relatively intimate society in a situation of struggle. Now that the political mission had been accomplished, and the state was established, many people wanted to loosen the pioneering tension. Socialism was still alive, but it had softened and become middle-class, and most importantly it had shrunk into increasingly smaller circles of people who had come of age in Yishuv times. There were still youth movements, and new kibbutzim were still being founded. The Nahal (Fighting Pioneer Youth), a kind of statist continuation of the Palmach (an elite military force in pre-state Israel), was very popular. The songs of the military choirs and bands nurtured the ethos of toil and task, of the willingness to make sacrifices for the greater good. But electric refrigerators and washing machines began appearing in homes, and towards the end of the 1950s the standard of living rose. Can an ethos of simplicity and self-abnegation be maintained in an affluent society?

    Concurrent with the relaxation of the “veterans” — a category that encompassed anyone who came to Palestine before 1948 — the new immigrants arrived. In the first few years they were mainly Holocaust survivors from Europe. They came after years of war and destruction and displacement, and their landing in the immigrant camps or temporary settlements was relatively soft, since they came from nothing, with nothing, and knew they would have to rebuild their lives. These survivors accepted hardship because they were accustomed to hardship, and also because they understood what the absorbers said — in Hebrew, Yiddish, or Ladino. After them came immigrants from the Arab countries, the Mizrahim. Some of them, like the elites of Iraqi Jewry, left behind fine homes and a good life, but after a flight of just a few hours they transitioned from a life of prosperity and stability to one of difficulty and humiliation. They did not understand the absorbers and the absorbers did not understand them.

    The Ashkenazi-Mizrahi rift was born in the 1950s. There was discrimination in favor of Ashkenazim and against Mizrahim. And there was also the standing sociological insult: the melting pot policy assumed that the Mizrahim had to change and adapt themselves to the existing Eastern European socialist ethos. This had implications, among other things, for the foundational status of the father, the traditional family patriarch: he abruptly lost his standing. And there was also the issue of religion.

    The Zionist movement internalized the religious narrative of the land of the forefathers, the return to Zion, and the promised land. The entire Zionist story is founded on it. On the other hand, it made every effort to remove God from the picture. Zionism represented a defiant opposition to historical quietism, to waiting for a messiah to come and save the people of Israel, and instead it sanctified action, the human ability to change the course of history by sheer force of will. The clash between observers of the Jewish faith and secular Jews, who challenged the traditions of the forefathers, began at the inception of Zionism. Herzl did not want to rule on this issue, which could have torn the movement apart. Ben-Gurion, too, wanted to remove the subject of religion from the agenda. He refrained from creating a constitution, recognizing that the dichotomy between secular and religious was a highly charged issue, which was liable to lead to a culture war in a state that was barely standing on its own two feet.

    The spirit of compromise expressed in the Declaration of Independence — the expression “Rock of Israel,” which satisfied secular Jews since the Hebrew phrase could be interpreted literally as “the might of Israel,” and also religious Jews, for whom it was a traditional allusion to God — did not recur in the early years of the state. Most of the government crises revolved around issues of religious or secular education in the immigrant camps and settlements, the public observance of the Sabbath, and the definition of who is a Jew. So long as there was no state, it was possible to live with the dissonance between the secularism of a socialist movement and the moderate religiosity of the Mizrahim. The arguments after the establishment of the state revolved around legislative issues, and mainly touched on the education of the new immigrants from the Arab countries, most of whom were religiously observant.

    The attempt to shape the Mizrahim in the image of the New Jew failed miserably. The New Jew was a program of Westernization that many Mizrahim considered insulting and an obstacle to progress. There occurred a traumatic encounter between two different ideas of modernization. On the one hand, the conservative Mizrahi version sought to maintain the existing order, with gradual adjustments: openness to new professions, economic progress, education, as practiced in the Maghreb countries by the Alliance Israelite Universelle, the French-Jewish organization founded in the nineteenth century to protect the rights of Jews. The Mizrahi intelligentsia sought to climb the social ladder rather than enlist in revolutions. It tried to preserve the patriarch’s standing and family honor, and it viewed the extended family unit as its reference group. It strove to preserve what had been its way of life in the past, though it did not stubbornly resist Westernization. It did not want to lose itself. The other view of modernization, by contrast, was the melting pot ideology, which viewed the young generation as the agents of socialization, and regarded the closely knit family with reservations, as a hinderance to social development, and religiosity as one of the elements preventing integration into modern society. There was something aggressive and invasive in the speed with which the veterans attempted to impose change upon the newcomers. What should have taken at least two generations, they tried to implement in Israel in a decade.

    The predominantly Ashkenazi veterans — with their ideals of simple dress, restraint in expressions of joy and sorrow, and personal modesty — would come to be called “tight-asses” by Mizrahim. In fact those were traits and customs that did not typify Eastern European Jewry, but rather the socialist tribe in Israel. It was only in the quasi-ecstatic circle dancing of the “New Jews” that the shell of restraint sometimes cracked. Otherwise they propounded an austere code of conduct. Accepting collective discipline and subordinating the individual to the community, an ethos that was pronounced in the kibbutzim which were considered a model, was the result of this code. How else are we to understand the willingness of mothers to accept the judgement of the group and relinquish caring for their babies?

    Mizrahim, by contrast, loved elegant dress — the tailored suit was a status symbol. They loved overt emotionalism, vociferousness, and haflot (feasts). In their view, these were all indicators of familial joie de vivre and entirely appropriate behavior. In time, they would become the external expressions of protest against the “appropriate” culture that others sought to inculcate in them. In contrast with community as a coalescing unit, they nurtured the extended family, devoutly honoring father and mother. They did not strictly observe the commandments — soccer games on the Sabbath were the norm — but they did observe tradition, and over time the influence of religion gradually increased. Religion became a badge of their identity.

    One of the main causes of the rift between Ashkenazim and Mizrahim was their conflicting attitudes to manual labor. The lionization of work that was a central element of the ideology of the New Jew was, for Mizrahim, a symbol of discrimination and subordination, which made them the hewers of wood and the drawers of water for the Jewish state. The predominantly Mapai establishment viewed the construction of hundreds of so-called immigrant moshavim (collective villages), and the allocation of some of the means of production to them, as a resounding Zionist success. The immigrants, however, were not asked if this was what they wanted. They would have preferred Tel Aviv over the geosocial periphery, and the fact that they were sent there without being asked, without anyone taking the trouble to explain to them, in their language, what awaited them, created a residue of bitterness 

    and hostility, for which the economic success of the moshavim, and the integration of the younger generation into politics, were no compensation. Manual labor in forestry, which the immigrant camp residents were sent to do since they did not have a profession, or because there was no other work available in the difficult early years of the state, again created a situation in which Ashkenazi managers employed Mizrahi laborers.

    The government’s egalitarian policy — manifested in the austerity program that it implemented in the early 1950s, which rationed essential commodities according to family size, like similar programs implemented in Britain after World War II, and maintained low wages for skilled professionals — was not perceived as especially egalitarian by the new immigrants. They encountered a bureaucracy that was staffed by predominantly Ashkenazi veterans, who did not speak their language and, in the way of bureaucracies, behaved condescendingly. The old workers had become officials. When they preached about the dignity of physical labor, it sounded hollow, coming as it did from people working in a Histadrut or government office. The daily encounter with the clerk in the labor bureau, the receptionist in the sick fund clinic, the teacher, and the school principal was a dialogue of the deaf. Moreover, in the 1960s, a big economic gap opened up between Ashkenazim and Mizrahim. The general rise in the standard of living created the so-called “Second Israel,” which was poorer and less educated, and the patina of egalitarianism had eroded and excluded them. After the “Black Panthers” protests during Golda Meir’s government, significant reforms were implemented to reduce the social and economic gaps, and continued until 1977. But it was too late. Inter-ethnic bitterness and umbrage became one of Israel’s formative components.

    Today, when identity politics ostensibly suspend the contrasts between absorbers and absorbed, between accusers and accused, the difference between Mizrahim and Ashkenazim, which, to a great degree, is identical to the difference between right and left, seems to have always existed. When Ben-Gu-rion was in power, and also during Golda Meir’s premiership, Mizrahi support for Mapai, which went on to become the Labor Party, was taken for granted. For several decades, Menahem Begin’s Herut Party, which after the Yom Kippur War in 1973 would become the Likud, was a minority party in Israel’s political landscape. The first generation of Mizrahim in Israel accepted the political left, although it rejected the ideal of the New Jew. Despite the discrimination, there were many mixed marriages between the offspring of immigrants from Europe and the Middle East. It was only in the 1970s, after the Yom Kippur War, that the tectonic shift began. This raises the question of whether the identity politics currently dominating the political arena was inevitable. Over the years, the demographic scale has always tipped in favor of Mizrahim. But does that mean that the division between the political left and right should be based on ethnicity?

    Historically, the line separating the Jewish left and the Jewish right in Palestine did not revolve around political issues. Although the left was indeed more moderate and pragmatic in its attitude to the issue of the country’s partition, the aspiration shared by all was a Jewish state in the Land of Israel. The left was not “dovish” in today’s terms. The differences between left and right were more in the sphere of social and economic worldviews — between socialism, a workers’ society, a large public sector, and massive state intervention in the economy, on the one hand, and capitalism, individual initiative, and minimal state intervention in the economy, on the other. This was the situation during the Yishuv years, and also, with minor changes, after the establishment of the state. The issue of “Greater Israel” was not on the agenda: when Gahal, the alliance between Herut and the Liberals, was established in the mid-1960s, which marked the conservative Herut’s emergence from the political isolation imposed on it by Ben-Gurion, the Liberals in the coalition demanded that the issue of territory not be included in the party’s platform, a condition that Begin accepted. Mapai was perceived as an activist national ruling party.

    The transition in the definition of the Labor movement from socialist to “dovish” began gradually, almost imperceptibly. The shift started in 1967, after the Six-Day War. The messianic enthusiasm over the renewed connection with the expanses of the land of Israel swept up the vast majority of people in Israel. The Movement for Greater Israel was established by people from the left, including notable writers such as S.Y. Agnon, Haim Gouri, Nathan Alterman. In its early days, Gush Emunim, or the Bloc of the Faithful, which would become the nucleus of Jewish settlement in the West Bank, also included secular Jews, and even people from the left. Jewish settlements were established in the “territories” during the Alignment government (a new term for the Labor movement) and were justified for security reasons, but once out of the bottle the messianic demon could not be put back. At the same time, the standard of living rose and the barriers to employing Palestinian laborers were breached — the aspiration of the Jews for productive work, as opposed to business and finance, was cast aside, as was the romance of manual labor. The former pioneers became contractors and entrepreneurs. The economic frameworks expanded, and as the standard of living continued to rise, socialism receded.

    The transition from a leftwing government to a rightwing government in 1977 was experienced as an earthquake by the left, which had always regarded such a turn of events as inconceivable. Of course it was to be expected that the rule of the Labor Party would come to an end at some point. No political power is eternal, and prolonged rule is always accompanied by corruption. The Labor Movement was no exception. After the debacle of the Yom Kippur War, with the removal of familiar and charismatic leaders from the political arena, namely Golda Meir and Moshe Dayan, and the emergence of a duo of leaders who grew up in Israel, Yitzhak Rabin and Shimon Peres, who competed over the leadership, a political turnabout was to be expected. But when it occurred, it left Labor astounded.

    When the venerable Labor politician Yitzhak Ben-Aharon announced, after the traumatic election of 1977, that he refused to accept the decision of the people, he demonstrated the tragic disconnection between the old left and the new Israel. Begin was a leader who knew how to pluck the right heartstrings, exploiting ethnic differences and old rivalries between employers and employees, between kibbutzim and their neighboring development towns, between the traditionalism of the Mizrahim and the secularism of the left. Begin — who, immediately after the election, appeared in an elegant suit at the Likud Central Committee, put a yarmulke on his head, and said the Shehecheyanu blessing — presented himself and his movement as representatives of those whom the Labor Movement had rejected, from the old members of the Irgun to the Ashkenazi middle class and, most momentously, the Mizrahi communities. It was an “alliance of the downtrodden” against those who had considered themselves the keepers of the general will of the people in Israel.

    Yet the political downfall of Labor did not lead to a resurgence, to an attempt at re-invigoration, to a closing of ranks. Instead it provoked a long lethargy accompanied by a helpless rage. Everyone talked of the need to “go to the people,” to conduct in-depth ideological self-criticism, to educate the people so that they would come to understand their great mistake. The question was, who would do the educating and what would be the content of that education.

    Setting Mizrahim against Ashkenazim was Begin’s strategy in both policy and politics. It was a way to recruit faithful followers, whose political identity would henceforth be the identity of the right. The left could not understand how the Mizrahim, whom they dogmatically regarded as the proletariat, supported the Likud, the party of capitalists, and so it constructed theories about the irrationality of the Mizrahim, about their “false consciousness.” The simpler truth is that the Likud’s rule benefited them. It opened up economic possibilities and social advancement for them, which they had not enjoyed before. The Mizrahi path to advancement was not education, as in the case of Ashkenazim, but independent business, which in their self-perception suited the capitalism of the right. The Likud did everything it could to dismantle the left’s power strongholds: a vilification of the kibbutzim, and high inflation that damaged the economic ability of the workers’ society and Histadrut’s institutions.

    With the implementation of the economic reform of 1985, whose architect was Shimon Peres, then a member of a national unity government, capitalism came to Israel in full force: privatization and the government’s withdrawal from economic intervention. Peres performed a heroic deed that saved Israel’s economy, but it was also the final nail in the coffin of the moderate socialism that remained. The dominance of privatization in the Israeli economy turned socialism into an ideology divorced from reality. These were the years when perestroika was being instituted, however incompletely, in Russia: it transpired that even in the beacon of state-directed economy, the reputation of capitalism was changing. Israel fell into line with the new economic-social trend, alongside Ronald Reagan and Margaret Thatcher, and the Labor Movement did not have anything else to offer.

    When the left’s social and economic traditions ceased to serve as its political marker, and when nationalism and religion became the political markers of the right, the left turned to a general political moderation to establish its difference. It began with the first Lebanon war, in 1982, which created a rift in the nation. The polarization was clear: the left opposed the war, the right defended it, as well as the Begin government that waged it. A new political lexicon emerged on the left: instead of socialism, liberalism; instead of equality, social justice. In the social protest movement that erupted onto the Israeli scene in 2011, the term “socialism” was nowhere heard. In the mass demonstrations, and among the tent dwellers on Tel Aviv’s Rothschild Boulevard, the proletariat was nowhere to be found.

    In that social justice movement there were no religious people, and no Arabs. There was only the Ashkenazi leftwing-bourgeois tribe, the educated middle class. The composition of the protesters exposed the demographic and sociological origins of the left’s decline and fall. It has now been relegated to the sidelines in Israeli society. It does not represent what is considered important by the majority of the public. Instead of concern for all, it is exercised mainly by individual rights. Liberalism in Israel is identified with defending Israeli democracy, the Supreme Court, and the rule of law. At one time, when Mapai was in government, the person who represented those values was Menachem Begin. They are the positions of an opposition defending the preservation of the existing order. They have nothing important to say about the larger policy questions of security, peace, and economy. And they are now the positions of the marginalized Labor tradition, a measure of the impotence of the left in setting the State of Israel’s agenda.

    Then there was the matter of the Palestinians and the conflict. In the 1990s there flourished the vision of a peace agreement with the Palestinians. The governments that were formed by the left, the Rabin and Barak governments, made efforts to reach an accord with the Palestinians. The Labor Party’s calling card was “two states for two peoples,” a prospect that had never before seemed so close, so real. The State of Israel was finding its place in the Middle East, it was being accepted into the regional club. Those were heady days. But they came to an abrupt end with Rabin’s murder and the Second Intifada, which broke the public’s trust in peace as something attainable. The Palestinians have always had a kind of veto over the fortunes of the Israeli left: when they reject diplomacy or practice violence, they make doves look like fools, at least in the eyes of large numbers of Israelis. Are the Palestinians truly willing to reach a historic compromise with Israel? Perhaps, at this late date, there can be no agreement over fundamental issues, between security for Israelis and justice for Palestinians. The question has become less and less practicable: which current Israeli political leader is capable of withdrawing Jewish settlements from the West Bank? And how many Israelis still care? In all the recent elections the issues were: yes to Netanyahu or no to Netanyahu, yes to the rule of law or no to the rule of law, a democratic Jewish state or a less democratic Jewish state. The Palestinian question, and the question of peace, hardly figured at all. Two parties with more or less similar rightwing agendas competed against one another — one more liberal, one less, but with no real differences in their political platforms.

    What does Labor, and the Israeli left, stand for? The time has come for a historical and philosophical accounting. The Zionist idea was an explosion of historical agency, an extraordinary display of political, social, and cultural optimism. The leaders of the Labor Party were avowed pessimists, who lived in constant anxiety over the fate of the Jewish people, and later of the State of Israel — but at the same time they harbored a profound belief in the elevation of humankind and in the rescue and reformation of the Jewish people. The anxiety never overcame the optimism. The first half of the twentieth century was the single most traumatic era in the history of the Jewish people, and in such a period the Zionist-socialist vision was able to enlist faithful followers who were prepared to sacrifice their lives for future generations — a grand collective mobilization in the name of safety and justice and a morally refined nationalism. This emergency period ended sometime in the late 1950s. When global society began to sanctify the success of the individual, when Israeli society became a sated and self-satisfied society, such grand mobilizations no longer seemed possible, at least on the left. In the 1980s, I argued that the Labor Movement had lost its creativity with the establishment of the state. I think this assertion was correct. The Labor Party was a victim of its success, of fulfilling its tasks. “The state destroyed my homeland.”

    The exhaustion of Israeli socialism is not unique to it. Despite the recent appearance of socialist slogans and tropes in American and European protests, socialism is no longer a living political force. It did not succeed in adapting itself to consumer society, to the new technologies, to cultural apathy, to the flattening of public discourse by the new media. Inequality is on the rise all over the world, but the motivation of the workers in the nineteenth century, to organize and to protect their rights, is not re-emerging today. The left-wing populism of today is more concerned with identity and the remedy of psychological injuries than with radical economic reform. And there is the problem of leadership, or the lack of it. Socialists used to attract and create extraordinary political leaders, who knew how to persuade the masses to follow them. The old generation of Mapai, too, produced a series of genuinely formidable leaders, such as Berl Katznelson, David Ben-Gurion, Golda Meir, Yitzhak Rabin, and Shimon Peres. Since the early 2000s, however, there has not been a single impressive figure in the gallery of Labor Party leaders. The problem is not only with the message, but also with the messengers.

    What does Labor have to offer its voters? Contempt for Benjamin Netanyahu is right and easy, but it is not a social philosophy or a political program. In an era of social media, there are powerful advantages for radical slogans, fake news, and hate speech. The traditional Mapai pragmatism, which always endeavored to find the middle ground, to create a practical solution, no longer attracts admiration: it seems disingenuous, compromising, lacking in clarity. Even worse, it seems elitist, in an era in which elitism has become a dirty word.

    And so it is hard to avoid the conclusion that the future of the Labor brand is behind it. It has, in a word, expired. The epic story is over. And the really heartbreaking thing about its feeble conclusion is that the challenge of building a just society and a just peace remains, and more urgently than ever before. Here and there we hear the rumblings of social protest, which have not been heard for decades. Will this bring about a change of course in Israel, a serious reassertion of demands for a better and more egalitarian society? Will the old spirit be revived for new reforms? There are reasons for pessimism. I hope that they are proved wrong.

    And That is Why

    And that is why I paced the corridors
    Of those great museums
    Gazing at paintings of a world
    In which David is blameless as a boy scout
    Goliath earns his shameful death
    While eternal twilight dims Rembrandt’s canvases
    The twilight of anxiety and attention
    And I passed from hall to hall
    Admiring portraits of cynical cardinals
    In Roman crimson
    Ecstatic peasant weddings
    Avid players at cards or dice
    Observing ships of war and momentary truces
    And that is why I paced the corridors
    Of those renowned museums those celestial palaces
    Trying to grasp Isaac’s sacrifice
    Mary’s sorrow and bright skies above the Seine
    And I always went back to a city street
    Where madness, pain, and laughter persisted —
    Still unpainted

    Winter Dawn

    It happens in winter, at dawn,
    that a taxi takes you to the airport
    (yet another festival).
    Half-awake, you recollect
    that Andrzej Bursa used to live
    right here, just outside.
    He once wrote: the poet suffers for millions.
    It is still dark at the bus stop,
    a few people huddle in the cold,
    seeing them you think, lucky souls,
    you only suffer for yourselves.

    Border

    The scent of gasoline crickets
    Vladimir Holan

    Poor people wait by the border
    and look hopefully at the other side
    The scent of gasoline crickets
    skylarks sing
    the abridged version of a hymn

    Both sides of the border face east
    The north is east
    And the south is east

    One car holds a giant globe
    showing only oceans

    A little girl in an ancient Fiat 125
    carefully does homework

    in a green ruled notebook —
    there are borders everywhere

    Sambor

    We drove through Sambor quickly,
    almost instantly, it took five minutes.

    But my mother, as I recall,
    passed her exams here.

    Dusk fell
    without funeral marches.

    A lone colt danced on the highway,
    though it didn’t stray far from the mare;

    freedom is sweet,
    so is a mother’s nearness.

    Over fields and forests
    gray silence reigned.

    And the little town of Sambor
    sank into oblivion again.

    Mountains

    When night draws near
    the mountains are clear and pure
    — like a philosophy student
    before exams.

    Clouds escort the dark sun
    to the shaded avenue’s end
    and slowly take their leave,
    but no one cries.

    Look, look greedily,
    when dusk approaches,
    look insatiably,
    look without fear.

    Translated by Clare Cavanagh

    Do No Harm: Critical Race Theory and Medicine

    In the winter of 1848, an epidemic of typhus ravaged Upper Silesia, a largely Polish mining and agricultural enclave in the Prussian Empire. Months earlier, heavy floods had destroyed large swaths of cropland, leaving the peasants to subsist on a paltry diet of clover, grass, and rotten potatoes. Weakened by starvation, they readily succumbed to infection. The Prussian authorities tapped a precocious twenty-six-year-old junior physician named Rudolf Ludwig Karl Virchow, at Berlin’s Charité Hospital, to perform the routine task of surveying the outbreak. For three weeks, Virchow travelled from town to town, observing that families of six or more often shared single room dwellings, turning homes into hotbeds of contagion. He noted the stigmata of the typhus rash — angry red spots that mysteriously spared the face and soles of the hands and feet — documented the nature of fevers, coughs, and diarrhea, and performed a few autopsies. 

    Virchow’s report to the Prussian Minister for Religion, Education, and Medicine contained mortality statistics and clinical descriptions. He also dispensed predictable recommendations for flood control and drainage systems. But what exercised Virchow the most — and what his sponsors least wanted to confront — were the deeper causes of the epidemic. “The nouveaux riches” who extracted wealth in metals and minerals from the mines treated their Silesian workers “not as human beings but as machines,” he wrote. He blamed the Catholic Church for keeping “the people bigoted, stupid and dependent.” “If these conditions were removed,” the bold young doctor offered the minister, “I am sure that epidemic typhus would not recur.” 

    Even before he left for Upper Silesia, Virchow was primed to see the threads between social conditions and disease. In Friedrich Engels’ treatise on the working class in Manchester and the deathtrap factories in which they toiled, which appeared in 1845, he read about the cramped conditions and the poor ventilation, the gruesome machine accidents, the toxic fumes and the woolen fibers they inhaled with every breath. From Edwin Chadwick’s Report into the Sanitary Conditions of the Labouring Population of Great Britain, from 1842, Virchow learned that workers in the North rarely passed their thirtieth birthdays. He read of families crammed into half-lit quarters, choking on the stench of human and animal waste, grinding away their years under a sky so thick with soot that it blocked the sun and caused their children’s legs to bow from rickets. In a letter to his father, Virchow said that his immersion in the Silesian typhus epidemic had turned him from “half a man [to] a whole one, whose medical beliefs fuse with his political and social ones.”

    Ten days after Virchow returned to Berlin in mid-March, the spring riots erupted, one of many protests against monarchy spreading across Europe. He flung himself into the short-lived revolution, brandishing a pistol at the barricades — a feat of activism followed by months of local political involvement that led Charité to suspend him. Fortunately, the University of Würzburg was eager to attract the medical prodigy and offered Virchow a position on the condition that he not use it as “a playground for radical tendencies.” He accepted the demand for his depoliticization, and used his new position instead as a proving ground for wondrous advances in medical science. He perfected new microscopic techniques that helped him discover how tumors form, how tissues proliferate, and how blood clots. Virchow was among the first to correctly link the origin of cancers from otherwise normal cells, the first to describe leukemia, and the first to use hair analysis in a criminal investigation. He discovered the life 

    cycle of the parasite Trichinella spiralis, or “pork worm,” which established the importance of meat inspection in Germany.

    Yet the laboratory could not contain him. The experience in Upper Silesia had convinced him that doctors, knowing as they did the true conditions of humanity, made the best statesmen. Thus, after completing in 1858 his magisterial text Cellular Pathology as Based upon Physiological and Pathological Histology, a work that is regarded as the foundation of modern pathology, Virchow moved back into politics, this time as a professional. He became a member of the Berlin City Council beginning in 1859 and planned the city’s sewage system. He next entered the Prussian House of Representatives and in 1880, at the age of fifty-nine, the German Reichstag. Virchow’s well-justified faith that social reform was necessary to combat disease never left him. When he celebrated his eightieth birthday in 1901, he was hailed by physicians all over the world as the “father of social medicine,” or medecine sociale. The term was coined by Jules Guerin, a French physician — doing so, coincidentally yet auspiciously, one day after Virchow’s return from Upper Silesia — to indicate “the numerous relations which exist between medicine and public affairs.” 

    These “numerous relations between medicine and public affairs” formed the philosophical heart of public health in Europe. In America, however, the field organized itself around technical strategies aimed at the leading causes of death at the turn of the century, which were influenza, pneumonia, diphtheria, tuberculosis, and cholera. It was a time when physicians had little to offer in the way of medical treatment. Opium, laxatives, sleeping powders, bloodletting, and leeches were the mainstays of care. In a speech in 1860, Oliver Wendell Holmes Sr. famously told his colleagues in the Massachusetts Medical Society that “opium, which the Creator himself seems to prescribe, wine which is a food, and the vapors which produce the miracle of anesthesia… I firmly believe that if the whole material medica, as now used, could be sunk to the bottom of the sea, it would be all the better for mankind — and all the worse for the fishes.” 

    Public health experts and doctors worked together closely to contain epidemics, but as drug discoveries mounted — among them penicillin for a broad swath of infections in 1928 (though it was not used until 1942) and the sulfa drugs in the 1930s — the power and prestige of the medical profession grew, and it separated from public health. Boundaries were established: public health cared for populations while medicine cared for individuals. Within American public health, an occasional champion of European-style social medicine would emerge. Charles Edward Winslow, head of Yale’s department of public health, was one. As he told his colleagues in 1948, “You and I have determined that men should not sicken and die from polluted water, from malaria-breeding swamps, from epidemics of diphtheria, from tuberculosis. Those battles have been, in large measure, won. We must now determine that men shall not be physically and emotionally crippled by malnutrition, by slum dwellings, by lack of medical care, by social insecurity.” 

    Winslow’s eloquent plea to address “social insecurity” went unheeded. The field took the opposite route, dedicating itself to individual-level risks for injury and chronic illness. Surgeons General and public service advertisements exhorted Americans to stop smoking, eat more vegetables, exercise, wear seatbelts, and so on. To protect consumers, health warnings appeared on cigarette packs. Within the academy, however, theoretical developments inspired by social medicine were underway. Sociologists and epidemiologists found common interest, for example, in 1950, in a study of the relationship of fetal and infant mortality to residential segregation. The study evoked Virchow’s commitment to quantification. “Medical statistics will be our standard of measurement: we will weigh life for life and see where the dead lie thicker: among the workers or among the privileged,” he vowed. 

    By the 1970s, a cadre of epidemiologists were studying the psychological, social, and cultural forces that make people more vulnerable to disease and that shape their choices regarding health. The term “social determinants of health,” which came into general usage in the 1990s, captured that formidable notion. Its more abstract cousin, “the social production of health,” examined how social inequalities affected health, and often did so with a nakedly ideological slant that implied no limit to the profession. Consider some sample quotations from faculty: “The practice of public health is, to a large degree, the process of redesigning society”; “Every problem is a public health problem”; “A school of public health is like a school of justice.” The latter dictum was issued by a former dean of the Harvard School of Public Health. 

    Interpreting these trends, the medical economist R.G. Evans and the health policy experts Morris L. Barer, and Theodore Marmor wrote in their book Why Are Some People Healthy and Others Not?, in 1994, that “for those on the left, health differentials are markers for social inequality and injustice more generally and are further evidence of the need to redistribute wealth and power and restructure or overturn existing social order.” Yet not everyone welcomed the infusion of progressive norms into public health academy. “We have nearly converted the school of public health from an institution committed to developing the scientific bases for disease prevention into one of many arenas for advancing social justice,” Philip Cole of the University of Alabama at Birmingham and his colleagues sternly observed in 2000. “Broadly speaking, public health is aligned with the left,” said the dean of the Boston University School of Public Health, “and there is no sense dancing around this.” He appealed to his colleagues to be “a fully inclusive left,” to “let go of always taking sides,” and to “abandon the hectoring tone [that] radicalism can entail.”

    During the protests triggered by the killing of George Floyd, many health professionals allowed their personal politics to bleed into their professional advice. A much-retweeted message from a senior epidemiologist at the Johns Hopkins School of Public Health instructed that “in this moment the public health risks of not protesting to demand an end to systemic racism greatly exceed the harms of the virus.” Three days later, 1,200 health professionals signed an open letter. “We do not condemn these gatherings as risky,” they wrote. They are “vital to …the threatened health specifically of Black people.” Speaking to the New York Times, one epidemiologist who marched remarked that “I certainly condemned the anti-lockdown protests at the time, and I’m not condemning the protests now, and I struggle with that… I have a hard time articulating why that is okay.” His honesty was refreshing, but the answer to his dilemma is that it is not okay. The job of epidemiologists is to inform the public about risks. It is absolutely not to tell them which risks are worth taking and what their moral prerogatives should be. 

    Months later, when it came time to distribute the corona-virus vaccine, an assortment of authorities, including legal scholars, public health experts, and state officials argued for giving high priority to black citizens in the name of “historical injustice.” About that historical injustice there can be no doubt, but the Advisory Committee on Immunization Practices of the Centers for Disease Control and Prevention, or CDC, concluded that race should supersede age as a prioritization category because the oldest cohort in America is whiter than the general population. Elevating “health equity,” the task force said, took precedence. The CDC itself told the committee that its allocation plan would result in up to 6 percent more deaths, many of whom would be black senior citizens — the highest risk group; but the advisers remained loyal to it. Their loyalties, in other words, were to an ideal, not primarily to protecting health. 

    While the CDC was developing its equity approach, the National Academies, non-governmental institutions that offer independent advice on science policy, proposed an allocation plan that would give priority to communities that rate high on the CDC’s Social Vulnerability Index. Using U.S. Census data, the index factors in poverty, unemployment, and health-insurance rates, among other socioeconomic vulnerabilities. Since minorities are more likely to meet criteria for social vulnerability, they would receive the vaccine early under that approach. Weeks later, public outcry forced the CDC committee to reverse itself. Still, on April 1, the governor of Vermont allowed anyone aged sixteen or older who identified as black, indigenous, or a person of color, or anyone who lives in a household with someone who does, to be vaccinated. Whites under fifty, unless they qualified for a vaccine by virtue of being a health care or public safety worker, of having a high-risk health condition, or being a parent or caregiver of someone at medical risk, had to wait. 

    Across campus at the medical school, the academic tradition was less politicized. Doctor-led activist groups have long existed, most notably Physicians for Social Responsibility, fifty years old this year, which shared the Nobel Prize in 1985 for alerting the world to the consequences of nuclear war, but physicians have mostly confined their advocacy, if they engaged at all, to healthcare financing and delivery. Doctors who specialized in caring for homeless people or pediatricians who treated poor children could not ignore poverty and decrepit housing, and they often collaborated with local social service agencies in keeping with their medical calling. But by oath and inclination, doctors’ eyes are, or should be, on treating the patients before them, not on reforming society. 

    I certainly acknowledge that the culture of American medicine has been changing over the last decade or so, at least among a vocal contingent. A dramatic validation of the shift took place on December 10, 2014, International Human Rights Day, when 3,000 medical students “died” on the lawns and walkways of medical school campuses across the country. The “national white coat die-in” was the brainchild of medical students who were moved to demonstrate for racial justice in the wake of the police killings of Michael Brown and Eric Garner. At the “die-in,” students wearing surgical scrubs and white jackets lay silent for four and a half minutes, symbolic of the four and a half hours that Michael Brown’s body remained on the street in Ferguson, Missouri after a white police officer shot him. A group called WhiteCoats4 BlackLives, WC4BL, emerged from the event. Its mission is to “prepare future physicians to be advocates for racial justice,” and one of its core convictions is that “policing is incompatible with health.”

    After George Floyd’s murder, WC4BL organized gatherings in medical centers across the country. These took place only a few months into the pandemic. Reeling from a triple tragedy — another black victim of police brutality, a viral death toll that unduly savaged black Americans, and their own bone-weariness from toiling in the plague-infested trenches — trainees and doctors took action. They persuaded almost two hundred state and local governments to declare racism as a public health crisis. The dual premise was that racially motivated  police violence  is bad for the health of blacks and that systemic racism is the pre-eminent driver of the overall poorer health of the black population. The American College of Physicians pledged a “commitment to being an anti-racist organization;” the American Psychiatric Association (my trade organization) vowed it “would not stand for racism against Black Americans”; and the American Academy of Pediatrics implored its members to “dismantle racism at every level” of society.

    In the wake of George Floyd’s death, major journals published numerous essays on racism in medicine, often lifting the paywall for them. In the New England Journal of Medicine, for example, a psychiatrist called for “majority taxes” on white colleagues defined by the author as a mandate to “acknowledge your White privilege, no matter how uncomfortable; leverage privilege to highlight medical racism; and humbly and actively implement antiracist policies.” In the Journal of the American Medical Association, authors insisted that “researchers must name and interrogate structural racism and its sociopolitical consequences as a root cause of the racial health disparities we observe.” Writing in Health Affairs, six doctors, all closely affiliated with the American Medical Association, the AMA, cautioned that “while naming racism as a fundamental cause of health inequity is a crucial first step, our patients, colleagues, and communities will not reap the benefits of such declarations until racism is exposed, confronted, and dismantled.” 

    Some of these articles specifically cited Critical Race Theory, or CRT, a worldview that interprets social existence for minorities as a perpetual power struggle waged every day and in every aspect of their lives against a dominant group. Differences of any kind — in income, education, school performance, and, of course, health — are manifestations of racism and racism alone. Within the domain of medicine, the critical race perspective casts key institutions — the training apparatus (medical schools), the knowledge base (medical journals and funders of research) and the treatment enterprise (the delivery of healthcare) — as engines of oppression and exploitation. The practice of “equity,” the enactment of critical race theory permits, if not endorses, unequal treatment of the dominant group in order to arrive at equal group outcomes, even if it is to the detriment of ailing individuals. 

    Despite the radical nature of critical race therapeutics, its proponents mean to deploy it in the service of a most conventional project: the reduction and elimination of health disparities, that is, the white-black gap in health status and in access to care. This is a fine goal and a decades-old campaign, put forth most prominently by the federal government, notably in a landmark report in 1985, and by foundations such as the Kaiser Family Foundations and the Robert Wood Johnson Foundation. But can approaches informed by critical race theory help to narrow the health gap? And can they do so in ways that do not create a zero-sum scenario in which the health of other groups is compromised?

    That question will be tested on a small scale at Brigham and Women’s Hospital in Boston in the form of a pilot study designed to give preferences to black and Hispanic patients with heart failure, a condition in which the heart muscle can no longer pump enough blood to meet the body’s needs. Minority patients will receive priority for admission to the cardiac specialty unit. According to two Brigham physicians writing earlier this year in the Boston Review, a review of medical records going back ten years showed that minority patients were less likely to be admitted to the specialty cardiac unit (with its private rooms and, presumably, more attentive care) than whites. Instead, they were more likely to be admitted to the general medical unit.

    Three percent of black patients hospitalized on the general unit died within a month of discharge, compared to under one percent who were cared for on the specialty unit. Retrospective studies such as this one are inherently limited, as the authors themselves admit, because they are conducted in hindsight and thus miss important variables. Even so, the data showed that the strongest predictor of where a patient would be admitted was whether or not they were being cared for by an outpatient cardiologist, making physician advocacy the most likely, though not the sole, explanation for unit of admission.

    Non-controversial remedies for differences in the quality of care include strengthening standardized admission guidelines or using decision tools to upgrade physicians’ care for heart failure care on the general medical unit. The doctors at Brigham who devised the pilot openly considered these options in a paper in a cardiology journal two years ago, yet still they chose to pursue a pilot that followed a “reparations framework,” no matter the legal ramifications. As Bram Wispelwey and Michelle Morse explained in their Boston Review article, entitled “An Anti-Racist Agenda for Medicine”:

    Offering preferential care based on race or ethnicity may elicit legal challenges from our system of color-blind law. But given the ample current evidence that our health, judicial, and other systems already unfairly preference people who are white, we believe—following the ethical framework of [applicative justice] and others—that our approach is corrective and therefore mandated. We encourage other institutions to proceed confidently on behalf of equity and racial justice, with backing provided by recent White House executive orders.

    The physicians who designed the Brigham pilot justified it as “redress [for] the outstanding debt from the harm caused by our institutions.” 

    Similar developments are occurring in medical education. Last summer, the Association of American Medical Colleges informed the medical community and its 155 medical schools that they “must employ anti-racist and unconscious bias training and engage in interracial dialogues.”  This spring, the AMA advocated “mandatory anti-racism [training]” as part of its vision that all physicians “confront inequities and dismantle white supremacy, racism, and other forms of exclusion and structured oppression.” The data on effective-ness of such training initiatives, however, are dismal, with study after study showing that such efforts often backfire by reinforcing racial and ethnic stereotypes while failing to improve morale, collaboration, or diverse hiring within a workplace. Still, Michigan now requires implicit bias training for health professionals and Maryland has made it a condition of obtaining a medical license. Such training often includes the wildly popular Implicit Association Test, or IAT, a computer-administered reaction time test purported to measure unconscious prejudice and thus foretell whether an individual will engage in discrimination. The problem, according to several teams of research psychologists, is that the race IAT has no predictive value. 

    Curricular reform in medical schools is also underway. At Stanford University School of Medicine, for example, a new “anti-racist” curriculum will instruct students in “confronting white supremacy.” Students at Brown University will take a four-week course on Racial Justice and Health Inequity to “gain a deep understanding of topic areas, such as Critical Race Theory, intersectionality, and the inequities that pervade the U.S. healthcare system.” At Kaiser Permanente Bernard J. Tyson School of Medicine, topics covered will include social identity, intersectionality, power, and privilege; history of race and racism in medicine and science; and media bias and literacy. Staple readings of the new curricula are White Fragility by Robin DiAngelo and How to Be an Antiracist by Ibram X. Kendi. 

    The integrity of the medical profession has certainly been compromised by a terrible history of racism. Physicians performed medical experiments on male and female slaves, trying to improve surgical techniques and better understand anatomy and physiology. In the Jim Crow era, many Southern hospitals, clinics, and doctors’ offices were completely segregated by race, and many more maintained separate wings or staff that were legally banned from mixing with both black and white patients. Even donated blood was kept in separate blood banks. As Vann R. Newkirk II put it, “Within the confines of a segregated health-care system, these factors became poor health outcomes that shaped black America as if they were its genetic material.” In 1997, President Clinton apologized for the Tuskegee syphilis study. 

    Students should also know that the AMA apologized to black physicians in 2008 for more than a century of policies that excluded them from the association, in addition to policies that barred them from some state and local medical societies. As late as 1966, black and white demonstrators picketed the AMA annual meeting to integrate all county and state medical societies. The association also failed to speak against federal funding of segregated hospitals and was silent in the face of pending civil rights legislation. Those transgressions are an important part of the record — but will they and other examples of racial injustice in medicine be taught as part of the encompassing social history of the field, or as a defeatist narrative that glosses over the moral progress that medicine, though still imperfect in many ways, has made? 

    The fundamental problem with social justice in public health is that there are no limiting principles to it. And so the new pedagogy prompts other questions. Will coursework in basic medical science, early clinical skills, epidemiology, bioethics, or exposure to the medical humanities be displaced to accommodate the anti-racist curriculum? How will deans respond to students who do not regard the medical classroom as a suitable venue in which to interrogate their social conscience, be told they must “accept America’s racist roots,” or informed that “we live in a country [with] a political economy predicated on devaluing Black labor, demeaning Black bodies, and denying Black humanity” (as a group of medical educators writing in the New England Journal of Medicine would have them do)? Will the moral fitness of such future doctors be called into question? 

    If health is completely at the mercy of social forces, as critical theory insists, will the importance of self-care be given adequate attention? It is hard to imagine that physicians will desist in discussing with patients matters such as diet, exercise, smoking, and so on — in short, actions they can and should take to improve their health. And yet, following a lecture I gave earlier this year, I was castigated by some psychiatric residents for drawing attention to the dimensions of personal agency in addiction. I was not “blaming the victim,” as charged. Quite the contrary. I was drawing attention to their potential, to the remnants of their agency.

    Will the anti-racist medical classroom accommodate controversy? Judging from the censorious milieu in some medical schools, I am not optimistic. One of my colleagues — here is one example among many — lost a departmental leadership position after trainees accused him of making them feel “unsafe.” The accusation came on a Zoom call during which my colleague objected to a fellow faculty member questioning his “support” for diversity. Surprised, he asked to know what he had said to give such a false impression, but he was never told. In an ethos in which an allegation is a conviction, an insinuation was enough. Another colleague told me that she stifled complaints when her school jettisoned lectures in bioethics to “make room for the anti-racist curriculum. Which is ironic, because that was where students were taught about subjects like the Tuskegee syphilis experiment.” A third colleague told me that during a group discussion of stress and suicide in black youth, the tacit rule was that only fear of police aggression and subjection to discrimination were allowable explanations, not the psychological torture of bullying by classmates or the quotidian terror of neighborhood gun violence. 

    One florid instance of the intolerance for controversy is the case of Dr. Norman C. Wang, a University of Pittsburgh cardiologist. Last March, he published an article in the Journal of the American Heart Association titled “Diversity, Inclusion, and Equity: Evolution of Race and Ethnicity Considerations for the Cardiology Workforce in the United States of America from 1969 to 2019.” Wang’s interpretation of the data on performance persuaded him that affirmative action in medicine was not working. “Excellence should not be sacrificed for short-term demographic optics,” he concluded. When news of Wang’s peer-reviewed paper hit social media last August (its initial appearance in March 2020 garnered little notice because it coincided with the onset of the pandemic), the reaction was swift.

    Physicians savaged him on Twitter. “Rise up, colleagues. The fact that this is published in ‘our’ journal should both enrage & activate all of us.” “Racism is alive, well, and publishable in medicine.” “We stand united for diversity equity and inclusion. And denounce this individual’s racist beliefs and paper.” The school fired Wang as director of the electrophysiology fellowship and banned him from having contact with medical students. The American Heart Association emphatically tweeted that Wang’s article “does NOT represent AHA values,” and it launched an “investigation to better understand how a paper that is completely incompatible with the Association’s core values was published.” Once alerted by Wang’s own medical school to allegations that his article contained “many misconceptions and misquotes” that “void … its scientific validity,” the journal retracted it. 

    Diversity is one of the most pressing issues in medical schools today. Nationwide, five percent of physicians are black, under half the national demographic of 13.4 percent. Graduates are more likely to practice in underserved areas, and some evidence suggests that black patients enjoy better communication with doctors of the same race. To bolster those numbers, many medical school admissions committees employ a “holistic review framework,” created by the American Association of Medical Colleges to consider applicants’ experiences and attributes in addition to academic achievement. “Situational judgment” and emotional intelligence are taken into account at several dozen schools. Some colleges offer special programs to shore up the academic record of aspiring black pre-med students. Yet despite these robust efforts and others, progress has been exasperatingly slow.

    Another delicate topic is the relationship between race and disease. Some worry that putting “genes” and “race” in the same sentence will encourage the fiction that races are discrete entities defined by biological traits. With science literacy among the public so tenuous, the worry is not misplaced. But the fact is that studies involving genes and race are simply about population genetics: the fact that people sharing a geographical ancestry are more likely to have particular gene variants (alleles) in their genome than do people with a different heritage. These variants may code for proteins or enzymes that cause vulnerabilities to certain diseases or determine how robust a response to treatment is likely to be. 

    Race is thus a shorthand for ancestral descent — and the more precise the ancestral origin the better, as variations in genetic heritage exist even between groups within a geographical region. Genetic admixing, that is, when parents are of different “races” or are mixed race themselves, further complicates the picture to the point where the shorthand of race becomes irrelevant, or too crude a category to be of any help at all. Researchers and physicians agree that the science of pharmacogenomics — the elucidation of the relationship between treatment and individuals’ unique genomic finger-print to create personalized therapies — will make the controversy obsolete. But until this gold standard is used widely, group-based genetic analysis will have some value.

    Even with the caveats in mind, genetic heritage can be relevant to medicine with regard to appropriate dosing of certain drugs, more accurate prediction of responses to those drugs, clinical decision-making via algorithms (an especially controversial matter that scientists are currently debating in good faith), and heightened risk for certain conditions, such as cardiovascular and renal disease. 

    A recent study in the Journal of the American Medical Association  uncovered an interesting finding correlated with race. In a sample of about three hundred patients at a New York medical center, blacks had stronger expression of the gene that codes for Transmembrane Serine Protease 2, a protein known as TMPRRS2 than did white, Asian, Hispanic, or mixed-race patients. TMPRSS2 sits on the surface of cells lining the nose and is involved with entry of the coronavirus into those cells. Will that finding hold up on replication? Perhaps not. And if it does, the protein likely accounts for a small part of the racial variation in COVID-19 infections, the lion’s share accounted for by social factors. Still, the investigation yielded potentially important findings. Science, after all, is provisional, cumulative, and, eventually, self-correcting. 

    Yet this study, too, provoked a swarm of angry responses from doctors and health professionals. “This is sounding way too much like blaming and rings of eugenics.” “Race IS NOT genetic.” “Stop …systemic racism is why [black, indigenous, and people of color] are disproportionately harmed by COVID-19.” “I think this would hold water if by ‘TMPRSS2,’ you meant ‘racism.’” “Shame on this publication for perpetuating racism.” “Biomedical racism to a T.” A team writing in Health Affairs warned researchers who planned to publish on health disparities to “never offer genetic interpretations of race because such suppositions are not grounded in science.” They also proposed that medical journals “reject articles on racial health inequities that fail to rigorously examine racism.” The article-review process, they say, requires “editors who are well versed in critical race theory.” But why? For genetic inquiry across groups is emphatically not “racial science” or scientific racism. The objectivity of research is not a form of complicity in structures of power; it is the very condition for the discovery of treatments that are genuinely universal.

    Concerned by the disavowal of such studies, experts spoke up. “For some applications, race may continue to be the best variable to capture the influence on health,” wrote John P. Ioannidis, Neil R. Powe, and Clyde Yancy in the Journal of the American Medical Association. “Quick dismissal,” they cautioned, “may worsen outcomes, especially for the most 

    disadvantaged populations.” In the New England Journal of Medicine, five genetics experts, who identified themselves as black, declared that “ideally, race will be replaced with genetic ancestry as a variable in medical research and practice. But until more ancestry data are available, ignoring race and extrapolating research findings from European ancestry populations to others is neither equitable nor safe.” The authors expressed disappointment that some “curricula promote ideologies that downplay the medical achievements of genetic studies.”

    Several months after Rudolf Virchow returned from Upper Silesia, he started a weekly newsletter. Although Die Medizinische Reform lasted only a year, many of the aphorisms enshrined in its pages live on. The masthead dictum — “physicians are the natural attorneys of the poor” — is among the most famous. In Virchow’s time, a physician was able to make a powerful case to politicians that the major scourges of the day — contagion, malnourishment, and starvation — required effective sanitation, adequate nutrition, and the alleviation of extreme poverty. In short, significant social policy. The connection between health and reform — civil engineering and food — was direct and obvious, “a remedy against the recurrence of famine and of great typhus epidemics,” as Virchow told the Prussian minister. (This was especially the case because antibiotics did not yet exist.)

    Under a regime of critical medical theory — CMT? — the mandate for change — “dismantling racism” — presents doctors with an unworkable challenge. For one, physicians are wholly ill-prepared for such a task. Their primary job is to diagnose and to treat — and to do no harm in the process. They have no expertise in the redistribution of power and money — nor can triage or surgery wait for such a redistribution. By urging reform of this kind in the name of health, good intentions aside, they risk abusing their authority, using the profession as a vehicle for politics, and, ultimately, eroding the trust of the public. 

    Moreover, the mission itself is too ambiguous. Even for seasoned policy analysts, teasing out a strong causal link between health and sprawling upstream economic and social factors is very difficult. With so many “intervening variables” at play, manipulating policy in the service of health may not have its intended effect, while the odds of creating unwanted repercussions elsewhere in the system are significant. 

    None of this is to elide the fact that much of black disadvantage in health is the cumulative product of legal, political, and social institutions that have historically discriminated against them, either explicitly or through passive disregard to the differential brunt of policies, and in certain instances still do. As a result, blacks lack comparatively fewer opportunities for better health. The neighborhoods in which they are more likely to reside attract lower levels of civic investment. This in turn leads to underfunded hospitals, fewer emergency services, pharmacy deserts, worse air and water quality, and fewer supermarkets and safe options for outdoor exercise. 

    There is indeed a race-based story to tell about why, in aggregate, black Americans suffer poorer health and receive less care than whites. It is a story that delivers real and painful truths. The pandemic served as an object lesson in differential exposure to the virus, with rates of coronavirus infections that were three times as high in blacks as whites. With jobs as lower-paid essential workers (e.g., transit workers, building maintenance staff, grocery store employees), dependence upon public transportation, and residency in dense quarters, African Americans were at higher risk.

    And yet “systemic racism” is not a useful medical diagnosis: it may have explanatory value but it doesn’t yield realistic prescriptions. So what are physicians supposed to do now? Only when explanations are able to bring causal dynamics into sharp focus will they reveal efficient points of entry into the healthcare apparatus for minimizing health gaps. In California, for example, too many black patients with colon cancer were falling through the cracks. When such patients in California were treated at an integrated health care system — a point of entry where all aspects of care were delivered under one roof — black patients fared much better than black patients treated in other settings. As a result, survival was the same for black and whites. Such initiatives are hard at work in cities across the country. 

    The totalizing narrative of race, like all totalizing narratives, dangerously simplifies things. It discounts other ways to illuminate the black-white gaps in health. Consider, for example, the constellation of disadvantages called Adverse Childhood Experiences, which bear a well-documented relationship to future health. Imagine a succession of deprivations and insults, toppling one after the other like dominoes across the lifespan. Start with mothers who receive little to no prenatal care. Their poorly thriving babies are born into an often fatherless world full of chaos, physical and emotional abuse or neglect, and domestic and community violence. These are not hostile stereotypes; they are real-world phenomena that must be faced if their consequences are to be understood — and, optimally, prevented, buffered, or reversed.

    The stress of sustained trauma can alter children’s neural maturation and hormonal function, predisposing them to problems such as poor emotional regulation and stunted cognitive development, including working with memory, attentional control, and cognitive flexibility. These deficits, in turn, may disrupt the formation of healthy attachment to other people, lead to weak performance in school and low educational attainment overall. As often lonely teens with a foreshortened sense of the future, they are prone to risk-taking with drugs and alcohol, reckless driving, and unprotected sex. As adults, they are often burdened by depression and despair and tend to smoke heavily, drink to excess, and abuse drugs. Next comes disease, mainly in the form of cancer, cardiovascular disease, diabetes, and renal illness. Then, premature death. The more adverse experiences, the greater the odds of these otherwise avoidable health consequences. Racial and ethnic minorities and Appalachian youth, as epidemiologists have shown, are at greater risk for more adverse experiences.

    The medecine sociale that Jules Guerin defined and the noble Rudolf Virchow once practiced was about the “numerous relations between medicine and public affairs.” Social medicine today, having been irradiated by critical race theory, has mutated into a belief that only one relation matters: systemic racism. From this monocausal vision, so constrained by its nature and animated by grievance as it is, has emerged a host of unhealthy developments. The worst of them is a doctrinal intolerance of explanations that lie outside the oppression narrative. 

    Add to this an intensely politicized environment that threatens academic collegiality, open inquiry, and unapologetic discourse. Round out the project with contempt for the notion of personal responsibility in health, and permission to erode the boundary between personal politics and professional obligations. Perhaps the apotheosis of the critical medical imperative is the dispensation that it grants a group of Boston doctors to miss the real question — how a hospital can deliver the best care for each patient it serves — in favor of a righteous trial of racial preferences that might harm other patients.

    Whether critical medical theorists represent the tip of an iceberg or the far tail of a bell curve seems moot, given their formidable influence at elite medical schools. Many deans and chairmen are doubtless too intimidated to resist. At the same time, however, their youthful colleagues are likely to be sympathetic to the critical justice project. Over the past decade, according to an analysis at Stanford University in 2019, young physicians have been moving so “sharply to the left” and flocking so densely to urban areas — “ideological sorting,” the authors called it — that rural areas are suffering from shortages of physicians. 

    The spirit of social medicine is precisely what should inspire some of those young doctors to set up practice in a rural minority town. If being anti-racist is their priority, it is probably the best gift they can give. That spirit should also prompt us to challenge the status of the black-white gap in health as the dominant measure of our wellbeing as a population. Just as a hammer is predisposed to see all problems as nails, emphasizing such gaps — now routinely called “health inequities” — leads inexorably to the quixotic conclusion that dismantling racism is the medical answer. And the tyranny of this gap forecloses another, more universal definition of disparity: the differential between a person’s current health and their optimal health, between the quality and quantity of the care that they are currently receiving and what, as a matter of right, they deserve. 

    The strict imperatives of clinical practice may be the best buffer against ideology. The surgical suite, the emergency department, and the examining room are the definitive, consequential spheres of clinical intervention. When applications to medical school rose steeply last year in the wake of the pandemic, a phenomenon dubbed the “Fauci effect,” the young applicants were surely inspired by the extraordinary heroism of doctors and nurses and the technical prowess of medical science.

    Physicians are still the natural lawyers for the disadvantaged, but in their way. In the clinic and at the bedside, they argue most eloquently through their specialized knowledge and their compassion. In medical journals, they spread knowledge through dispassionate, truth-seeking methods that speak to all. And in the realm of social medicine, they do their best work aiding those who are most vulnerable and in need, regardless of group affiliation. The best way to be an anti-racist doctor is to be a good doctor.

    Three Tales

    MONDRIAN

    Mondrian’s closest friend was the Dutch painter Eli Streep, a Jew who was caught in a raid in Paris in 1942 and murdered. Mondrian had escaped by then, via London to New York. Streep and Mondrian saw each other almost every day in Paris during the many years they both lived in the same shabby building on the Rue du Depart by the Montparnasse railway station. They had been schoolboy friends in Amsterdam, and they were among the first young painters to notice the death of the almost unknown Vincent van Gogh, a few of whose strange paintings had attracted them. They even visited Theo van Gogh’s young widow, Jo, to see more of her brother-in-law’s pictures after she had married the painter Cohen Gosschalk.

    Mondrian seems to have painted the first of his purest signature works around 1921, those with the first slightly thickened black lines, vertical and horizontal. He and Streep had always been struck by van Gogh’s intense, sometimes black outlines, his way of outlining faces and bodies as well as houses and trees. These painting-drawings of Vincent’s last few years, done in the insane asylum near Arles and then in Auvers in the north, where he killed himself, became, for Mondrian, depressed images from which he plucked the outline, and blackened and straightened them out upon his own white canvases until the lines depressed his own art, or so he thought.

    Streep encouraged Mondrian each day so that their poverty — they subsisted on bread, potatoes, and coffee more days than not — was little noticed. Streep thrived, if that is the word, on Mondrian’s radical ambitions for his unhappy black stripes, so new to the practice of art. Streep became Mondrian, in the sense that he enhanced Mondrian’s work and meager life. I mean that Streep’s excitement about his friend’s painting was becoming to them as any pleased audience is becoming to a drama or a comedy. When one of them sold a picture, which was rare, they ate out in a little bistro and then went to a brothel, as Vincent and Gauguin used to do in Arles when they lived together.

    The thickened black lines of Mondrian began to cast a spell, even a black spell one might say, on some of the few visitors to Mondrian’s room. This room in Paris served as a studio, bed-corner, and hidden little kitchen. When visitors arrived at 4:30 in the afternoon, Streep would usually be there too, having come down from his own room for the company. Paris often gets dark early and Mondrian would put a single dangling naked light on above his canvas, above his black lines. Three or four people would huddle around the easel in the darkness. Streep would be in his element. He rejoiced for his Dutch brother and for himself.

    Mondrian’s black lines became talked about in Paris and even back in Holland, among a very few, of course, and hardly any buyers.

    “I love Mondrian’s stripes,” a woman in a café would say.

    “Mondrian’s black lines are so sad,” said another woman, “sad and good.”

    “And right,” someone added.

    “How do you mean?”

    “Right for me and my sadness. Right for the sadness of everyday streets.”

    “Yes, one line going up and down and two lines across town. No let-up, just new sets of paths from picture to picture.”

    If someone did buy, only a few francs were involved and usually in installments. Paris began to feel war in the air. Streep said that the Jews were in trouble as usual. Half of those few interested in Mondrian were Jews. Jews in trouble. Some left Paris and Europe. Mondrian’s stripes saddened more and more days. Primary colored blocks of flat color between some black lines didn’t help to lift anyone’s spirits. But when the canvases were without color, fear could strike some timid souls.

    Mondrian and Streep embraced and took leave of each other.

    “I’ll follow you soon,” said Streep. He had some family concerns to take care of first. Mondrian and a lady friend of his got to New York together. Paris fell. A refugee, who made it out of France, told Mondrian that Streep was dead.

    Mondrian painted his black, black lines in New York, in his small room there. No color. He lived alone but he saw the woman he left Paris with once or twice a week. They went dancing and then slept together. Mondrian’s pictures of the Passion back in Europe became loved by some of those who loved art in New York.

    “His crossed lines are about his Jewish friend Streep now,” wrote a critic.

    Mondrian was my favorite teacher at the Cooper Union when I went there as a teenager in 1950. He and I became good friends. I will never forget him, his suit and tie, his white painting smock, his combed-back hair. But the real great memory for me was in his darkened room, huddled with him in front of his easel upon which he had placed a little picture, a single light dangling above.

    “Is this really about what happened in Europe during those years?”

    “Yes, but it is also about now, and it is also about tomorrow. Black lines which began then, crossed, and kept on their way.”

    HOW TO PAINT A PICTURE

    “Paint on a very small canvas,” said the aged painter to his listener.

    “Why?”

    “Because small is now better than large. Taste has shifted.

    Well, really, my taste has shifted and that’s what counts.”

    “But will people take the time to scrutinize small pictures? I mean, people lead busy walk-by lives.”

    “Sure they will, even if only because they do lead such busy lives. People will like to slow down and sort of stop the world to study a little picture closely. That’s called Culture.”

    Then the painter put a small canvas on the easel. He was in a plastic chair in front of it. The listener stood behind him, rapt and apprehensive.

    “What have we here?” asked the painter, mostly to himself. 

    “I see a face of Jesus the Jew,” he said to the blank little canvas.

    The painter, who was about seventy-five years old and not in good health, got up and went to a bookcase, from which he withdrew a fat Rouault book from a section of about a dozen books and catalogs about Rouault, a painter who had fascinated him for many years, but not really one of his core or important painters from Giotto to Matisse. He slumped back down in his chair and flipped through the book until he came to the tipped-in color plate he was looking for. It was a head of Jesus and the text about it said: “It is one of the most monumental works ever produced by the brush of an artist.” The painter read that out loud to his listener, as if the opinion of the writer was a stunning thing. After a lifetime of, well, art, it was stunning because the Rouault plate, although he rather liked it, did not seem so extraordinary as all that. Undaunted, the painter put the book down by his chair on top of a pile of other books, which included a Mondrian book, a Greenberg book, a book of van Gogh self-portraits, an Auerbach book, a Bomberg book, a Cezanne watercolor book, and a few others. Then he went up to his canvas and drew a Jesus face very quickly with a few lines using a charcoal stick. It looked like an abstract group of about a dozen black strokes.

    “There, that’s a good start. It could be as good as the Rouault, but I hope my pictures will be better, whatever better means.”

    The listener, a very young painter, said she would come back in a few days and left him alone there.

    She was anxious to leave because she was having a hard time with her boyfriend and was determined to end her affair. At the same time she was excited to learn how to paint a picture from this old painter who was doing it in a way she had never known before. The old painter was lonely and he encouraged her to drop in whenever she liked. He enjoyed her young, untried company. She reminded him of Audrey Hepburn. Let’s call her Audrey.

    When Audrey left, he studied his abstract Jesus. He had just read a book called Rabbi Jesus by an Oxford scholar who placed Jesus among the great Jews. What if he made his pictures quite more abstract? He was devoted to the idea of painting a face unlike the faces any other painter had done before. Giotto had done that, and so had Picasso, and so had many painters. The attempt was quite a tradition, you might call it a revolutionary tradition. He got out a tube of Mars Black and began to brush in more abstract lines and splotches until you could hardly make out the features of a face at all. Some of the faces of Cezanne’s bathers are like that, are they not? He really started that radical idea to change the face. So Jesus kept changing until the painter achieved a kind of Mondrian Face, with four different lines describing the sides, top, and bottom. The face of the face seemed to be in chaos, or, as in some Kabbalist descriptions of God, Nothing.

    But the painter did not believe that Jesus was God, so he rescued his Jesus from Nothing and kept re-introducing human features after all. Color. Should there be color in the face on the little canvas? Perhaps not. How about a face decided by black strokes? Crucial black strokes if one was lucky, but maybe not final, as Jesus is not final. The painter hobbled over to a wall full of art clippings. A Mondrian self-portrait and a Manet head of Monet were pinned next to each other. Each was a collection of crucial strokes verging on abstraction. Each face was good to read because each was like new after all these years. Back in his chair, he squinted at his Jesus. It looked very unfocused, but then so were the Mondrian and the Manet. The painter talked to his dead wife, to his tall, last portrait of her just before she died when she was forty-six. In the portrait she was in a black Armani suit, leaned on the wall behind Jesus the Jew on an antique easel.

    “Love means that I want you to be,” he said to her, quoting Augustine, which was Hannah Arendt’s favorite saying.

    Silence.

    His wife had been dead thirteen years. He felt half-dead himself. He was. He was also surrounded by his latest little paintings, many almost finished, like his life.

    He went to his wife and kissed her on her lips. “Kiss me back,” he said to her, “to death, to the After, so I can go looking for you in the Great Whatever. Meet me at the Station, hold my hand and lead kindly light.”

    Silence.

    Back to Jesus, he glanced at his other Jews, those faces on the dozen little pictures he was working on around the room. Proust as a boy, Scholem, Kossoff after Auerbach, the author of the Zohar, Max Liebermann, an abstract Einstein, an Impressionist Pissarro, Hannah Arendt in Jerusalem… Then he surveyed his studio walls as he always did again and again. Photos of faces: his portrait of Isaiah Berlin, his snapshot of Guston, reproductions of Auerbach faces, Matisse faces, Rembrandt faces, an Ingres profile, dozens of van Gogh self-portraits. He was walled in.

    The next few days the painter kept changing the Jesus lines — arabesques, short stabs, straight abrupt lines, wedges, curls, angles, broom-like swathes, mouth lines, eyebrows with loose hairs, eye lines, wrinkles, grooves, jaw and cheek lines….all composing themselves, disappearing and always changing. But Jesus was still there — a Jesus. There were still only about twelve or fifteen lines, all in Mars Black. Abstract Mondrian lurked in the face and so did late Rembrandt — especially his face of Julius Civilis. It was a little line painting, cavalier as to contour. It was a little culture.

    Audrey returned one day, having broken with her lover. 

    “The little Jesus looks dense, oblique, slant, difficult, ongoing,” she remarked, delighted. “Is that how to paint a picture? I mean, surrounded by ideas?”

    “Yes,” he said. “For now it’s a way. Come over here.” He put his arm around her waist.

    “Where did you move when you left your boyfriend?”

    “Back to my mother.”

    “Come and live in my spare room. No rent. I just like you here.”

    “I don’t trust you to be good.”

    “C’mon, I’m as old as your grandfather.”

    “So was Picasso.”

    “I can’t get it up anymore. And don’t I show you how to paint a picture?”

    So Audrey moved in with him, as with Professor Higgins.

    THE GIFT

    Ron’s father owned much of Westwood Village in Los Angeles and when he died he left it all to Ron, who had just graduated from the Royal College of Art in London in 1962. Ron was already very wealthy because his maternal grandmother, who had been born to a Hassidic family in Podolia, had long before married the owner of Maggs Bros., the great rare-book sellers in Berkeley Square in London, and Ron inherited that business and its building when he was still a student. He lived above Maggs Bros., in his own flat and top floor studio. Maggs Bros. lived on in the good hands it enjoyed plus Ron, who loved books and art and women. He took great interest in  his new book business, aside from his life as an emerging young painter.

    Ron and David Hockney were close friends, having been together at the Royal College. They had begun to draw each other there. When he graduated, Ron was put under contract by the hugely successful Marlborough Fine Art in nearby Bond St. In the early 1960s, Marlborough also represented other young Jews such as Frank Auerbach, Leon Kossoff, and Lucian Freud, along with the estates of David Bomberg, Mondrian, Eli Streep, and Jackson Pollock. Ron commissioned Auerbach and Freud to paint his portrait and those paintings were duly hung in the tall house in Berkeley Square full of books and pictures. Like rich young painters before him, such as Caillebotte and Bazille, Ron collected the paintings of extraordinary painters he knew and grew up with in London during a period when London was home to an unusual number of world-class artists, of whom Francis Bacon was probably supreme. Ron’s own paintings were avidly bought by museums as well as collectors in Europe and his native America as well as in England. And his rare-book business continued to flourish.

    Of course many gems were plucked from the thriving international book trade in his ground-floor shop and spirited upstairs into the private rooms of the young owner, and a surprising number of these books, upon being read, would inspire the picture-making in the light-filled studio high in the house. Later the pictures that Ron made would travel back downstairs to Berkeley Square and around a few corners in Mayfair to Marlborough Fine Art Ltd., often to be sold within days or weeks. For many years, Ron could not or did not keep any of his paintings for his own collection. Selling his own paintings made him triply rich and by the time he was thirty he had more money than maybe any painter ever had in history.

    He lived rather simply, though. He had no car, no country house, he paid no attention to his clothes, did not drink alcohol, ate very little, and hardly ever left London, or Mayfair for that matter. Even the various attractive young women in his life had to cope with whatever life they found in the tall house in one of London’s most expensive and famous squares. Just about his only extravagance, aside from the rare books he chose from his own store, such as Pound’s first book, A Lume Spento, or the first printing (Munich) of Kandinsky’s Uber das Geistige in der Kunst, was his picture-collecting. As early dusk fell and the light failed at his easel, Ron would go out and stroll around the art galleries close to his house. At least once a week he would find a picture to buy. A Sickert or a Bomberg or a Cezanne watercolor or a Degas brothel monotype, as well as paintings by his London contemporaries. From time to time he would buy a very expensive painting at auction. He owned six or seven paintings by Cézanne, his favorite painter.

    Mayfair in those days was also what one might call a red-light district, up-market from Soho, beyond Regent Street. The Street Offences Act had driven working girls out of the streets and doorways of the West End of London into flats upstairs. These flats could be found behind the art galleries that Ron loved to wander. It was his habit, now and then, to walk around his district after having prowled the galleries, perhaps having bought a Morandi just as the gallery closed at night. First prowl art, then prowl sex. Ron most often called upon his favorite girl, or woman. Her name was Denise. She was French and about the same age as Ron, in her early thirties, when Ron started to visit her. Her flat was in Bruton Mews, which curled into Berkeley Square itself. Bruton Mews was right behind art galleries such as Lefevre and Tooth’s, where examples of Ron’s own paintings were sometimes on view.

    A little card in the doorway said FRENCH MODEL (DENISE) ONE FLOOR UP. Denise looked a little like Jane Fonda as she played the prostitute in Klute, though her hair, cut like Fonda’s in that movie, was blonde. Of course Klute had not yet been made. Denise was handsome, square-jawed, and very obliging, which many whores hardly are, believe it or not. I mean to say, after studying them all my life, I can say that about half these magical women act, what shall I say, relatively unfriendly to their clients. And the other half are, well, friendly. Nice and not nice.

    Denise was a nice person, never in a hurry except that the usual drama, from the first embrace to the end of the act, rarely took more than a half hour, often less. Of course, sometimes Ron had to wait if Denise was busy. A maid would open the door and usher Ron into a room with erotic magazines on a coffee table, saying “Denise will see you soon.” He studied the pictures on the wall, which tended to be pin-ups of naked women. Before long, he would hear the activity of the last client leaving and Denise would appear smiling and wearing a short housecoat. They would then go into her bedroom, which had a large mirror alongside the bed, and a bathroom. Half an hour later Ron would stroll to the end of the Mews feeling good and not at all the shame that Flaubert said he felt upon leaving a brothel.

    Ron could see his Maggs Bros. shop across the night square and that felt good too, just thinking about all those writers whose books were on those shelves in the dark. His books. Joyce, Svevo, Babel (his scarce Benya Krik film script), Gissing (who married a prostitute), Hart Crane’s The Bridge in its Black Sun slipcase, hundreds of rare books for sale garbled in Ron’s mind with paintings and women. Ron was often  not in full possession of his faculties, not free from confusion or giddiness.

    Denise. Ask her out to dinner. One night he did. She was so nice she said OK, amused to be asked. She said she took days off, so how about next Sunday. All that Sunday Ron painted in his studio, upon one wall on which he kept his art books, the twenty-five painters he liked best, from Giotto to Mondrian and Matisse. He had never asked a working girl out before and as he painted on a small canvas inspired by a Lautrec oil sketch in his possession he was excited by its bravado technique. The theme of Ron’s painting was Jacob wrestling the angel, which he had tried before and failed.

    In his journal, Delacroix said he could hardly paint through the day without promising himself a woman at night. Or he said something like that. At table, in the restau-rant, Denise told Ron that she had been in London eight years, having started in her profession in Paris. She had a ten-year-old boy at boarding school in France and she saw him quite often.

    “Are you OK, I mean happy?” asked Ron.

    “Reasonable happy,” she said. “I mean I’m not depressed or anything.”

    “I’m glad,” Ron said, “because my life so far has been very lucky. I’m rich as hell and someone like you makes me even richer, it seems to me.”

    “Why is that?” asked Denise.

    “Because you seem magical to me,” said Ron. “Your daily and nightly life, as I can only imagine it, is so original and traditional and secretive and daring and, well — public.”

    Denise smiled her winning smile and said, “Yeah, I suppose it’s all those things.”

    “Has anyone asked you out before, like this?”

    “No, but one or two of my regulars seem almost like friends.”

    “That’s because you act beyond the call of duty. You bring out the best in a guy like me.”

    “Well, I sort of like what I do. I like the daily grind of it, the sameness, and what you call daring — the daring sameness. Even though guys want to do a lot of different things with me, it all seems like sameness, and yet every time I’m with a new man I’m sort of aware it’s unusual to do what we’re doing — daring sameness.”

    “Sounds like painting,” said Ron the painter.

    Back in his fabulous house, he put on the lights for her in hushed Maggs Bros. and then they went up two flights to his flat and then up the next flight, which was all pictures he had collected on the walls and on easels which stood everywhere like a small forest.

    “What’s this?” she asked, “It’s beautiful.”

    “It’s a watercolor by Cézanne.”

    “Oh, I’ve heard of him.”

    “He’s my favorite painter of all,” said Ron.

    “Why?” she asked.

    “Well, it’s a long story. He’s considered to be the father of Modern Art. He was the first painter to break away from the sort of realistic painting people expected, so he was laughed at by almost everyone, and he and his pictures were always insulted. I just love him for having the audacity to re-invent the human figure in extraordinary ways. He died in 1906, when a few people had begun to appreciate his revolution. He lived in Aix-en-Provence. Have you been there?”

    “No, I never went to the Midi.”

    “He was quite rich because his father was a banker in Aix, so he always returned home from hostile Paris, but they thought he was crazy in Aix, too.”

    “What’s this? It’s so sexy.”

    “That’s a little line drawing by Egon Schiele. He died when he was twenty-eight. He was also mostly attacked. He drew beautifully, didn’t he?”

    Then they went upstairs to Ron’s studio. She said she had never been in an artist’s studio and that she knew nothing about art. Ron said that he had made the top floor into one large room with a skylight that let in northern light, which painters liked best. The concept of lofts had not yet become common and many of the London painters worked in rooms or small studios built for artists long ago. Denise walked around looking at the dozen or so paintings, some on easels, that Ron was working on. She said very little, but Ron tried to explain the few pictures to her in the context of what he called Modern Art.

    She was amazed by the hundreds of oil-paint tubes all around and said that she had no idea there were so many different colors. Ron said he would like to paint her if she would pose nude for him, but it would take a few sessions, maybe as many as ten. He would pay her well, he said. She smiled and said she would think about it. He pulled a Manet book from his long Manet shelf and showed her Olympia and Nana, and he told her that these were very great paintings of working girls like her, but that sadly he could not paint as well as Manet, though he lived in hope. He told her that these paintings and Manet’s art were hated and ridiculed in his lifetime, and that he died when he was fifty-one.

    “This is all so new to me,” she said. “My mind is spinning with the pleasure of all you have shown me. I want to, how do you say — digest it.”

    “Will you work tonight?” asked Ron.

    “I don’t know. It’s still early for me.”

    “I’ll walk you home, neighbor.” So he did. It was about 11.  Back in his house, Ron felt sort of excited. He imagined Denise back at work. He picked up a little brush and painted a few strokes of Mars Black, deepening the outline of a head of Spinoza he was trying to do. He loved Spinoza and had been reading his difficult Ethics as well as Valentiner’s book Rembrandt and Spinoza. The two geniuses probably knew each other in Amsterdam’s Jewish Quarter, where they both lived, but no one has ever been able to prove it.

    Ron cleaned his brushes in turpentine, turned off his studio lights, and went down to the floor below where he kept so many of his pictures by other artists. After a while he left his house and walked across the square to Bruton Mews. The lights were on and he read the little card: FRENCH MODEL (DENISE) ONE FLOOR UP. Up he went. A man hurried past him down the stairs, having emerged from Denise’s flat. Ron rang the bell of the door on which another card said just DENISE. She opened the door herself. She was in her short housecoat and high sandals.

    “Ron!”

    “I felt like seeing you again.”

    “Come in.”

    They went into her bedroom.

    “Now what?” she smiled.

    “I’ve got something for you,” he said, and pulled the little Schiele nude drawing from a plastic shopping bag.

    “For me?” she said, astonished. “But it must be so valuable.”

    “So are you. I only paid a few hundred dollars for it when 

    I was a student in Vienna many years ago. Yes, now it is quite valuable, so try to keep it safe. Here in your bedroom would be a good place. No one will think it’s an original drawing by a famous artist.”

    “How do you spell his name?”

    “S-C-H-I-E-L-E. Egon Schiele.”

    “You are crazy. Crazy and rich and so sweet. I’ll put it up over the bed where it belongs. Thank you. Thank you. Thank you.”

    Denise dropped her little coat. Naked, she urged him onto the bed and he joined her there. The bedspread stayed on the made-up bed as usual. While they were having sex, the doorbell rang and they heard the maid let someone in.