Fiorello La Guardia was a great mayor of New York — he even has an airport named after him — but he made some boneheaded errors. Some years after the Sixth Avenue El in Manhattan was razed, La Guardia and the city council decided to rehabilitate the neighborhoods around the thorough-fare, which had become run down from hosting the elevated train. And so, in October 1945, they officially rebranded Sixth Avenue as Avenue of the Americas.
City planners must have found the cosmopolitan-sounding name exciting. New York City was emerging as the global capital, on the cusp of the American Century: home to the new United Nations and soaring International Style skyscrapers, a hub of commerce, a dynamo of artistic creativity. But this act of renaming by fiat, against the grain of public opinion, failed spectacularly. A survey ten years later found that, by a margin of 8 to 1, New Yorkers still called the street Sixth Avenue. “You tell someone anything but ‘Sixth Avenue,’” a salesman explained to the New York Times, “and he’ll get lost.” Generations of visitors have noticed signs that still say “Avenue of the Americas” and wondered fleetingly about its genesis and meaning, but for anyone to say it out loud today would clearly mark him as a rube.
Names change for many reasons. While designing Washington, DC in the late eighteenth century, Pierre L’Enfant renamed the local Goose Creek after Rome’s Tiber River. It was a bid for grandeur that earned him mainly ridicule. After Franklin Roosevelt was elected president, Interior Secretary Harold Ickes saw fit to cleanse federal public works of association with the most unpopular man in America, making the Hoover Dam into the Boulder Dam. With independence in 1965, Rhodesia ditched its hated eponym to become Zimbabwe, and its capital, Salisbury, became Harare. When it was conquered by the Viet Cong in 1975, Saigon was reintroduced as Ho Chi Minh City, however propagandistic the appellation still sounds. On Christmas Eve, 1963, Idlewild Airport became JFK. In 2000, Beaver College, tired of the jokes, chose to call itself Arcadia. (Et in Beaver ego.) Even old New York was once New Amsterdam.
Like the misbegotten Avenue of the Americas moniker, though, new names do not always stick. Who but a travel agent calls National Airport “Reagan”? Where besides its website is the New York Public Library known as “the Schwartzman Building”? In 2017, the Tappan Zee Bridge formally became the Mario M. Cuomo Bridge, thanks to its namesake’s son, but everyone still calls it the Tappan Zee. (Few knew that for the thirteen years prior it had been named for former New York governor Malcolm Wilson; in fact, few knew that someone called Malcolm Wilson had been governor.) Everyone also still calls the Robert F. Kennedy Bridge the Triborough and the Ed Koch Bridge the Queensborough.
Political events prompt changes, too. When in 1917 German aggression forced the United States into World War I, atlases were summarily revised. Potsdam, Missouri became Pershing. Brandenburg, Texas, became Old Glory. Berlin, Georgia became Lens — but after the war, with the rush to rehabilitate Germany, it reverted to Berlin. (During the next world war this Berlin declined to change its name again, though 250 miles to the northwest Berlin, Alabama rechristened itself Sardis.) In 1924, the Bolsheviks saddled splendid St. Petersburg with the chilling sobriquet Leningrad — “after the man who brought us seventy years of misery,” as tour-bus guides tell their passengers. Only with Communism’s demise could city residents reclaim their old appellation.
The revision — and re-revision — of place names is thus a common enterprise. But how and why those in control choose to re-label streets, cities, schools, parks, bridges, airports, dams, and other institutions has always been a strange, unsystematic process — subject to changing social norms, political fashions, historical revisionism, interest-group pressure, the prerogatives of power, consistent inconsistency, and human folly. The current craze for a new public nomenclature, in other words, is far from the straight-forward morality play it is often made out to be. How we think about it and how we go about it deserve more deliberation than those questions have received.
Today’s nomenclature battles mostly turn on a specific set of questions: about race and the historical treatment of non-white peoples. Every day, in the United States and abroad, new demands arise to scrub places, institutions, and events of the designations of men and women who were once considered heroes but whose complicity (real or alleged) in racist thoughts or deeds is now said to make them unworthy of civic recognition. Not only confederate generals, upholders of slavery, and European imperialists are having their time in the barrel. So too are figures with complex and even admirable legacies, as diverse as Christopher Columbus and George Washington, Andrew Jackson and Woodrow Wilson, Junipero Serra and Charles Darwin, David Hume and Margaret Sanger — even, although it sounds like parody, Mohandas K. Gandhi.
What has led us to set so many august and estimable figures, along with the more flagrantly reprehensible ones, on the chopping block? It helps to look at the criteria being invoked for effacement. To be sure, advocates of renaming seldom set forth clear, careful, and consistent sets of principles at all. Typically, the arguments are ad hoc, each one anchored in some statement, belief, political stance, or action of the indicted individual, the wrongness of which is presumed to be self-evident. But occasionally over the years, governmental committees, university panels, or other bodies have gamely tried to articulate some criteria. Their language is telling.
One body that recently made plain its standards for naming was a Washington, D.C. mayoral “working group” with the ungainly label “DCFACES.” (An ungainly name is an inauspicious quality in a body seeking to retitle streets and buildings.) That acronym stands for the equally ungainly “District of Columbia Facilities and Commemorative Expressions.” In the summer of 2020, DCFACES released a report declaring that any historical figure would be “disqualified” from adorning a public building or space in Washington, DC if he or she had participated in “slavery, systemic racism, mistreatment of, or actions that suppressed equality for, persons of color, women and LGBTQ communities.” These rules resulted, among other absurdities, in a call to re-label Washington’s Franklin School (which now serves as a museum) because Benjamin Franklin, though a magnificent patriot, politician, democrat, diplomat, writer, thinker, inventor, publisher, and abolitionist, also owned two slaves, whom he eventually freed.
Here is how the report’s executive summary presents the rules:
IMPERATIVES
Commemoration on a District of Columbia asset is a high honor reserved for esteemed persons with a legacy that merits recognition. The DCFACES Working Group assessed the legacy of District namesakes, with consideration to the following factors:
Participation in slavery — did research and evidence find a history of enslaving other humans or otherwise supporting the institution of slavery.
2. Involvement in systemic racism — did research and evidence find the namesake serving as an author of policy, legislation or actions that suppressed persons color and women.
3. Support for oppression — did research and evidence find the namesake endorsed and participated in the oppression of persons of color and/or women.
4. Involvement in supremacist agenda — did research and evidence suggest that the namesake was a member of any supremacist organization.
Violation of District human rights laws — did research and evidence find the namesake committed a violation of the DC Human Right Act, in whole or part, including discrimination against protected traits such as age, religion, sexual orientation, gender identity, and natural origin.
Several difficulties with this formulation are immediately apparent. For starters, the list is at once too broad and too narrow. It is too broad because phrases such as “support for oppression” are so vague and subjective that they could implicate any number of actions that might be defensible or explicable. It is also too broad because it implies that a single violation is altogether disqualifying, so that someone like Hugo Black or Robert Byrd (both of whom joined the Ku Klux Klan as young men, only to repudiate their actions and go on to distinguished careers) can never be honored.
At the same time, the lens is also too narrow. Its single-minded focus on sins relating to race and sex (and, in one instance, other “protected traits”) in no way begins to capture the rich assortment of human depravity. A robber baron who was untainted by racist bias but subjected his workers to harsh labor would seem to pass muster in the capital. So would a Supreme Court justice with a clean record on race who curtailed freedom of speech and due process. Dishonesty, duplicity, and cowardice are nowhere mentioned as disqualifying. Neither are lawlessness, corruption, cruelty, greed, contempt for democracy, any of the seven deadly sins, or, indeed, scores of other disreputable traits any of us might easily list.
The Washington mayoral working group was not the first body to set down naming rules focused on racism and other forms of identity-based discrimination. In fact, commit-tees have propounded such frameworks for a long time. In 2016, the University of Oregon, in considering the fate of two buildings, adopted seven criteria that largely dealt with offenses “against an individual or group based on race, gender, religion, immigration status, sexual identity, or political affiliation.” (The Oregon list, to its drafters’ credit, also contained some nuance, adding the phrase “taking into consideration the mores of the era in which he or she lived” and making room for “redemptive action” that the individual might have engaged in.) In 1997, the New Orleans school board proscribed naming schools after “former slave owners or others who did not respect equal opportunity for all.” Few objected when this policy was invoked to exchange the name of P.T. Beauregard on a junior high school for that of Thurgood Marshall. More controversial, though, was the elimination of George Washington’s name from an elementary school, no matter how worthy his replacement appeared to be. (He was Charles Richard Drew, a black surgeon who helped end the army’s practice of segregating blood by race.) So the battles now being waged in city councils and university senates, though intensified by the recent racial ferment, long predate the latest protests or even the Black Lives Matter movement of 2014.
Like so many skirmishes in our culture wars, these go back to the 1960s. That era’s historic campaigns for racial and sexual equality; the widespread criticisms of government policy, starting but not ending with the Vietnam War; the deepening skepticism toward political, military, and religious authority; the blurring of boundaries between public and private; the exposure of criminality in high places; the demise of artistic standards of excellence — all these elements conspired to render quaint, if not untenable, old forms of patriotism and hero worship. Debunking thrived. Not just in the counterculture, but also in the academy, there took hold what the historian Paul M. Kennedy called “anti-nationalistic” sentiment: arguments (or mere assumptions expressed via attitude and tone) that treated the nation’s past and previous generations’ values and beliefs with disapproval, disdain, or even a conviction, as Kennedy wrote, that they “should be discarded from … national life.” Growing up in the 1970s and after, Generations X, Y, and Z were never taught to passively revere the Founding Fathers or to celebrate uncritically the American experiment. On the contrary, we were steeped in dissidence, iconoclasm, suspicion, and wisecracks. At its best, this new adversarial sensibility instilled a healthy distrust of official propaganda and independence of mind. At its worst, it fostered cynicism and birthed a propaganda of its own.
The thorniest questions of the 1960s stemmed from the challenge, thrown down by the civil rights movement, for America to live up to its rhetoric of equality. “Get in and stay in the streets of every city, every village, and hamlet of this nation,” the 23-year-old John Lewis said at the March on Washington in 1963, “until true freedom comes, until the revolution of 1776 is complete.” With uneven resolve, Americans devoted to human equality have striven to meet the challenge. And this effort has included, crucially, rethinking the past. To highlight and learn about our nation’s history of racial exclusion and discrimination is among the noblest goals we can have in our public discourse, because it is the intellectual and cultural condition of justice: we will not be able to achieve equality without understanding the deep roots of inequality in our society.
By the 1990s American society had become an irreversibly multicultural one. WASP values, assumptions, priorities, and interpretations of the past could no longer dominate. “We Are All Multiculturalists Now,” declared the title of a somewhat unexpected book by Nathan Glazer in 1996. But with that watershed, Glazer noted, it became necessary to pose a new set of queries (which Americans had indeed been asking for some time): “What monuments are we to raise (or raze), what holidays are we to celebrate, how are we to name our schools and our streets?”
Probably no group of historical actors has been subject to as much contentious debate as the secessionists who founded the Confederate States of America. Yet by the third decade of the twenty-first century, there was not much of a debate left about their virtues. Arguments for their valor already seem hopelessly antiquated. Partial defenses of Robert E. Lee, of the sort that David Brooks earnestly mounted in the New York Times just five years ago, now induce cringes. (“As a family man, he was surprisingly relaxed and affectionate… He loved having his kids jump into bed with him and tickle his feet.”) Were the Times to publish a piece like Brooks’ in the current environment, the whole masthead would be frog-marched out of the building under armed guard.
The public, or some of it, has now learned that Southerners imposed most of their Lost Cause nomenclature, iconography, and narratives not in innocent tribute to gallant soldiers, but as part of a rearguard racist project of forging and upholding Jim Crow. This new awareness — along with the political agitation of the last decade — has altered how many Americans think about a military base honoring Braxton Bragg or a park memorializing Nathan Bedford Forrest. The Lincoln scholar Harold Holzer confessed last year that statues and place names which “I long regarded as quaint were in fact installed to validate white supremacy, celebrate traitors to democracy, and remind black and brown people to stay ‘in their place.’” It became increasingly incongruous, if not bizarre, to see in a redoubt of suburban liberalism such as Arlington, Virginia, a boulevard evoking the Confederacy’s leading general.
Still, as the protests in Charlottesville in 2017 showed, Lee retains his champions. Plying his demagoguery that August, Donald Trump — at the same press conference at which he defended the Charlottesville firebrands — warned that if Lee were to be scrubbed from public commemoration, George Washington (“a slave owner”) and Thomas Jefferson (“a major slave owner”) would be next. “You have to ask yourself, where does it stop?” To this slippery-slope argument, many have given a sensible and convincing answer: Lee, Jefferson Davis, Stonewall Jackson, and the others were traitors to their country; Washington, Jefferson, and the founders were not. Removing the former from streets and schools while retaining the latter admits no contradiction. As far back as 1988, Wilbur Zelinsky, in his fascinating history Nation into State, remarked that “as the military commander of an anti-statist cause, there is no logical place for Lee in the national pantheon alongside Washington, Franklin, and others of their ilk,” explaining that Lee entered the pantheon (or stood just outside its gates) only “as an archetypal martyr — the steadfast, chivalrous, sorrowful, compassionate leader of a losing cause.”
Yet the distinction between traitors and patriots, while perfectly valid so far as it goes, does not answer the big questions. It does not address, for example, whether every last venue commemorating a Confederate must be taken down. Yes, let us lose the Confederate flags and Confederate statuary, and change the place names that keep alive the Lost Cause. But would it be acceptable to keep a handful, for considered reasons? Doing so would show that we know that our history includes the bad along with the good, as all human history does; and it would remind us that our predecessors at times were not able to tell the bad from the good. It would remind us that our country was once riven to the core by a struggle over evil and inculcate sympathy for the difficulty, and the cost, of the struggle. It might also deflate a presentist arrogance that tempts us to think that our current-day appraisals of the past, fired off in the heat of a fight, are unerring and for the ages.
The distinction between traitors and patriots also fails to address the larger and more humane question of whether there is a way, notwithstanding the hateful cause for which the Confederates fought, to extend some dignity to their descendants who renounce the ideology of the Old South but wish to honor forbears who died by gun or blade. In the right context, and without minimizing those forbears’ attachment to an evil institution, this goal should, I think, be achievable. At the Gettysburg battlefield, monuments to Southern regiments stand arrayed opposite those to Northern troops, but in no way does a walk through the austere, beautiful environs suggest an exculpation or a whitewash. To erase any possible doubt, a professionally designed and intelligently curated museum nearby spells out the war’s history, including the centrality of slavery, in cold detail.
And the distinction between traitors and loyalists is insufficient for yet another reason, too: it speaks only to the period of the Civil War. Outright traitors are a small, discrete subset of those who have come under fire in the recent controversies; the nomenclature wars span much wider terrain. Identifying secession as grounds for censure is fine, but it provides no limiting principle to help us think through, in other circum-stances, whose names should and should not remain. It says nothing about Theodore Roosevelt, Winston Churchill, John Muir, Kit Carson, Louis Aggasiz, Henry Kissinger, Voltaire, or anyone else.
Most regrettably, the distinction does not persuade everyone. In addition to the Lost Cause devotees, some on the left likewise deny the distinction. We saw New Orleans retitle George Washington Elementary School back in 1997. When Trump cited Washington in his press conference in 2017, he was unknowingly describing something that had already happened. Could it be that he recalled the campaign at the University of Missouri in 2015 to defenestrate Jefferson, whom students, apparently knowing little about his quasi-marriage to Sally Hemings, excoriated as a “rapist”? Even if Trump was ignorant of these precedents, as seems probable, he must have felt some vindication when protesters in 2020 targeted Abraham Lincoln, Ulysses S. Grant, Frederick Douglass (!), and other assorted foes of slavery. Trump and these leftwing activists agree that the current renaming rage should not “stop” with traitors to the Union. They share a fanatical logic.
Few participants in the nomenclature wars have reckoned seriously with this slippery-slope problem. The Yale University officials who renamed Calhoun College because its eponym flew the banner of race slavery were well aware that Elihu Yale earned his fortune at a powerful British trading company that trafficked in African slaves. But Yale remains Yale, for now. Similar contradictions abound. Are we to make a hierarchy of hypocrisies? If Woodrow Wilson’s name is to be stripped from Princeton University’s policy school because he advanced segregation in the federal bureaucracy, by what logic should that of Franklin Roosevelt, who presided over the wartime Japanese internment, remain on American schools? If the geneticist James Watson’s name is scratched from his research institution’s graduate program because he believed that racial IQ differences are genetic, why should that of Henry Ford — America’s most influential anti-Semite, who published the Protocols of the Elders of Zion in his Dearborn Independent — remain on the Ford Motor Company or the Ford Foundation? In what moral universe is Andrew Jackson’s name erased from the Democratic Party’s “Jefferson-Jackson” dinners, but Donald Trump’s remains on a big blue sign near the 79th Street off-ramp on the West Side Highway? How can the District of Columbia go after Benjamin Franklin and Francis Scott Key but not Ronald Reagan, whose name adorns the “international trade center” downtown? It is not a close contest as to who made life worse for the city’s black residents.
The problem with the contemporary raft of name alterations is not that historical or commemorative judgments, once made, cannot be revised. Change happens. It may have been silly for the Obama administration to rechristen Mt. McKinley “Denali,” but it was not Stalinist. The real problem (or one problem, at any rate) is that no rhyme or reason underwrites today’s renaming program. Like the social media campaigns to punish random innocents who haphazardly stumble into an unmarked political minefield, the campaign of renaming follows no considered set of principles. It simply targets whoever wanders into its sights.
If we wish to impose some coherence on the Great Renaming Project, a good first step would be to create a process of education and deliberation. Our debates about history generally unfold in a climate of abysmal ignorance. How much is really known about the men and women whose historical standing is now being challenged? What matters most about their legacies? Were they creatures of their age or was their error perfectly evident even in their own time? What harm is perpetuated by the presence of their name on a street sign or archway? The answers are rarely straightforward.
In many public debates, the participants know little about what the men and women under scrutiny did. In April 2016, a Princeton undergraduate and stringer for the New York Times wrote incorrectly in the paper of record that Woodrow Wilson “admired” the Ku Klux Klan. The next day the paper ran a letter correcting the error, noting, among other facts, that in his History of the American People Wilson called the Klan “lawless,” “reckless” and “malicious”; but just two weeks later another stringer, one year out of Yale, parroted the same mistake. That even Ivy-educated youngsters got things so wrong should not be surprising. The undergraduates I teach tend to know about Andrew Jackson’s role in Indian Removal, and that he owned slaves. But most know little of his role in expanding American democracy beyond the elite circles of its early days. Millions of young people read in Howard Zinn’s A People’s History of the United States about the horrors that Columbus inflicted on the Arawaks of the Caribbean. But Zinn was rebutting the heroic narratives of historians like Samuel Eliot Morison, whose Columbus biography won a Pulitzer Prize in 1943. How many students read Morison anymore? How many have a basis for understanding why so many places in North America bear Columbus’ imprint in the first place? Were all those places consecrated to genocidal conquest? Without efforts to educate the young — and the public in general — about the full nature of these contested figures, the good and the bad, the inexorable complexities of human thought and action, these debates will devolve into a simplistic crossfire of talking points.
On occasion, mayors, university presidents, and other officials have recognized that a process of education and deliberation is necessary before arriving at a verdict on a controversial topic. In 2015, Princeton University came under renewed pressure to address the racism of Woodrow Wilson, who was not only America’s twenty-eighth president but a Princeton graduate, professor, and, eventually, a transformational president of the college. At issue was whether to take his name off the university’s policy school, a residential dorm, and other campus institutions (professorships, scholarships, book awards, etc.). Desiring a process that was democratic and deliberative, the president of the university, Christopher Eisgruber, convened a committee. Multiracial and multigenerational in composition, it included members of the board of trustees, Wilson experts, higher education leaders, and social-justice advocates. It solicited the views of students, faculty, staff, and alumni. Historians wrote long, thoughtful, well-researched letters weighing the merits of the case. Some 635 community members submitted comments through a dedicated website (only a minority of whom favored eliminating Wilson’s name).
The committee weighed the evidence, which includes the record not just of Wilson’s deplorable racism but also his undeniable achievements. Although many students today know little about Wilson besides the racism — which, we must be clear, went beyond private prejudice and led him to support Cabinet secretaries Albert Burleson and William McAdoo in segregating their departments — he was for a century considered one of America’s very best presidents. Wilbur Zelinsky, in his meticulous study, called Wilson “one of four presidents since Lincoln whom some would consider national heroes” (the others being the Roosevelts and John F. Kennedy). Wilson could claim in his day to have enacted more significant progressive legislation than any president before him; since then, only Franklin Roosevelt and Lyndon Johnson have surpassed him. Wilson also built upon Theodore Roosevelt’s vision of a strong presidency to turn the White House into the seat of activism, the engine of social reform, that it has been ever since. Nor was Wilson successful just domestically. He was a historic foreign-policy president, too, and a winner of the Nobel Peace Prize. After exhausting all bids for peace with Germany, he reluctantly led America into World War I, which proved decisive in defeating Teutonic militarism, and he pointed the way toward a more democratic and peaceful international order — though, crippled by a stroke and his own arrogance, he tragically failed to persuade the Senate to join the League of Nations, leaving that body all too ineffectual in the critical decades ahead.
The Princeton committee’s fair-minded report was adopted by the Board of Trustees in April 2016. It recommended keeping Wilson’s name on the buildings. But Eisgruber and the board of trustees simultaneously promised that campus plaques and markings would henceforth provide frank accounts of Wilson’s career and beliefs, including his racism. More important, the university would, it said, take bold steps in other aspects of campus life to address the underlying grievance: that many black Princetonians do not feel they are treated as equal members of the campus community. And there the matter rested, until 2020. Following the Memorial Day killing of George Floyd by a Minneapolis policeman, protests erupted nationwide calling for police reform and other forms of racial justice — including, once again, the reconsideration of names. This time Eisgruber launched no deliberative process, appointed no diverse committee, solicited no external input, convened no searching conversation. He simply declared that the Board of Trustees had “reconsidered” its verdict of a few years before. His high-handed decree, more than the ultimate decision, violated the principles on which a university ought to run. For Eisgruber, it also gave rise to some new headaches: in what can only be seen as an epic troll, Trump’s Department of Education opened an investigation into whether Princeton’s confession of rampant racism meant it had been lying in the past when it denied engaging in racial discrimination.
Curiously, at the same time as Princeton banished Wilson, Yale University also performed a banishment — this one with regard to John C. Calhoun, whose name graced one of its residential colleges. But there were crucial differ-ences between the two cases. Although Calhoun has been recognized as a statesman, grouped with Henry Clay and Daniel Webster as the “Great Triumvirate” of senators who held the nation together in the fractious antebellum years, he is a far less admirable figure than Wilson. He made his reputation as a prominent defender of slavery and a theorist of the nullification doctrine that elevated states rights over federal authority — a doctrine that later provided a rationale for Southern secession. But beyond the huge political differences between Wilson and Calhoun are the differences in the processes that Princeton and Yale pursued. Princeton jettisoned a deliberative decision to implement an autocratic one. Yale did something like the reverse.
Following the Charleston massacre of 2015, the president of Yale, Peter Salovey, told his campus that Yale would grapple with its own racist past, including its posture toward Calhoun. Then, the following spring, he declared that after much reflection on his part — but no formal, community-wide decision-making process — Calhoun would remain. Salovey contended, not implausibly, that it was valuable to retain “this salient reminder of the stain of slavery and our participation in it.” To get rid of Calhoun’s name would be to take the easy way out. At the same time, Salovey also announced (in a ham-handed effort to balance the decision with one he expected students and faculty would like) that one of Yale’s two new residential colleges would be named for Pauli Murray, a brilliant, influential, underappreciated midcentury civil rights lawyer who was black and, for good measure, a lesbian.
Students and faculty rebelled. Salovey backtracked. He now organized a committee, chaired by law and history professor John Fabian Witt, to tackle the naming question systematically. Wisely, however, Salovey charged the committee only with developing principles for renaming; the specific verdict on Calhoun would come later, decided by still another committee, after the principles were set. To some, the whole business seemed like a sham: it was unlikely that after vowing to take up a question a second time he would affirm the same result. Still, the exercise of formulating principles—in the tradition of a storied Yale committee that the great historian C. Vann Woodward led in the 1970s to inscribe principles for free speech on campus — was worthy, and Salovey populated the Witt committee with faculty experts on history, race, and commemoration. Even more than the Princeton report, the Witt Committee’s final document was judicious and well-reasoned. When, in 2017, Yale finally dropped Calhoun’s name from the residential college, no one could accuse the university of having done so rashly.
Deliberation by committee, with democratic input, may be necessary to ensure an informed outcome on a controversial subject, but as the example of DCFACES shows, it is not always sufficient. Setting forth good principles is also essential. One mistake that the Washington group made was in asking whom to disqualify from recognition, rather than who might qualify. Historians know that the categories of heroism and villainy are of limited value. Everyone is “problematic.” And as Bryan Stevenson likes to say, each of us is more than the worst thing we have ever done.
Thus if we begin with the premise that certain views or deeds are simply disqualifying, we have trouble grasping the foolishness of targeting Gandhi (for his anti-black racism), Albert Schweitzer (for his racist and colonialist views), or Martin Luther King, Jr. (for his philandering and plagiarism). In any case, how can we insist that racism automatically denies a historical actor a place in the pantheon when the new reigning assumption — the new gospel — is that everyone is (at least) a little bit racist? We all have prejudices and blind spots; we all succumb to stereotyping and “implicit bias.” By this logic, we are all disqualified, and there is no one left to bestow a name on the local library.
A more fruitful approach is the one the Witt Committee of Yale chose: by asking what are the “principal legacies” of the person under consideration, the “lasting effects that cause a namesake to be remembered.” We honor Wilson for his presidential leadership and vision of international peace. He is recognized not for his racism but in spite of it. We honor Margaret Sanger as an advocate of reproductive and sexual freedom, not for her support of eugenics but in spite of it. Churchill was above all a defender of freedom against fascism, and the context in which he earned his renown matters. Of the recent efforts to blackball him, one Twitter wag remarked, “If you think Churchill was a racist, wait until you hear about the other guy.” Not everything a person does or says is of equal significance, and people with ugly opinions can do great things, not least because they may also hold noble opinions.
Principal legacies can evolve. They undergo revision as people or groups who once had little say in forging any scholarly or public consensus participate in determining those legacies. It may well be that by now Andrew Jackson is known as much for the Trail of Tears as for expanding democracy, and perhaps that is appropriate. Arthur M. Schlesinger, Jr., made no mention of Indian Removal in his classic The Age of Jackson in 1945, but by 1989 he had come to agree that the omission — common to Jackson scholars of the 1940s — was “shameful,” if all too common among his peers at the time. But as the Witt Committee noted, our understandings of someone’s legacies “do not change on any single person’s or group’s whim; altering the interpretation of a historical figure is not something that can be done easily.” For all that Americans have learned about Thomas Jefferson’s racial views and his slaveholding in recent decades, his principal legacies — among them writing the Declaration of Independence, articulating enduring principles of rights and freedom, steering a young country through intense political conflict as president — remain unassailable. We will have to learn to live with all of him.
The Witt Committee also asked whether the criticisms made of a historical figure were widely shared in his or her own time — or if they are a latter-day imposition of our own values. The difference is not trivial. As late as 2012, when Barack Obama finally endorsed gay marriage, most Democrats still opposed the practice. But norms and attitudes evolved. Today most Democrats think gay marriage unremarkable, and the Supreme Court has deemed it a constitutional right. It might be fair to condemn someone who in 2020 seeks to overturn the court’s decision, but it would be perverse to label everyone who had been skeptical of gay marriage ten years ago a homophobe or a bigot. Historians must judge people by the values, standards, and prevailing opinions of their times, not our own. No doubt we, too, will one day wish to be judged that way. Yet the pervasive impulse these days to moralize, to turn analytical questions into moral ones, has also made us all into parochial inquisitors.
It is also worth asking what harm is truly caused by retaining someone’s name, especially if the person’s sins are obscure or incidental to his reputation. Many buildings and streets commemorate people who are largely forgotten, making it hard to claim that their passing presence in our lives does damage. A federal court forbade Alabama’s Judge Roy Moore from placing a giant marble Ten Commandments in the state judicial building, but the phrase “In God We Trust” is allowed on coins because in that context it is considered anodyne and secular — wallpaper or background noise — without meaningful religious content. By analogy, the preponderance of place names hardly evoke any associations at all. They are decorations, mere words. The State University of New York at Buffalo removed Millard Fillmore’s name from a campus hall because Fillmore signed the Fugitive Slave Act. But it is doubtful that Fillmore’s surname on the edifice had ever caused much offense, for the simple reason that almost no one knows anything about Millard Fillmore.
Then, too, as Peter Salovey initially suggested about Calhoun, a person’s name can sometimes be a useful and educational reminder of a shameful time or practice in our past. In 2016, Harvard Law School convened a committee to reconsider its seal, which depicted three sheaves of wheat and came from the family crest of Isaac Royall, a Massachu-setts slaveowner and early benefactor of the school. While the committee voted to retire the seal, historian and law professor Annette Gordon-Reed and one law student dissented, arguing that keeping the seal would serve “to keep alive the memory of the people whose labor gave Isaac Royall the resources to purchase the land whose sale helped found Harvard Law School.” Historical memory is always a mixed bag — if, that is, we wish to remember as much as we can about how we came to be who we are. Sometimes, a concern for history is precisely what warns us not to hide inconvenient or unpleasant pieces of the past.
Often context can serve the purposes of promoting antiracism or other noble principles better than erasure. Museums and other forms of public history are experiencing a golden age. Historic sites that once lacked any significant information for tourists are being redesigned to satisfy the hungriest scholar. Plaques, panels, touch-screen information banks, and other displays can educate visitors about the faults and failings — as well as the virtues — of the men and women whose names appears on their buildings and streets. Addition — more information, more explanation, more context — may teach us more than subtraction. But even here, there are limits. A recent show at the National Gallery of Degas’ opera and ballet pictures did not mention that he was a virulent anti-Semite. Should we care? If the museum had “contextualized” the tutus with a wall caption about Captain Dreyfus, the information would not have been false, but it would have been irrelevant, and in its setting quite strange. We don’t need asterisks everywhere.
Above all, renaming should be carried out in a spirit of humility. The coming and going of names over the decades might inspire in some a Jacobin presumptuousness about how easy it is to remake the world. But what it should more properly induce is a frisson of uncertainty about how correct and authoritative our newly dispensed verdicts about the past truly are. “We readily spot the outgrown motives and circumstances that shaped past historians’ views,” writes the geographer David Lowenthal; “we remain blind to present conditions that only our successors will be able to detect and correct.” Public debates and deliberation about how to name our institutions, how to evaluate historical figures, and how to commemorate the past are an essential part of any democratic nation’s intellectual life and political evolution. Our understandings of our history must be refreshed from time to time with challenges — frequently rooted in deeply held political passions — to widely held and hardened beliefs. There are always more standpoints than the ones we already possess. Yet passions are an unreliable guide in deriving historical understanding or arriving at lasting moral judgments. In light of the amply demonstrated human capacity for overreach and error, there is wisdom in treading lightly. Bias is everywhere, even in the enemies of bias. Nobody is pure.