War and the Liberal Hegemony

Why did the United States intervene in the Second World War? The question is rarely asked because the answers seem so obvious: Hitler, Pearl Harbor, and what more needs to be said? To most Americans, World War II was the quintessential “war of necessity.” As the late Charles Krauthammer once put it, “wars of choice,” among which he included Vietnam and the first Gulf War, are “fought for reasons of principle, ideology, geopolitics or sometimes pure humanitarianism,” whereas a “war of necessity” is a “life-or-death struggle in which the safety and security of the homeland are at stake.” If World War II is remembered as the “good war,” the idea that it was “necessary” is a big part of the reason why. The enemies were uniquely wicked and aggressive; Americans were attacked first; they had no choice but to fight. This perception of World War II has had a paradoxical effect on the broader American foreign policy debate. On the one hand, writers of an anti-interventionist bent rightly perceive that the war’s reputation as “necessary” and therefore “good” has encouraged Americans to believe that other wars can be “necessary” and therefore “good,” too. (Krauthammer believed the “war on terror” was also one of “necessity,” and Richard Haass put the Gulf War in the “necessary” category, and in 1965 even David Halberstam and The New York Times editorial page believed that American intervention in Vietnam was necessary.) On the other hand, anti-interventionists are not alone in believing that, even if World War II was necessary, the circumstances were unique and therefore irrelevant to subsequent foreign policy discussions. There will never be another Hitler, and the idea that a foreign great power (as opposed to a terrorist group) might launch a direct attack on the United States seems far-fetched even today. World

Log In Subscribe

Sign Up For Free

Read 2 free articles a month after you register below.

Register now