The country has walked to the brink of war or
beyond with Iran, North Korea, Iraq, and Syria to prevent or stop them.
The American foreign-policy establishment is fond of
punishing those who threaten to use weapons of mass destruction against
innocents. For CNN commentator Fareed Zakaria, all
it took was a Tomahawk missile strike on the Syrian air base suspected of
sending jets to bomb the rebel-held village of Khan Sheikhoun with sarin gas to
make Donald Trump “presidential.” The New Yorker’s Ryan Lizza questioned neither
the moral nor the strategic purpose of strike, only its legality. Former Obama
staffers like Anne-Marie Slaughter struck
positive notes in response, grateful that some message, any message, had
finally been delivered to Bashar al-Assad. Nicholas Kristof and David Brooks both
praised the country for standing up for the taboo against chemical weapons that
arose among European nations after World War I.
Since the 1960s,
categorical opposition to weapons of mass destruction—biological, chemical, and nuclear—has been a hallmark of American
foreign-policy thinking. John F. Kennedy recast the American president as a
protective father to the world. So when reports noted that images of “young
children and babies being gassed” reportedly moved Trump to
action in Syria, they were echoing a long tradition, one that has both helped
to justify international cooperation to combat the most terrible weapons of
war, and given license to presidents to strike out in humanity’s name.
The U.S. campaign
against the use of inhumane weapons began not with the interwar debates about
the prohibition of chemical weapons after Somme and Verdun, or when the U.S. Senate
refused to ratify the 1925 Geneva Protocol on Chemical Weapons, but with the
arrival of the atomic age. After Hiroshima and Nagasaki, Americans viewed the
bomb as the fruit of their own ingenuity, its use justified as the only means to
end World War II at a minimal cost to American lives. In a poll
conducted by Fortune magazine four months after the bombings
of Hiroshima and Nagasaki, more than four out of five respondents approved of the
bombings, with nearly one in four wishing more bombs had fallen. (One
woman explained: There
“may be innocent women and children, but they only in my opinion breed more of
the same kind of soldiers.”)
John Hersey’s
issue-length account of six individuals struggling to survive in the ruins of
Hiroshima, published in The New Yorker just
over a year after the bombing, sought to subvert that callous narrative. Hersey
set part of his piece in the city’s Red Cross hospital, where its shattered
residents shuffled like zombies after the bombing. He concluded with
a quotation from the diary of 10-year-old survivor Toshio Nakamura: “Next day I
went to Taiko Bridge and met my girlfriends Kikuki and Murakami. They were
looking for their mothers. But Kikuki’s mother was wounded and Murakami’s
mother, alas, was dead.”
By humanizing the
victims, Hersey helped inaugurate a reappraisal of total war’s morality;
gradually, American approval for the atomic bombings declined. When former Secretary of War Henry Stimson
told Harper’s magazine the following February that he and
President Harry Truman had believed that dropping the bomb in Japan would provide
“an effective shock [that] would save many times the number of lives, both
American and Japanese, than it would cost,” he implicitly conceded Hersey’s
basic point—human lives, regardless of nationality, were the only measure by
which to judge such instruments of death.
American attitudes
toward nuclear weapons would shift, however, after plans to place atomic energy
under international control fell apart with the onset of the Cold War. When the
Soviet Union entered the nuclear club in 1949, Truman refused to heed his General
Advisory Committee’s counsel not to build the hydrogen bomb, which the body had
likened to “a weapon of genocide.” Phrases such as “massive retaliation,”
“brinksmanship,” and “megadeaths” entered the strategic lexicon as the nuclear
arsenal mushroomed from 1,000 to 24,000 warheads under Dwight Eisenhower. The
muscular grammar of deterrence, credibility, and risk-taking initiated a
contest to see which politician could talk the toughest.
But Eisenhower saw the
dangers that lay ahead. In his “Chance for Peace” speech in 1953, he called military spending “a theft,” complaining that
“[t]he cost of one modern heavy bomber is this: a modern brick school in more
than 30 cities.” He warned in his farewell address that “we cannot mortgage
the material assets of our grandchildren” to feed the country’s growing
military-industrial complex. Ironically, Eisenhower’s messages of caution,
delivered in the language of domestic protection, laid the groundwork for what
was to come.
Where committed
peacemakers like Adlai Stevenson, the Democratic presidential candidate in 1952
and 1956, advocated a ban on nuclear testing to save children from fallout, the
young John Kennedy advertised his youth and masculinity, winning a Pulitzer
for Profiles in Courage, highlighting his wartime naval service,
waxing eloquent about his touch football days, hiding the truth of his chronic
back pain, and charging that Eisenhower, the hero of Normandy, had allowed a
non-existent “missile gap” to open up between the United States and its
Soviet nemesis.
Kennedy softened his image only after the
Cuban Missile Crisis bolstered his alpha-male bona fides. Members of Camelot’s
inner circle recalled the Cuban Missile Crisis as a test of wills between
Kennedy and Nikita Khrushchev; Secretary of State Dean Rusk reportedly intimated to National Security Advisor McGeorge Bundy:
“We’re eyeball to eyeball, and I think the other fellow just blinked.” In
truth, Kennedy chose forbearance and flexibility, secretly trading U.S. missiles
in Turkey for the Soviets’ in Cuba. Afterward, Kennedy pursued negotiations to
prevent a return to the brink as well as to stymie potential nuclear
challengers to American power in Europe, Asia, and the Middle East. To do so,
he needed to rewrite the script for presidential masculinity amid the
Cold War.
Kennedy’s commencement address to American University on June 10,
1963, marked a return to the more familiar style of
Franklin Roosevelt. In the speech, he staked out a more intimate role, casting
himself as the worried father of the human family, expanding on “the kind of
peace that makes life on earth worth living, the kind that enables men and
nations to grow and to hope and to build a better life for
their children.”
Kennedy’s appeal
helped win public support for the Limited Test Ban Treaty (LTBT) after it had
been finalized that August over opposition from his Joint Chiefs of Staff.
Co-written by anti-nuclear activist Normal Cousins and inspired by such organizations as
Women Strike for Peace and such individuals as baby expert Benjamin Spock,
whose counsel that women should forgo employment to rear their children at home
typified the era’s cult of domesticity, Kennedy used language that acknowledged
parents’ fears of nuclear fallout. He and Khrushchev had ulterior motives—namely,
freezing Mao’s China out of the nuclear club even as American officials
contemplated preventive strikes against its nuclear facilities. But
when the White House wrestled over how to defend the first major nuclear-arms
control treaty before Congress, Kennedy scribbled the key to public opinion
down on a sheet of scratch paper—“Diminish danger, war by other means …
fallout—radiation. Children.”
Lyndon Johnson, of
course, went on to weaponize Kennedy’s paternalism. In his 1964 “Daisy
commercial,” a Shirley Temple lookalike picked petals as a
portentous voice counts down to Armageddon. “These are the stakes,” the
voiceover concluded, “[t]o make a world in which all of God’s children can
live, or to go into the dark.” The ad sunk Barry Goldwater’s campaign after the
Republican candidate suggested nukes might turn the tables in Vietnam.
Children became a
fixture of speeches by those who supported nuclear arms control. On the first
anniversary of the LTBT, Robert Kennedy echoed his brother: “if we would
preserve the air we breathe and the health of our children and our children’s
children.” He soon emerged as the Nuclear Non-Proliferation Treaty’s (NPT) most
vocal champion; in his maiden speech, the junior senator from New York pushed
Johnson to refocus his administration on halting the spread of nuclear weapons
to new states rather than escalate in Southeast Asia. He followed up with an
article in Frontier, entitled “Will There Be Any World Left For Our
Children?” When Johnson’s torch-bearer, Vice President Hubert Humphrey, ran
against Richard Nixon in 1968, the president stumped for the NPT to
contrast the two men’s fitness for office. Johnson charged that for the Senate
not to ratify the treaty, which the United Nations had finalized that summer,
as Nixon demanded, would result in “the world our children will inhabit made
far more perilous.”
Historians have illuminated how images of suffering children emboldened
human rights and humanitarian activists in the 1970s, leading them to support
targeted interventions in the name of universal values and moral sentiment. But
however much images of suffering children from Bangladesh to Kosovo to Darfur
have galvanized action to aid those threatened by civil war, famine, natural
disaster, or genocide, humanitarian interventions have been rare, while the
country has walked to the brink of war or beyond with China, Iran, North Korea,
Iraq, and Syria, on grounds that these states had pursued or used
monstrous weapons.
Years of linking the
threat posed by these arms to paternal or maternal impulses has helped liberals
lock arms with conservatives where weapons of mass destruction are concerned.
Beginning with Barry Goldwater, conservatives tended to be less skeptical over
their use, a trend that continued with Ronald Reagan, who largely turned
a blind eye to Saddam Hussein’s use of chemical weapons in
the Iran-Iraq War.
When Hillary
Clinton explained to the Council on Foreign Relations why she
voted to authorize lethal force to disarm Saddam Hussein’s Iraq, she concluded
that “the most important” reason to intervene were “our children, our future
grandchildren, all the children who deserve from this generation of leadership
the same commitment to building a safer, more secure world that we inherited
from the last generation.” Clinton thus reinforced her interventionist impulses
with her lifelong advocacy for women and children at home and abroad.
While Trump’s impulse to save “beautiful babies” might be a welcome
change of tone for him and his administration; the desire to protect the
innocent springs from humane sentiments. But in empowering one man to act as a
vengeful father, Americans abjure their democratic duties. Arthur M.
Schlesinger, Jr., Kennedy’s court historian, explained how the “imperial
presidency” had severed the checks and balances that traditionally subjected
the executive’s war-making powers to democratic scrutiny. The paternal
presidency has made American citizens, including those who cherish liberal
values such as international norms and human rights, similarly docile.
As President Trump
opts for spectacular shows of force in Syria, Afghanistan, and Korea, Americans
would do well to tell their representatives that military actions, including
standoff air or missile strikes, apart from threats to “the United States, its territories or possessions, or its armed forces,” warrant
vigorous public debate.
Jonathan Hunt is a lecturer in modern global
history at the University of Southampton in the United Kingdom. He is the
author of a forthcoming history of nuclear non-proliferation.
No comments:
Post a Comment