top of page
Writer's pictureKen Larson

WHAT RECENT WAR MAKING DECISION HISTORY TELLS US - About Power And The Price We Pay


“THE ATLANTIC” From “The Iraq War and the Inevitability of Ignorance” By James Fallows

“The U.S. is destined to keep over-learning the lessons of the last conflict; leaders considering war or peace, media stoking or questioning pro-war fever, and 99 percent of the public in considering the causes for which the military 1 percent will be asked to kill, and die.”

_______________________________________________________________________________

“There’s a specific reason it is so hard to be president—in normal circumstances—and why most incumbents look decades older when they leave the job than when they began. The reason is that the only choices normal presidents get to make are the impossible ones—decisions that are not simply very close calls on the merits, but that are guaranteed to lead to tragedy and bitterness whichever way they go.


Take Barack Obama’s famed choice not to back up his “red line” promise in Syria, which was a focus of Jeffrey Goldberg’sThe Obama Doctrine Atlantic cover story two years ago. The option Obama chose—not intervening in Syria—meant death and suffering for countless thousands of people. The option he rejected—intervening—would have meant death and suffering for countless thousands of the same people or others.


Agree or disagree on the outcome, any such decision is intellectually demanding and morally draining. Normal presidents have to make them, one after another, all day long. (Why don’t they get any easier choices? Because someone else has made all of those before they get to the president.) Obama’s decision to approve the raid on Osama bin Laden’s compound turned out to be a tactical and political success. When he made it, he had to weigh the possibility that it could end in world-publicized failure—like Jimmy Carter’s decision to attempt a rescue of American hostages in Iran, which ended in chaos, and which Carter later contended was what sealed his fate in his re-election run.


A special category of impossible decision, which I was introduced to in the two years I worked for Jimmy Carter in the White House and have borne in mind ever since, turns on the inevitability of ignorance. To be clear, I don’t mean “stupidity.” People in the government and military are overall smarter than press portrayals might suggest. Instead I mean really registering the uncomfortable fact that you cannot know enough about the big choices you are going to make, before you have to make them.


Sometimes that is because of deadline rush: The clock is ticking, and you have to act now. (To give a famous example: In 1980 U.S. radar erroneously indicated that the Soviets had launched a nuclear-missile attack, and Zbigniew Brzezinski, as Jimmy Carter’s national-security adviser, had to decide at 3 a.m. whether to wake the president to consider retaliation. Before the world was rushed toward possible nuclear obliteration, the warning was revealed as a false alarm before Brzezinski could place the call.) Most of the time it is because the important variables are simply unknowable,

and a president or other decision-maker has to go on judgment, experience, hunch.


This point sounds obvious, because we deal with its analogues in daily-life decisions big and small. No one who decides to get married can know what his or her spouse will be like 20 years in the future, or whether the partners will grow closer together or further apart. Taking a job—or offering one—is based at least as much on hope as on firm knowledge. You make an investment, you buy a house, you plan a vacation knowing that you can’t possibly foresee all the pitfalls or opportunities.

But this routine truism takes on life-or-death consequences in the choices that presidents must make, as commander in chief and as head of U.S. diplomatic and strategic efforts. The question of deciding about the unknowable looms large in my mind, as I think back 15 years to the run-up to the Iraq war, and think ahead to future such choices future presidents will weigh.

There’s a long list of books I wish presidents would have read before coming to office—before, because normal presidents barely have time to think once they get there. To give one example from my imagined list: the late David Fromkin’s A Peace to End All Peace is for me a useful starting point for thinking about strains within the modern Middle East. The book argues, in essence, that the way the Ottoman Empire was carved up at the end of World War I essentially set the stage for conflicts in the region ever since. In that way it is a strategic counterpart to John Maynard Keynes’s famous The Economic Consequences of the Peace, written just after the conclusion of the Versailles agreements, which argues that the brutal economic terms dealt out to the defeated Germans practically guaranteed future trouble there.


Also high up on my “wish they’d read” list is Thinking in Time: The Uses of History for Decision Makers, by two Harvard professors (and one-time mentors of mine), Ernest May and Richard Neustadt. In this book, May and Neustadt reverse the chestnut attributed to an earlier Harvard professor, George Santayana, that “those who do not remember the past are condemned to repeat it.” Instead they caution against over-remembering, or imagining that a choice faced now can ever be exactly like one faced before.


The most famous and frightening example is Lyndon Johnson’s, involving Vietnam. Johnson “learned” so thoroughly the error of Neville Chamberlain, and others who tried to appease (rather than confront) the Nazis, that he thought the only risk in Vietnam was in delaying before confronting communists there. A complication in Johnson’s case, as this book and all other accounts of Vietnam make clear, is that he was worried both about the reality of waiting too long to draw a line against Communist expansion, and perhaps even more about appearing to be weak and Chamberlain-like.


Because of the disaster Johnson’s decisions caused—the disaster for Vietnam, for its neighbors, for tens of thousands of Americans, all as vividly depicted in last year’s Ken Burns / Lynn Novick documentary—most American politicians, regardless of party, “learned” to avoid entanglement in Asian-jungle guerrilla wars. Thus in the late 1970s, as the post-Vietnam war Khmer Rouge genocide slaughtered millions of people in Cambodia, the U.S. kept its distance. It had given up the international moral standing, and had nothing like the internal political stomach, to go right back into another war in the neighborhood where it had so recently met defeat.


From its Vietnam trauma, the United States also codified a crass political lesson that Richard Nixon had applied during the war. Just before Nixon took office, American troop levels in Vietnam were steadily on the way up, as were weekly death tolls, and monthly draft calls. The death-and-draft combination was the trigger for domestic protests.


Callously but accurately, Nixon believed that he could drain the will to the protest if he ended the draft calls. Thus began the shift to the volunteer army—and what I called, in an Atlantic cover story three years ago, the Chickenhawk Nation phenomenon, in which the country is always at war but the vast majority of Americans are spared direct cost or exposure. (From the invasion of Iraq 15 years ago until now, the total number of Americans who served at any point in Iraq or Afghanistan comes to just 1 percent of the U.S. population.)


May and Neustadt had a modest, practical ambition for their advice to study history, but to study it cautiously. “Marginal improvement in performance is worth seeking,” they wrote. “Indeed, we doubt that there is any other kind. Decisions come one at a time, and we would be satisfied to see a slight upturn in the average. This might produce much more improvement [than big dramatic changes] measured by results.”

My expectation is more modest still: I fear but expect that the U.S. is fated to lurch from one over-“learning” to its opposite, and continue making a steadily shifting range of errors.

The decision to invade Iraq was itself clearly one of those. The elder George Bush fought a quick and victorious war to drive Saddam Hussein out of Kuwait in 1991. But he stopped short of continuing the war into Iraq to remove Saddam Hussein from power—and so his son learned from that “failure” that he had to finish the job of eliminating Saddam. (As did a group of the younger George Bush’s most influential advisors: Dick Cheney, who had been secretary of defense during the original Gulf war, and returned as George W. Bush’s vice president. Colin Powell was chairman of the Joint Chiefs of Staff the first time around, and secretary of state the second. Paul Wolfowitz was undersecretary of defense during the first war, and deputy secretary of defense during the second. And so on.)


Two of the writers who were most eloquent in making their case for the war—Christopher Hitchens, who then wrote for the Atlantic among other places, and Michael Kelly, who was then our editor-in-chief—based much of their case on the evils Saddam Hussein had gotten away with after the original Gulf War. (Hitchens died of cancer in 2011; Kelly was killed in Iraq, as an embedded reporter in the war’s early stage.) Then Barack Obama, who had become president in large part because he opposed the Iraq war — which gave him his opening against the vastly better known and more experienced Hillary Clinton— learned from Iraq about the dangers of intervention in Syria. And on through whatever cycles the future holds.


Is there escape from the cycles? In a fundamental sense, of course not, no. But I’ll offer the “lesson” I learned—50 years ago, in a classroom with Professor May; 40 years ago, when I watched Jimmy Carter weigh his choices; 15 years ago, in warning about the risks of invading Iraq. It involves a cast of mind, and a type of imagination. As the Bush administration moved onto a war footing soon after the 9/11 attacks, no one could know the future risks and opportunities. But, at the suggestion of my friend and then-editor Cullen Murphy, I began reporting on what the range of possibilities might be. Starting in the spring of 2002, when the Bush team was supposedly still months away from a decision about the war, it was clear to us that the choice had been made. I interviewed dozens of historians, military planners, specialists in post-war occupations, and people from the region to try to foresee the likely pitfalls.


The result, which was in our November, 2002 issue (and which we put online three months earlier, in hopes of affecting the debate) was calledThe Fifty-First State? Its central argument was: The “war” part of the undertaking would be the easy part, and deceptively so. The hard part would begin when U.S. troops had reached Baghdad and the statues of Saddam Hussein were pulled down—and would last for months, and years, and decades, all of which should be taken into consideration in weighing the choice for war.


It conceivably might have gone better in Iraq, and very well could have, if not for a series of disastrously arrogant and incompetent mistakes by members of the Bush team. I won’t go into details here: I laid them out in several articles, including this, this, and this, and eventually a book. But the premise of most people I interviewed before the war, who mostly had either a military background or extensive experience in the Middle East, was that this would be very hard, and would hold a myriad of bad surprises, and was almost certain to go worse than its proponents were saying.


Therefore, they said, the United States should do everything possible to avoid invading unless it had absolutely no choice. Wars should be only of necessity. This would be folly, they said, and a war of choice. The way I thought of the difference between those confidently urging on the war, and those carefully cautioning against it, was: cast of mind. The majority of people I spoke with expressed a bias against military actions that could never be undone, and whose consequences could last for generations. I also thought of it as a capacity for tragic imagination, of envisioning what could go wrong as vividly as one might dream of what could go right. (“Mission Accomplished!”) Any cast of mind has its biases and blind spots. But I’m impressed, in thinking about the history I have lived through and the histories I have read, by how frequently people with personal experience of war have been cautious about launching future wars. This does not make them pacifists: Harry Truman, infantry veteran of World War I, decided to drop the atomic bomb. But Ulysses Grant, Dwight Eisenhower, Colin Powell (in most of his career other than the Iraq-war salesmanship at the United Nations)—these were former commanding generals, cautious about committing troops to war. They had a tragic imagination of where that could lead and what it might mean. What lesson do we end with? Inevitably any of them from the past will mismatch our future choices. The reasons not to invade Iraq 15 years ago are different from the risks to consider in launching a strike on North Korea or on Iran, or provoking China in some dispute in the East China Sea.” https://www.theatlantic.com/international/archive/2018/03/iraq-war-anniversary-fifty-first-state/555986/









ABOUT THE AUTHOR: James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. Fallows has won the National Magazine Award for his 2002 story Iraq: The Fifty-First State?” warning about the consequences of invading Iraq; he has been a finalist four other times. He has also won the American Book Award for nonfiction for his book National Defense and a N.Y. Emmy award for the documentary series Doing Business in China. He was the founding chairman of the New America Foundation. His recent books Blind Into Baghdad (2006) and Postcards From Tomorrow Square (2009) are based on his writings for The Atlantic. His latest book is China Airborne (2012).

6 views0 comments

Comments


Post: Blog2_Post
bottom of page