One of the best books I’ve read provides a powerful lesson in the pitfalls that come from great intelligence and ability. David Halberstam’s “The Best and the Brightest” tells the story of certifiably brilliant people who took the United States to war, dug in deeper as things got worse and never internalized messages coming from the front — messages that might have weakened their resolve to send Americans to die in a far-off country.
Halberstam’s book is about the Vietnam War. Though a work of history, it — like other great histories of conflict — serves as a parable. The men involved — they were all men back then — included McGeorge Bundy, former dean of the Faculty of Arts and Sciences at Harvard University; Robert McNamara, a Harvard Business School graduate and former Ford Motor Co. president; a young Harvard law professor named John McNaughton and Daniel Ellsberg, the Harvard-educated aide to McNamara whose eventual publication of thousands of secret documents proved that deception of the American public had been central to the war’s progress. President John F. Kennedy, who appointed these men, was himself a Harvard graduate.
The lesson of “The Best and the Brightest” is that mental agility can be a trap. Smart people can fit new information into the narratives they have already constructed — because they are so good at fitting ideas together, it can be exceedingly difficult for them to recognize when facts call assumptions into question. A person with great talent for abstraction can fit round pegs into square holes, something he or she accomplishes by proving to himself or herself that the peg is square or the hole is round, or that both are somewhere in between.
When commonly held assumptions shape quick thinkers’ perceptions, they, more than others, have the ability to protect those assumptions from facts. The hardest person to convince that, say, the domino theory of international communism might not make sense, is the person with the mental dexterity to show that evidence against the theory is in fact no evidence at all. Yet at the same time, such a mind is fully capable of switching from one set of assumptions to another and, being comfortable with abstractions, may find it hard to recognize which narrative is actually worth pursuing. It is easy to switch from knowing that the peg is self-evidently round to knowing that the peg is self-evidently square — someone who can do that has a hard time knowing when it’s important to choose — and stick up for — a firm conclusion.
Thus one reads of McNaughton, “No one in the high levels of government in 1964 had greater and more profound doubts about the wisdom of the policy the nation was following in Vietnam, and no one argued more forcefully with his immediate superior against the particular course. And having lost that argument, when someone else … made the same points … no one tore those arguments apart more ferociously.”
Qualities that help a person reach high office sometimes make them unable to bring a critical eye to their position. People whose approaches to problem solving have brought them success often lack the ability to question whether their approach is appropriate for the task at hand. It is, after all, their mastery of procedures that brings them to positions of public trust.
Former Defense Secretary Robert McNamara is notorious for spending the Vietnam War at his desk, poring over statistical reports, unable — because of his training and success — to ask whether the numbers measured anything useful, or were even honest. They weren’t. When he finally understood that, he became one of the strongest voices speaking out against the war within the administration. Yet it was his decisions, made on premises he’d failed to question, that created the quagmire in the first place.
The most important questions are about assumptions. Without an understanding of the premises on which a decision is being made, it is almost impossible to sense whether the narratives you’ve imposed have in turn limited your ability to see other possible outcomes. The tragedy of “The Best and the Brightest” is that brilliant men failed to see how their own abilities could keep them from asking the right questions.
Those of you who hope someday to make important decisions owe it to yourselves — and to the people whose lives your decisions will affect — to read this book, think deeply about it and learn to question your own assumptions. In that endeavor, the best and brightest among you will have to work the hardest.
Seth Soderborg can be reached at sethns@umich.edu. Follow him on twitter at @thedailyseth.