My friends tell me they’re taking Economics 101 to gain some practical knowledge about the field, but I think they’d learn more about how people make decisions and respond to incentives by going to a college party. There, they would see real life cost-benefit analyses — and few of them would be rational, consistent or optimal, as their textbooks assume.
I’m not sure if Nobel Prize winner Daniel Kahneman conducted any of his research at college parties, but his new book, “Thinking, Fast and Slow,” in blurring the line between psychology and economics, reveals how unconscious biases lead people to systematically make irrational, sub-optimal decisions.
What Kahneman’s book does is invalidate the cold, calculating and emotionless persona that economists and policy makers have built institutions around. He says he hopes that the vocabulary he outlines to describe unconscious biases — the halo effect, representative bias, illusion of validity, to give just a few examples — leads to more nuanced conversations about human capital.
What his book doesn’t do, unfortunately, is help us correct these systematic biases. Even Kahneman himself still falls for the same biases he’s been studying for more than 40 years. The ancient maxim, “know thyself,” while important, is simply not enough to change behavior. Being aware that we’re susceptible to priming, loss-aversion or the halo effect doesn’t mean we can avoid them. Self-description, that’s one thing; self-correction, that’s something else entirely.
Yet some economists, like Richard Thaler and Cass Sunstein, refuse to settle for mere self-description. Their book, “Nudge,” proposes an interesting premise: Helping ourselves — changing habits, altering unconscious biases — on our own is very difficult. Our environments, however, can give us nudges that can compensate for these biases and help us make better decisions.
For example, want to encourage people to lose weight? Don’t show them how restaurants “prime” them to overeat. Just have the salad bar be the first thing they see in the cafeteria. Want people to save more money? Don’t tell them why they’re overspending. Just default them into a 401(k). Want students to have college aspirations? Don’t merely show them the statistics. Bring in real live examples of people who’ve succeeded and who’ve failed because of their education choices.
Thaler and Sunstein call the designing of incentives choice architecture. The designs can be institutional, but they can also be personal. The application SelfControl, for instance, stops users from spending too much time on the Internet. The website Stickk helps users “stick” to their goals by attaching a variety of monetary incentives. The community of Life Hackers, a group that posts tips and “hacks” — everyday time-saving cheats — to be more productive, is so prolific that they create life hacks to stop themselves from posting life hacks.
The applications and benefits of choice architecture are most illuminating, in my opinion, when applied to the context of author David Foster Wallace’s speech, “This is Water.”
Wallace states that what your education really teaches you is not how to think, but rather, what to think. When you’re shopping in the supermarket late at night, and you’re lonely, tired and hungry, the ability to analyze classroom arguments isn’t going to save you. It’s going to be up to you to consciously choose to not be angry, to not give other people a hard time.
Choice architecture concedes that controlling what to think is a daunting task. As humans, we are way more influenced by our environments than we’d like to admit. So choice architecture aims to reverse the approach. Instead of merely encouraging people to, well, overcome their systematic tendencies to make mistakes, let’s change their environments to help them prevent themselves from making these mistakes in the first place.
So at the supermarket, you won’t stress over finding groceries; you’ll have your template list and which isles they’re on readily available. You’ll shoot your must-send e-mails just by speaking into your phone. And if you’re angry, your phone will sense that and confirm that you still want to send that passive-aggressive message.
In effect, there are two different sciences here. Kahneman’s science, the descriptive science, is an extensive study of the systematic flaws our intuition commits. Knowledge of this science, however, is not enough to change behavior. Choice architecture, the prescriptive science, aims to fill the gap left by descriptive science — to design institutions and choice structures that compensate for our biases and help us make better decisions.
Kahneman has just compiled more than 40 years of descriptive science research. For prescriptive science, Thaler and Sunstein’s book was just the beginning.
If Kahneman is like the Lewis and Clark of the human mind, whoever masters the prescriptive science will be like Neil Armstrong.
Erik Torenberg can be reached at firstname.lastname@example.org.