Thinking in bets Making smarter decisions when you don't have all the facts

Annie Duke, 1965-

Book - 2018

"Poker champion turned business consultant Annie Duke teaches you how to get comfortable with uncertainty and make better decisions as a result. In Super Bowl XLIX, Seahawks coach Pete Carroll made one of the most controversial calls in football history: With 26 seconds remaining, and trailing by four at the Patriots' one-yard line, he called for a pass instead of a hand off to his star running back. The pass was intercepted and the Seahawks lost. Critics called it the dumbest play in history. But was the call really that bad? Or did Carroll actually make a great move that was ruined by bad luck? Even the best decision doesn't yield the best outcome every time. There's always an element of luck that you can't contro...l, and there is always information that is hidden from view. So the key to long-term success (and avoiding worrying yourself to death) is to think in bets: How sure am I? What are the possible ways things could turn out? What decision has the highest odds of success? Did I land in the unlucky 10% on the strategy that works 90% of the time? Or is my success attributable to dumb luck rather than great decision making? Annie Duke, a former World Series of Poker champion turned business consultant, draws on examples from business, sports, politics, and (of course) poker to share tools anyone can use to embrace uncertainty and make better decisions. For most people, it's difficult to say "I'm not sure" in a world that values and, even, rewards the appearance of certainty. But professional poker players are comfortable with the fact that great decisions don't always lead to great outcomes and bad decisions don't always lead to bad outcomes. By shifting your thinking from a need for certainty to a goal of accurately assessing what you know and what you don't, you'll be less vulnerable to reactive emotions, knee-jerk biases, and destructive habits in your decision making. You'll become more confident, calm, compassionate and successful in the long run"--

Saved in:

2nd Floor Show me where

658.40353/Duke
1 / 1 copies available
Location Call Number   Status
2nd Floor 658.40353/Duke Checked In
Subjects
Published
New York : Portfolio 2018.
Language
English
Main Author
Annie Duke, 1965- (author)
Physical Description
pages cm
Bibliography
Includes bibliographical references and index.
ISBN
9780735216358
  • Introduction
  • Chapter 1. Life Is Poker, Not Chess
  • Pete Carroll and the Monday Morning Quarterbacks
  • The hazards of resulting
  • Quick or dead: our brains weren't built for rationality
  • Two-minute warning
  • Dr. Strangelove
  • Poker vs. chess
  • A lethal battle of wits
  • "I'm not sure": using uncertainty to our advantage
  • Redefining wrong
  • Chapter 2. Wanna Bet?
  • Thirty days in Des Moines
  • We've all been to Des Moines
  • All decisions are bets
  • Most bets are bets against ourselves
  • Our bets are only as good as our beliefs
  • Hearing is believing
  • "They saw a game"
  • The stubbornness of beliefs
  • Being smart makes it worse
  • Wanna bet?
  • Redefining confidence
  • Chapter 3. Bet to Learn: Fielding the Unfolding Future
  • Nick the Greek, and other lessons from the Crystal Lounge
  • Outcomes are feedback
  • Luck vs. skill: fielding outcomes
  • Working backward is hard: the SnackWell's Phenomenon
  • "If it weren't for luck, I'd win every one"
  • All-or-nothing thinking rears its head again
  • People watching
  • Other people's outcomes reflect on us
  • Reshaping habit
  • "Wanna bet?" redux
  • The hard way
  • Chapter 4. The Buddy System
  • "Maybe you're the problem, do you think?"
  • The red pill or the blue pill?
  • Not all groups are created equal
  • The group rewards focus on accuracy
  • "One Hundred White Castles ... and a large chocolate shake": how accountability improves decision-making
  • The group ideally exposes us to a diversity of viewpoints
  • Federal judges: drift happens
  • Social psychologists: confirmatory drift and Heterodox Academy
  • Wanna bet (on science)?
  • Chapter 5. Dissent to Win
  • CUDOS to a magician
  • Mertonian communism: more is more
  • Universalism: don't shoot the message
  • Disinterestedness: we all have a conflict of interest, and it's contagious
  • Organized skepticism: real skeptics make arguments and friends
  • Communicating with the world beyond our group
  • Chapter 6. Adventures in Mental Time Travel
  • Let Marty McFly run into Marty McFly
  • Night Jerry
  • Moving regret in front of our decisions
  • A flat tire, the ticker, and a zoom lens
  • "Yeah, but what have you done for me lately?"
  • Tilt
  • Ulysses contracts: time traveling to precommit
  • Decision swear jar
  • Reconnaissance: mapping the future
  • Scenario planning in practice
  • Backcasting: working backward from a positive future
  • Premortems: working backward from a negative future
  • Dendrology and hindsight bias (or, Give the chainsaw a rest)
  • Acknowledgments
  • Notes
  • Selected Bibliography and Recommendations for Further Reading
  • Index
Review by Booklist Review

Duke (Decide to Play Great Poker, 2011) brings together her two areas of expertise championship poker and cognitive psychology in this practical guide to making smarter decisions. By treating decisions as bets, she writes, we can more easily recognize the learning opportunities in uncertain situations throughout our lives. In six concise chapters, she presents practical strategies for improved decision making, such as separating outcomes from decisions, acknowledging uncertainty, becoming less reactive and less emotional decision makers, and cultivating groups of fellow truth-seekers to help us overcome our natural biases. Duke is as comfortable with popular culture as she is with psychology and behavioral economics, and she illustrates her points with examples from professional sports, popular films, politics, and business while also providing overviews of scholarly topics from game theory to groupthink. Concise, practical, and accessible, this book will be of interest to all those looking to improve their decision making in their personal or professional lives.--Harmon, Lindsay Copyright 2018 Booklist

From Booklist, Copyright (c) American Library Association. Used with permission.

Chapter 1 Life Is Poker, Not Chess Pete Carroll and the Monday Morning Quarterbacks One of the most controversial decisions in Super Bowl history took place in the closing seconds of Super Bowl XLIX in 2015. The Seattle Seahawks, with twenty-six seconds remaining and trailing by four points, had the ball on second down at the New England Patriots' one-yard line. Everybody expected Seahawks coach Pete Carroll to call for a handoff to running back Marshawn Lynch. Why wouldn't you expect that call? It was a short-yardage situation and Lynch was one of the best running backs in the NFL. Instead, Carroll called for quarterback Russell Wilson to pass. New England intercepted the ball, winning the Super Bowl moments later. The headlines the next day were brutal: USA Today: "What on Earth Was Seattle Thinking with Worst Play Call in NFL History?" Washington Post: "'Worst Play-Call in Super Bowl History' Will Forever Alter Perception of Seahawks, Patriots" FoxSports.com: "Dumbest Call in Super Bowl History Could Be Beginning of the End for Seattle Seahawks" Seattle Times: "Seahawks Lost Because of the Worst Call in Super Bowl History" The New Yorker: "A Coach's Terrible Super Bowl Mistake" Although the matter was considered by nearly every pundit as beyond debate, a few outlying voices argued that the play choice was sound, if not brilliant. Benjamin Morris's analysis on FiveThirtyEight.com and Brian Burke's on Slate.com convincingly argued that the decision to throw the ball was totally defensible, invoking clock-management and end-of-game considerations. They also pointed out that an interception was an extremely unlikely outcome. (Out of sixty-six passes attempted from an opponent's one-yard line during the season, zero had been intercepted. In the previous fifteen seasons, the interception rate in that situation was about 2%.) Those dissenting voices didn't make a dent in the avalanche of criticism directed at Pete Carroll. Whether or not you buy into the contrarian analysis, most people didn't want to give Carroll the credit for having thought it through, or having any reason at all for his call. That raises the question: Why did so many people so strongly believe that Pete Carroll got it so wrong? We can sum it up in four words: the play didn't work. Take a moment to imagine that Wilson completed the pass for a game-winning touchdown. Wouldn't the headlines change to "Brilliant Call" or "Seahawks Win Super Bowl on Surprise Play" or "Carroll Outsmarts Belichick"? Or imagine the pass had been incomplete and the Seahawks scored (or didn't) on a third- or fourth-down running play. The headlines would be about those other plays. What Pete Carroll called on second down would have been ignored. Carroll got unlucky. He had control over the quality of the play-call decision, but not over how it turned out. It was exactly because he didn't get a favorable result that he took the heat. He called a play that had a high percentage of ending in a game-winning touchdown or an incomplete pass (which would have allowed two more plays for the Seahawks to hand off the ball to Marshawn Lynch). He made a good-quality decision that got a bad result. Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: "resulting." When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn't turn out well in the short run. Pete Carroll understood that his universe of critics was guilty of resulting. Four days after the Super Bowl, he appeared on the Today show and acknowledged, "It was the worst result of a call ever," adding, "The call would have been a great one if we catch it. It would have been just fine, and nobody would have thought twice about it." Why are we so bad at separating luck and skill? Why are we so uncomfortable knowing that results can be beyond our control? Why do we create such a strong connection between results and the quality of the decisions preceding them? How can we avoid falling into the trap of the Monday Morning Quarterback, whether it is in analyzing someone else's decision or in making and reviewing the decisions in our own lives? The hazards of resulting Take a moment to imagine your best decision in the last year. Now take a moment to imagine your worst decision. I'm willing to bet that your best decision preceded a good result and the worst decision preceded a bad result. That is a safe bet for me because resulting isn't just something we do from afar. Monday Morning Quarterbacks are an easy target, as are writers and bloggers providing instant analysis to a mass audience. But, as I found out from my own experiences in poker, resulting is a routine thinking pattern that bedevils all of us. Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences. When I consult with executives, I sometimes start with this exercise. I ask group members to come to our first meeting with a brief description of their best and worst decisions of the previous year. I have yet to come across someone who doesn't identify their best and worst results rather than their best and worst decisions. In a consulting meeting with a group of CEOs and business owners, one member of the group identified firing the president of his company as his worst decision. He explained, "Since we fired him, the search for a replacement has been awful. We've had two different people on the job. Sales are falling. The company's not doing well. We haven't had anybody come in who actually turns out to be as good as he was." That sounds like a disastrous result, but I was curious to probe into why the CEO thought the decision to fire his president was so bad (other than that it didn't work out). He explained the decision process and the basis of the conclusion to fire the president. "We looked at our direct competitors and comparable companies, and concluded we weren't performing up to their level. We thought we could perform and grow at that level and that it was probably a leadership issue." I asked whether the process included working with the president to understand his skill gaps and what he could be doing better. The company had, indeed, worked with him to identify his skill gaps. The CEO hired an executive coach to work with him on improving his leadership skills, the chief weakness identified. In addition, after executive coaching failed to produce improved performance, the company considered splitting the president's responsibilities, having him focus on his strengths and moving other responsibilities to another executive. They rejected that idea, concluding that the president's morale would suffer, employees would likely perceive it as a vote of no confidence, and it would put extra financial pressure on the company to split a position they believed one person could fill. Finally, the CEO provided some background about the company's experience making high-level outside hires and its understanding of the available talent. It sounded like the CEO had a reasonable basis for believing they would find someone better. I asked the assembled group, "Who thinks this was a bad decision?" Not surprisingly, everybody agreed the company had gone through a thoughtful process and made a decision that was reasonable given what they knew at the time. It sounded like a bad result, not a bad decision. The imperfect relationship between results and decision quality devastated the CEO and adversely affected subsequent decisions regarding the company. The CEO had identified the decision as a mistake solely because it didn't work out. He obviously felt a lot of anguish and regret because of the decision. He stated very clearly that he thought he should have known that the decision to fire the president would turn out badly. His decision-making behavior going forward reflected the belief that he made a mistake. He was not only resulting but also succumbing to its companion, hindsight bias. Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. When we say, "I should have known that would happen," or, "I should have seen it coming," we are succumbing to hindsight bias. Those beliefs develop from an overly tight connection between outcomes and decisions. That is typical of how we evaluate our past decisions. Like the army of critics of Pete Carroll's decision to pass on the last play of the Super Bowl, the CEO had been guilty of resulting, ignoring his (and his company's) careful analysis and focusing only on the poor outcome. The decision didn't work out, and he treated that result as if it were an inevitable consequence rather than a probabilistic one. In the exercise I do of identifying your best and worst decisions, I never seem to come across anyone who identifies a bad decision where they got lucky with the result, or a well-reasoned decision that didn't pan out. We link results with decisions even though it is easy to point out indisputable examples where the relationship between decisions and results isn't so perfectly correlated. No sober person thinks getting home safely after driving drunk reflects a good decision or good driving ability. Changing future decisions based on that lucky result is dangerous and unheard of (unless you are reasoning this out while drunk and obviously deluding yourself). Yet this is exactly what happened to that CEO. He changed his behavior based on the quality of the result rather than the quality of the decision-making process. He decided he drove better when he was drunk. Quick or dead: our brains weren't built for rationality The irrationality displayed by Pete Carroll's critics and the CEO should come as no surprise to anyone familiar with behavioral economics. Thanks to the work of many brilliant psychologists, economists, cognitive researchers, and neuroscientists, there are a number of excellent books that explain why humans are plagued by certain kinds of irrationality in decision-making. (If you are unaware of these books, see the Selected Bibliography and Recommendations for Further Reading.) But here's a summary. To start, our brains evolved to create certainty and order. We are uncomfortable with the idea that luck plays a significant role in our lives. We recognize the existence of luck, but we resist the idea that, despite our best efforts, things might not work out the way we want. It feels better for us to imagine the world as an orderly place, where randomness does not wreak havoc and things are perfectly predictable. We evolved to see the world that way. Creating order out of chaos has been necessary for our survival. When our ancestors heard rustling on the savanna and a lion jumped out, making a connection between "rustling" and "lions" could save their lives on later occasions. Finding predictable connections is, literally, how our species survived. Science writer, historian, and skeptic Michael Shermer, in The Believing Brain, explains why we have historically (and prehistorically) looked for connections even if they were doubtful or false. Incorrectly interpreting rustling from the wind as an oncoming lion is called a type I error, a false positive. The consequences of such an error were much less grave than those of a type II error, a false negative. A false negative could have been fatal: hearing rustling and always assuming it's the wind would have gotten our ancestors eaten, and we wouldn't be here. Seeking certainty helped keep us alive all this time, but it can wreak havoc on our decisions in an uncertain world. When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions. Different brain functions compete to control our decisions. Nobel laureate and psychology professor Daniel Kahneman, in his 2011 best-selling Thinking, Fast and Slow, popularized the labels of "System 1" and "System 2." He characterized System 1 as "fast thinking." System 1 is what causes you to hit the brakes the instant someone jumps into the street in front of your car. It encompasses reflex, instinct, intuition, impulse, and automatic processing. System 2, "slow thinking," is how we choose, concentrate, and expend mental energy. Kahneman explains how System 1 and System 2 are capable of dividing and conquering our decision-making but work mischief when they conflict. I particularly like the descriptive labels "reflexive mind" and "deliberative mind" favored by psychologist Gary Marcus. In his 2008 book, Kluge: The Haphazard Evolution of the Human Mind, he wrote, "Our thinking can be divided into two streams, one that is fast, automatic, and largely unconscious, and another that is slow, deliberate, and judicious." The first system, "the reflexive system, seems to do its thing rapidly and automatically, with or without our conscious awareness." The second system, "the deliberative system . . . deliberates, it considers, it chews over the facts." The differences between the systems are more than just labels. Automatic processing originates in the evolutionarily older parts of the brain, including the cerebellum, basal ganglia, and amygdala. Our deliberative mind operates out of the prefrontal cortex. Colin Camerer, a professor of behavioral economics at Caltech and leading speaker and researcher on the intersection of game theory and neuroscience, explained to me the practical folly of imagining that we could just get our deliberative minds to do more of the decision-making work. "We have this thin layer of prefrontal cortex made just for us, sitting on top of this big animal brain. Getting this thin little layer to handle more is unrealistic." The prefrontal cortex doesn't control most of the decisions we make every day. We can't fundamentally get more out of that unique, thin layer of prefrontal cortex. "It's already overtaxed," he told me. These are the brains we have and they aren't changing anytime soon. Making more rational decisions isn't just a matter of willpower or consciously handling more decisions in deliberative mind. Our deliberative capacity is already maxed out. We don't have the option, once we recognize the problem, of merely shifting the work to a different part of the brain, as if you hurt your back lifting boxes and shifted to relying on your leg muscles. Both deliberative and reflexive mind are necessary for our survival and advancement. The big decisions about what we want to accomplish recruit the deliberative system. Most of the decisions we execute on the way to achieving those goals, however, occur in reflexive mind. The shortcuts built into the automatic processing system kept us from standing around on the savanna, debating the origin of a potentially threatening sound while its source devoured us. Those shortcuts keep us alive, routinely executing the thousands of decisions that make it possible for us to live our daily lives. Excerpted from Thinking in Bets All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.