Skip to content
Elvis Chidera

Thinking in Bets — Book summary

summary, book9 min read

This is my summary of the key ideas of Thinking in Bets written by Annie Duke.


One of the key insight of this book that resonated with me is being able to think and communicate probabilistically. In most cases, we don’t know something for a fact but somehow we communicate as if we do.

A feedback I recently gave to someone:

Going from “That doesn’t make sense” (or its other variants) to “I think that doesn’t make sense [because …]” (or its other variants) has a few advantages:

  • It’s a better model of your knowledge. There are uncertainties, things you don’t know, and variables you are not considering (see Chesterton's fence).
  • It invites a conversation that seeks an objective truth. The former statement is more likely to put the other person on the defensive or have them abandon the alternative no matter the merit (because some people don’t want to say you are wrong especially when you are a senior). Expressing uncertainty upfront removes the right vs wrong dichotomy.

  1. A bet is a decision about an uncertain future.
  2. This is not a book about poker strategy or gambling. It is, however, about things poker taught the author about learning and decision-making.
  3. The outlook of one’s life is a function of skill (decision quality) and luck.
  4. Resulting: A term used by poker players to describe our tendency to equate the quality of a decision with the quality of its outcome.
  5. Test:
    • What was your best decision last year?
    • What was your worst decision last year?
    • For most people, their best decision preceded a good result and the worst decision proceeded a bad result.
  6. Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable.
  7. Our brains evolved to create certainty and order:

    We are uncomfortable with the idea that luck plays a significant role in our lives. We recognize the existence of luck, but we resist the idea that, despite our best efforts, things might not work out the way we want. It feels better for us to imagine the world as an orderly place, where randomness does not wreak havoc and things are perfectly predictable. We evolved to see the world that way. Creating order out of chaos has been necessary for our survival.

  8. Psychologist Gary Marcus in his book, Kluge: The Haphazard Evolution of the Human Mind:
    • “Our thinking can be divided into two streams, one that is fast, automatic, and largely unconscious, and another that is slow, deliberate, and judicious.”
    • The first system, "the reflexive system, seems to do its thing rapidly and automatically, with or without our conscious awareness."
    • The second system, "the deliberative system ... deliberates, it considers, it chews over the facts."
  9. Game theory is the study of mathematical models of conflict and cooperation between intelligent rational decision-makers. — Roger Myerson

  10. von Neumann on Poker vs Chess:

    Chess is not a game. Chess is a well-defined form of computation. You may not be able to work out the answers, but in theory, there must be a solution, a right procedure in any position. Now, real games are not like that at all. Real life is not like that. Real life consists of bluffing, of little tactics of deception, of asking yourself what is the other man going to think I mean to do. And that is what games are about in my theory.

  11. Chess, for all Its strategic complexity, isn't a great model for decision-making in life, where most of our decisions involve hidden information and a much greater influence of luck.
  12. Poker is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time.

    You could make the best possible decision at every point and still lose the hand because you don't know what new cards will be dealt and revealed.

  13. In chess, outcomes correlate more tightly with decision quality. In poker, it is much easier to get lucky and win, or get unlucky and lose.
  14. Don’t underestimate the amount and effect of what you don’t know:
    • Suppose someone says, "I flipped a coin and it landed heads four times in a row. How likely is that to occur?"
    • It will be a mistake to immediately start calculating based on the assumption of a fair coin.
    • You need information about the coin and/or the person flipping it.
  15. "Thoroughly conscious ignorance is the prelude to every real advance in science. — James Clerk Maxwell

  16. A great decision is the result of a good process (not outcome), and that process must include an attempt to accurately represent our state of knowledge. That state of knowledge, in turn, is some variation of "I'm not sure."
  17. "I'm not sure" does not mean that there is no objective truth. It means that we treat our beliefs as works in progress, as under construction.
  18. Decisions are bets on the future, and they aren't “right” or “wrong” based on whether they turn out well on any particular iteration. An unwanted result doesn't make our decision wrong if we thought about the alternatives and probabilities in advance and allocated our resources accordingly.
  19. We know from Daniel Kahneman and Amos Tversky's work on loss aversion, that losses in general feel about two times as bad as wins feel good.
  20. Whenever we choose an alternative, we are automatically rejecting every other possible choice.
  21. All decisions are bets. Not placing a bet on something is itself, a bet.
  22. For survival-essential skills, type Ⅰ errors (false positives) were less costly than type Ⅱ errors (false negatives). In other words, better to be safe than sorry, especially when considering whether to believe that the rustling in the grass is a lion.
  23. We form beliefs without vetting most of them, and maintain them even after receiving clear, corrective information.
  24. Our pre-existing beliefs influence the way we experience the world.
  25. Motivated reasoning: we notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief.
  26. Disinformation is different than fake news in that the story has some true elements, embellished to spin a particular narrative.
  27. Being can make bias worse: the smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view.
  28. Being asked "Wanna bet?" triggers us to:
    • Remember that our beliefs are works in progress.
    • Examine our information in a less biased way
  29. We can train ourselves to view the world through the lens of "Wanna bet?":

    Once we start doing that, we are more likely to recognize that there is always a degree of uncertainty, that we are generally less sure than we thought we were, and that practically nothing is black and white (0% or 100%).

  30. We would be better served as communicators and decision-makers if we thought less about whether we are confident in our beliefs and more about how confident we are.
  31. Not much is ever certain. Samuel Arbesman's The Half-Life of Facts is a great read about how practically every fact we've ever known has been subject to revision or reversal.
  32. Incorporating uncertainty in the way we think about what we believe creates open-mindedness, moving us closer to a more objective stance toward information that disagrees with us.
  33. Statements of science are not of what is true and what is not true, but statements of what is known to different degrees of certainty[…] Every one of the concepts of science is on a scale graduated somewhere between, but at neither end of, absolute falsity or absolute truth. — Richard Feynman

  34. There is a big difference between getting experience and becoming an expert. That difference lies in the ability to identify when the outcomes of our decisions have something to teach us and what that lesson might be.
  35. The way we field outcomes is predictably patterned:
    • We take credit for the good stuff and
    • Blame the bad stuff on luck so it won't be our fault.
  36. Self-serving bias is the term for this pattern of fielding outcomes. The result is that we don't learn from experience well.
  37. The pattern is flipped when evaluating the performance of others. As Jean Cocteau said: "We must believe in luck. For how else can we explain the success of those we don't like?"
  38. Just as with motivated reasoning, self-serving bias arises from our drive to create a positive self-narrative.
  39. What accounts for most of the variance in happiness is how we're doing comparatively.
  40. Habits operate in a neurological loop consisting of three parts: the cue, the routine, and the reward.
  41. Charles Duhigg, in The Power of Habit, offers the golden rule of habit change — that the best way to deal with a habit is to respect the habit loop:

    To change a habit, you must keep the old cue, and deliver the old reward but insert a new routine.

  42. While a group can function to be better than the sum of the individuals, it doesn't automatically turn out that way. The group mustn’t turn into an echo chamber that exacerbates our tendency to confirm what we already believe.
  43. Groups can improve the thinking of individual decision-makers when the individuals are accountable to a group whose interest is in accuracy and open to a diversity of ideas.
  44. Our craving for approval is incredibly strong and incentivizing. In studies, participants expected to explain their actions to someone they'd never met and never expected to meet again.
  45. Accountability is a willingness or obligation to answer for our actions or beliefs to others. A bet is a form of accountability.
  46. Diversity and dissent are not only checks on fallibility, but the only means of testing the ultimate truth of an opinion.
  47. IQ is positively correlated with the number of reasons people find to support their side in an argument.
  48. Acceptance or rejection of an idea must not depend on the personal or social attributes of the protagonist.
  49. The accuracy of the statement should be evaluated independently of its source.
  50. If the group is blind to the outcome, it produces a higher fidelity evaluation of decision quality. The group is less likely to succumb to ideological conflicts of interest when they don't know what the interest is.
  51. Another way a group can de-bias members is to reward them for skill in debating opposing points of view and finding merit in opposing positions.
  52. Skepticism is about approaching the world by asking why things might not be true rather than why they are true. It's a recognition that, while there is an objective truth, everything we believe about the world is not true.
  53. Thinking in bets embodies skepticism by encouraging us to examine what we do and don't know and what our level of confidence is in our beliefs and predictions. This moves us closer to what is objectively true.
  54. There are several ways to communicate to maximize our ability to engage in a truth-seeking way with anyone:
    • Express uncertainty: It invites everyone around us to share helpful information and dissenting opinions.
    • Lead with assent: For example, listen for the things you agree with, state those, and be specific, and then follow with "and" (supplementing) instead of "but" (negating).
    • Ask for a temporary agreement to engage in truth-seeking. If someone is offloading emotion to us, we can ask them if they are just looking to vent or if they are looking for advice: "Do you want to just let it all out, or are, you thinking of what to do about it next?"
    • Focus on the future.
  55. When we make in-the-moment decisions (and don't ponder the past or future), we are more likely to be irrational and impulsive.
  56. This tendency we all have to favor our present self at the expense of our future self is called temporal discounting.
  57. Business journalist and author Suzy Welch developed a popular tool known as 10-10-10 that has the effect of bringing future-us into more of our in-the-moment decisions. “Every 10-10-10 process starts with a question. ... What are the consequences of each of my options in ten minutes? In ten months? In ten years?"
  58. The overestimation of the impact of any individual moment on our overall happiness is the emotional equivalent of watching the ticker in the financial world.
  59. Watching the ticker doesn't just magnify what has happened in the very recent past. It distorts our view of it as well.
  60. The way we field outcomes is path-dependent. It doesn't so much matter where we end up as to how we got there. What has happened in the recent past drives our emotional response much more than how we are doing overall.
  61. Our feelings are not a reaction to the average of how things are going.
  62. Our in-the-moment emotions affect the quality of the decisions we make in those moments, and we are very willing to make decisions when we are not emotionally fit to do so.
  63. If you blow some recent event out of proportion and react drastically, you're on tilt.
  64. Most illustrations of Ulysses's contracts, like the original, involve raising a barrier against irrationality. But these kinds of pre-commitment contracts can also be designed to lower barriers that interfere with rational action.
  65. When faced with highly uncertain conditions, military units and major corporations sometimes use an exercise called scenario planning. The idea is to consider a broad range of possibilities for how the future might unfold to help guide long-term planning and preparation. — Nate Silver

  66. Being able to respond to the changing future is a good thing; being surprised by the changing future is not.
  67. Scientists found that prospective hindsight — imagining that an event has already occurred — increases the ability to correctly identify reasons for future outcomes by 30%.
  68. The most common form of working backward from our goal to map out the future is known as backcasting. In backcasting, we imagine we've already achieved a positive outcome, then we think about how we got there.
  69. A premortem is an investigation into something awful, but before it happens.
  70. Backcasting and premortems complement each other. Backcasting imagines a positive future; a premortem imagines a negative future.

Once something occurs, we no longer think of it as probabilistic — or as ever having been probabilistic. This is how we get into the frame of mind where we say, "I should have known" or "I told you so." This is where unproductive regret comes from.

© 2024 by Elvis Chidera. All rights reserved.