I've recently reread the book *The Drunkard's Walk* by Leonard Mlodinow, which is a book about many ways that probabilistic phenomena occur in daily life and what the consequences are for understanding individual & collective decisions. I say "reread" because the first time I read it was in high school (as I recall, although I don't remember exactly when, though I did mention it in a review for a different book, saying then that I didn't finish it because I didn't find it as engaging as the book in that review); a few days ago, I happened to see it on top a stack of books, and I figured it would be nice to reread for a few reasons. First, I have learned a lot of science, and my worldview has developed & matured a lot, since I was in high school, so I thought it would be good to see how this book would hold up in my view in that context. Second, I figured it would be nice to read a book about probabilistic phenomena, as it wasn't something that I had to worry much about in my college studies or in my PhD work (which is a little ironic, given that van der Waals forces and radiative heat transfer are phenomena of statistical thermodynamic fluctuations, but it turns out that certain mathematical formulations hide all essential randomness), it will be relevant to my postdoctoral work as I get more into travel surveys with associated statistical analysis, concepts like base rate fallacies are relevant for things like false positive result rates for tests associated with this coronavirus (**please note that I am not a public health expert, and please consult governmental public health agencies for guidance with respect to this ongoing pandemic**), and I've been thinking over the last several months about how many of the conceptual quandaries associated with quantum mechanics can actually be tied to questions of whether probability is emergent versus fundamental.

The book is not too long, and it is a quick & engaging read. The author uses many interesting examples to motivate the discussion of fallacious reasoning in the context of probability as well as ways that probability enters daily life even in areas where people expect more determinism. There are also many interesting historical anecdotes about the development of probability theory, especially how ancient Greece and certain medieval European societies believed that any discussion of uncertainty would go against their conceptions of a pure & deterministic universe (whatever the prime mover might be). Also, in the tenth chapter, there is an interesting discussion of how the development of chaos theory itself is an example of the unpredictable & seemingly random nature of human life (though I didn't like the conflation of chaos theory itself with probability, as chaos is a separate mathematical phenomena that can emerge in purely deterministic systems). Additionally, in the tenth chapter, I appreciated how the author is careful to state that determinism is a bad model only of human behavior (at individual & societal levels) and makes no claim about the applicability of determinism to the universe at large, and how the author makes a call for humility and for rewarding people based on their character instead of perpetuating beliefs that people who are successful are wholly responsible for their successes while people who are in marginalized circumstances are somehow rightfully being punished for past mistakes. Overall, I think the book does a good job of achieving its purpose of illustrating to lay readers how ubiquitous probabilistic phenomena are in even seemingly deterministic aspects of daily life.

Before getting into other criticisms, I should point out that my copy of this book has several printing errors (mostly missing words) and a few typographical errors, but these occurred maybe once every 10 pages (based on an instinctive guess), so these therefore didn't affect my understanding of the book. Also, the author errs in claiming that Germanic rule in the Dark Ages (commonly understood to be the medieval period) preceded the ancient Roman civilization, but this again doesn't undercut the overall argument.

Where this book falls short is in fulfilling its purpose of diving deeper into the implications of such randomness for human behavior at individual and societal levels; the author's sloppy treatment of human behavior is a recurring problem throughout the book. In particular, there are a few related broad issues that come up at various points through the book. The first is the question of how to reconcile the apparent randomness of daily events (including the phenomenon of regression to the mean) believed to be deterministic with the real phenomena of collective self-fulfilling prophecies (including emergent segregation of social groups to reinforce outcomes that are believed to be deterministic even if they are not, thereby reinforcing determinism in itself). The second is the treatment of things like superstitions as examples of self-fulfilling prophecies, even if the superstitions have no effects in their contents but may change mindsets enough to change outcomes. The author doesn't do a good job of addressing many of these issues throughout the book, and only partially acknowledges the power of self-fulfilling prophecies at the end of the book (in the tenth chapter); the author makes it seem like a slow & methodical build-up to a satisfying conclusion, but frankly, the discussion of these issues could have been a lot more clear & concise and could have come much sooner in the book. The discussion of superstition in particular is rife with condescension, as the author never acknowledges how superstitions may change mindsets & lead to self-fulfilling prophecies but instead summarily dismisses them as silly relics mostly of a bygone era, reinforced by statements about how science and religion were irreparably separated with the trial of Galileo with no nuanced discussion of how religious beliefs (even if not organized religious institutions per se, to the extent that was the case before Galileo) played a role in motivating scientific discoveries even after Galileo. Another example is how the author glibly dismisses many claims of clusters of environmentally-caused cancer; it may well be true that some cases are due to biased statistical analysis after the fact, but it doesn't really address why inequitable outcomes seem to occur so frequently in this context, and it doesn't do justice to the gravity of the problem. (**UPDATE**: I recognize that my argument against the author's treatment of the incidence of environmentally-caused cancer can easily be dismissed as an overly emotional reaction that is not justified by the statistics, so it is worth clarifying that further. My concern is that the statistical arguments that claim that environmentally-caused cancer is not really a problem, and that those who claim it is a problem only do so by drawing arbitrary boundaries after the fact to inflate apparent concentrations of carcinogens in specific areas, may themselves be riven with the same sorts of bias that are perpetuated in situations like machine learning determining the provision of health care, but are cast in a way that seems "neutral" and therefore "superior" to "emotionally-driven" arguments.) Furthermore, although there is discussion of both the failures of superficial statistical arguments in favor of DNA testing in the criminal justice system and of the way that Bayesian analysis can systematically codify learning of new information in terms of probabilities, there is little discussion of how these issues can combine in toxic ways to perpetuate existing societal biases under the veneer of formal Bayesian analysis (as occurs with machine learning now); I admit that I wouldn't have been thinking about this as much had I not read and reviewed the book *Weapons of Math Destruction* by Cathy O'Neil, but similar examples were already available at the time that the book that I review in this post was being written. I really see an essential condescension and a lack of humility throughout the book in these discussions of human behavior, masked by the pithy & irreverent writing, that are at odds with the author's own calls for humility & deeper understanding.

There are other aspects of human behavior that this book fails to adequately capture; these may technically be beyond the scope of this book, but I think they are worth noting anyway, as they speak to larger problems with the ability of people (even those well-trained in STEM fields) to really understand probability theory. The second chapter goes over many examples of how, in the technical language of probability, given events \( A \) and \( B \), certain questions can be framed such that laypeople and professional specialists (particularly doctors & lawyers) fall into the trap of believing that \( \operatorname{Pr}(A \cap B) > \operatorname{Pr}(A) \) even though the opposite is mathematically always true. However, I can already see that the phrasing of many of those questions, particularly the way that events \( A \) & \( B \) are juxtaposed (especially if \( B \) is additional information that may be relevant to the assessment of \( A \)), may make people believe either that what should be interpreted as \( \operatorname{Pr}(A) \) is actually \( \operatorname{Pr}(A \cap \neg B) \), in which case \( \operatorname{Pr}(A \cap B) > \operatorname{Pr}(A \cap \neg B) \) could in fact be true, or that what should be interpreted as \( \operatorname{Pr}(A \cap B) \) is actually the conditional probability \( \operatorname{Pr}(A|B) \), in which case \( \operatorname{Pr}(A|B) > \operatorname{Pr}(A) \) could in fact be true. This speaks more to the way that natural human language is unsuited to the subtleties of the language of probability theory, yet rather than address these possibilities, the author again leaves the discussion there, implying disdain for people who are too stupid to know better. Another problem is that throughout the book, the author raises the question of how to determine whether a particular sequence of observations of outcomes for a process that may be random reflects a specific probability distribution model, but never clearly explains how to do this in practice, instead only giving hints about this through various examples. This is related to the question of why one may prefer an explanation based on probabilities than based on deterministic phenomena, particularly for small sample sizes. For this, I will give an example. Consider exactly 5 observations of an event, which has binary outcomes (either success or failure), for which no other observations are made, and for which in all of those 5 observations, success occurs every single time. Intuitively, laypeople might be inclined to believe that there is a deterministic cause of this, while if a probability theorist were to initially believe that this is consistent with a binomial distribution with \( (N, p) = (5, 0.6) \) but then later revise this to \( (N, p) = (5, 0.99) \), laypeople could reasonably wonder why this would be justified, and why the probability theorist refuses to believe in the possibility of some deterministic causal relationship. Of course, this is a contrived example, I understand why causation needs to be proved as an alternative to a null hypothesis, and I understand that probability distributions closer to uniform probabilities are favored as those that maximize entropy (which essentially means that subject to certain known constraints, the probability distribution that best reflects the state of ignorance about a system is closest to uniform), but the author does not properly explain these points. Finally, the broadest problem with this book is that the author only superficially acknowledges the issue that if every calculation in probability theory or statistics, whether of a certain event happening, a string of events being a true "hot streak", or a model fitting data correctly, is itself a probability, then the aforementioned disconnect of this language of probability from natural human language makes it difficult to translate probabilities into robust rules for deterministic (usually binary) decisions that laypeople must make; this is related to the idea that in game theory, a single person playing a single-shot game cannot play a mixed strategy, and the concept of a mixed strategy only makes sense in the context of observing a large ensemble of independent players, possibly playing repeatedly. Perhaps asking the author to address this problem is too much, but I still feel like such failures diminish the book in comparison to its hype.

Without hyping my own credentials, I admit that it is possible that my reaction to this book is more of a reflection of my greater experience with STEM, humanities, and social science fields and with science education/communication compared to when I was in high school. Furthermore, just as this book exhorts, I cannot be overcome by either positivity bias or negativity bias; it would only be fair to take the good & bad parts of this book together as appropriate, without believing that one outdoes or cancels the other. Given this, I can't really make a strong recommendation that readers should or should not read this book.