There are few people who would argue with the statement that math is at the heart for most of our modern world. What is less well understood is what happens when that math goes wrong. And it does. All the time!

Mr. Parker’s highly amusing and thought-provoking book is about math and computers, but what becomes clearer as the book goes on is that this is also a book about systems and how and why systems can fail. There are lots of examples of people adding up numbers incorrectly or trying to take shortcuts to make the math simpler, which in turn leads to devastating and sometimes lethal consequences. However, it the subtler applications of mathematics where “Humble Pi” really scores.

For example, looking at 30- or 40-year-old kitchen appliance, still in use, is often accompanied by a phrase such as “they don’t make things today like they used to.” While this might seem obvious at first glance given that we are talking about an appliance working well beyond its expected lifespan, this is actually an example of “Survivor Bias.” If we looked at how many of the appliances had been manufactured, and then looked at how many were still in daily use, the chances are that we would recognize that this surviving appliance is an outlier and that the vast majority of the appliances have actually long been replaced or broken down. It is only the existence of this surviving outlier that prompts the idea even though we would likely not comment on its existence were more of the appliances in existence. The appliance’s rarity generates a false narrative that can only be understood by understanding the underlying math of the number of appliances produced.

For managers there is much to take away from Humble Pi. Mr. Parker encourages us to look at systems like layers of sliced Swiss cheese. All systems should be made of multiple layers – the checks and balances of any good system. But it is important to understand that there are possibilities for mistakes in every layer of a system – the holes in the cheese. The challenge as designers of systems is to ensure that the holes in each layer do not align. The author uses the example of two different nurses in a hospital performing a complicated drug calculation the same way and both making the same math mistake leading to a medical error.

Related to this idea of errors being a natural part of a system is the impact of a lack of tolerance for errors on new employee training. If managers terminate employees for making mistakes, the people who are left to train new employees are those who are must less likely to make mistakes. These are probably the worst people to train new employees who are obviously more prone to making mistakes. If instead, we teach employees to work a system that can detect mistakes and provide feedback, a system where the holes do not line up, then we will overall have far less mistakes – even when people are new. As the books says, humans can be very resourceful in finding ways to make mistakes.

This is not just a book about rounding errors, and why you should turn your computer off regularly. It is a book about what it means to be human in a world that relies and is built on mathematics, which humans are inherently not very good at. It is a fun and interesting read that will stay with you long after you put it down.