For me, one of the best parts of a holiday is the time sitting by the pool or at the beach (and less happily the airport) catching up on some serious reading. If there’s one profession that requires an insatiable capacity to read it is information security. I find it hard to keep up with the constant stream of journals, email newsletters, legislation, reports, blog updates and white papers to read and I don’t find enough time for books. On holiday I leave the laptop at home, and fill a suitcase with my reading pile.
One book I read this summer was Dan Gardner’s Risk – The Science and Politics of Fear. It is a well-written examination of the psychology of risk and fear which really addresses how we perceive risk. Bruce Schnier recommended it a few months ago and I wholeheartedly second his recommendation. A large part of our business is about assessing risk and then communicating that assessment to other people. Risk explains the ways our risk assessments are sub-consciously biased and also explains why people don’t always buy into our risk assessments.
The book has three parts:
In the first, Gardner condenses the academic research into risk perception. He explains the heuristics (rules-of-thumb) and biases that our unconscious mind (Gardner calls this gut) uses to assess risk and how these affect the conscious, reason-based decisions we’d like to think that our conscious mind (head) takes.
The second part of the book examines how governments, corporations and the media have used fear as an influencing tool to take advantage of these unconscious biases.
Finally Gardner looks at how these biases and heuristics affect our modern-day, risk-assessed world in the areas of crime, the environment and terrorism.
Why is this book useful to an information security professional?
All security professionals have to assess risk, it is the essence of what we do. So gaining an insight into how our own risk assessments can be hijacked is really useful. Understanding how other people can (intentionally or otherwise) manipulate our risk assessments and those of our colleagues and managers is even more valuable. Here are a few examples:
- I don’t know exactly how many new pieces of malware are produced each day, but because I’ve heard the figure 10,000 quite a few times, my personal ‘guess’ will be biased towards 10,000 not 10.This is the anchoring and adjustment heuristic. If you tell me that the real figure is 7,000 I’ll accept it as ‘about right’; if you tell me it’s really 1,000 I probably won’t believe you because my unconscious has anchored on ‘around 10,000’.
- The more times stories appear in the news about companies losing personal data, the more likely we – and our colleagues – think this will happen in our organisation. This is the availability heuristic at work. The availability heuristic says that the easier it is for our unconscious mind to recall an example of an event, the more we overestimate the likelihood of that event occurring. The opposite is also covered by the availability heuristic of course – the harder it is for us to recall an example of an event, the more we underestimate the likelihood of that event occurring.
- There’s also the affect heuristic, which describes how people’s assessment of the probability and impact of events is biased, based on whether their gut perception of the event is emotionally good or emotionally bad. Data theft is bad, so the affect heuristic means that our unconscious mind, and those of our colleagues and managers, instinctively overestimates the probability of data theft occurring.
Reading Risk – The Science and Politics of Fear provides information that will help you to be a more complete security professional. The next time you sit in a meeting arguing that something bad is extremely unlikely to occur you will be able to understand why you’re the only person in the room with that opinion. More importantly you’ll know what to do to get everyone at the table to move to a rational rather than instinctive approach to the risk assessment.