For me, one of the best parts of a holiday is the time sitting by the pool or at the beach (and less happily the airport) catching up on some serious reading. If there’s one profession that requires an insatiable capacity to read it is information security. I find it hard to keep up with the constant stream of journals, email newsletters, legislation, reports, blog updates and white papers to read and I don’t find enough time for books. On holiday I leave the laptop at home, and fill a suitcase with my reading pile.
One book I read this summer was Dan Gardner’s Risk – The Science and Politics of Fear. It is a well-written examination of the psychology of risk and fear which really addresses how we perceive risk. Bruce Schnier recommended it a few months ago and I wholeheartedly second his recommendation. A large part of our business is about assessing risk and then communicating that assessment to other people. Risk explains the ways our risk assessments are sub-consciously biased and also explains why people don’t always buy into our risk assessments.
The book has three parts:
In the first, Gardner condenses the academic research into risk perception. He explains the heuristics (rules-of-thumb) and biases that our unconscious mind (Gardner calls this gut) uses to assess risk and how these affect the conscious, reason-based decisions we’d like to think that our conscious mind (head) takes.
The second part of the book examines how governments, corporations and the media have used fear as an influencing tool to take advantage of these unconscious biases.
Finally Gardner looks at how these biases and heuristics affect our modern-day, risk-assessed world in the areas of crime, the environment and terrorism.
Why is this book useful to an information security professional?
All security professionals have to assess risk, it is the essence of what we do. So gaining an insight into how our own risk assessments can be hijacked is really useful. Understanding how other people can (intentionally or otherwise) manipulate our risk assessments and those of our colleagues and managers is even more valuable. Here are a few examples:
I don’t know exactly how many new pieces of malware are produced each day, but because I’ve heard the figure 10,000 quite a few times, my personal ‘guess’ will be biased towards 10,000 not 10.This is the anchoring and adjustment heuristic. If you tell me that the real figure is 7,000 I’ll accept it as ‘about right’; if you tell me it’s really 1,000 I probably won’t believe you because my unconscious has anchored on ‘around 10,000’.
The more times stories appear in the news about companies losing personal data, the more likely we – and our colleagues – think this will happen in our organisation. This is the availability heuristic at work. The availability heuristic says that the easier it is for our unconscious mind to recall an example of an event, the more we overestimate the likelihood of that event occurring. The opposite is also covered by the availability heuristic of course – the harder it is for us to recall an example of an event, the more we underestimate the likelihood of that event occurring.
There’s also the affect heuristic, which describes how people’s assessment of the probability and impact of events is biased, based on whether their gut perception of the event is emotionally good or emotionally bad. Data theft is bad, so the affect heuristic means that our unconscious mind, and those of our colleagues and managers, instinctively overestimates the probability of data theft occurring.
Reading Risk – The Science and Politics of Fear provides information that will help you to be a more complete security professional. The next time you sit in a meeting arguing that something bad is extremely unlikely to occur you will be able to understand why you’re the only person in the room with that opinion. More importantly you’ll know what to do to get everyone at the table to move to a rational rather than instinctive approach to the risk assessment.
A couple of days ago I highlighted a post about security issues Facebook applications from the Light the Blue Touchpaper blog (the security research team at Cambridge). It came the week after I had spent two days giving repeated sessions of a “how to stay safe on the Internet course” and showing people how they could change their privacy settings to prevent non-Friends from seeing their personal information.
Joseph Bonneau has posted some research he’s done into the information that a rogue Facebook application can read from your profile. The brief summary is:
Facebook applications can access all information that you can access.
This means that they can access your profile and any information in your friends’ profiles they have shared with you.
There is nothing to stop an application harvesting all this information and sending it to a third-party web site.
Put another way.
Your friend installs a facebook application.
Becuase you’ve shared parts of your profile with your friend, the application your friend just installed reads your information.
The application your friend installed sends your information off to a database somewhere else.
So without you doing anything, or even knowing about it, someone’s just harvested your profile. My advice now would be to just simply delete all your profile information from Facebook and if you do keep any there, share it with no-one.
Of course this application violates Facebook’s rules, and they’ve now removed the offending application that Bonneau described, but I’m sure there will be others. Especially as sometimes Facebook makes it hard to actually understand what information you are sharing with who.
This won’t be a surprise to anyone who works in IT security, but last week the BBC reported how easy it was for an experienced security consultant doing some pretty basic ‘social engineering’ to:
Walk into buildings unchallenged and steal data from a company
Get users to divulge their password over the telephone to someone who claimed ‘to be from IT’
The video is about 90 seconds long — and really worth watching.
What lessons we can learn from this?
All security programmes tell users not to divulge their password to anyone — not even people in IT. However this doesn’t work: when it comes to their computer systems, users see IT as ‘authority figures’ and so will divulge their password — especially if they are coerced.
What you need to do is:
1. Train the IT team not to ask users for passwords
Put a process like this in place so if an IT person needs to login as a user they should:
Raise a ticket / case in the organsiation’s help desk — which should ideally to be authorised by someone else
Confirm with the user that they are about to login using the user’s account
Login as a system administrator and change the user’s password to something that only they know
Now the IT person can login as the user (the time of the login and the activcity carried out whilst logged in should be logged with the original ticket)
Once completed, they should login as a system administrator and set the user’s password to a one time password which is communicated to the user
The user can now login and must be forced to reset their password to one of their own choosing
This is a long process, so it’s no wonder that IT people take a short-cut and just ask a user for their password, so you also have to:
2. Train all staff that if anyone “from IT” asks for their password, it’s a security incident
As part of your awareness training, emphasise that the only people who would ask for their password are:
Data pirates (criminals)
Rogue IT people
So if someone rings up and asks them for their password, they should refuse to give it and report it to the Information Security Manager (or their equivalent).
Of course it’s rare nowadays that IT should need to login as a user, there are better ways to diagnose a problem such as remotely sharing a user’s screen (once the appropriate authorisation has been given).
Here’s a little cartoon I’ve used in training and included in internal newsletters.
(if you want a print-quality version, or one customised to your company then just send me an email)