A warning to the serially incompetent and the wicked

At last week’s Data Protection conference the new Information Commissioner – Christopher Graham – made his first public speech. With the title ICO: new powers, new funding and a new Commissioner it was certain to establish the direction we’d see the ICO taking for the next five years. The slides from the speech are available on the ICO’s web site (PDF), and the Commissioner didn’t disappoint.

All organisations need to be aware of the Commissioner’s new powers to fine those that breach the Data Protection Act. These powers come into force in April 2010. The good news is that the Commissioner still wants take a carrot-driven approach and help organisations to do the right thing, the ICO’s first reaction will always be to advise and assist.

However, the Commissioner was clear that he planned to use his new powers. The level of the fine has not been set by Government, and he’s lobbying for fixed fines with a maximum of “hundreds of thousands of pounds”. He anticipated that around 20 organisations – “ the serially incompetent and wicked”  – would feel his stick-based sanction in the first year.

If you’re concerned about how well your organisation complies with the Data Protection Act and how securely you look after the information you hold then there’s no better time for someone to have a look. I offer an integrated Information Security and Data Protection gap analysis that will show you just how well you’re doing, and suggest simple (and often low cost) ways to improve.

If you’d like to find out more then please call me on 020 8144 8456 or contact me. On the other hand if you are incompetent or just plain wicked then watch out — the ICO may still focus on the carrot-driven approach to compliance, but he’s about to get a really big stick that he intends to use.

The Other C, I and A of Information Security

Ask anyone who works in Information Security what the initials CIA mean and they will say “Confidentiality, Integrity and Availability”. These are the three measures used to assess the impact that an unwelcome event would have on an asset.

When I train people, I talk about another more important Information Security meaning of CIA: Common Sense, Intent and Application.

Common Sense

Good Information Security requires everyone to use their common sense. Have you ever wondered why some people have common sense and others do not? Why some users remember strong passwords, and others would think it is OK to use their cat’s name and then write it down on a post-it?

This common-sense-imbalance used to worry me until the day I heard Ira Winkler give a presentation where he argued that “there’s no common sense without common knowledge” and you know, he’s absolutely right.

When users (and sometimes security professionals) do something that’s as far from common sense as can be, I’ve found it’s generally because we don’t share a common knowledge. For instance:

I know it is wrong to write your password down because it allows someone to easily logon to a system and perhaps do bad things while pretending to be you – that’s common sense:
they don’t understand why anyone would ever want to do this.

I know that writing the password to an encrypted file on the CD holding the file devalues the encryption – that’s common sense:
they don’t understand why the data needed to be encrypted in the first place and what encryption means, and decided that the password was less likely to get lost if they wrote it on the CD.

In Information Security we are all guilty of assuming that everyone understands threats and vulnerabilities in the same way that we do; but they don’t, which is why their common sense doesn’t match ours. To develop an instinct for good common sense, you need common knowledge – which means proper education for your users and for the whole of your Information Security team.


My dad used to have a phrase that really annoyed me when I was a kid. He’d say “If a job’s worth doing it’s worth doing well”—especially when my homework came back with C grades. I’m reminded of this whenever people talk about doing projects or initiatives in Information Security.

My experience is that it is a waste of time to be half-hearted about security. Worse still, it can have the opposite effect to the one you intended.

Take any simple control that’s documented in a process or a policy. If people see it’s not enforced, or has a variable implementation based on someone’s position in the organisation chart, then it sends the message that all controls are optional.

It is better to do a few things well, than lots of things poorly.

Implement security with a positive intent to do it well. If you know you’re going to make a half-baked attempt at a project, pick a simpler project you know you’ll do well.


Much in of what we do in Information Security is dull. Checking, maintaining, documenting, cleaning, auditing, testing – just making sure that what needs to be done is done and done well.

My observation is that people – and especially technical people – get more excited about playing with new things than they do about keeping the old things going. Sure, they might not describe it as ‘playing’ and use works like ‘evaluating’, ‘installing’ or ‘configuring’ but at the end of the day it is the challenge and excitement of learning the new that excites them.

Good security though isn’t always about the new. It’s about doing the tedious stuff well and paying attention to it. It is about:

  • Checking the logs on a regular basis
  • Making sure that the roles defined in the role-based access are correct
  • Doing the lessons learned from an incident and following up the action points until they’re all completed
  • Updating the DR documentation when you change a server configuration
  • Cleaning the backup tape heads and verifying the backup worked properly
  • Filling out the visitor book for the server room
  • Writing the documentation for X before moving on to Y
  • Chasing the last person who didn’t complete the Data Protection training course

It takes application from everyone in the team to keep on top of these and hundreds of other little tasks. It takes application from management to make sure it happens.

So there you have it. My alternative exposition of the CIA triad:

Common Sense: Invest in the security education of users and the IT team

Intent: Plan on doing each security project or initiative well

Application: Keep doing  the dull things

Risk – a book worth reading

For me, one of the best parts of a holiday is the time sitting by the pool or at the beach (and less happily the airport) catching up on some serious reading. If there’s one profession that requires an insatiable capacity to read it is information security. I find it hard to keep up with the constant stream of journals, email newsletters, legislation, reports, blog updates and white papers to read and I don’t find enough time for books. On holiday I leave the laptop at home, and fill a suitcase with my reading pile.

One book I read this summer was Dan Gardner’s Risk – The Science and Politics of Fear. It is a well-written examination of the psychology of risk and fear which really addresses how we perceive risk. Bruce Schnier recommended it a few months ago and I wholeheartedly second his recommendation. A large part of our business is about assessing risk and then communicating that assessment to other people. Risk explains the ways our risk assessments are sub-consciously biased and also explains why people don’t always buy into our risk assessments.

The book has three parts:

In the first, Gardner condenses the academic research into risk perception. He explains the heuristics (rules-of-thumb) and biases that our unconscious mind (Gardner calls this gut) uses to assess risk and how these affect the conscious, reason-based decisions we’d like to think that our conscious mind (head) takes.

The second part of the book examines how governments, corporations and the media have used fear as an influencing tool to take advantage of these unconscious biases.

Finally Gardner looks at how these biases and heuristics affect our modern-day, risk-assessed world in the areas of crime, the environment and terrorism.

Why is this book useful to an information security professional?

All security professionals have to assess risk, it is the essence of what we do. So gaining an insight into how our own risk assessments can be hijacked is really useful. Understanding how other people can (intentionally or otherwise) manipulate our risk assessments and those of our colleagues and managers is even more valuable. Here are a few examples:

  • I don’t know exactly how many new pieces of malware are produced each day, but because I’ve heard the figure 10,000 quite a few times, my personal ‘guess’ will be biased towards 10,000 not 10.This is the anchoring and adjustment heuristic. If you tell me that the real figure is 7,000 I’ll accept it as ‘about right’; if you tell me it’s really 1,000 I probably won’t believe you because my unconscious has anchored on ‘around 10,000’.
  • The more times stories appear in the news about companies losing personal data, the more likely we – and our colleagues – think this will happen in our organisation. This is the availability heuristic at work.  The availability heuristic says that the easier it is for our unconscious mind to recall an example of an event, the more we overestimate the likelihood of that event occurring. The opposite is also covered by the availability heuristic of course –  the harder it is for us to recall an example of an event, the more we  underestimate the likelihood of that event occurring.
  • There’s also the affect heuristic, which describes how people’s assessment of the probability and impact of events is biased, based on whether their gut perception of the event is emotionally good or emotionally bad. Data theft is bad, so the affect heuristic means that our unconscious mind, and those of our colleagues and managers, instinctively overestimates the probability of data theft occurring.

Reading Risk – The Science and Politics of Fear provides information that will help you to be a more complete security professional.  The next time you sit in a meeting arguing that something bad is extremely unlikely to occur you will be able to understand why you’re the only person in the room with that opinion. More importantly you’ll know what to do to get everyone at the table to move to a rational rather than instinctive approach to the risk assessment.

Facebook applications can really steal your personal data

A couple of days ago I highlighted a post about security issues Facebook applications from the Light the Blue Touchpaper blog (the security research team at Cambridge). It came the week after I had spent two days giving repeated sessions of a “how to stay safe on the Internet course” and showing people how they could change their privacy settings to prevent non-Friends from seeing their personal information.

Joseph Bonneau has posted some research he’s done into the information that a rogue Facebook application can read from your profile. The brief summary is:

  1. Facebook applications can access all information that you can access.
  2. This means that they can access your profile and any information in your friends’ profiles they have shared with you.
  3. There is nothing to stop an application harvesting all this information and sending it to a third-party web site.

Put another way.

  1. Your friend installs a facebook application.
  2. Becuase you’ve shared parts of your profile with your friend, the application your friend just installed reads your information.
  3. The application your friend installed sends your information off to a database somewhere else.

So without you doing anything, or even knowing about it, someone’s just harvested your profile. My advice now would be to just simply delete all your profile information from Facebook and if you do keep any there, share it with no-one.

Of course this application violates Facebook’s rules, and they’ve now removed the offending application that Bonneau described, but I’m sure there will be others. Especially as sometimes Facebook makes it hard to actually understand what information you are sharing with who.

The full article: http://www.lightbluetouchpaper.org/2009/06/09/how-privacy-fails-the-facebook-applications-debacle/

and a similar one from the SocialHacking blog: http://theharmonyguy.com/2009/05/28/about-that-verification/

What’s the connection between human rights and information security?

I attended a couple of events over the past week. On Saturday I went to Liberty’s 75th Birthday Conference and on Thursday the ISSA UK Chapter event on the Data Protection Act (DPA).

I had planned to write about Lord Bingham’s excellent speech at the Liberty conference but after an interesting discussion at the ISSA event I thought it would be useful to first describe the fundamental link between Data Protection and Human Rights.

The 1998 Data Protection Act can trace its ancestry directly to Article 8 of the European Convention on Human Rights (ECHR) through the EU Directive on Data Processing (95/46) and the Council of Europe’s Treaty 108.

The European Convention for the Protection of Human Rights and Fundamental Freedoms


The ECHR was agreed by the Council of Europe after the (horrors of the) second world war and aimed to set out the basic Human Rights which all citizens of Europe would enjoy. There are some 13 rights defined within the 18 articles, including the right not to be tortured, the right to a fair trial and the right to freedom of thought and religion.

Article 8 of the ECHR states:

1. Everyone has the right to respect for his private and family life, his home and his correspondence.

2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.

The European Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data

Agreed in 1981 and generally known as treaty 108 (as ECPIAPPD isn’t a good acronym) this convention aimed to establish a pan-European framework to balance the article 8 right to privacy with the fact that data is processed by computers and will be exchanged across national borders.

The preamble states:

The member States of the Council of Europe, signatory hereto,

  • Considering that the aim of the Council of Europe is to achieve greater unity between its members, based in particular on respect for the rule of law, as well as human rights and fundamental freedoms;
  • Considering that it is desirable to extend the safeguards for everyone’s rights and fundamental freedoms, and in particular the right to the respect for privacy, taking account of the increasing flow across frontiers of personal data undergoing automatic processing;
  • Reaffirming at the same time their commitment to freedom of information regardless of frontiers;
  • Recognising that it is necessary to reconcile the fundamental values of the respect for privacy and the free flow of information between peoples,

Have agreed as follows:

And then if you go on to read the Convention you’ll find many familiar definitions such as ‘Data Subject’ and sections which are clearly forerunners of the DPAs eight principles:

Article 5 – Quality of data

Personal data undergoing automatic processing shall be:

  • obtained and processed fairly and lawfully;
  • stored for specified and legitimate purposes and not used in a way incompatible with those purposes;
  • adequate, relevant and not excessive in relation to the purposes for which they are stored;
  • accurate and, where necessary, kept up to date;
  • preserved in a form which permits identification of the data subjects for no longer than is required for the purpose for which those data are stored.

Article 7 – Data security

Appropriate security measures shall be taken for the protection of personal data stored in automated data files against accidental or unauthorised destruction or accidental loss as well as against unauthorised access, alteration or dissemination.

The first UK Data Protection Act came about in 1984 as a result of this convention.

(In fact both the 1984 DPA and Treaty 108 have a common influence in the work done in the 1970s by the UK Government’s Younger Committee on privacy)

Directive 95/46/EC

To give it its full name, Directive 95/46/EC of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data aimed to create a standard data protection environment across all countries in the EU. In broad terms the remit of the Council of Europe is political, and the EU’s remit is economic. The purpose of the EU directive was to harmonise and create a level playing field for data protection across the member states so that:

  • No organisation can gain a competitive advantage by processing data in a member state with poor (and therefore cheaper-to-implement) data protection legislation
  • Any European citizen will be confident that their personal data will be looked after to the same standard by any company based in any member state

And again if you look at some of the clauses of the EC directive they show their parentage from Treaty 108 and point to what their descendants will look like:

Article 6
1. Member States shall provide that personal data must be:

  1. processed fairly and lawfully;
  2. collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes. Further processing of data for historical, statistical or scientific purposes shall not be considered as incompatible provided that Member States provide appropriate safeguards;
  3. adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed;
  4. accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that data which are inaccurate or incomplete, having regard to the purposes for which they were collected or for which they are further processed, are erased or rectified;
  5. kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected or for which they are further processed. Member States shall lay down appropriate safeguards for personal data stored for longer periods for historical, statistical or scientific use.

The 1998 Data Protection Act

All member states were required to enact legislation to implement the EC directive and so the UK government passed the 1998 Data Protection Act with its now-familiar eight principles. You can see that the principles trace their ancestry back to article eight of the ECHR via the EC Directive and the Council of Europe Treaty 108.

  1. Personal data shall be processed fairly and lawfully
  2. Personal data shall be obtained only for one or more specified and lawful purposes
  3. Personal data shall be adequate, relevant and not excessive
  4. Personal data shall be accurate and, where necessary, kept up to date
  5. Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes
  6. Personal data shall be processed in accordance with the rights of data subjects under this Act.
  7. Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data
  8. Personal data shall not be transferred to a country or territory outside the European Economic Area ….

On a day-to-day basis I deal with all sorts of technical matters, talk about ‘risks’ and ‘controls’ and help organisations comply with the Data Protection Act. It’s good to remember that in a small way, alongside commercial imperatives, I’m also helping to protect one of the fundamental human rights of people’s privacy.

Liberty – formerly the National Council for Civil Liberties – is dedicated to protecting civil liberties and promoting human rights for everyone. Founded in 1934 they held their 75th birthday conference last weekend with an impressive set of speakers including Lord Bingham, Jack Straw, Nick Clegg, Dominic Grieve, Tony Benn, Ken Macdonald (ex-DPP), Sarah Ludford MEP and Privacy International’s Simon Davis.

Liberty’s 75th Birthday Conference

The conference keynote given by Lord Bingham (who was the most senior Law Lord until retirement) addressed the position of the 1998 Human Rights Act (HRA). The HRA incorporates the ECHR into UK law, and after the next general election there’s the probability that whoever wins will try to amend or replace it. Often the debate on the HRA and the European Convention is ill-informed with people confusing the ECHR, the UK Act, the EU and the Council of Europe and calling for a plague on all of it.

Lord Bingham’s speech was a lesson in clarity and a well argued defence of the HRA. He made these ten points:

  1. Just because the ECHR starts with the word ‘European’ doesn’t mean that it’s some foreign import. British politicians made a huge contribution in drafting it and the UK was the first country to ratify it.
  2. The UK is bound by the ECHR under international law. If a Government repealed the HRA, it would still be bound by the convention.
  3. All that the 1998 HRA did was allow people in the UK to assert their rights under the convention in UK law, without having to go to the European Court of Human Rights in Strasbourg.
  4. The HRA does not transfer interpretative power from politicians to judges. Before the HRA, European judges in the European Court had interpretative power, now after the HRA it is UK judges who have that interpretative power.
  5. The HRA is not undemocratic. Judges can not overturn the will of Parliament, the ‘worst’ they can do is declare an Act of Parliament as incompatible with the ECHR, but it is Parliament that has to work our what to do – only Parliament can repeal and amend Acts.
  6. The HRA is criticised as elevating the rights of the individual above the community. In some respects this is true – articles such as the right not be enslaved are absolute. For the non absolute rights (such as Article 8’s right to privacy) the rights of the individual always have to be balanced with the rights of the community.
  7. Another criticism is that the HRA and ECHR only mention rights and not responsibilities. True – but our responsibilities as citizens are enshrined in a detailed manner in countless other Acts. If there are duties and responsibilities that are not defined they need to be clearly defined in law, not in well-meaning statements.
  8. The ECHR is a minimum standard, not a ceiling. It doesn’t stop a UK government from creating better protection for human rights if it wants to.
  9. People criticise the HRA for ‘foolish decision making’. In Lord Bingham’s opinion the level of judicial decisions in cases on the HRA and ECHR is no more foolish than elsewhere.
  10. The fundamental rights and freedoms protected by the ECHR and the HRA are just that – basic rights which everyone should enjoy by virtue of their existence. Which should be discarded?

If you’ve ever wanted to understand where the Human Rights Act came from, or why it’s important then I highly recommend reading Lord Bingham’s full speech (PDF).

Users divulge their passwords to strangers

This won’t be a surprise to anyone who works in IT security, but last week the BBC reported how easy it was for an experienced security consultant doing some pretty basic ‘social engineering’ to:

  • Walk into buildings unchallenged and steal data from a company
  • Get users to divulge their password over the telephone to someone who claimed ‘to be from IT’

The video is about 90 seconds long — and really worth watching.

What lessons we can learn from this?

All security programmes tell users not to divulge their password to anyone — not even people in IT. However this doesn’t work: when it comes to their computer systems, users see IT as ‘authority figures’ and so will divulge their password — especially if they are coerced.

What you need to do is:

1. Train the IT team not to ask users for passwords

Put a process like this in place so if an IT person needs to login as a user they should:

  1. Raise a ticket / case in the organsiation’s help desk — which should ideally to be authorised by someone else
  2. Confirm with the user that they are about to login using the user’s account
  3. Login as a system administrator and change the user’s password to something that only they know
  4. Now the IT person can login as the user (the time of the login and the activcity carried out whilst logged in should be logged with the original ticket)
  5. Once completed, they should login as a system administrator and set the user’s password to a one time password which is communicated to the user
  6. The user can now login and must be forced to reset their password to one of their own choosing

This is a long process, so it’s no wonder that IT people take a short-cut and just ask a user for their password, so you also have to:

2. Train all staff that if anyone “from IT” asks for their password, it’s a security incident

As part of your awareness training, emphasise that the only people who would ask for their password are:

  • Data pirates (criminals)
  • Rogue IT people

So if someone rings up and asks them for their password, they should refuse to give it and report it to the Information Security Manager (or their equivalent).

Of course it’s rare nowadays that IT should need to login as a user, there are better ways to diagnose a problem such as remotely sharing a user’s screen (once the appropriate authorisation has been given).

Here’s a little cartoon I’ve used in training and included in internal newsletters.

Data Pirates stealing a password

(if you want a print-quality version, or one customised to your company then just send me an email)