Category Archives: Data Protection Act

Grand Central: Great trains, terrible terms

Recently I travelled to York on Grand Central Railway. I really like their train service because you pay the same fare whether you buy your ticket in advance, at the station, or on the train. I really dislike the terms and conditions for using their on-board wi-fi.

“Grand Central reserves the right to include the name, address and other relevant information relating to the User in a directory for the use of Grand Central users or other third parties, unless specifically requested by the User in writing not to do so.”

As a fair processing notice designed to let the user know what Grand Central will do with your data, this fails.

  • I guess by ‘directory’ they mean ‘database’. Directory is a terrible word to use, as most people’s mental model will be of something that’s open to anyone to consult – like a telephone directory.
  • It doesn’t say what use will be made of the data, just the types of people (Grand Central users and other third parties) who can use it.
  • It gives no indication of what could be relevant information. It could mean that they collect details of all the web sites you visit when using that connection, and add those to their ‘directory’.
  • If you were to apply the Information Commissioner’s Principle One test – what would the user expect Grand Central to do with their data?

Needless to say, I didn’t use the wi-fi, but emailed their customer service department once I was back on a real connection. Their response was:

“This is a generic condition from our WiFi service provider. The only detail we collect is email address and we may use this from time to time to contact users with details of Grand Central, offers and promotions. If you wish to be removed from the directory please inform us in writing.”

Which is a much better statement of the data they are collecting, and what they plan to do with it — essentially the fair processing notice that should have been available for using the wi-fi.

There are some lessons here:

  • Telling a user what data you’re collecting and what you are going to do with it is one of the fundamental principles of the DPA.
  • If you use generic text from someone else, then you risk being in breach of the first and second data protection principles.
  • Breaching the DPA at best gets you a letter from the ICO, and perhaps you’re added to his list of ‘potential incompetents’. After all, if you can’t write a basic statement of what you’re going to do with people’s data, you might be equally relaxed about how you look after it. Perhaps all the routers and file servers at Grand Central still have their generic passwords?

Data Sharing and the Blue Badge Parking Scheme

Back in 2008 the government announced that they were going to reform some of the ways the disabled parking / blue-badge scheme worked to reduce the amount of fraudulent use. When I heard this discussed on the radio, the government’s spokesman talked about providing £10 million towards a data sharing scheme to enable a council parking attendant to check on the validity of a blue badge issued by another council.

I have a knee-jerk adverse reaction to the words “government” and “data sharing” – especially when they are used in the same context as “the prevention and detection of crime”, so I checked out the strategy document (PDF) on the Department for Transport’s (DfT) site and was pleasantly surprised to find a sensible proposal:

“The preferred option going forward is to create a system which allows sharing of data through the linking of existing local authority databases. DfT will provide local authorities with up to £10m in funding over the next three years to establish a system of data-sharing.”

That was back in October 2008, and now a consultant has finished a survey of all the IT systems local councils use to administer the scheme, the DfT is starting to run data sharing workshops with local councils, beginning to design the system (December status update – PDF).

In the meantime Rochdale council has made a successful bid to the Government Connect benefits realisation fund to investigate the “establishment of a national database with local access” for the blue badge scheme.

So, it will be interesting to see if a distributed approach is maintained and I’d like to offer my suggestions so that privacy is built in from the start. Because when you look at the problem, there is probably no need to share data.

Implement a simple question and answer approach. Not data sharing and not a centralised database.

Whose data is it?

People apply to their local council to issue a permit, so it is the job of the local council to look after that data. It’s the permit holder’s data that they entrust to the local council and in Data Protection Act terms, the local council is the Data Controller. The name of the issuing council is written on the permit along with a permit number (that also identifies the gender of the owner) and the date the permit expires.

Who needs to access it?

Parking enforcement officers from all over the UK (and perhaps eventually Europe) don’t need access to any more data than is written on the permint.

All they need is the answer to one question: “is this permit valid, invalid or being used illegally?”.

They don’t need to see any of the information that the issuing council has about the permit owner.

A parking officer may also like to report a concern to the issuing council – that they suspect the permit may be being used illegally. Sending this information to the council that issued the permit would then allow the council to get in touch with the permit holder directly. This keeps the relationship between the local council and the permit holder and doesn’t make the permit holder subject to potentially inconsistent actions of parking attendants anywhere in the country.

A network of local databases:

From a technical perspective, the system constraints are simply this:

  • Each council needs to keep the responsibility of looking after the data of their permit holders.
  • Other authorities (who are properly authorised and validated by the issuing council) need to be able to ask a question of this information, and receive an answer.

So here’s one way of building this system.

Each council maintains their own database of permits and permit holders (as the DfT initially suggests). They look after the security of the data and they don’t export the data to any other system.

Each council issues all of the other councils an electronic access key that allows them to ask a validity question from the issuing council’s database.

Whenever a parking enforcement officer needs to check whether a permit is valid, they send:

  • The permit ID in question
  • Their ID (e.g. their badge number – something that can individually identify them)
  • Their council’s access key

to the council that issued the permit (they can read this from the permit). The issuing council would then reply with one of four answers:

  1. We didn’t issue that permit. (It’s probably a forgery.)
  2. We issued that permit, and the permit is valid.
  3. The permit is invalid (it may have just expired — this allows the issuing council to set their own grey-area) so doesn’t confer any rights to disregard parking restrictions.
  4. The permit is invalid and has been reported stolen or withdrawn by the issuer and should be seized.

The parking attendant can then perform the relevant statutory actions.

No personal data needs to be shared between the issuing council and the parking attendant, wherever they are in the country.

Notes

  1. I’m not an expert on parking, permit fraud or enforcement. There may be many reasons why this simple query / answer approach wouldn’t solve the problems with fraudulent permit use. However, this is the best place to start. If people think that a parking enforcement officer needs more information then they should make the case for this. It is always best to share the minimum amount of data necessary to remain compliant with the third (only get and use data you need) data protection principle.
  2. I’ve simplified this discussion to the broad question of data copying, data sharing or my preferred question:response which would share the minimum of personal information. There’s a separate technical discussion about the best way of achieving this, and whether it would be best implemented using public-private key encryption, with a central-key management system operated jointly by all councils. There would be some other issues to explore around how long a key is valid for, and how a local council revokes another authority’s access.
  3. I’d also be tempted to consider whether using near-field RFID chips in the permits would add value to the system and make the permits harder to forge. It would also reduce the frequency of number keying errors by a glove-wearing parking attendant on a cold day, as their terminal would just be able to read the permit ID through the windscreen.

An analysis of the T-Mobile breach

There’s been a lot in the press for the past few days about the recent T-Mobile breach. Basically it appears that a number of staff at the mobile phone company have been selling customer data which included the customer’s name, their mobile number and when their contract expired. There hasn’t been a great deal of information about this other than the BBC’s report, the Information Comissioner’s press release (PDF) and a short post on T-Mobile’s customer support forum.

From an information security and Data Protection Act compliance perspective there are three breaches of the Act.

T-Mobile

There’s no information how the data was extracted from T-Mobile’s system and I accept that it could have been by people copying the information down onto pieces of paper, however I’ll assume that as the BBC story talked about “millions of records from thousands of customers”, there was a bulk extract of data.

T-Mobile is probably in breach of the seventh principle in that they failed to ensure:

“Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data”

It is a breach of section 4(4) of the Act if a data controller fails to comply with the data protection principles in relation to all personal data, and the Information Commissioner (for the moment) can commence enforcement proceedings against the company, in the course of which T-Mobile will have to undertake to implement better security and processes.

However what’s interesting to me is whether T-Mobile had ever properly quantified the commercial value of information about a customer’s name, mobile and contract expiration date? And if so whether this was adequately reflected in their risk analysis?

If this were the case then two technical steps I’d expect them to have taken would have been:

  1. to make it very hard for people to run and save a report that had more than (say) 20 such records (most people working in customer service wouldn’t even need this many records in a report)
  2. to implement some Data Leakage Prevention (DLP) technology that looked at the type of data moving out of the organisation in email, on removable media such as CDs, USB sticks and as physical printouts

The employee / employees

The employee(s) [the T-Mobile site now appears to indicate that it was just the action of a single employee] have committed a clear offence under Section 55(1) of the Act.

“A person must not knowingly or recklessly, without the consent of the data controller obtain or disclose personal data or the information contained in personal data”

If convicted they’ll receive a maximum of a £5,000 fine (and if the Information Commissioner gets his way then next year this could be a custodial sentence).

The data recipient

The person buying the data has also committed a Section 55 offence as they obtained the data without T-Mobile’s consent.

The identity of the person or company who purchased the data hasn’t been made public. It will be interesting to see whether it was a small phone dealer, a broker or one of the other big mobile phone companies. If the latter the there’s a real issue to explore – was this the action of a ‘rogue’ salesperson or something that was tacitly condoned by the organisation?

For a market to exist in personal data there has to be both a buyer and a seller, and the value of the data is defined by the buyer: if no one wanted to buy this information then the T-Mobile employee(s) wouldn’t have stolen it to sell. If the data was traded through a list broker then still the recipient organisation should have asked themselves where this data came from as alongside the section 55 offence they will have breached the first (be fair when you get, use and share data) and second (tell people what you will do with their data, do nothing more) data protection principles.

When this case finally comes to court I’ll be really interested to see the action taken against the purchasers of the personal data.

In the future I expect to see all databases that hold personal information equipped with full read auditing which would create an audit log entry whenever a user read an individual record, or ran a report that included that record.

Audit: User JohnDoe viewed this record at 10:23 on 22/10/09
Audit: User JaneDoe included this record in the report CustomersAboutToLeave at 19:47 on 23/10/09

I’d also expect mobile phone companies to correlate the read activity of their users (recorded in this type of audit log) against the customers who went elsewhere at the end of their contracts.

New data security law book launched

On Monday I had the pleasure of attending the launch of Stewart Room’s new book ‘Butterworths Data Security Law and Practice’. Stewart wrote the definitive guide to the Data Protection Act for techies, the equally snappily-named Data Protection and Compliance in Context. This is also the course book for the ISEB Practitioner-level certificate in Data Protection.

Stewart’s new book is – as he admitted – elephantine in its size and coverage  (for comparison it’s physically larger than Ross Anderson’s Security Engineering). It is the first book that addresses infosec and law and I’m really looking forward to getting hold of a copy. I had a chance to browse one of the display copies at the launch and it looks really useful.

With probably about a hundred infosec and law professionals in the same room the conversations were really engaging. There was a lot of talk about the prominence of breaches in the news, especially after last week’s T-Mobile revelations along with the ongoing consultation on the Information Commissioner’s new powers. A few of the people I spoke to were curious to see what changes there would be in non-financial services companies once the Commissioner had levied his first sizable fine.

A warning to the serially incompetent and the wicked

At last week’s Data Protection conference the new Information Commissioner – Christopher Graham – made his first public speech. With the title ICO: new powers, new funding and a new Commissioner it was certain to establish the direction we’d see the ICO taking for the next five years. The slides from the speech are available on the ICO’s web site (PDF), and the Commissioner didn’t disappoint.

All organisations need to be aware of the Commissioner’s new powers to fine those that breach the Data Protection Act. These powers come into force in April 2010. The good news is that the Commissioner still wants take a carrot-driven approach and help organisations to do the right thing, the ICO’s first reaction will always be to advise and assist.

However, the Commissioner was clear that he planned to use his new powers. The level of the fine has not been set by Government, and he’s lobbying for fixed fines with a maximum of “hundreds of thousands of pounds”. He anticipated that around 20 organisations – “ the serially incompetent and wicked”  – would feel his stick-based sanction in the first year.

If you’re concerned about how well your organisation complies with the Data Protection Act and how securely you look after the information you hold then there’s no better time for someone to have a look. I offer an integrated Information Security and Data Protection gap analysis that will show you just how well you’re doing, and suggest simple (and often low cost) ways to improve.

If you’d like to find out more then please call me on 020 8144 8456 or contact me. On the other hand if you are incompetent or just plain wicked then watch out — the ICO may still focus on the carrot-driven approach to compliance, but he’s about to get a really big stick that he intends to use.

What’s the connection between human rights and information security?

I attended a couple of events over the past week. On Saturday I went to Liberty’s 75th Birthday Conference and on Thursday the ISSA UK Chapter event on the Data Protection Act (DPA).

I had planned to write about Lord Bingham’s excellent speech at the Liberty conference but after an interesting discussion at the ISSA event I thought it would be useful to first describe the fundamental link between Data Protection and Human Rights.

The 1998 Data Protection Act can trace its ancestry directly to Article 8 of the European Convention on Human Rights (ECHR) through the EU Directive on Data Processing (95/46) and the Council of Europe’s Treaty 108.

The European Convention for the Protection of Human Rights and Fundamental Freedoms

http://conventions.coe.int/treaty/en/Treaties/Html/005.htm

The ECHR was agreed by the Council of Europe after the (horrors of the) second world war and aimed to set out the basic Human Rights which all citizens of Europe would enjoy. There are some 13 rights defined within the 18 articles, including the right not to be tortured, the right to a fair trial and the right to freedom of thought and religion.

Article 8 of the ECHR states:

1. Everyone has the right to respect for his private and family life, his home and his correspondence.

2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.

The European Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data

http://conventions.coe.int/Treaty/en/Treaties/Html/108.htm
Agreed in 1981 and generally known as treaty 108 (as ECPIAPPD isn’t a good acronym) this convention aimed to establish a pan-European framework to balance the article 8 right to privacy with the fact that data is processed by computers and will be exchanged across national borders.

The preamble states:

The member States of the Council of Europe, signatory hereto,

  • Considering that the aim of the Council of Europe is to achieve greater unity between its members, based in particular on respect for the rule of law, as well as human rights and fundamental freedoms;
  • Considering that it is desirable to extend the safeguards for everyone’s rights and fundamental freedoms, and in particular the right to the respect for privacy, taking account of the increasing flow across frontiers of personal data undergoing automatic processing;
  • Reaffirming at the same time their commitment to freedom of information regardless of frontiers;
  • Recognising that it is necessary to reconcile the fundamental values of the respect for privacy and the free flow of information between peoples,

Have agreed as follows:

And then if you go on to read the Convention you’ll find many familiar definitions such as ‘Data Subject’ and sections which are clearly forerunners of the DPAs eight principles:

Article 5 – Quality of data

Personal data undergoing automatic processing shall be:

  • obtained and processed fairly and lawfully;
  • stored for specified and legitimate purposes and not used in a way incompatible with those purposes;
  • adequate, relevant and not excessive in relation to the purposes for which they are stored;
  • accurate and, where necessary, kept up to date;
  • preserved in a form which permits identification of the data subjects for no longer than is required for the purpose for which those data are stored.

Article 7 – Data security

Appropriate security measures shall be taken for the protection of personal data stored in automated data files against accidental or unauthorised destruction or accidental loss as well as against unauthorised access, alteration or dissemination.

The first UK Data Protection Act came about in 1984 as a result of this convention.

(In fact both the 1984 DPA and Treaty 108 have a common influence in the work done in the 1970s by the UK Government’s Younger Committee on privacy)

Directive 95/46/EC

http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:EN:HTML
To give it its full name, Directive 95/46/EC of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data aimed to create a standard data protection environment across all countries in the EU. In broad terms the remit of the Council of Europe is political, and the EU’s remit is economic. The purpose of the EU directive was to harmonise and create a level playing field for data protection across the member states so that:

  • No organisation can gain a competitive advantage by processing data in a member state with poor (and therefore cheaper-to-implement) data protection legislation
  • Any European citizen will be confident that their personal data will be looked after to the same standard by any company based in any member state

And again if you look at some of the clauses of the EC directive they show their parentage from Treaty 108 and point to what their descendants will look like:

Article 6
1. Member States shall provide that personal data must be:

  1. processed fairly and lawfully;
  2. collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes. Further processing of data for historical, statistical or scientific purposes shall not be considered as incompatible provided that Member States provide appropriate safeguards;
  3. adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed;
  4. accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that data which are inaccurate or incomplete, having regard to the purposes for which they were collected or for which they are further processed, are erased or rectified;
  5. kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected or for which they are further processed. Member States shall lay down appropriate safeguards for personal data stored for longer periods for historical, statistical or scientific use.

The 1998 Data Protection Act

http://www.opsi.gov.uk/Acts/Acts1998/ukpga_19980029_en_1
All member states were required to enact legislation to implement the EC directive and so the UK government passed the 1998 Data Protection Act with its now-familiar eight principles. You can see that the principles trace their ancestry back to article eight of the ECHR via the EC Directive and the Council of Europe Treaty 108.

  1. Personal data shall be processed fairly and lawfully
  2. Personal data shall be obtained only for one or more specified and lawful purposes
  3. Personal data shall be adequate, relevant and not excessive
  4. Personal data shall be accurate and, where necessary, kept up to date
  5. Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes
  6. Personal data shall be processed in accordance with the rights of data subjects under this Act.
  7. Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data
  8. Personal data shall not be transferred to a country or territory outside the European Economic Area ….

On a day-to-day basis I deal with all sorts of technical matters, talk about ‘risks’ and ‘controls’ and help organisations comply with the Data Protection Act. It’s good to remember that in a small way, alongside commercial imperatives, I’m also helping to protect one of the fundamental human rights of people’s privacy.

Liberty – formerly the National Council for Civil Liberties – is dedicated to protecting civil liberties and promoting human rights for everyone. Founded in 1934 they held their 75th birthday conference last weekend with an impressive set of speakers including Lord Bingham, Jack Straw, Nick Clegg, Dominic Grieve, Tony Benn, Ken Macdonald (ex-DPP), Sarah Ludford MEP and Privacy International’s Simon Davis.

Liberty’s 75th Birthday Conference

http://www.liberty-human-rights.org.uk/about/1-history/75th-anniversary-conference/index.shtml
The conference keynote given by Lord Bingham (who was the most senior Law Lord until retirement) addressed the position of the 1998 Human Rights Act (HRA). The HRA incorporates the ECHR into UK law, and after the next general election there’s the probability that whoever wins will try to amend or replace it. Often the debate on the HRA and the European Convention is ill-informed with people confusing the ECHR, the UK Act, the EU and the Council of Europe and calling for a plague on all of it.

Lord Bingham’s speech was a lesson in clarity and a well argued defence of the HRA. He made these ten points:

  1. Just because the ECHR starts with the word ‘European’ doesn’t mean that it’s some foreign import. British politicians made a huge contribution in drafting it and the UK was the first country to ratify it.
  2. The UK is bound by the ECHR under international law. If a Government repealed the HRA, it would still be bound by the convention.
  3. All that the 1998 HRA did was allow people in the UK to assert their rights under the convention in UK law, without having to go to the European Court of Human Rights in Strasbourg.
  4. The HRA does not transfer interpretative power from politicians to judges. Before the HRA, European judges in the European Court had interpretative power, now after the HRA it is UK judges who have that interpretative power.
  5. The HRA is not undemocratic. Judges can not overturn the will of Parliament, the ‘worst’ they can do is declare an Act of Parliament as incompatible with the ECHR, but it is Parliament that has to work our what to do – only Parliament can repeal and amend Acts.
  6. The HRA is criticised as elevating the rights of the individual above the community. In some respects this is true – articles such as the right not be enslaved are absolute. For the non absolute rights (such as Article 8’s right to privacy) the rights of the individual always have to be balanced with the rights of the community.
  7. Another criticism is that the HRA and ECHR only mention rights and not responsibilities. True – but our responsibilities as citizens are enshrined in a detailed manner in countless other Acts. If there are duties and responsibilities that are not defined they need to be clearly defined in law, not in well-meaning statements.
  8. The ECHR is a minimum standard, not a ceiling. It doesn’t stop a UK government from creating better protection for human rights if it wants to.
  9. People criticise the HRA for ‘foolish decision making’. In Lord Bingham’s opinion the level of judicial decisions in cases on the HRA and ECHR is no more foolish than elsewhere.
  10. The fundamental rights and freedoms protected by the ECHR and the HRA are just that – basic rights which everyone should enjoy by virtue of their existence. Which should be discarded?

If you’ve ever wanted to understand where the Human Rights Act came from, or why it’s important then I highly recommend reading Lord Bingham’s full speech (PDF).