Author Archives: John

Pre-authorisation data (PCI DSS Q&A)

Question: Is pre-authorisation data in scope of PCI DSS?

Answer: Yes.

There’s quite a bit of misleading information on the internet about the status of pre-authorisation data. As far as all the card schemes are concerned there’s no difference between pre-authorisation data and post-authorisation data. If you store, process or transmit pre-authorised cardholder data then the PCI DSS requirements apply.

However, if your card brand agrees, you are permitted to store sensitive authentication data (SAD) which includes track-data, encrypted PIN blocks and CVV2 values before authorisation as long as it is deleted immediately after authorisation.

The best argument I once heard about this subject was from a QSA who said that a card number that had not been authorised “was just a random 16 digit number” and it was only the process of authorisation that made it cardholder data. He argued that the fact that it passed a Luhn check and was entered into a web form field labelled “card number” was immaterial. Nonsense: if it walks like a PAN, and quacks like a PAN, then it’s a PAN.

There’s also a PCI SSC FAQ about this.

Grand Central: Great trains, terrible terms

Recently I travelled to York on Grand Central Railway. I really like their train service because you pay the same fare whether you buy your ticket in advance, at the station, or on the train. I really dislike the terms and conditions for using their on-board wi-fi.

“Grand Central reserves the right to include the name, address and other relevant information relating to the User in a directory for the use of Grand Central users or other third parties, unless specifically requested by the User in writing not to do so.”

As a fair processing notice designed to let the user know what Grand Central will do with your data, this fails.

  • I guess by ‘directory’ they mean ‘database’. Directory is a terrible word to use, as most people’s mental model will be of something that’s open to anyone to consult – like a telephone directory.
  • It doesn’t say what use will be made of the data, just the types of people (Grand Central users and other third parties) who can use it.
  • It gives no indication of what could be relevant information. It could mean that they collect details of all the web sites you visit when using that connection, and add those to their ‘directory’.
  • If you were to apply the Information Commissioner’s Principle One test – what would the user expect Grand Central to do with their data?

Needless to say, I didn’t use the wi-fi, but emailed their customer service department once I was back on a real connection. Their response was:

“This is a generic condition from our WiFi service provider. The only detail we collect is email address and we may use this from time to time to contact users with details of Grand Central, offers and promotions. If you wish to be removed from the directory please inform us in writing.”

Which is a much better statement of the data they are collecting, and what they plan to do with it — essentially the fair processing notice that should have been available for using the wi-fi.

There are some lessons here:

  • Telling a user what data you’re collecting and what you are going to do with it is one of the fundamental principles of the DPA.
  • If you use generic text from someone else, then you risk being in breach of the first and second data protection principles.
  • Breaching the DPA at best gets you a letter from the ICO, and perhaps you’re added to his list of ‘potential incompetents’. After all, if you can’t write a basic statement of what you’re going to do with people’s data, you might be equally relaxed about how you look after it. Perhaps all the routers and file servers at Grand Central still have their generic passwords?

Filling cabinet breaches

I like to analyse the ICO’s undertakings and enforcement notices to see whether there are lessons you can learn from other people’s unfortunate mistakes.

Last year the Orbit housing association moved offices and in the process sold-off some of their surplus-to-requirments filling cabinets. The problem was that there were some 57 files left in them. With 42 recovered that left 15 customers’ files in the wild. The ICO insisted on an undertaking (PDF).

I resisted pointing out the obvious — that this was a bad idea — and reminding people that it is important to involve your DPA or security manager in office moves, and embedding DPA considerations into your business change process.

However a couple of weeks ago Lancashire County Council left some social work records in an old filing cabinet that was bought by a member of the public. Again the ICO required an undertaking (PDF).

There’s a couple of lessons to take for these two incidents.

  1. It is worth reminding everyone in the organisation that the data protection act applies to paper files that contain personal data. Just emphasising this in the next DPA or security training my help someone stop and think.
  2. Make sure that there’s a DPA or security check in all of your business change processes.

Data Sharing and the Blue Badge Parking Scheme

Back in 2008 the government announced that they were going to reform some of the ways the disabled parking / blue-badge scheme worked to reduce the amount of fraudulent use. When I heard this discussed on the radio, the government’s spokesman talked about providing £10 million towards a data sharing scheme to enable a council parking attendant to check on the validity of a blue badge issued by another council.

I have a knee-jerk adverse reaction to the words “government” and “data sharing” – especially when they are used in the same context as “the prevention and detection of crime”, so I checked out the strategy document (PDF) on the Department for Transport’s (DfT) site and was pleasantly surprised to find a sensible proposal:

“The preferred option going forward is to create a system which allows sharing of data through the linking of existing local authority databases. DfT will provide local authorities with up to £10m in funding over the next three years to establish a system of data-sharing.”

That was back in October 2008, and now a consultant has finished a survey of all the IT systems local councils use to administer the scheme, the DfT is starting to run data sharing workshops with local councils, beginning to design the system (December status update – PDF).

In the meantime Rochdale council has made a successful bid to the Government Connect benefits realisation fund to investigate the “establishment of a national database with local access” for the blue badge scheme.

So, it will be interesting to see if a distributed approach is maintained and I’d like to offer my suggestions so that privacy is built in from the start. Because when you look at the problem, there is probably no need to share data.

Implement a simple question and answer approach. Not data sharing and not a centralised database.

Whose data is it?

People apply to their local council to issue a permit, so it is the job of the local council to look after that data. It’s the permit holder’s data that they entrust to the local council and in Data Protection Act terms, the local council is the Data Controller. The name of the issuing council is written on the permit along with a permit number (that also identifies the gender of the owner) and the date the permit expires.

Who needs to access it?

Parking enforcement officers from all over the UK (and perhaps eventually Europe) don’t need access to any more data than is written on the permint.

All they need is the answer to one question: “is this permit valid, invalid or being used illegally?”.

They don’t need to see any of the information that the issuing council has about the permit owner.

A parking officer may also like to report a concern to the issuing council – that they suspect the permit may be being used illegally. Sending this information to the council that issued the permit would then allow the council to get in touch with the permit holder directly. This keeps the relationship between the local council and the permit holder and doesn’t make the permit holder subject to potentially inconsistent actions of parking attendants anywhere in the country.

A network of local databases:

From a technical perspective, the system constraints are simply this:

  • Each council needs to keep the responsibility of looking after the data of their permit holders.
  • Other authorities (who are properly authorised and validated by the issuing council) need to be able to ask a question of this information, and receive an answer.

So here’s one way of building this system.

Each council maintains their own database of permits and permit holders (as the DfT initially suggests). They look after the security of the data and they don’t export the data to any other system.

Each council issues all of the other councils an electronic access key that allows them to ask a validity question from the issuing council’s database.

Whenever a parking enforcement officer needs to check whether a permit is valid, they send:

  • The permit ID in question
  • Their ID (e.g. their badge number – something that can individually identify them)
  • Their council’s access key

to the council that issued the permit (they can read this from the permit). The issuing council would then reply with one of four answers:

  1. We didn’t issue that permit. (It’s probably a forgery.)
  2. We issued that permit, and the permit is valid.
  3. The permit is invalid (it may have just expired — this allows the issuing council to set their own grey-area) so doesn’t confer any rights to disregard parking restrictions.
  4. The permit is invalid and has been reported stolen or withdrawn by the issuer and should be seized.

The parking attendant can then perform the relevant statutory actions.

No personal data needs to be shared between the issuing council and the parking attendant, wherever they are in the country.


  1. I’m not an expert on parking, permit fraud or enforcement. There may be many reasons why this simple query / answer approach wouldn’t solve the problems with fraudulent permit use. However, this is the best place to start. If people think that a parking enforcement officer needs more information then they should make the case for this. It is always best to share the minimum amount of data necessary to remain compliant with the third (only get and use data you need) data protection principle.
  2. I’ve simplified this discussion to the broad question of data copying, data sharing or my preferred question:response which would share the minimum of personal information. There’s a separate technical discussion about the best way of achieving this, and whether it would be best implemented using public-private key encryption, with a central-key management system operated jointly by all councils. There would be some other issues to explore around how long a key is valid for, and how a local council revokes another authority’s access.
  3. I’d also be tempted to consider whether using near-field RFID chips in the permits would add value to the system and make the permits harder to forge. It would also reduce the frequency of number keying errors by a glove-wearing parking attendant on a cold day, as their terminal would just be able to read the permit ID through the windscreen.

The future of privacy talk at ORG

Bruce Schneier spoke on the subject of The Future of Privacy at the Open Rights Group on Friday. The ORG is the ‘UK equivalent’ of the EFF and I’m proud to be one of its founder members. I’ve heard Bruce speak a few times, most recently at WEIS 09, and I’ve always been impressed at his relaxed presentation style. This was a great event and ORG will be posting has posted a video of the event on its web site. I’d recommend watching the both the presentation and the Q&A afterwards.

UPDATED: Here are the links to the presentation and the Q&A.

A few highlights (with comments):

  • In relation to large government databases, built to facilitate data mining techniques for suspicious activities, Bruce commented that if you’re looking for a needle in a haystack, it doesn’t seem very sensible to add more hay!
  • On CCTV he posited that we’re living in a unique time. Ten years ago there were no cameras, now there are hundreds of cameras and we can see them all, in ten year’s time there will be many hundreds of cameras, but we won’t be able to see any of them.
  • When ‘life recorders’ become widely used (and they’d only need about 1TB a year to record your entire life) he could see that not having an active life recorder would be seen as suspicious — much like leaving or turning off your mobile phone is now presented as “evidence” that you were up to no good.
  • Ephemeral conversation is dying.
  • The real dichotomy is not security v privacy, but liberty v control. He argued that privacy increases power, and openness decreases power. So citizens need privacy and governments need to be open for a balanced democracy to prosper.
  • The death of privacy has been predicted for centuries (for instance, see Warren and Brandeis’ The Right to Privacy published in 1890). Without a doubt privacy is changing and this is a natural process — but it isn’t inevitable. Our challenge is to either accept this, or to reset the balance between privacy and the mass of identity-based data gathered for commercial gain and state security. Laws are the prime way to reset that balance.
  • When asked the one thing he’d like to change, he replied it would be to implement European style data protection legislation (like our own Data Protection Act) in the US.

Abuse of radio buttons and check boxes

I’m particularly sensitive to interface design and I saw a real horror this week. [The] BCS1 recently conducted a members’ survey. Question six managed to break the long established model of radio buttons (select one) and check boxes (select more than one).

I guess they wanted to make sure that people had answered the question so required a ‘none’ option. If you selected this radio button it used some JavaScript to clear any of check boxes you’d previously selected.

One of the best bits of interface design advice I ever heard was from Jakob Nielsen. In his list of Top Ten Mistakes it is number eight.

Consistency is one of the most powerful usability principles: when things always behave the same, users don’t have to worry about what will happen. Instead, they know what will happen based on earlier experience. Every time you release an apple over Sir Isaac Newton, it will drop on his head. That’s good.

The more users’ expectations prove right, the more they will feel in control of the system and the more they will like it. And the more the system breaks users’ expectations, the more they will feel insecure. Oops, maybe if I let go of this apple, it will turn into a tomato and jump a mile into the sky.

It’s important that any application or website uses mental models that people are familiar with. In security you’re often asking a critical question, and that’s all you want the user to think about, not a newly invented or misapplied design metaphor.

1Formerly the British Computer Society, it has recently become “bcs – The Chartered Institute for IT” and is no longer referred to as “The BCS”.

An analysis of the T-Mobile breach

There’s been a lot in the press for the past few days about the recent T-Mobile breach. Basically it appears that a number of staff at the mobile phone company have been selling customer data which included the customer’s name, their mobile number and when their contract expired. There hasn’t been a great deal of information about this other than the BBC’s report, the Information Comissioner’s press release (PDF) and a short post on T-Mobile’s customer support forum.

From an information security and Data Protection Act compliance perspective there are three breaches of the Act.


There’s no information how the data was extracted from T-Mobile’s system and I accept that it could have been by people copying the information down onto pieces of paper, however I’ll assume that as the BBC story talked about “millions of records from thousands of customers”, there was a bulk extract of data.

T-Mobile is probably in breach of the seventh principle in that they failed to ensure:

“Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data”

It is a breach of section 4(4) of the Act if a data controller fails to comply with the data protection principles in relation to all personal data, and the Information Commissioner (for the moment) can commence enforcement proceedings against the company, in the course of which T-Mobile will have to undertake to implement better security and processes.

However what’s interesting to me is whether T-Mobile had ever properly quantified the commercial value of information about a customer’s name, mobile and contract expiration date? And if so whether this was adequately reflected in their risk analysis?

If this were the case then two technical steps I’d expect them to have taken would have been:

  1. to make it very hard for people to run and save a report that had more than (say) 20 such records (most people working in customer service wouldn’t even need this many records in a report)
  2. to implement some Data Leakage Prevention (DLP) technology that looked at the type of data moving out of the organisation in email, on removable media such as CDs, USB sticks and as physical printouts

The employee / employees

The employee(s) [the T-Mobile site now appears to indicate that it was just the action of a single employee] have committed a clear offence under Section 55(1) of the Act.

“A person must not knowingly or recklessly, without the consent of the data controller obtain or disclose personal data or the information contained in personal data”

If convicted they’ll receive a maximum of a £5,000 fine (and if the Information Commissioner gets his way then next year this could be a custodial sentence).

The data recipient

The person buying the data has also committed a Section 55 offence as they obtained the data without T-Mobile’s consent.

The identity of the person or company who purchased the data hasn’t been made public. It will be interesting to see whether it was a small phone dealer, a broker or one of the other big mobile phone companies. If the latter the there’s a real issue to explore – was this the action of a ‘rogue’ salesperson or something that was tacitly condoned by the organisation?

For a market to exist in personal data there has to be both a buyer and a seller, and the value of the data is defined by the buyer: if no one wanted to buy this information then the T-Mobile employee(s) wouldn’t have stolen it to sell. If the data was traded through a list broker then still the recipient organisation should have asked themselves where this data came from as alongside the section 55 offence they will have breached the first (be fair when you get, use and share data) and second (tell people what you will do with their data, do nothing more) data protection principles.

When this case finally comes to court I’ll be really interested to see the action taken against the purchasers of the personal data.

In the future I expect to see all databases that hold personal information equipped with full read auditing which would create an audit log entry whenever a user read an individual record, or ran a report that included that record.

Audit: User JohnDoe viewed this record at 10:23 on 22/10/09
Audit: User JaneDoe included this record in the report CustomersAboutToLeave at 19:47 on 23/10/09

I’d also expect mobile phone companies to correlate the read activity of their users (recorded in this type of audit log) against the customers who went elsewhere at the end of their contracts.

New data security law book launched

On Monday I had the pleasure of attending the launch of Stewart Room’s new book ‘Butterworths Data Security Law and Practice’. Stewart wrote the definitive guide to the Data Protection Act for techies, the equally snappily-named Data Protection and Compliance in Context. This is also the course book for the ISEB Practitioner-level certificate in Data Protection.

Stewart’s new book is – as he admitted – elephantine in its size and coverage  (for comparison it’s physically larger than Ross Anderson’s Security Engineering). It is the first book that addresses infosec and law and I’m really looking forward to getting hold of a copy. I had a chance to browse one of the display copies at the launch and it looks really useful.

With probably about a hundred infosec and law professionals in the same room the conversations were really engaging. There was a lot of talk about the prominence of breaches in the news, especially after last week’s T-Mobile revelations along with the ongoing consultation on the Information Commissioner’s new powers. A few of the people I spoke to were curious to see what changes there would be in non-financial services companies once the Commissioner had levied his first sizable fine.

Yet another meaning for C, I and A

Yesterday I heard Andy Smith, the Chief Security Architect for the Identity and Passport Service (IPS) speak at the BCS Central London branch meeting about the security behind the new National Identity Register which supports the National Identity Card.

On one slide he highlighted what he considered the three biggest threats to Information Security:

  • Complacency
  • Apathy
  • Inattention (Andy called it Human Error, but I hope he’ll excuse my re-wording to fit into the familiar triad)

So now there’s three security meanings for C, I and A.

  1. Confidentiality, Integrity and Availability : The original
  2. Common Sense, Intent and Application : Plan on doing sensible things well, and keep doing them
  3. Complacency, Inattention and Apathy : It is really hard for humans to do security things 100% of the time

Andy’s presentation was really interesting and I’m glad to have had the opportunity of hearing his views, but in my view the session failed to address the publicised topic of “ID Cards: The end of the Private Citizen – or good corporate ID management?” There wasn’t a speaker to address whether this was the “end of the Private Citizen” and questioners were discouraged from being “too political”. As IT professionals it is really important we participate in the debate about state-wide databases and the consequences of insecurity and secondary uses. That’s not a political discussion, but a socio-technical discussion about the future application of technology. The UK chapter of the ISSA held a similar event in July this year which included former home secretary David Blunkett, a speaker from the Home Office, Pete Bradwell from Demos along side many technical presentations. Perhaps it was the table I was sat on but our discussion ranged widely through technology, security and ethical issues.

At last night’s BCS event I’d have like to have heard Andy talk more about the technical details of how his team resolved some of the many interesting challenges they will have faced over the past few year, especially the architectural solutions and processes devised to maintain separation of duties within the IPS.

As a root identity provider the ID card and the NIR are attractive, however I can’t help thinking of Bruce Schneier’s 2007 essay on The Risks of Data Reuse which ended:

“History will record what we, here in the early decades of the information age, did to foster freedom, liberty and democracy. Did we build information technologies that protected people’s freedoms even during times when society tried to subvert them? Or did we build technologies that could easily be modified to watch and control? It’s bad civic hygiene to build an infrastructure that can be used to facilitate a police state.”