How must security managers apply the law – and, in particular, the Data Protection Act 1998 – to the increasing use of biometric security technologies in the private sector? As Nick Mallett and Luke Plimmer suggest, there’s a clear preference for developing proportional systems which minimise abuse. Certainly, data controllers must adhere to the eight Data Protection principles. Illustrations by Jonathan Edwards

Business is beset as never before by increasing legislation in the form of either new or modified Acts of Parliament. Engaging around 500,000 personnel and subject to rather high levels of staff turnover, the private security industry is affected more than most by employment legislation and regulation of the employer-employee relationship, which appears to change 50 or so times every year! Most recently, of course, its been the National Minimum Wage and the Disability Discrimination Act that have come in for detailed scrutiny.

In September 1997, the Government created a Better Regulation Task Force with the laudable objective of stemming the tide of new laws. Alas, that Task Force hasn’t been hugely effective, particularly for the security sector.

In addition to the incoming regulation originating in both Whitehall and Brussels, the industry is now having to face up to the Private Security Industry Act 2001 as well as the normal vicissitudes of life (including ever-changing technology that’s supposedly created in order to make our lives easier).

One of those technologies is biometrics. Although still not fully rolled-out to the majority of end users, in truth biometric security has been with us forever. At the non-technological level, we’re all applying a form of biometric security when we recognise others by sight and sound. What we’re actually doing is comparing what we are seeing and hearing with the corresponding data we hold in our own database (or memory).

Biometrics has a jargon all of its own, of course, but thankfully not too much. In that jargon, we authenticate or verify the other party according to whether we recognise the data received (or not). If we verify a person, we then identify them. Enrolment means the original provision of the reference data (or template) to the database against which a subject will (when required) be compared for authentication, verification or identification purposes. Therefore, when we first encounter someone via one or more of our senses, we enrol that person by means of our memory.

Is biometrics an imperfect science?

Whereas most human beings rely on the five senses (some claim a sixth!), there are at least nine commonly available methods of recording essential – and arguably unique – human characteristics in a way which renders them suitable for use in biometric security. These include: acoustic emissions (waves generated by the eardrum in response to sound), DNA, facial geometry, fingerprints, gait, handwriting, hand shapes, iris pattern, retina pattern and voice recognition.

With the possible exception of DNA, all are imperfect bases because of the danger that the identifying feature will be affected by temporary or permanent alterations brought about voluntarily or otherwise after enrolment.

Likewise, some of the technologies are more acceptable to the public than others. In some cases, enrolment can occur automatically and/or unconsciously. Speech, fingerprints, facial shape, gait, someone’s handwriting or DNA may be captured and recorded with or without the conscious participation of the subject. On the other hand, the accurate recording of iris information is extremely difficult, and the capture of hand geometries haphazard without subject co-operation.

The form in which data subjects’ templates are stored is crucial. In practice, it will depend on the type of application for which the biometric device will be used, as well as the size of the templates (which can be stored in the memory of a biometric device, in a central database or on plastic, optical or smart cards).

The shortcomings of biometric techniques are such that, in order to perform authentication procedures to the highest achievable level of accuracy, three different methods may be used in tandem – based on something an individual knows (eg a password or PIN), owns (a token or smart card) or is (ie the biometric feature).

If the corporate security manager decides to procure a biometrics-based security system and use the data captured, what are the legal implications of doing so? We need look no further than the Working Document published on 1 August 2003 by the Data Protection Working Party established under Article 29 of the original EU Data Protection Directive (95/46/EC of the European Parliament, and of the Council of 24 October 1995) for the answer.

The Working Party has advisory status, and is required to act independently. It comprises representatives of the supervisory authority designated by each Member State to have responsibility for Data Protection matters. To a great degree the Working Party is autonomous, and free to consider any items placed on its agenda by the chairman or at the European Commission’s request.

It’s somewhat heartening that the UK representative – Richard Thomas, the Information Commissioner – holds and expresses very robust opinions on privacy issues, and is certainly not afraid to disagree with the Government. Thomas has, for instance, a distinctly more liberal approach to privacy in the context of the proposed national identity card than did former Home Secretary David Blunkett.

The introduction to the Working Document underlines this point very clearly. “The rapid progress of biometric technologies, and their expanded application in recent years, necessitates a careful scrutiny from a Data Protection perspective. A wide and uncontrolled use of biometrics raises concerns with regard to the protection of the fundamental rights and freedoms of individuals. This kind of data is of a special nature, as it relates to the behavioural and physiological characteristics of an individual, and may allow his or her unique identification.â€

The publication goes on to state: “A specific concern related to biometric data is that the public may become desensitised – through the widening use of such data – to the effect that processing may have on daily life. By way of an example, the use of biometrics in school libraries can make children less aware of the Data Protection risks that may impact upon them in later life.â€

For the purposes of the Data Protection Act 1998, biometric data constitutes ‘personal data’ and, in some cases, ‘sensitive personal data’. The examples of biometric data previously identified may all constitute personal data as they will “relate to a living individual who can be identified… from those data and other information which is in the possession of – or is likely to come into the possession of – the data controller.â€

Some biometric data could be considered as ‘sensitive’ within the meaning of Article 8 of the EU Directive (in particular data revealing racial or ethnic origin, or that which concerns health). For example, in those biometric systems based on facial recognition, data revealing racial or ethnic origin may be processed. In such cases, the special safeguards provided by Article 8 (Section 2 of the 1998 Act) will apply, in addition to the general protection provisions of the Directive.

That doesn’t mean that any processing of biometric data will necessarily include sensitive data. Whether a processing procedure contains sensitive data is a question of appreciation linked with the specific biometric characteristics used, and the biometric application itself. It’s more likely to be the case if biometric data in the form of images are processed since, in principle, the raw data may not be reconstructed from a binary template.

Obligations of data controllers

All controllers of biometric data must adhere to the eight Data Protection principles. These are as follows: (1) Personal data shall be processed fairly and lawfully (2) Personal data shall be obtained only for specified and lawful purposes, and shall not be processed in a manner incompatible with those purposes (3) Personal data shall be adequate, relevant and not excessive (4) Personal data shall be accurate and, where necessary, kept up-to-date (5) Personal data shall be kept for no longer than is necessary.

Speech, fingerprints, facial shape, gait, someone’s handwriting or DNA may be captured and recorded with or without the conscious participation of the subject. On the other hand, the capture of hand geometries is haphazard without subject co-operation

The remaining three principles for consideration are: (6) Personal data shall be processed in accordance with the rights of the data subject (7) Appropriate measures shall be taken to prevent unauthorised use – or the accidental loss – of personal data, and (8) Personal data shall not be transferred outside of the European Economic Area (EEA) unless that country ensures adequate protection.

Principle 1: Personal data shall be processed fairly and lawfully

In order that this principle be met, the data subject must be told exactly why the data is being processed, and the identity of the data controller. Generally speaking, the consent of each subject must also be obtained for processing (unless one of the limited exemptions applies). Indeed, the EU Data Protection Working Party states that systems which collect data without the knowledge of their subjects (distance facial recognition systems spring to mind) must be avoided.

If the data being collected is ‘sensitive’ then the requirements are even more stringent. In the majority of cases, the subject must be fully informed of all the relevant information and their ‘explicit’ consent obtained. Where ‘sensitive’ data is concerned, implied consent will simply not suffice. The form of that consent will vary according to circumstance. For example, notices concerning the mere existence of CCTV cameras carry with them an implication of consent to individuals being filmed and – possibly – recorded.

Principle 2: Personal data shall be obtained only for specified and lawful purposes, and shall not be processed in a manner in any way incompatible with those purposes

This principle, which overlaps with the first, concerns the obtaining and processing of information. It prohibits data controllers from the further processing of information that would be incompatible with the defined purpose for which the data was collected.

In addition, data subjects must not be deceived or misled as to the intended purpose of collection. For instance, biometric data processed for access control purposes mustn’t be used to assess the emotional state of the data subject or for surveillance at work. All possible steps must be taken to avoid any incompatible re-use of captured data.

It’s thought that the centralised storage of biometric data increases the risk that databases could be linked together, thus leading to more detailed profiles of individuals. If this were to occur then clearly the remit of any original information capture would be exceeded. The EU Data Protection Working Party recommends that biometric data should remain with the individual – for example, on a smart card, mobile phone or bank card.

This principle imposes an obligation on those who disclose biometric data to a third party to impose contractual obligations on that third party to process the information only for purposes compatible with the data controller’s original (and specified) purpose.

Principle 3: Personal data shall be adequate, relevant and not excessive

The central idea behind this principle is proportionality. Essentially, the data controller (ie the security manager) should ask whether or not their intended purpose could be achieved in a less intrusive way, taking into account the risks to a given individual’s fundamental rights and freedoms.

For example, in France the authorities refused the use of fingerprints in the case of access by children to school restaurants, but accepted for exactly the same purpose the outline of the hand patterns. Essentially, data controllers need to tailor each system to the specific requirements of the situation.

A certain difficulty may arise as biometric data often contains more information than is necessary for its identification or verification purposes, particularly where raw data (such as an original image) is concerned. Data controllers should destroy unnecessary and irrelevant data as soon as possible, and construct users’ templates so as to preclude the processing of this data.

Data Protection authorities have suggested that biometric systems relating to physical characteristics which leave traces (eg fingerprints, as opposed to hand geometries) – or those which store information in the control access device or a central database – may be excessive as they present more of a risk to the rights and freedoms of individuals.

It’s recommended that biometrics be stored in an object exclusively available to the user (like a microchip, mobile phone or bank card). However, a central database will be needed if the function of identification (“Who am I?â€) is to be carried out rather than the function of verification (“Am I who I say I am?â€). In these cases, particular care must be taken – and safeguards put in place – to preserve the individual’s rights and freedoms.

Principle 4: Personal data shall be accurate and, where necessary, kept up-to-date

Data controllers are under an obligation to take reasonable steps to verify the accuracy of the data they obtain but, given the current state of technology accuracy, this is still proving problematic for biometric systems to achieve.

To be frank, most biometric systems encompass serious flaws. For instance, it’s estimated that 5% of people don’t have readable fingerprints (either because of manual labour or genetic make-up). Iris scans can be affected by watery eyes, and even long eyelashes. To this end, privacy campaigners have argued that facial recognition systems may be fooled just as easily by people with disguises, beards, prosthetics or even plastic surgery. Where systems are used for the purpose of identification rather than verification, comparing a user’s template with all other records in a database engenders even lower success rates.

The problem is that this could leave biometric systems open to challenge. In fact, the EU Data Protection Working Party emphasises the importance of accuracy in this area. Errors occurring in biometric systems, states the Working Party, can have quite severe consequences (including the false rejection of those authorised, and the false acceptance of those unauthorised). Faced with such ‘indisputable’ evidence, individuals may find it impossible to prove the contrary. The only viable option for data controllers is to employ a combination of measures, or a multi-biometric system to achieve greater accuracy.

Principle 5: Personal data shall be kept for no longer than is necessary

Whether a processing procedure contains sensitive data is a question of appreciation linked with the specific biometric characteristics used, and the biometric application itself

This principle overlaps with the third and is pretty much self-explanatory. It imposes an obligation on data controllers to ‘keep personal data under constant review, and delete all information which is no longer required for the purpose that it was originally obtained’. By way of example, it may be the case that a particular threat to security is no longer present and, therefore, the data is no longer needed for that purpose.

Principle 6: Personal data shall be processed in accordance with the rights of the data subject

Principle 6 will be breached whenever a data controller fails to comply with an individual’s justified request to cease processing their biometric data, or fails to respond to any request within 21 days of receipt.

A breach will also occur if the data controller doesn’t comply with any subject access request. In certain circumstances, data subjects have a right to make such requests concerning the information being processed that’s specifically to do with themselves. These requests must be made in writing, and the data controller can charge up to £10 for dealing with each of them.

Data controllers must be sure of the individual’s identity (so as to adhere to Principle 7), and can ask for details as to the location of the data. Each request has to be administered within a 40-day window.

Principle 7: Appropriate measures shall be taken to prevent unauthorised use – or the accidental loss – of personal data

Maintaining the security of biometric data is fundamental to safeguarding the rights and freedoms of individuals.

The dangers of failing to meet this obligation are severe. Were a person to steal another individual’s biometric identity, the ‘thief’ couldn’t change the ‘victim’s’ genetic attributes as easily as they could a computer password. This may cause irreparable damage to the individual concerned, and limit their freedom in the future.

Data controllers should therefore ensure that protective measures afford a level of security appropriate to the harm which may otherwise result. Particular care is required wherever biometric data is transmitted over a network or the Internet. Here, security measures could include the encryption of users’ templates, the protection of encryption keys and access control.

That said, the Data Protection Act 1998 allows an account to be taken of the state of technology (and its cost) available to the data controller at the relevant time. Thus it’s advisable for the data controller to monitor changes in technology such that they don’t breach the legislation simply because they’ve failed to upgrade their own security systems.

The EU Data Protection Working Party advocates developing encryption keys based on biometric data. These would allow an individual’s biometric data to be decoded only on the basis of a new collection of data from the data subject.

Security managers should note that Principle 7 also imposes an obligation on the data controller to ensure – as far as is reasonable under the circumstances – the reliability of all employees who have access to personal data. Given the greater importance of biometric data, this obligation is likely to be more stringent (with a greater degree of training required of security staff).

Principle 8: Personal data shall not be transferred outside the European Economic Area (EEA) unless that country ensures an adequate level of protection

This is hugely relevant for multinational companies operating offices and employing staff overseas. Recently, software giant Microsoft was fined by Spanish Data Protection authorities for sending details on its members of staff based in Spain to the company’s global hq in the States, as the USA isn’t deemed to have adequate levels of protection.

If data has to be transferred, play safe and use a Trans-Border Data Flow Agreement containing appropriate contractual provisions that will secure compliance with the other seven principles, and adopt the European Commissioner’s approved clauses.

Non-compliance with the principles

For those data controllers who fail to comply with the Data Protection principles, the outcome may be rather nasty.

For starters, an individual can sue for damages and/or compensation if non-compliance has caused them any distress. Under Section 13 of the Data Protection Act 1998, individuals are exceptionally entitled to compensation for any distress caused due to a contravention. This is principally available where an action for damages is brought to the fore, but may also be permitted as a free-standing claim in limited circumstances.

Neglect of the principles might also result in appropriate enforcement action by the Information Commissioner. The Commissioner is primarily responsible for enforcing the Data Protection Act. He has powers to investigate complaints, and can issue data controllers with Enforcement Notices specifying the steps that need to be taken to ensure compliance. Failure to comply with any such notice is a criminal offence, which can also bring with it a hefty fine and negative publicity.

It’s true to say that increasing numbers among the general public are becoming fully aware of their rights in this area. In 2003-2004, the Information Commissioner dealt with over 10,000 complaints under the terms of the Data Protection Act 1998.