Facial recognition 'a threat to privacy', says ICO

19 Sept 2019

Facial recognition technology is in the news almost daily, and will be of special importance to anyone responsible for managing premises and people.

This week, UK privacy group Big Brother Watch joined with politicians and campaign groups - including Amnesty International and Liberty - to submit its concerns in a formal letter to the government. In it, they describe the increasing use of facial recognition technology as a "surveillance crisis", arguing that it has not been subject to sufficient scrutiny by parliament.

The technology behind facial recognition makes it a highly efficient method of tracking people in large numbers for security purposes. It is already used successfully by the UK Border Force: many of us will be familiar with 'ePassport gates' that require anyone entering the country to scan their passport while their face is simultaneously scanned by the camera. The UK Home Office and Border Force are looking to roll out this technology to make the immigration process even more efficient. This week, Gatwick Airport announced it would become the first airport in the UK to install facial recognition software to allow passengers to board airplanes without physical document checks (though they would still require passengers to retain other forms of ID).

The rules around use of facial recognition software in public and private places in the UK are strict however. The recent revelation that the King's Cross estate (a 67 acre space that is privately managed, but not obviously so to the general public) had used it for surveillance, came as a shock to many, and prompted a public statement from Information Commissioner Elizabeth Denham:

“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding.

“I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.

“Facial recognition technology is a priority area for the ICO and, when necessary, we will not hesitate to use our investigative and enforcement powers to protect people’s legal rights.

“We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King's Cross area of central London, which thousands of people pass through every day.

“As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.

“Put simply, any organisations wanting to use facial recognition technology must comply with the law - and they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified.

“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”

Employers who are concerned about the operational risks of deploying advanced technologies can find more information from the ICO here.