UK Police using Facial Recognition systems since 2009

It came as a shock to the establishment and public last November 2014 when the Association of Chief Police Officers (ACPO) were forced to reveal they had been using biometric facial recognition technology with custody suite photographs on the Police National Database (PND).  This came to light during a House of  Commons Select Committee hearing on the Current and Future uses of Biometric Data and Technologies when the UK Biometrics Commissioner’s written submission detailed the use of the facial recognition system used by all UK police forces nationally, which had been ‘live’ since 28th March 2014.  
 

Both the Home Office, who took responsibility for the PND in 2012, and ACPO failed to mention in their submissions to the Select Committee looking at biometrics, that they were using a biometric facial recognition system.  The Home Office has subsequently submitted an ‘updated’ report to the Select Committee detailing their use of facial recognition.
So the use of facial recognition technology has been a relatively recent addition to policing in the UK, albeit absent of any discussion or even basic honesty about how it has been applied.   Not so.  The Metropolitan (Met) Police have been running a facial recognition system on its 2.9 plus million custody suite photographs since 2009.  This came to light in recent Freedom of Information Requests (FOIR) sent to the Met Police after the Biometrics Commissioners report (see above) also outed them as using facial recognition technology on their photograph database.  
This would explain then the Commissioner of the Met Police, Sir Bernard Hogan-Howe’s eagerness to lower CCTV cameras; in his own words in an interview from March 2015:
“Over the last year as facial recognition software has got better it means we can apply the software to the images of burglaries… taking the tops of their heads is not that helpful with facial recognition which relies on your eyes and this configuration around the nose and the mouth”
Again, no mention in the interview of the fact they have been using facial recognition technology since 2009.  However, how efficient their facial recognition system has been since 2009, is debatable. When used at airports facial recognition’s accuracy is “around 95 per cent, but that’s down to the fact that the technology is applied to high quality, head on images” says the Met Police’s DCI Mick Neville but often CCTV images are poor and not at the right height.  At a conferencewhere Neville revealed that their facial recognition technology had only positively identified one of person in the London riots in 2011, “Neville underlined the point by holding his hands up to frame his own face and repeatedly emphasised that “we needthe face””.  One can almost sense his frustration at surveillance cameras being too high.
Engadget http://engt.co/1KNlUiY

Maybe this is why the Met Police are planning procuring 20,000 body cameras, the largest roll out of body cams worldwide.  The use of the Met Police’s body cams have seen an increase from 116 in 2013, 230 in 2014, 1,221 in 2015 to a now massive planned 20,000 plus body cameras in 2016.  The trialling of 1000 worn body cams during 2015 across 10 London boroughs saw around 6,000 videos uploaded per month.  Twenty thousand worn body cams operating in a similar fashion would mean 120,000 videos uploaded each month – a huge two million eight hundred thousand plus videos uploaded in a year, all at head height.  No doubt music to Sir Bernard Hogan-Howe and DCI Neville ears.  Surely this must be one of the largest planned data grabs of public life by a UK police force to date?

Public data grabs are not just confined to life in London.  Facial recognition capabilities have improved significantly enough to now be deployed not only on the PND and on regional police forces databases but recently with facial recognition also being applied in the field, literally, at the Download Music Festival in June 2015, where 90,000 festival goers images were being compared to a Europol database of custody suite photographs as they partied.  True to the form of the lack of transparency when police use biometric facial recognition systems, Leicestershire Police, who policed the Download festival, were not impressed that their use of facial recognition technology went public after the initial article appeared in the Police Oracle, causing significant upset for management at Leicestershire Police, who did not want any advance publicity of their “new” surveillance project.” 
(Hereis the Police Oracle article without signing into their website to read it) 

Leicestershire Police were also the first police force regionally to use facial recognition technology on its custody suite database of 92,000 custody suite images.  The biometric identifying system used at the Download Festival apparently is a separate system to that which they use on their custody suite images, NeoFace.  A Freedom of Information (FOIR) request is currently pending to see how this ‘on location’ facial recognition technology was deployed.
Images of us taken from public and private CCTV, police and enforcement officer’s body cameras, mobile camera units, public transport, social media and more, are run past these police image databases to identify us.  What happened at the Download Festival was no more than a fishing expedition by police to highlight persons of interest.  This is ‘predictive’ policing.  No solving of crime, just identifying persons who had been arrested in Europe.  People who have not necessarily been charged or found guilty of a crime or if guilty presumably, if they were at the music festival, they were there because they had paid society back.  People now back in society going about their personal business – innocently.  If one had committed a crime and served the penalty due, is it not right that said person is allowed to continue their lives free from suspicion or tracking because of a previous misdemeanour?
How efficient this current use of facial technology is also unknown.  Freedom of Information requests sent to the Met Police and the Home Office reveal that they do not know how many innocent people’s images, duplicate images, successful searches, false positive searches, have been performed on their facial recognition databases.  How can these biometric identifying systems be evaluated as to their effectiveness or accuracy if no one is looking at this and more importantly how does one defend oneself if the system wrongly identifies one of us?
FOIRs also revealed that Privacy Impact Assessments (PIA) have not been done on these facial recognition systems to measure just how privacy invasive this biometric surveillance technology is.
Home Office FOIR reply“The Home Office does not hold a Privacy Impact Assessment specific to the facial recognition system.”
Met Police FOIR reply“Privacy impact assessments did not exist as a requirement at the point of going live, therefore this information is not held.”
No thought, no consideration, no discussion, no regard for our rights to be private – the right of an individual to be let alone.
This is what the Information Commissioner’s Office (ICO), who ensures the UK Data Protection Actis adhered to, guidance on conducting a PIA where biometric technology is involved, advises:
“Does the project involve you using new technology which might be perceived as being privacy intrusive?  For example, the use of biometrics or facial recognition.
What do we mean by privacy?
Privacy, in its broadest sense, is about the right of an individual to be let alone. It can take two main forms, and these can be subject to different types of intrusion:
Physical privacy – the ability of a person to maintain their own physical space or solitude. Intrusion can come in the form of unwelcome searches of a person’s home or personal possessions, bodily searches or other interference, acts of surveillance and the taking of biometric information
Informational privacy – …It can include the collection of information through the surveillance or monitoring of how people act in public or private spaces…”
The guidance also goes on the states that “The use of biometric information or potentially intrusive tracking technologies may cause increased concern and cause people to avoid engaging with the organisation.”  How a member of the public would ‘avoid engaging‘ with any organisation that ubiquitously collects our facial data, when the simple act of walking out of our front doors potentially engages us with the surveillance system, is mystifying.

Two interesting documents gained under Freedom of Information Act on the UK national facial recognition system reveals that the Home Office has issued guidelines for the use of the Cognitec facial recognition system detailing that “Facial search can be used on facial images found in fraudulently obtained documents in order to assist in identifying perpetrators, date of publication unknown.  It also states there are 15 million images on the system, contradicting the BBC’s claim of 18 million and ACPO’s claim of 13.7 million facial photographs.  The discrepancies are both puzzling and worrying.  If there is clearly no system for logging the amount of images on the PND how can the facial recognition system be evaluated for any PIA?

The minutes of the ACPO Cross Business Area Working Group on Facial Imaging held in April 2014 makes for interesting reading, detailing plans to upload what would seem non custody suite photographs from Operation Amberhill “the MPS’s proactive identity fraud prevention team,” stating that it “ would probably become a national service and that Amberhill images would be added to PND”.  It seems that the police are making the rules up as they go along. 

 
Another addition to the biometric surveillance technology of voyeurism that UK establishments are considering, is what the biometric industry refers to as ‘soft biometrics’ – using characteristics such as height, weight, body geometry, scars, marks, tattoos, and gender, etc, to remotely identify persons of interest.  “Unlike many primary biometric traits, soft biometrics can be obtained at a distance without subject cooperation and from low quality video footage, making them ideal for use in surveillance applications.”  Reid et al 

The two UK cities considering this are Birmingham and Glasgow – Britain’s first ‘smart’ city.  Glasgow are working with NICE Systems to provide their city monitoring, with the largest Operations Centre in Europe costing £12 million.  They have 422 fixed and 30 mobile HD IPTV cameras and, according to a Glasgow City Council spokesperson, are planning to trial NICE System’s ‘Suspect Search’ using soft biometrics sometime in 2015.


ACPO’s submission to the
House of Commons Science and Technologies Committee inquiry into the ‘Current and future uses of biometric data and technologies’ stated:

“In West Midlands Police, for example, the innovation partner Accenture are looking at the application of ‘face in the crowd’ technologies across the wealth of CCTV footage available. It is recognised that infrastructure challenges will need to be overcome for this technology to  be applied successfully.”
Face in the Crowd is to help find ‘missing persons’ but also capable of tracking able persons – not just ‘missing’ adults – but normal law abiding people.   In reply to a FOIR West Midland Police responded when asked if Face in the Crowd was implemented, that it not been commissioned, they have no plans to implement and are not working with any companies but are considering “potential opportunity” of facial recognition.  

Both NeoFace, supplying Leicestershire Police, and Cognitec, supplying the PND facial recognition system, offer ‘soft biometric’ person tracking systems.  When will these be switched on without our knowledge?

The covert roll out of human biometric tracking systems planned by police and establishments across the UK seems out of control.  That there is no public discussion, outrage or even interest from our representatives in parliament is highly concerning.  Where is our right to be “let alone”?  How can we function in our society and value our privacy, free from state surveillance?  We are on the brink of a tsunami of data grab with assumptions being made on the information we unwittingly submit via these remote biometric systems.  Assumptions that we have no idea of – whether true or false.

We are more than our assimilated data. 

————————————-

The current state of surveillance from UK police biometric facial recognition technology:

Nationally

Police National Database (PND):             
Started using biometric facial recognition on 28th March 2014 using Canadian CGIhardware and German Cognitec facial recognition software.  BBC reported in February 2015 that there were 18 million facial images on the PND however a FOIR revealed in March 2015 that the PND stored “13.7 million facial images, with an undetermined number of duplicates.  It should also be stressed that ACPO [Association of Chief Police Officers] does not know how many different people are represented by the images.” Cost – £1,168,758 to implement (assessment, functional design, procurement of hardware and software, build and release).  Data Controllers are each police forces Chief Officer.  CC Mike Barton, Chief Constable of Durham Constabulary, is the National Police Chiefs Council (NPCC) lead for the PND as Chair of the Cross Business Area Working Group on Facial Images

Regionally

Metropolitan Police:
Started using biometric facial recognition in 2009 using L1 Identity Solutions ABIS (Automated Biometric Information System) Face Examiner.  L1 Identity systemswere acquired by Safran in 2011, a French aerospace defence and securities company, and now operates as Morpho Trust.  A stand-alone database of 2.9 plus million custody suite images transferred from the Metropolitan Police’s database onto the facial recognition Morpho database.  No images are stored on the database other than custody suite images.  Data Controller is Sir Bernard Hogan-Howe.

Leicestershire Police:
Started trialling biometric facial recognition in April 2014 using NeoFace, supplied by NEC Corporation a Japanese communications company.  A stand-alone database of 100,000 custody suite images transferred from Leicestershire Police’s database onto the NeoFace database.  No images are stored on the database other than custody suite images.  Cost – unknown, Leicestershire Police declined to answer this under FOIR citing Section 43 of the Freedom of Information Act Commercial Interest.  Remote in the field use of facial recognition at the Download Music Festival 2015, system unknown.


Other forces showin an interest in NeoFace via Leicestershire Police:

(quotations taken from FOIRS) 

Lancashire Police– “Lancashire Constabulary are currently running an internal trial of the NeoFace system which is being sponsored by ACC Bates.”

Essex Police – “still exploring future options.”


North Wales Police– “still under review and a decision has not been made.”
Northamptonshire Police – “no organisational projects to pursue one”
Kent Police – “no plans to introduce facial recognition technology, this is on the grounds of cost”
————————————-
by Pippa king 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.