Facial Recognition Technology: Lessons Learned
Whether you like it or not, Facial Recognition Technology (FRT) is gaining increased exposure across the globe. Governments, corporations and organisations are investing in and developing the technology, and there’s an angle for it absolutely everywhere – from law enforcement and crime detection to social media and marketing.
Understandably, FRT doesn’t always generate positive publicity. It’s only a couple of years ago that the digital behemoth, Facebook, was dragged through the courts when it started using facial data without user consent. Quite rightly privacy laws across the globe (and particularly in the EU) are becoming tougher. That’s why, when DataSparQ made its first foray into the world of FRT, we did so with much trepidation and care.
We announced our first FRT product and application, the AI Bar, with a small fanfare in August 2019. The press coverage it achieved and the response it received across the world was astounding. It’s only now that we’ve had to time sit back and reflect on the lessons we learned during the process of developing and trialing it.
For those of you who somehow managed to avoid the media frenzy that followed the AI Bar’s launch this summer, and failed to catch a glimpse of our television debut on BBC’s The One Show in October, here’s a brief heads-up on the product and its application. We designed the AI Bar to help speed up the process of getting served at a bar, and to make the system fairer and easier for punters and bar staff alike. It uses facial recognition technology to note the order in which punters approach the bar and then puts them in a virtual queue – letting the bar staff know, via an image on a screen, who to serve next. If you’re interested, you can read more about how it works here.
We were well aware that a lot of solutions that use FRT invoke criticism regarding the legality and ethics of their application. Making sure all the right boxes were ticked was certainly a interesting learning curve for us – and thirsty work!
At a basic level FRT collects biometric data relating to a person’s facial features. According to GDPR this data is classed as “sensitive personal data”, and is therefore subject to special conditions. GDPR laws regarding sensitive personal data may be very restrictive, but there are exceptions that will allow the collection and use of sensitive personal data. We applied one of those exceptions to the AI Bar – and that is if the ‘Data Subject’ (or in the case of the AI Bar, the punter) willingly gives their consent for us to use their image.
GDPR also requires strict measures to ensure that an individual’s personal identifiable information is never compromised. In terms of the AI Bar we’ve ticked that box by making sure that the recorded footage collected at the bar is stored on a private, secure network, and not in the cloud. In addition, no images are stored to disk and the data is deleted every evening.
Limitations of the technology
Perhaps the biggest challenge we are still working to overcome is the current limitations of the technology. Biometric technologies for facial recognition require machine-learning algorithms that have been trained on a dataset of labeled images. The system can only recognise faces within the parameters of the data that it has been trained on and previously exposed to.
Where ‘face comparisons’ are concerned, a machine learning model will come back with a probability score on whether a face is a match – somewhere between 0 and 100%. We have to decide where we are going to draw the line. For example, we may decide that if it’s an 80% match then that’s good to go. You can’t be too strict and aim for a 100% match because you have to allow for anomalies, such as different angles and lighting. And if you’re too lapse, then you’ll be making some very unlikely matches.
There have been some high profile concerns that there is intrinsic racial and gender bias within FRT systems, and during its feature on the AI Bar, the BBC’s The One Show demonstrated that our technology fails to recognise any difference between identical twins. They did however also point out that it would be unusual for a pair of twins to go to the bar and buy their drinks individually! We’re still trialing different thresholds in order to fine tune the balance and set up the system for complete success.
For all the publicity (good and bad) that FRT receives, it’s worth remembering that the technology isn’t as advanced as the media would actually have you believe. We’ve absolutely loved working on the AI Bar, and whilst we’ve got a myriad of ideas on how we can develop it further, we recognise that we’ve also still got a number of challenges with regards to the technology and the regulations that surround it. But that’s what being innovative is all about!
We’re currently focusing our attention on the area of consent. Getting written consent for everyone that enters a public bar is a trip hazard for the AI Bar. We’ve got a couple of options – either look at reducing the amount/sensitivity of the data to eliminate the need for explicit consent, or develop an easier/quicker method of obtaining that consent – one that doesn’t negatively impact the experience of the ‘punter’. Perhaps our next product?
One thing’s for sure, the AI Bar and applications like it, have piqued public interest. It’s got legs and we’re running with it! We’ll continue to work with the Information Commissioner’s Office (ICO) to understand restrictions and concerns, and hopefully we’ll soon be seeing you at the bar to celebrate the launch of the next generation of AI Bar and other exciting AI products.