Why Bunnings may have crossed a line recording customer faces
What’s the difference between being recorded on grainy VHS tape versus Bunnings’ use of facial recognition technology? The answer is darker than you think.
Being recorded in shops, streets and even your neighbour’s doorbell is nothing new, so why has Bunnings ignited a storm over its use of facial recognition cameras?
Closed circuit television cameras have been around for years, and so has coupling that with facial recognition. It first surfaced 23 years ago in Las Vegas to identify relationships between gamblers and casino staff to combat collusion. It was then used to reveal the September 11 hijackers and has become used widely across law enforcement and global intelligent agencies.
But it’s not just used to fight crime – which Bunnings said was its intention – it also makes life more convenient. Passing through an airport is now easier thanks to smart passports and biometrics. Unlocking your mobile phone can also be done with a swipe of a finger and looking squarely at its camera.
Despite its widespread use, the ability for a retailer to record a customer’s face and store that information is a no go zone. In Australia it is classified as sensitive information, which is illegal to collect without a user’s consent. And the Office of the Australian Information Commissioner said Bunnings did just that for a three-year period, invading the privacy of hundreds of thousands of customers.
But how is this different to being recorded on grainy VHS tape from a CCTV system in the 1990s?
The answer lies in what a retailer can do with that information and the potential for misuse.
For years data – even before the artificial intelligence boom – has been king. If a retailer can record a customer and store their biometric data unknowingly, it can open up an Aladdin’s cave of revenue boosting opportunities.
This information can track how many people visit a store, how they react to different in-store promotions, record their shopping behaviour and their associations with family and friends. That seems nothing different to what a storekeeper who is on top of their game could do by paying close attention to loyal customers.
But where it potentially gets darker, and what the privacy commissioner is concerned about. Bunnings said it was only using facial recognition to prevent shoplifting and anti-social behaviour. But that information can be linked to social media, where many customers willingly post pictures of themselves, creating rich customer profiles, gleaning more information that extends way beyond “loss prevention”.
The technology can also be used to aggregate socio-economic and political data – although Bunnings’ parent Wesfarmers was accused of being out-of-step with most Australians over its support for the failed Voice referendum last year.
Combined with artificial intelligence, those insights can be generated in a heartbeat.
Then there is the risk of a cyber attack. What happens if that information is stolen by state-sponsored hackers from China or Russia and used to commit identity fraud and produce deep fakes. It’s sensitive information for a reason.
This is why many Australians feel uneasy about ‘Minority Report’ style tracking. Nicholas Dynon – innovation and risk manager at Optic Security Group, who is a counter terrorism expert and completed a peer reviewed article in the National Security Journal – says people are generally less accepting of facial recognition (FRT) cameras in retail stores than they are in airports and police investigations.
While people are happy to use FRT to help fight crime – even in shops – Mr Dynon said they are less accepting of it being used for “other purposes” such as loyalty programs, advertising, payments and the tracking of customer behaviour.
Nobody likes big brother watching.
“In short, people place relatively little trust in retailers’ ability to operate the technology responsibly or in ways that are demonstrably beneficial to the public,” Mr Dynon said.
“The use of FRT by retailers to identify known shoplifters and anti-social patrons, for example, is considered no more acceptable by the public than the idea of its use by police to identify minor offenders or traffic rule breakers. The identification of minor offenders is widely seen as a disproportionate use of the technology.
“FRT is a powerful tool that offers many potential security – and other – benefits. But what may be viewed by an organisation as a revolutionary crime prevention and business improvement capability may be viewed by many of its customers or employees as technological overreach and a threat to individual privacy and freedoms.”