NewsBite

Advertisement

Why Victoria’s privacy chief is so worried about facial-recognition technology

By Kieran Rooney and Carla Jaeger

Victorian gambling venues are increasingly using off-the-shelf facial recognition technology to help identify problem gamblers, prompting concerns that sensitive data is being put at risk.

The state’s information commissioner this week said some government contractors were not being held to strict privacy laws, and admitted several data breaches of government bodies it had not investigated.

Facial recognition technology is becoming increasingly prevalent.

Facial recognition technology is becoming increasingly prevalent.Credit: AP

Community Clubs Victoria chief executive Andrew Lloyd has written to the national privacy commissioner requesting assistance over the issue of facial recognition. He warned that it was “problematic” that self-exclusion files were being handed over to third-party suppliers that process and install the systems.

In that letter, seen by The Age, he raised concerns that venues and retailers were purchasing facial recognition technology from sellers that “essentially provide no guidance” about whether they complied with Australia’s data security and privacy guidelines, such as how information is stored and whether it is deleted appropriately.

“I think facial recognition systems can be a wonderful tool, however the implementation and protections need to be embedded properly within the business utilising the technology,” Lloyd told The Age.

Loading

“For licensed premises, facial recognition systems can provide definite protections, and in South Australia the government has implemented a gold-standard system in licensed clubs that interfaces with the gambling self-exclusion system, which is working extremely well.

“I am concerned to hear that some venues are giving their self-exclusion files to third-party suppliers to process images and install these systems. This practice is problematic.

“I think the privacy commissioner needs to provide more education and guidance for the retail and hospitality industry, so operators are not installing systems that are not compliant and not understanding what due diligence they need to go through to meet the Australian Privacy Principles and legislation.”

Advertisement

Problem gamblers can choose to be excluded from gaming venues, whose facial recognition software will alert security if those on the self-exclusion register attempt to enter a casino or gaming establishment. In most cases, people who sign on to self-exclusion registers in Victoria are warned their data could be used for facial recognition.

But as states across Australia have sought to strengthen protections around pokies venues and tackle organised crime, they have encountered challenges on the use of facial-recognition technology.

New South Wales Attorney-General Michael Daley this month said he was committed to the use of facial-recognition technology but warned there was a “way to go” in how it should be used.

“We must make sure that the technology works and that its implementation protects people’s privacy,” he said in parliament.

“With cybercrime and a range of other things, we all worry that the more data these systems collect the more is put at risk of being stolen.”

Australian Privacy Foundation immediate past chair David Vaile said the spread of cheap internet-connected technology increased the risk of personal information being exposed, particularly if it wasn’t regularly updated with security upgrades.

Loading

“All the attacker needs is a tiny, hairline crack to get through, and what the defender needs to do is have 100 per cent perfect perimeter security, which is impossible,” he said.

“The business model of the attackers has proven very successful, so they’ve grown in 20 to 30 years.”

Vaile said the use of different technology across each venue made it almost impossible for people walking through the door to judge whether the facial recognition technology in use was safe.

Retailer Bunnings this month was found by the national regulator to have breached Australia’s privacy laws with its use of facial-recognition technology, a finding it has sought to review.

Following this, the Office of the Australia Information Commissioner (OAIC) issued guidance about how businesses should approach the issue.

The office said it was up to organisations to justify that the collection was necessary in the first place and that they should take reasonable steps to identify risks for how information was used, stored, destroyed and de-identified.

“The fact that FRT is available, convenient or desirable should not be relied on to establish that it is necessary to collect the information,” the OAIC guidance said.

    An OAIC spokesperson said the “onus is on organisations to reassess their own practices and ensure they comply with our guidance”.

    Victorian Information Commissioner Sean Morrison told a parliamentary hearing last week that his office had concerns about potential breaches because government agencies were failing to enforce privacy laws when outsourcing work.

    “The expectation there is that when agencies are contracting with the provider that they pass on all of the [privacy and security] requirements to those agencies – the access to information privacy or information security. And … we don’t believe that’s happening now,” Morrison told the hearing.

    Morrison said his office was aware of several data breaches that did not result in investigations, and that agencies were not reporting breaches due to fears they would be investigated.

    “There were also a couple of other breaches that we are aware of that we didn’t do investigations on … but where, again, the volume of information that was [taken] was much higher than it should have been.”

    Victoria’s Privacy and Data Protection Deputy Commissioner, Rachel Dixon, would not provide further information about these breaches when asked by The Age, but said: “[OVIC] cannot investigate where it does not have jurisdiction over the organisation or type of information impacted”.

    In 2022, tens of thousand Victorians’ had sensitive personal information exposed after a ransomware attack on a state government contractor, Datatime.

    Datatime held several contracts with six different state departments spanning decades. Dixon launched an investigation into the breach.

    The report, handed down in May this year, found Datatime had troves of public-sector data dating back to as early as 2003, including sensitive personal information like medical records and family histories.

    “It was an organisation that was contractually required by several government agencies to delete the data that they were collecting within a matter of months,” Dixon said at last week’s parliamentary hearing.

    The report also found that along with other cybersecurity issues, Datatime and the two departments it held contracts with at the time of the breach were unclear on their obligations to destroy and de-identify government data.

    Dixon was unable to complete her investigation as the company entered voluntary administration.

    Under OVIC guidelines, government agencies are supposed to bind all third-party providers to comply with laws which govern the handling of personal information and public sector data. The information watchdog is currently consulting agencies as it develops a new set of guidelines.

    Without enforcing these privacy obligations to contractors, there is little recourse if a privacy breach occurs, said Dixon.

    A state government spokesperson said any agency aware of a breach should report it to the Office of the Victorian Information Commissioner.

    Start the day with a summary of the day’s most important and interesting stories, analysis and insights. Sign up for our Morning Edition newsletter.

    Most Viewed in Politics

    Loading

    Original URL: https://www.theage.com.au/politics/victoria/why-victoria-s-privacy-chief-is-so-worried-about-facial-recognition-technology-20241129-p5kuo7.html