in

AFP used Clearview AI facial recognition software to counter child exploitation

The Australian Federal Police (AFP) has admitted to using a facial recognition tool, despite not having an appropriate legislative framework in place, to help counter child exploitation.

In response to questions taken on notice by deputy commissioner Karl Kent, the AFP said while it did not adopt the facial recognition platform Clearview AI as an enterprise product and had not entered into any formal procurement arrangements with the company, it did use a trial version.

The AFP-led Australian Centre to Counter Child Exploitation (ACCCE) registered for a free trial of the Clearview AI facial recognition tool and conducted a pilot of the system from 2 November 2019 to 22 January 2020.

“The trial was to assess the capability of the Clearview AI system in the context of countering child exploitation,” the AFP wrote.

See also: Facial recognition: Convenient or creepy?

The trial saw nine invitations sent from Clearview AI to AFP officers to register for a free trial, with seven officers activating the trial and conducting searches.

“These searches included images of known individuals, and unknown individuals related to current or past investigations relating to child exploitation,” it said. “Outside of the ACCCE Operational Command there was no visibility that this trial had commenced.”

In response to three requests for information sought under Australia’s Freedom of Information Act 1982, the AFP said there were no cases of it using Clearview AI.

Last month, Kent told a Parliamentary Joint Committee on Intelligence and Security (PJCIS), as part of its review of the country’s mandatory data retention regime, that he had received advice from his legal team on the matter and that they needed to do “some further digging”.

“The requests were processed in accordance with the FOI Act, in that reasonable searches were undertaken by the AFP portfolio with responsibility for facial identification capabilities. No information relating to Clearview AI was identified,” the AFP wrote in response.

“It was subsequently discovered … the AFP-hosted Australian Centre to Counter Child Exploitation (ACCCE) held information relevant to Clearview AI, which was not identified in response to the earlier freedom of information requests.”

With Clearview AI in February suffering a data breach that exposed its customer list, the number of accounts each customer has, and the number of searches those customers have made, the federal opposition has asked Minister for Home Affairs Peter Dutton to explain whether the use of Clearview AI without legal authorisation has jeopardised AFP investigations into child exploitation. It also asked Dutton whether he could ensure that no Australians have had their privacy breached as a result of the use of Clearview AI by AFP officers.

“Peter Dutton must immediately explain what knowledge he had of Australian Federal Police officers using the Clearview AI facial recognition tool despite the absence of any legislative framework in relation to the use of identity-matching services,” a statement from Labor said.

See also: Committee orders complete redrafting of Biometric Bills as privacy safeguards are deemed inadequate

Clearview AI, founded by Australian entrepreneur Hoan Ton-That, also faced criticism last month from Australian Privacy Commissioner Angelene Falk, who had made inquiries to determine if the data of Australians had been collected.

Meanwhile, the Office of the Australian Information Commissioner (OAIC) has issued a notice to produce under section 44 of the Privacy Act in relation to Clearview AI.

“The AFP is continuing to review this matter internally. It is our understanding that, accepting the limited pilot outlined above, that no other areas or individuals have utilised the Clearview AI product or engaged with the company,” AFP said in response to questions on notice.

“The AFP is fully cooperating with the OAIC and is continuing to review and evaluate our governance and policy setting in this space.

“The AFP seeks to balance the privacy, ethical and legal challenges of new technology with its potential to solve crime and even save victims. We are actively looking to improve our processes and governance without necessarily constraining innovative investigative approaches.”

RELATED COVERAGE


Source: Information Technologies - zdnet.com

Samsung pushes 8.5Gbps across 800MHz of 5G mmWave spectrum

Coronavirus tracing tech policy 'more significant' than the war on encryption