in

Facebook and Google refuse 1 in 5 Australian law enforcement data access requests

Both Facebook and Google have told a House of Representatives Standing Committee that they have each respectively refused 20% of Australia’s law enforcement requests for data held on their platforms.

For the 2019 calendar year, Google received 4,363 requests from Australian law enforcement agencies to disclose account level data to assist them in their investigations and a further 23 requests under the search giant’s emergency disclosure policy, which is used in cases where life is deemed to be imminently at risk. The approval rate for these requests, Google government affairs and public policy manager Samantha Yorke said, sat at around 80%.

Facebook received 943 total requests and also disclosed data in 80% of the cases.

“In the 20% where we didn’t, it’s typically because there was not enough either legal authority demonstrated or the request was too vague or broad for us to be able to comply,” Facebook director of public policy in Australia, New Zealand, and the Pacific Islands Mia Garlick said.

Similarly, Yorke said requests were denied due to a lack of information from relevant parties or because the accountholder was not an Australian resident or citizen and therefore local law enforcement did not have appropriate jurisdiction to request such information.

See also: NZ Privacy Commissioner labels Facebook as ‘morally bankrupt pathological liars’

Yorke and Garlick were appearing before the Standing Committee on Social Policy and Legal Affairs, with Tuesday’s hearing focusing on family, domestic, and sexual violence. The committee took the opportunity to discuss Facebook’s move into end-to-end encryption across its Messenger platform.

“We did announce that we would be taking many years to make this transition because we do work globally with law enforcement in all parts of the world and with global security agencies,” Garlick said.

She said Facebook has been engaged in discussions with Australian law enforcement agencies, as well as the Department of Home Affairs, to talk through what law enforcement “looks like in the end-to-end encrypted world”.

“We’re aiming to be an industry leader in this space and work with them not just on how things can stay the same with respect to unencrypted services but also thanks to the investment that we’ve made for over a decade in artificial intelligence and machine learning, there can continue to be reliance on that to assist with identifying behavioural signals that can assist with law enforcement operations,” she continued.

Also appearing before the committee was Australian eSafety Commissioner Julie Inman-Grant, who has publicly taken issue with Facebook’s end-to-end encryption threat since August 2019, before law enforcement joined the debate.

“We are concerned about industry going down [this path] without actually openly talking about some of the technologies and techniques that are out there, including homomorphic encryption that can be used to scan for child sexual abuse images even in end-to-end encrypted situations,” she said.

Inman-Grant highlighted the reports made to the US National Center for Missing & Exploited Children by other tech companies.

“In 2019, there were almost 60 million from Facebook. Now that may change if they actually go to end-to-end encryption, but if you look at companies like Apple, there were something like 230 — now they have billions of users, lots of storage capacity in iCloud, they’ve got iMessage — you can’t tell me that there are only 230 child sexual abuse images on their platform,” she said.

“Amazon, look at AWS, that hosts most of the world’s data — they had eight. Even my former employer Microsoft who owns Skype — Skype for the past 10 years has been the most benevolent vector for child sexual livestreaming of abuse.”

Inman-Grant said she has personally sent three letters and had five conversations with Microsoft about how it could use technologies across Skype to catch predatory material.

“‘If you’re saying a Skype conversation is end-to-end encrypted, if you can insert a simultaneous translator in there, why can’t you eat your own dog food and use Photo DNA or an algorithm called Project Artemis that uses grooming technologies?’, and they say it’s because of the privacy of the customer,” she said.

“I think we need to stop giving all of these companies a free pass.

“Over time, if we don’t see the issues addressed and we think the harms to children and vulnerable users are too great, I think legislation is an option.”

RELATED COVERAGE


Source: Information Technologies - zdnet.com

Service NSW expecting cyber attack to set it back AU$7m in legal and investigation costs

Aussie BitConnect promoter charged over his involvement with alleged crypto scam