in

Experts renew calls for a government body to tackle foreign disinformation

Analysts from the Australian Strategic Policy Institute (ASPI) have reignited their calls for a government body to tackle disinformation and the incentives of social media and technology companies that operate in Australia.

ASPI’s Dr Jake Wallis and Tom Uren, appearing in a personal capacity at a hearing held by the Select Committee on Foreign Interference through Social Media on Monday, said in terms of policy responses, the best way forward is for platforms to publish content moderation guidelines and regular transparency reports for all harms, as well as their responses.

“Currently there are different degrees of transparency across the platforms — even particular distinctions between the positions that Twitter and Facebook take — but also, more concerningly, across non-Western platforms,” Wallis said.

“We suggest some sort of independent body, perhaps a statutory authority, that is empowered to observe and report on how the incentives, policies, algorithms, and enforcement actions of social media platforms are operating, with the ultimate goal being to maximise benefits and reduce harm for society and its citizens.”

Wallis said the authority would be granted explicit insight into how content is filtered, blocked, amplified, or suppressed, both from a moderation point of view and from an algorithmic amplification one.

Uren said government agencies that may like to investigate what is happening on social media platforms simply do not have the data that the platforms themselves have, saying as a result, it’s very hard for outsiders to see what’s happening behind the curtain.

“My personal observation is that no-one in government thinks that they own this problem,” he said.

“I think foreign disinformation is just one aspect of the problem. You’d need something that tackles, more directly, the incentives of social media and technology companies. That, to me, sits more naturally in communications. But there’s also a part around promoting a vibrant media industry in Australia, which I think also would be communications.”

See also: ASPI wants statutory authority to prevent foreign interference through social media

Despite statistics on takedowns from the social media giants, Wallis is unsure of the extent to which those actions would serve the interests of the Australian public.

“I think a huge challenge here is that the platforms themselves are the frontline for this activity. They are the coalface. They’re the ones undertaking the earliest stages of investigation. They have sets of content moderation policies and terms of service that they can apply in regulating what happens on their platforms,” he said.

“But as to the extent to which that serves the interests of the Australian public, I’m not sure that we know or that we have a good answer to that. I’m not sure we have a great sense of the extent to which this activity is happening given that it’s the platforms that identify it and that provide takedown datasets, to a greater or lesser degree.

“But unless we have some sort of public capability that allows us to understand what the extent or scale of targeting is, we’re not in a position to have evidence based discussions around it.”

Also appearing before the committee in a personal capacity was Katherine Mansted from the Australian National University’s National Security College, who echoed calls for moving beyond a reactive approach to social media disinformation, to one that is based on resilience and context building.

“Actions by social media companies to identify and remove bots, look at controls and take down disinformation networks are key, but alone these actions are insufficient,” she said. “In fact, they condemn us to playing whack-a-mole, so we need to move beyond this tactical approach to addressing misinformation towards a more strategic one.”

Mansted believes the best approach is to work more on exposing and disrupting the precursors to disinformation, such as theft of citizen data, the black market trade in stolen account handles, and efforts by governments to control social media and other digital platforms.

“We need to look at funding efforts to build public awareness of the strategic narratives that underpin and promote disinformation, to help politicians, opinion leaders, and ordinary citizens better understand the purpose and context of disinformation,” she added.

“We also need to, where disinformation is covert, pay attention to facilitating public attribution to the actors responsible for covert disinformation.”

According to Mansted, COVID-19 has very much been an accelerant and will continue to be an accelerant for propaganda and disinformation.

There is a significant body of cognitive research which demonstrates that people are more susceptible to propaganda and disinformation in times of high anxiety and uncertainty,” she said.

“COVID-19 — not just the acute health crisis but also the economic and social consequences that will continue to flow for some time — will create fertile ground for disinformation and propaganda.

“This is something that we’ve seen a number of actors take advantage of, such as state actors, peddling disinformation and trying to enhance social polarisation, and extremist groups … there’s been an increase in the spread of conspiracy theories associated with COVID-19.”

Psychologists at the UK’s Northumbria University have touted scientific evidence of the link between violent behaviour towards the telecommunications sector and 5G COVID-19 conspiracy beliefs.

Read more: Facebook comments manifest into real world as neo-luddites torch 5G towers

While telcos, law enforcement bodies, and media outlets have drawn the link, Northumbria University’s findings revealed that belief in 5G COVID-19 conspiracy theories was positively correlated with state anger.

The researchers assessed 601 UK participants’ and their levels of 5G COVID-19 conspiracy beliefs, levels of paranoia, and state anger — which is temporary, short-lasting outbursts of anger. Paranoia, in the researchers’ context, refers to participants’ belief that there is hostile intent towards them personally, as opposed to the conspiratorial belief that powerful organisations are harming society at large.

Additionally, participants were asked questions about whether they thought violence was a justified response to the alleged link between 5G mobile technology and COVID-19. Participants similarly stated how likely they would be to engage in such behaviours, the university said.

“The findings revealed that belief in 5G COVID-19 conspiracy theories was positively correlated with state anger. In turn, this state anger was associated with a greater justification of violence in response to a supposed connection between 5G mobile technology and COVID-19,” Dr Daniel Jolley and Dr Jenny Paterson wrote.

The psychologists’ research also indicates that these patterns were not specific to 5G conspiratorial beliefs and said general conspiracy theorising was linked to a justification and willingness to engage in violent behaviour more generally because such theorising was associated with increased state anger.

Also appearing before the committee on Monday was Alex Stamos from Stanford Internet Observatory, who said in the United States, a lot of the COVID-19 disinformation has been domestic.

“A lot of it has been from people who are motivated around antivax, who believe that everything’s a conspiracy — Bill Gates, despite having more money than God, has got some kind of plan to use vaccines to get more money; I’m not exactly sure what their theory is,” he said.

Stamos, who in a previous life was Facebook’s chief security officer and is now contracting to controversial video conferencing platform Zoom, used this example to highlight how domestically-spread disinformation is just as powerful as foreign interference.

RELATED COVERAGE


Source: Information Technologies - zdnet.com

Vocus ISPs Dodo and iPrimus taken to court by ACCC over NBN speed claims

South Korean telcos launch digital driver's licence