Countering malign online activities

Media Enquiries

M: 0423 800 109
E: publicrelations@adfa.edu.au

Find a UNSW Canberra Expert

As Australia’s defence capabilities adapt to address emerging threats in cyberspace, enhanced information warfare capabilities within the Department of Defence could help mitigate malign online activities that may compromise national security, such as fake news and influence campaigns spread across social media. 

In new research commissioned by the Department of Defence, a team of 50 academics from UNSW, the University of Adelaide, the University of Melbourne, Macquarie University and Edith Cowan University examined contemporary digital technologies and influence campaigns.

Case studies included Facebook, the Russian Internet Research Agency and Cambridge Analytica. The research into Facebook was led by UNSW Canberra academics.  

“UNSW looked at the Facebook business model, how it operated, how it was potentially open to misuse by malign entities, and how it collects and uses large amounts of user data,” UNSW Canberra Research Associate Peter Job explained.  

Citing the example of Russian interference throughout the 2016 US election, the research considered ways in which an Australian entity could detect and counter a similar attack.  

“We analysed how people use Facebook, how the platform collects and uses data, why it draws certain people into echo chambers and how it can polarise people,” Dr Job said.  

“They start looking at some polarising material, so the algorithms suggest more extreme material and they get drawn into certain groups and conspiracy theories. Malign entities have managed to misuse this effect to spread fake news through Facebook and we looked at various ways in which that could happen.” 

Postdoctoral Research Fellow Garry Young said Facebook was just one example of a social media platform that could be misused, and other platforms were just as susceptible to malign activities.  

“The means by which we operate in modern society and social media is quite vulnerable to malign activities, certainly the way the business model is working at the moment,” Dr Young said.  

“There are a lot of third parties and a lot of potential for manipulation.” 

To counter this, the research has suggested that a workforce is required with skills to identify nefarious activity.  

“An Australian capability to counter malicious influence campaigns would require a strategy and framework for clear and transparent communication with the public, it would likely consist of a mix of personnel with diverse experience and expertise, and would likely include increased government cooperation with social media companies,” Dr Young said. 

These findings will be examined by the Department of Defence alongside the other two case studies.  

The University of Melbourne led a study into the Russian Internet Research Agency as an example of a contemporary state-sponsored mass influence operation.  

A University of Adelaide-led team analysed Cambridge Analytica as an example of the ways in which non-state actors can achieve mass influence.  

The cross-university, multi-disciplinary team featured experts spanning disciplines including cyber security, psychology, law, artificial intelligence and data science.  

“This project has been a very interesting experience, exploring the issues around social media and the potential implications of its misuse for Australia,” Dr Job said.

“The multi-disciplinary approach, collaborating with academics across a number of universities around the country has worked very well.” 

The final report can be found on the Defence Research Institute website.