ADM+S Submission to The Joint Select Committee on Social Media and Australian Society
This submission considers some important and complex questions around the role of social media in Australian society by a large group of researchers from the ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S).
Facial recognition
When considering the use of age verification to protect Australian children from social media, ADM+S research shows that facial recognition technologies are unreliable, of limited efficacy and have a racial and gender bias. In cases where facial recognition is deployed, cumbersome reallife double checking of the system’s alerts is required, with humans still required to do the actual work of age-verification.
This submission also considers developments in China, where authorities placed a series of restrictions on people aged 18 or below playing video games and relied on facial recognition technology to enforce the reform. Despite the significant technological capabilities of the Chinese state, the use of facial recognition technology has been largely ineffective there.
News Media Bargaining Code
The submission questions the viability of the News Media Bargaining Code, with the model wrongly presuming that all digital platforms consistently need news content to provide a service. The committee is encouraged to re-consider the recommendations from the Digital Platforms Inquiry Preliminary Report, where holistic regulatory oversight was proposed that directly remedied the information asymmetry news organisations are currently facing. If the Australian Government wants to force certain platforms to carry a certain amount of Australian news content, a different legislative instrument may be required.
Mis- and disinformation
Australian journalism, news and public interest media currently play both constructive and less constructive roles in the circulation of mis- and disinformation on digital platforms. While responsible reporting is a critical element of the wider information ecosystem, mis- and disinformation can also be amplified through reporting from mainstream media outlets. The practice of sourcing stories from social media, including by reporting on viral TikTok videos or Twitter controversies, is a key source of this amplification. The submission also highlights the important role that independent fact-checkers continue to play.
Further research
There is very little systematic knowledge about how platforms’ algorithms, recommender systems and business tactics influence what Australians see (and hear). The submission details a range of new and ongoing research projects and infrastructure across the ADM+S Centre, which are developing the methods and tools to better understand these systems. The newly launched, federally funded Australian Internet Observatory (funded by NCRIS through the ARDC) will enable such work to be scaled further to a national level.
The ADM+S Australian Ad Observatory has also found that harmful and illegal content can also form a substantial component of the everyday advertising content served to Australians by social media. The Australian Ad Observatory has used novel data donation methodologies to provide unprecedented insights into the online advertising experience of Australian Facebook users, identifying harmful or illegal advertising content that includes unlawful scam content, harmful and, in some cases, potentially unlawful gambling advertising, and concerning patterns of alcohol and unhealthy food advertising.
Funding
ARC Centre of Excellence for Automated Decision-Making and Society
Australian Research Council
Find out more...