Table of Contents
Tech companies Amazon, Google and Meta have been criticized by a Senate select committee investigation for being especially vague about how they used Australian data to train their powerful artificial intelligence products.
Labor senator Tony Sheldon, chair of the inquiry, was frustrated by the multinationals’ refusal to answer direct questions about their use of Australians’ personal and private information.
“Watching Amazon, Meta and Google dodge questions during the hearings was like sitting through a cheap magic trick: lots of hand gestures, a puff of smoke and nothing to show at the end,” Sheldon said in a statement. after publishing the investigation’s final report on Tuesday.
He called tech companies “pirates” who were “plundering our culture, data and creativity for their profit while leaving Australians empty-handed”.
The report found that some general-purpose AI models, such as OpenAI’s GPT, Meta’s Llama, and Google’s Gemini, should automatically move into a “high risk” category and be subject to mandatory transparency and accountability requirements.
Several key themes emerged during the investigation and in its report.
Separate AI laws needed
Sheldon said Australia needed “new stand-alone AI laws” to “control big tech” and that existing laws should be amended as necessary.
“They want to make their own rules, but Australians need laws that protect rights, not the bottom line of Silicon Valley,” he said.
He said Amazon had refused during the investigation to reveal how it used data recorded from Alexa, Kindle or Audible devices to train its AI.
Google also, he said, had refused to answer questions about what user data from its services and products it used to train its AI products.
Meta admitted it had been collecting data from Australian Facebook and Instagram users since 2007, in preparation for future AI models. But the company couldn’t explain how users could consent to their data being used for something that didn’t exist in 2007.
Sheldon said Meta dodged questions about how it used data from its WhatsApp and Messenger products.
AI is ‘high risk’ for creative workers
The report found that creative workers were at the most imminent risk of AI severely impacting their livelihoods.
It recommended payment mechanisms be put in place to compensate creatives when AI-generated work was based on their source material.
AI model developers needed to be transparent about the use of copyrighted works in their data sets, the report said. Any declared work must be licensed and paid.
Among the report’s 13 recommendations is a call for the introduction of stand-alone AI legislation covering AI models deemed “high risk”.
AI that impacts people’s rights at work should be considered high risk, i.e. consultation, cooperation and representation before being adopted.
Music rights management organization Apra Amcos said the report recognized the detrimental impact of AI on workers, particularly in the creative sector. He said the report’s recommendations proposed “clear steps” to mitigate risks.
The Media Entertainment and Arts Alliance said the report’s call for the introduction of legislation to establish an AI Act was “clear and unambiguous”.
Don’t suffocate AI with bureaucracy
The two Coalition members on the committee, Senators Linda Reynolds and James McGrath, said AI posed a greater threat to Australia’s cybersecurity, national security and democratic institutions than the creative economy.
They said it was necessary to implement mechanisms “without infringing on the potential opportunities that AI presents in relation to job creation and productivity growth.”
They did not accept the report’s conclusion that all uses of AI by “people at work” should automatically be classified as “high risk.”
Additional comments from the Greens argued that the final report did not go far enough.
“(The report) does not recommend an overall strategy that aligns Australian AI regulation with that of the UK, Europe, California or other jurisdictions,” the party said.
The Guardian has approached Amazon, Google and Meta for comment.