In what seems a significant breakthrough, the Tech Council of Australia, the ACTU and the Media Entertainment and Arts Alliance (MEAA) plan to meet to discuss compensation for creatives whose work is used to train AI models.
While this is an initial dialogue, the MEAA has described it as “a breakthrough.”
“A year after launching our campaign to force Big Tech giants to stop stealing the work of Australia’s media and creative workers, the tech industry has agreed to come to the table and negotiate a compensation deal,” MEAA said in a statement.
“We look forward to sitting down with the Tech Council of Australia and the ACTU to reach a deal that serves our members’ interests swiftly and fairly. MEAA welcomes the acknowledgment by technology companies that Australia’s media and creative workers deserve to be paid for the content used to develop highly profitable AI models.”
The Tech Council appears receptive. CEO Damian Kassabgi noted that tech platforms have helped creatives gain exposure, clicks, and new revenue streams—including international reach for Australian artists.
“We’re keen to explore what AI means for different sectors, including opt-out models,” he said. “There’s been real progress in recent days toward a shared understanding of the need to collaborate and seize the national opportunity AI presents. This is just one part of that.
“We’re hopeful we can find a path forward on copyright that enables AI training in Australia while protecting the livelihoods of creators. What that path looks like has not yet been determined.”
This willingness to talk marks a major shift. Until now, big tech has warned that Australia risks falling behind unless our writers, artists, composers, photographers, researchers, and news organisations surrender their copyright to AI systems that consume their work—without consent or compensation. Many suspect the real goal is simply to use creative works for free.
This development also represents a reversal by the Tech Council, whose newly appointed chair, Atlassian co-founder Scott Farquhar, had previously urged Australian creatives to forgo copyright. His comments were disappointing, especially given the Council’s central role in shaping Australia’s tech economy.
Decades ago, I worked in the TAFE system and saw firsthand the value of collaborating with government to forecast employment needs and align education and industry. That’s precisely the kind of strategic foresight the Tech Council seeks to offer today.
The Council’s membership includes major Australian and U.S. tech firms, making its position especially influential. These companies are among the largest providers of high-tech jobs in Australia and must be part of any forward planning.
The Productivity Commission also weighed in, though its recent media remarks on issues including copyright felt more like thought bubbles than considered analysis. The claim that paying creatives would collapse productivity and ruin the economy was patently absurd.
Speak to those involved in AI governance and you’ll hear a different story. They’ll tell you that AI development won’t be impeded by copyright claims. They’ll explain that copyrighted material can be logged as it’s ingested into AI systems. The sky won’t fall if creatives assert their rights.
In fact, compensation claims are feasible, especially at the point of original use by AI. A log could form the basis of a searchable register, maintained by the Copyright Agency or another authority, where creatives could lodge royalty claims.
Yes, complexities exist, particularly with visual media. Legal disputes may arise over whether an image contains unlawfully copied elements. For news articles, direct negotiation between publishers and AI firms (akin to the News Media Bargaining Code) would be more efficient than tracking individual stories.
Importantly, many cutting-edge industries will rely on bespoke sovereign LLMs trained on proprietary business and industry data and not necessarily on creative works. At the recent TechLeaders 2025 conference, Australian company Maincode briefed journalists on its sovereign LLM strategy, which may not require local creative content at all.
Governance experts agree: technology itself doesn’t prevent us from paying creatives. What does? Greed.
Further, these discussions could lead to compensation models outside the Copyright Act. Historically, AI-related copyright disputes have played out under U.S. law, where weak “fair use” provisions have left creatives shortchanged. But Australian law is poised to take a more active role, especially as locally developed LLMs train on domestic material. Representative organisations are urging the government to address AI-related copyright explicitly.
There’s clear logic in updating the Copyright Act to reflect the realities of AI training. The Act was never designed with AI in mind, and without reform, we risk wasting millions on litigation to clarify ambiguous law. The US has already endured costly, protracted battles over whether LLM training breaches copyright. Big Tech denies it outright—and so far, U.S. courts have sided with them.
Adding provisions to Australian law that affirm copyright protections in the context of AI training would eliminate this ambiguity. Yet Industry Minister Tim Ayres recently reiterated that the government has no plans to amend the Act in either direction.
Meanwhile, some U.S. companies may still face sanctions for using pirated content—Anthropic, developer of Claude AI, being one example.
The irony? AI itself shows it’s not hard to devise ways to pay creatives. I recently brainstormed this issue with Microsoft Copilot. Together, we explored a model that logs works as they’re used to train LLMs, feeding them into a searchable register for royalty claims across various use categories. The dialogue was illuminating – see below.
Of course, Australian organisations representing creatives have already proposed legal solutions. I defer to their expertise entirely.
Originally published by iTWire, August 22, 2025