City aerial shot

Authors, artists take the fight to Big Tech over AI

Professional associations covering artists, authors and publishers have hit out at calls by big tech firms for Australia to water down its copyright law, and to allow for cheap or free access to protected works and data.

They have also warned against Australia ceding too much power to international regulators on copyright, a scenario that big tech is promoting.

Google, Microsoft and Amazon, along with industry groups backed by global technology firms, say Australia will be left behind in the emerging era of generative AI if the companies don’t have freer access to local data for training their large language models.

The Australian Publishers Association, Australian Society of Authors, Arts Law Centre of Australia and Australian Copyright Council have vented their concerns in submissions to the Department of Industry’s safe and responsible AI inquiry launched in June.

Australian authors also are reeling at the discovery of dozens of copyrighted Australian works among a dataset of 183,000 pirated titles compiled and used by developers to train large language models, without permission.

Host website The Eye was forced to delete the dataset after receiving a take-down notice by the Denmark-based Rights Alliance.

The Australian Society of Authors (ASA) said about 18,000 ISBN book identifiers had a connection with Australia. “The ASA has received phone calls or emails from over 150 affected authors,” said chief executive Olivia Lanchester.

She named Helen Garner, Richard Flanagan, Dervla McTiernan, Sophie Cunningham, Trent Dalton, Shaun Tan, Markus Zusak, Tim Winton, Christos Tsiolkas, Andy Griffiths, Melissa Lucashenko, Tom Keneally, Jane Harper, and Robbie Arnott among local authors whose copyrighted works were accessed without permission.

Ms Lancaster said authors and artists were “vulnerable to mimicry and having their distinctive voice and style copied” by large language models.

“AI-generated content can be generated quickly and cheaply, and we therefore worry about a devaluing of books and a devaluing of the labour of authors and illustrators.”

She said the market had been diluted by a sudden influx of titles due to the ease of text and image generation by AI.

The unauthorised Books3 dataset was created for independent AI developers, but Meta stands accused of accessing it to train its LlaMA language models.

US lawyers have confirmed they are targeting Meta’s use of Books3 in coming litigation by authors that now had court approval.

“In July 2023, Joe Saveri and I filed the first case challenging Meta’s use of Books3, on behalf of authors Richard Kadrey, Sarah Silverman, and Christopher Golden,” US lawyer Matthew Butterick said in an email this week.

“The judge recently confirmed that the core copyright infringement claim can move forward. We are starting discovery.”

Meta’s global policy head, former UK deputy prime minister Sir Nick Clegg, has told Reuters that Meta expected litigation over “whether creative content is covered or not by existing fair use doctrine. We think it is.”

The federal Industry department inquiry on safe and responsible AI has been running in parallel with a copyright enforcement review by the Attorney-General’s department, which is hosting private roundtables with stakeholders.

Nevertheless, submissions to the Department of Industry have much to say on copyright.

Google’s submission said Australia’s copyright framework was already impeding its ability to build Google’s AI research capacity and investment in Australia “compared to other more innovation-friendly legal environments fostered by nations like the United States, with its fair use protections, or Singapore which updated its copyright law in 2021 to include exceptions to copyright for the purpose of computational data analysis, including training machine learning systems”.

“This is the core risk to Australian talent, investment and local development capability we would encourage the government to address in its technology governance framework.”

Microsoft called for Australia to adopt an explicit fair dealings exemption for text and data mining, a measure which would let AI scoop up copyrighted data for analysis. “Microsoft sees a great opportunity for local copyright laws to be modernised to keep pace with digital change.”

Amazon said it was important to prioritise global engagement and consistent international standards. “Australia should seek out opportunities to contribute to the development of global technical standards, rather than creating standalone domestic standards. Keep it simple.”

The Digital Industry Group said Australia’s current based fair use exceptions for copyright infringement were potential barriers to the training and development of AI models. Its membership includes Apple, eBay, Google, Meta, TikTok and Yahoo.

Another representative body, the Australian Digital Alliance, said Australia remained out of step with jurisdictions “more favourable to AI development.”

Ms Lanchester said big tech wanted access to copyright material for free. “Why should authors – who earn on average $18,200 per annum – subsidise the businesses of big tech?,” she said.

“We think what they are really saying is we want AI development in Australia to be free of input costs. The obvious solution is licensing,” she said.

The Australian Publishers Association is opposing big tech’s call that Australia replace its fair dealings exemptions with the wider range of exemptions available through US fair use law.

The association’s policy and government relations manager Stuart Glover said the creative industries were “very encouraged” by the federal government’s current position on copyright articulated earlier this year.

Dr Glover said the association also opposed calls for a tech and data mining exemption. He gave the example of big tech wanting access to use AI models to summarise data generated by Australian researchers, scientific institutions and universities without paying.

He said the UK government this year had rejected a similar push.

Australian Copyright Council chief executive officer Eileen Camilleri said existing copyright law was “set up perfectly to deal with this”.

“If you want to use copyright information, in a way which accepts a copyright owner’s statutory rights, then you will get permission and/or you pay for it, that’s pretty simple.

“There’s nothing further legislatively which is required, other than making it easier, perhaps, to streamline the enforcement opportunities for particularly smaller copyright owners.”

It’s also been revealed that about 40 per cent of Australian artists are making good use of AI tools, with about 10 per cent of a final work coming from generative AI use.

The Arts Law Centre of Australia said those surveyed were overwhelmingly visual artists. However 86 percent of respondents were worried about the threat generative AI poses.

“Generative AI is bad for artists, not just because it scrapes the internet for images to use for training, but also because it threatens to deskill the entire industry, said one respondent.

“Visual artists train for years to develop the skills they have to create their art, but AI steals all the fruits of that training.”

Centre chief executive Louise Buckingham said her main concern was that artists were treated fairly.

“If their work is going to be used in training data, they get to control that, they get to be remunerated for it, and they get to be acknowledged or attributed accordingly, and not have their works subjected to derogatory treatment. That’s the key concern.”

Dr Buckingham said the world was used to each jurisdiction having its own copyright law.

Published in InnovationAus.com on 24 November 2023.

Posted in News and tagged , , , , , , .

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.