Do you want AI? A look at Senate inquiry submissions

Sector submissions to the Senate inquiry into the risks and opportunities of AI adoption include those from Screen Producers Australia and the guilds for composers and production designers.
Graphic design with letters AI and purple tech inspired background.

Some of you may be aware that the Senate is currently conducting an inquiry into the opportunities and impact for Australia arising out of the uptake of AI technologies.

The Senate committee was established on 26 March 2024, and hearings have been conducted in Sydney and Canberra during May and July. The last hearing is today (17 July 2024).

A total of 236 submissions were tabled. Among them, 16 were submitted by the following organisations in the creative sector:

The committee will report on the outcomes to Parliament on, or before, 19 September 2024.

Some recommendations and observations in submissions

Overall, there were three commonalities across the submissions:

  • Federal and state laws are urgently needed to define the compliance framework for AI data sets in relation to the copyright of the artists whose works have been copied or scraped.
  • Permission for the use of a creatives’ works for AI must be sought and obtained before use of the work, including works used to “train” AI systems, by data mining companies and AI generators.
  • Government is to mandate transparency when AI is used.

National Aboriginal and Torres Strait Islander Music Office

The Indigenous-led initiative of APRA AMCOS and peak body for Aboriginal and Torres Strait Islander (ATSI) music creators, NATISMO, expressed deep concerns for the impact of AI on this particularly vulnerable sector.

It warned, ‘The government should be mindful not to draw artificial distinctions between fields of ATSI cultural creation.’ The organisation added that while much of the Government’s focus has been on unauthorised use of ICIP (Indigenous Cultural and Intellectual Property) in arts and crafts, including in mass-produced items aimed at tourists and the international market, ATSI music creators are often also visual artists, dancers, craftspeople.

‘This interwoven complexity of ATSI cultural creation means that if one piece is taken out – such as the hijacking of traditional music forms by generative AI replicas – the entire fabric is torn, with unknown collateral effects for the community.’

NATISMO concluded: ‘ATSI people have a right to decide how cultural content is used, and a right to protect cultural heritage. One of the principal issues is that AI models are trained by way of unauthorised scraping and copying of ATSI cultural material already uploaded in digital space – this results in taking cultural content out of context, out of its space and thus compartmentalising Indigenous culture.’

The spread of AI-generated misinformation has the potential to exacerbate the crisis of trust in journalism by reducing the transparency, objectivity and accuracy of news media.

MEAA

Australian Society of Authors

‘Creators are not anti-tech – this is an issue of fairness,’ said the Australian Society of Authors (ASA).

‘Authors and illustrators have been locked out of the financial benefits of the AI boom,’ and ‘thus far, tech companies have built a new parasitic technology, generating their own wealth on the basis of work they have not paid for.’

The ASA called for the Government to mandate transparency. ‘Creators should know if an AI developer seeks to use their work for AI training and must have the right to grant or decline permission. This transparency would also facilitate voluntary direct licensing and collaboration between the tech and creative sectors.’

It added the recommendation: ‘Government to require all AI developers and deployers conducting business in Australia to rely on licensed content only for training, as a condition of doing business in Australia.’

Australian Production Design Guild

The Australian Production Design Guild (APDG) said that many screen and live performance designers are already using or intend to use Generative AI technology in their work (52.9%) as a research tool, to develop their design ideas – adding that it is proving an effective way to enhance existing workflows, rather than replace the designer’s practice, supporting ethical use.

And yet it also reported that 82.7% of the designers the organisation recently surveyed advised that they were somewhat concerned (46.2%) or very concerned (36.5%) about the use of Generative AI impacting their livelihoods and the quality of productions as a whole. Other concerns tabled by the APDG were around appropriate credit and chain of title, and payment of the designer when their work is used within an AI dataset, and identification, acknowledgement and transparency of the use of AI when used within a creative process. It also raised the problem of a designer being appropriately indemnified for use of their use of AI created designs.

Australian Library and Information Association

Australian Library and Information Association (ALIA) said, ‘One of the most critical components of AI inclusion is ability – or the need to be AI literate. Unfortunately, there are already substantial disparities in people’s literacy ability.’

It added, ‘There are early indications that AI will only deepen existing literacy gaps. For example, the ‘Digital Lives of Australians 2024‘ report released in May 2024 reports a 27 percentage point knowledge gap between older (50+) and younger Australians when asked if they knew a little about AI technologies.’ It also relayed concerns around training data and generative AI output is in relation to Aboriginal and Torres Strait Islander peoples’ content.

ALIA tabled the recommendation that funds are committed to the library sector to be upskilled in AI to support the public with AI skills, and this would be a cost-effective way to support students and the general public across all regions of Australia.

Media, Entertainment & Arts Alliance

The Media, Entertainment & Arts Alliance (MEAA) wrote that the ‘challenge that AI presents is that it has the potential to dislocate and devalue the work of artists, creators and journalists’.

It continued: ‘MEAA is also concerned about the push by industry operators to downplay the case for creators to receive compensation for the use of their work to train Large Language Models (LLMs). The arguments being put forward imply that these commercially owned LLMs require large amounts of reliable data to ensure they are free from bias and to ensure that Australia can develop its own AI industry. While this may be the case from a technical aspect, such arguments ignore the fact these LLMs are commercial ventures and are being developed for commercially driven purposes. They are not public goods, nor are they publicly owned. Therefore, they should be obligated to compensate.

‘The work of Australian creatives is being systematically scraped to train AI, without their knowledge, consent or compensation. It is impossible to ascertain the extent of this use due to a lack of transparency.’

MEAA continued, ‘AI generated content places performers in competition with synthetic versions of themselves, undermining their livelihoods and the sustainability of the media and cultural sector… While AI advocates spruik the technology’s capacity to improve the productivity of workers and “democratise” the production of creative content, the reality is that businesses are utilising AI tools to replace their workforce with automation products.‘

It also reiterated concerns for First Nations creatives. It concluded, ‘The spread of AI-generated misinformation has the potential to exacerbate the crisis of trust in journalism by reducing the transparency, objectivity and accuracy of news media.’

MEAA called for laws that require the public disclosure of data used to train AI, and an “AI Tax” on businesses that replace human workers with AI tools to be adopted. It also called for a parliamentary inquiry into the impact of AI on copyright and intellectual property.

Australian Guild of Screen Composers

The Australian Guild of Screen Composers (AGSC) said the greatest problems for its sector are unauthorised copying, unpaid uses of copyright and multiple breaches of exclusive rights of copyright owners – and made the point that scraping is currently in breach of Australian copyright law.

The AGSC’s submission also notes that there is growing duress on composers, as companies are presenting contracts to composers insisting that they agree to all future AI manipulations of their current work within the contract. ‘We believe that this is forcing composers to surrender their exclusive statutory rights of control of their original works,’ the AGSC wrote.

It says there is ‘a real risk of destroying a whole industry of creatives,’ adding, ‘we will see a homogenisation of material, in favour of cheap, generic material and a marked contraction in quality.’ It also makes the case that AI puts our economy at risk.

‘It’s not made as a tool for us composers; it’s made as a direct competitor to the work that we do. It will likely have a significant impact on the market value of what we do as professional creatives, and probably mark the end of the idea of Intellectual Property as we know it.

‘In 2020-21, screen drama productions spend reached $1.9 billion, driven by increases in Australian and foreign films created in Australia. The Bureau of Communications, Arts and Regional Research (BCARR) estimates that the expenditure in Australia by film and TV productions through to 2026-27 could be over $4.3 billion.’

Australia New Zealand Screen Association

The Australia New Zealand Screen Association (ANZSA) expressed a concern especially for “deepfakes” that ‘undermine trust in what is real and what is not’.

‘As such there are calls for all AI outputs to be labelled; however, we caution against an over-broad application of that principle.’

Read: ChatGPT and screenwriting tools – insights from Screen Forever

ANZSA explained: ‘Any future regulatory reform should take a risk-based approach, only prescribing labelling for use cases that are high risk and have potential to deceive or harm. Uses of AI as a tool to create expressive audiovisual works from the motion picture industry, for example, would not be considered “high risk” and therefore not fit into this criteria.’

Australian Publishers Association

The Australian Publishers Association (APA) advocates for the licensing of content AI ingestion, which it believes will protect and boost the creative industries, rather than destroy them. It adds that publishers are already supplying vast corpuses of quality content for training generative AI models. The sector is also integrating AI into its products and processes, such as enhancing digital textbooks and making learning materials more intuitive. However, it warned that a more robust copyright system is needed to operate ethically and legally.

It made the point that Australia is currently lagging behind EU and US legislation in this area. ‘The UK’s recent rejection of a broad text and data mining (TDM) exception confirms that such provisions are unnecessary for innovation and harmful to creative industries,’ said the APA.

Screen Producers Australia

Representing a membership of around 800 and driving a $2 billion independent sector, the Screen Producers Australia (SPA) said, ‘Future generative AI systems present amazing opportunities to improve efficiency across many aspects of screen production.’ It added that the use of AI in the screen industry is not new; however, it raised specific sector concerns in the areas of established distribution models.  

‘Smaller Australian production companies (of which the industry is largely made up) may not have the resources to access these systems in a way that mitigates the copyright concerns. Only those companies, such as broadcasters and larger internationally-based or supported businesses, which have considerable resources would. This imbalance could lead to smaller Australian production companies, particularly animation companies, losing work to broadcasters or larger internationally supported companies that take production in-house.’

It expressed concern also for the loss of entry pathways into the industry, with these roles being replaced by AI systems.

National Association for Visual Arts

In July 2023, National Association for Visual Arts (NAVA) and Arts Law Centre of Australia, supported by the Australian Society of Authors, conducted a comprehensive survey of AI with their members. Nearly 40% acknowledged incorporating generative AI into their creative process.

NAVA supports a similar approach to that of the UK, including statutory compliance of cross-sector AI-specific principles, greater transparency and a toolkit to assist organisations to assess the risks to individual rights and freedoms caused by their AI models and platform.

NAVA also raised particular concern for the protection of Aboriginal and Torres Strait Islander visual arts and crafts.

Arts Law

Arts Law also supports the approach taken in the UK to legislate to create a statutory duty for regulators and use of AI. ‘This same approach could be taken in Australia,’ it said, adding that ‘the AI Survey (conducted with NAVA) revealed that 93% of respondents support the introduction of guidelines, regulations and/or a code of practice of legislative protections to regulate generative AI platforms.’

It added in its submission: ‘Almost half of the respondents to the AI Survey use generative AI in the creative process via a mix of assistance with written works such as editing and grant writing via ChatGPT and some creators use it for creative content and ideation processes. However, while most respondents confirmed that if they use generative AI in the creative process, only about 10% of the final work comes from generative AI use.’

Its submission published a quote from that survey, which captured the mood of many creative in the sector: ‘Generative AI is a danger to the visual arts industry for many reasons. Generative AI is bad for artists, not just because it scrapes the internet for images to use for training, but also because it threatens to deskill the entire industry. Visual artists train for years to develop the skills they have to create their art, but AI steals all the fruits of that training. Artists already live in poverty and find it difficult to make a living from their work, yet AI is threatening to take the few jobs that exist for artists, as generative AI can be used as a disciplinary tool to erode the bargaining power of working artists.’

‘Our entire creative culture is worse off when we buy into the hype of AI,’ the respondent concluded.

Arts Law also raised a concern for the environmental impact of AI data mining, which seemed to have been overlooked by most submissions to the inquiry, but a real and significant impact of this new technology.

Note: Since publishing, the following submission has been added to this story.

Australian Writers Guild

In a joint submission by the Australian Writers’ Guild (AWG), Australian Writers’ Guild Authorship Collecting Society (AWGACS), Australian Screen Editors (ASE) Australian Production Design Guild (APDG) and Australian Cinematographers Society (ACS), the voices of screen sector professions were tabled at the inquiry. They push for an urgent need for regulation.

Their joint submission states: ‘Major international studios and video game publishers are already embracing this technology. By cutting creative people out of the creative process, companies may cut costs and increase profit. We will see industry homogenisation, consolidation, contraction and a reduction of the economic contribution of the creative sector. Livelihoods will be at risk and over time we will see a devastating erosion of the skill base of Australian creatives.’

The other key point they make is that, ‘Australian stage, screen, performance, broadcast and interactive content sectors are essential Australian industries, stimulating investment and economic activity, while
employing Australian artists and workers,’ adding their content is ‘an indispensable projection of Australian identity globally’. This bolsters tourism and opportunities for soft diplomacy.

‘They represent critical tools and expressions of our cultural sovereignty. The unregulated use of AI by corporate content producers, including the major international studios and major video game publishers, and, more recently, local production companies, represents a threat to Australian creative work.’

They organisations call for fair renumeration, authorial control and a “notice and takedown” system with essential transparency of AI use.

ArtsHub will continue to report on this topic, and the Inquiry’s report in September.

Gina Fairley is ArtsHub's National Visual Arts Editor. For a decade she worked as a freelance writer and curator across Southeast Asia and was previously the Regional Contributing Editor for Hong Kong based magazines Asian Art News and World Sculpture News. Prior to writing she worked as an arts manager in America and Australia for 14 years, including the regional gallery, biennale and commercial sectors. She is based in Mittagong, regional NSW. Twitter: @ginafairley Instagram: fairleygina