As the EU Council, Parliament and Commission third trilogue negotiations’ meeting will take place next week, the SAA strongly calls on the negotiators to build on the EU Parliament’s proposals to impose transparency obligations on the providers of foundation models and to establish clear rules that protect and promote the continued development of human creativity and original works.
The SAA welcomes and supports the EP transparency obligation of the providers of foundation models to document and make publicly available a sufficiently detailed summary of the use of training data protected under copyright law “without prejudice to national or Union legislation on copyright” (Art 28b.4c). Such a mention “without prejudice to national or Union legislation on copyright” is essential to leave copyright rules out of the scope of the AI Act. Addressing copyright rules separately is a condition for the AI Act to be a future proof horizontal instrument.
In addition, we ask for:
A comprehensive and up-to-date list of the protected works used by generative AI systems for training purpose. A summary of training data alone is not sufficient to ensure that authors can enforce their claims.
Clear and strict rules on the labelling of AI generated production as such. The labelling of content, when artificially generated or manipulated, shall not suffer any exception for the so-called exercise of the right to freedom of expression and the right to freedom of the arts (Art 52.3a). We do not see any contradiction between these fundamental freedoms and the transparency principle on the use of AI.
“With adequate transparency and safeguards, AI can serve authors and society. It has the potential to boost creativity and cultural diversity. But the key is: it will only work if policymakers make human well-being their top priority in innovation, while dealing with copyright separately. In the end, what we need is for AI to preserve and enhance human creativity, not replace it.”
Barbara HAYES, Chair of the board of the SAA.