Lost in Compression: Models of Authorship in Generative AI

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Abstract

Recent developments of image generators have introduced a new point of contention in the already contested field of artificial intelligence: the ownership of images. In 2023 Getty Images sued the company Stability AI, accusing it of illegally appropriating photographs for the purpose of training its models. Analysing image generators and stock agencies as probabilistic systems, this text argues that their significant difference lies in their model of appropriation. Where Stability AI proceeds through direct appropriation, the stock agency proceeds through contractual appropriation using its dominant position in the market. The article discusses Getty Images’ release of an image generator trained on its own image collection and critically reflects on the stock agency’s attempt to insert its contractual engine in the core of the generative AI technology, making copyright the regulating principle of the relations of ownership of the system and the principle that constrains the range of images it can produce.
Original languageEnglish
JournalMedia Theory
Volume8
Issue1
Pages (from-to)205-228
ISSN2557-826X
Publication statusPublished - Jun 2024

Fingerprint

Dive into the research topics of 'Lost in Compression: Models of Authorship in Generative AI'. Together they form a unique fingerprint.

Cite this