How AI Is Changing Artistic Creation and Challenging IP Laws
"These are ethical concerns that, as a society, we're really confronted with for the first time,” said trademark and copyright lawyer Jessica Neer McDonald.
The use of artificial intelligence (AI) is gaining notoriety among traditional and Web3 artists alike. While some creatives worry that AI will one day replace their jobs or steal their work, other artists have welcomed the technical experimentation and have even found ways to use bots to execute their artistic vision.
This feature is part of CoinDesk's Culture Week.
But questions about the moral implications of AI have left artists divided, and concerns about plagiarism to data privacy have been raised. Some artistic communities have gone so far as to ban AI-generated images created using tools such as DALL-E, Midjourney and StarryAI from being shared.
Meanwhile, the artists who embrace AI speak passionately about their work and are eager to provide insight into the expansive technology. Some artists view their relationship with AI bots in a similar fashion to a parent teaching their child how to read, write or draw. “I did lots of drawing with my kids when they were little,” said Australian artist Lilyillo on Twitter Spaces in February. “I was watching his brain actually learn in real-time … I feel the same wonder when watching these [AI] tools.”
Artists who work closely with AI are helping those of us outside the field understand it through their craft and experimentation. Along with researchers, authors and technologists developing AI tools, the world of AI art illuminates some pressing questions regarding intellectual property (IP) and ownership in Web3.
Intellectual property and originality
Artificial intelligence is creating a new artistic canon, says AI researcher and poet Sasha Stiles, who has been a poetry mentor to Hanson Robotics’ humanoid robot BINA48 since 2018.
Language models such as the AI chatbot ChatGPT and similar tools like Sudowrite create neural computer networks that absorb vast quantities of text and other data on the internet, then process it through an algorithm to form comprehensible statements.
The human-led process of developing language models creates what Stiles describes as a “pool of data” which users can collectively draw upon from things like books, films, encyclopedias and ideas found inside a physical or digital library.
“As a writer I think there is this romantic stereotype of the solitary writer, toiling away in the empty room, and that the ideas we're coming up with spring out of thin air,” said Stiles. Yet, in reality, artists have always been inspired by and have drawn inspiration from existing cultural works, she said.
“How many ideas inform the things I'm doing as a poet?” Stiles asks, noting the similarities between how humans and AI gather information to create new things.
As for plagiarism and concerns about originality, Stiles said some writers and artists are looking into ways to opt out of having their text included in language model data sets. But at this time there is no clear avenue to protecting one’s published words from AI crawlers, or tools used to scrape data from online sources.
These challenges tend to be more fraught in the context of text-to-image art generators such as DALL-E or Midjourney, which have the capability to create images from text prompts in the style of a particular artist – sometimes without that artist’s knowledge or permission.
For instance, a New Yorker staff writer generated an image of a Havanese dog in the style of the late Harper’s Bazaar photographer Richard Avedon using DALL-E 2. This ability has incited accusations that AI tools trivialize plagiarism and make it too easy to violate intellectual property protections.
The challenges of applying existing IP laws to AI works
Many of these ideas and technologies are still emerging, and therefore fall under several legal gray areas, said Jessica Neer McDonald, a Florida-based trademark and copyright lawyer at Neer McD PLLC.
“The question here is to try to figure out where the copyright ownership begins,” she said. The first consideration is input – who actually owns the unique combination of prompts, text and data used to generate a signature style, and what effect does that ownership have on the output generated?
These boundaries will likely be worked out through court cases and statute interpretation that funnel through the U.S. Copyright Office (USCO) and U.S. Patent and Trademark Office (USPTO).
“These are ethical concerns that, as a society, we're really confronted with for the first time,” said McDonald.
Another consideration is intent. Is the artist purposefully using a protected logo, slogan or design for monetary gain or promotional purposes? It’s not yet clear how blatant one is allowed to be in their description or how closely an output may resemble a protected image without legal implications. There are few existing parameters when it comes to AI-generated works, and some AI tools have limits in place for how closely they can mimic protected symbols, characters and texts.
While there isn’t yet precedent for these questions surrounding AI and intellectual property, USCO and USPTO have generally upheld the precedent that AI-generated artwork is generally not protected under IP law.
“Intellectual property law has historically focused on protecting the fruit of human intellect,” said McDonald, citing an August 2022 case in which the United States District Court for the Eastern District of Virginia affirmed that only a “natural person” can be considered an inventor. In another recent instance, USCO rescinded a copyright registration granted to AI artist Kristina Kashtanova for her graphic novel because the artist failed to disclose that the images in the book were generated with the help of Midjourney. After careful review, the office granted protection only for the text and compilation of the book but not for the images because they were “not the product of human authorship.”
An updated set of guidelines on the "Human Authorship" requirement published on March 16, 2023, by the USCO was lauded by AI artist Silver as a "vital win," noting that the text calls out exceptions that may qualify AI-generated work as "original."
In her tweet, Silver cited a passage in the new guidelines that said "a work containing AI-generated material will also contain sufficient human authorship to support a copyright claim. For example, a human may select or arrange AI-generated material in a sufficiently creative way that ‘‘the resulting work as a whole constitutes an original work of authorship.’’
Regardless, AI mimicry poses multiple concerns, even though AI-generated art is not protected by IP law. Even fan-generated artwork with no obvious monetary incentives may, as digital artist Greg Rutkowski says, conflate the artist’s authentic works with computer-generated images. Rutkowski, who is famous for illustrating fantasy landscapes in games like Dungeons & Dragons and Magic: The Gathering, has complained the internet is too overcrowded with AI artwork resembling his own for his original work to receive attention anymore.
“There are many lawsuits where an artist's style may try to be protected under copyright law, and those are decided on a case-by-case basis,” said McDonald.
While style is not something that can always be copyrighted, the larger question being posed is whether the ability to create artwork easier through an AI tool interferes with an original artist’s exclusivity provided under copyright law.
Artists who are interested in releasing AI-generated artworks and books — such as Stephen Thaler’s AI-generated picture, "A Recent Entrance to Paradise," Claire Silver’s selected works or this entire sci-fi magazine — should look at the terms and conditions of each AI machine used, noting specifically what kind of intellectual property rights you or the company may be granted to the work generated between human and bot.
The leader in news and information on cryptocurrency, digital assets and the future of money, CoinDesk is a media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. CoinDesk is an independent operating subsidiary of Digital Currency Group, which invests in cryptocurrencies and blockchain startups. As part of their compensation, certain CoinDesk employees, including editorial employees, may receive exposure to DCG equity in the form of stock appreciation rights, which vest over a multi-year period. CoinDesk journalists are not allowed to purchase stock outright in DCG.
Learn more about Consensus 2023, CoinDesk’s longest-running and most influential event that brings together all sides of crypto, blockchain and Web3. Head to consensus.coindesk.com to register and buy your pass now.