When I say AI is just another tool, people are quick to shut me down. They are quick to assume that, because of this technology, the artist will die. But that is not true. Most people are tunnel-visioned on college kids generating anime portraits or Unreal Engine-style 3D environments, or the capitalistically minded people willing to cut costs at all costs. And, for some reason, they tend to ignore actual artists creating meaningful work using AI in their creative process.
What gives art meaning is the intention behind it: the message and emotion the artist wants to deliver. I don't believe that using generative AI to communicate that message is inauthentic, as long as there is intention behind it. Without intention, even the famous works of many artists can be categorized as slop. And even if there is intention, but the message doesn't resonate with the audience, does that authenticity matter if everyone is going to ignore it anyway?
People also fail to realize what technological innovation in previous decades has allowed us to achieve in the arts: new types of art forms (digital art, 3D modeling) and artistic processes that technology has unlocked. I think artists who are afraid of AI are complacent people who are actually less creative than they'd like to believe. They can't envision a new form of art or a creative process built around generative AI.
There are always going to be people who generate tons of mediocre slop because of the relationship art has to money. This existed even before AI: garbage ghostwritten Kindle-published books, plagiarized content, and bootlegs.
I think we just have to go through another phase of "digital art isn't real art" before some real innovator with a futuristic vision comes along and shows us how generative AI can be used in some new kind of art form that we can't even imagine yet.
Some artistic processes will die, just as they have in the past, but the artist will never die. Because an artist is someone who tries to create beauty from random bits and pieces in this miserable world. And they could do that through paints on a canvas or through code running on a machine.
"What gives art meaning is the intention behind it." -- Yes, this exactly! That's why I had such a strong experience listening to AI-generated music. I realized I had no problem with artists using AI to help bring their visions to life. AI will create new processes and practices, and that's how art evolves. My issue is with people who use AI to try to replace artists, copy their work without pay, or use it to create mindless content.
Would there still be a sense of betrayal with AI art if there were no dishonesty or false pretenses in the first place? In your examples, it's not surprising audiences feel betrayal, because there's actual betrayal involved: they were explicitly or implicitly lied to, led to expect art created by humans, then had those expectations disrupted by the big reveal. This muddles the analysis because we don't know whether the feelings of betrayal come from the upset expectations or the nature of AI.
For a clean analysis we should 1) assume complete transparency and examine how we'd feel about AI art presented as such, on its own terms, perhaps even in a situation where we sought it out (an AI art contest or AI art gallery) and 2) compare the dishonest AI art examples to dishonest human art examples, so we have a fair, intellectually honest baseline.
For example, how would you feel if your husband had told you he created the song entirely himself, with more traditional music production tools? Then, when you were done listening and being moved, he revealed it was made with Suno? I assume you would feel a sense of betrayal, not unlike the feeling of discovering the Theatre D'Opera Spatial was made with Midjourney...
I focus on the feeling of betrayal because it's such a common experience when it comes to AI art. The fact is, there is a lot of dishonesty and false pretense, and it's hard to separate that from the art itself. The more people are duped, the more they mistrust AI. We also like to shame people who use AI art, which makes people more likely to lie about it.
It's interesting to consider alternative scenarios, however. Art forgeries are an obvious parallel. How does it feel to come home from a museum, thinking you've seen a real statue, only to find out it was a replica? There's a similar feeling of betrayal there.
If "art is connection across time and space," why is it that, when we're looking back in time to determine whether a certain image meets the criteria for art, we stop our search when we get to an AI model? The trail doesn't go cold there, so why abandon the search for connection prematurely? Why not go further back in time, to the works that made up the training corpus for that model, and then the flesh-and-blood humans who created those works?
When a human is moved by AI art, this is not a coincidence. It's different from, say, the artwork coming into existence by random chance when a strong gust of wind tips over some paint cans. That would be the true letdown. But with AI, the fact that the work moved you is not due to random chance. It's due to the fact that the world is full of the creative output of your fellow humans, and that latent within the patterns of some cross-section of that output was this work that spoke to you, waiting. AI can surface such latent works. It's this incredible tool for remixing and recombining the human at scale, stirring the memetic cultural stew like an industrial egg-beater while our human brains are like handheld whisks, mining areas of the collective unconscious that perhaps no human artist has stumbled upon yet.
From the beginning I found this to be beautiful. AI doesn't destroy the connection across time and space, it extends it and changes its nature. The process changes from artist>>art>>audience to artist>>art>>AI>>art>>audience. It also changes from one-to-one, or few-to-one, to many-to-one (i.e., instead of one human artist connecting with a viewer through their art, millions of human artists connect to a viewer through the piece of AI art.)
I see a general pattern of people ignoring the humanity of the training data whenever it's convenient for their argument, often in ways that are inconsistent. An artist might complain that, "they stole our original human artwork to make AI," then that same artist will say, "there's nothing human at all about AI or what it creates." It feels to me like you can't have it both ways.
The more I think about this, the more I think that art requires context to be meaningful. That's why responses to AI art vary so greatly.
If you are open to the broader idea that AI art builds on all of humanity's cumulative creations, then you might find beauty in it. Other people find this view too impersonal and mechanistic, reducing art to data points. It's either a tremendous amount of context or not nearly enough. Whichever view you hold, that will color the way you respond to an AI's output.
When I say AI is just another tool, people are quick to shut me down. They are quick to assume that, because of this technology, the artist will die. But that is not true. Most people are tunnel-visioned on college kids generating anime portraits or Unreal Engine-style 3D environments, or the capitalistically minded people willing to cut costs at all costs. And, for some reason, they tend to ignore actual artists creating meaningful work using AI in their creative process.
What gives art meaning is the intention behind it: the message and emotion the artist wants to deliver. I don't believe that using generative AI to communicate that message is inauthentic, as long as there is intention behind it. Without intention, even the famous works of many artists can be categorized as slop. And even if there is intention, but the message doesn't resonate with the audience, does that authenticity matter if everyone is going to ignore it anyway?
People also fail to realize what technological innovation in previous decades has allowed us to achieve in the arts: new types of art forms (digital art, 3D modeling) and artistic processes that technology has unlocked. I think artists who are afraid of AI are complacent people who are actually less creative than they'd like to believe. They can't envision a new form of art or a creative process built around generative AI.
There are always going to be people who generate tons of mediocre slop because of the relationship art has to money. This existed even before AI: garbage ghostwritten Kindle-published books, plagiarized content, and bootlegs.
I think we just have to go through another phase of "digital art isn't real art" before some real innovator with a futuristic vision comes along and shows us how generative AI can be used in some new kind of art form that we can't even imagine yet.
Some artistic processes will die, just as they have in the past, but the artist will never die. Because an artist is someone who tries to create beauty from random bits and pieces in this miserable world. And they could do that through paints on a canvas or through code running on a machine.
"What gives art meaning is the intention behind it." -- Yes, this exactly! That's why I had such a strong experience listening to AI-generated music. I realized I had no problem with artists using AI to help bring their visions to life. AI will create new processes and practices, and that's how art evolves. My issue is with people who use AI to try to replace artists, copy their work without pay, or use it to create mindless content.
Preach!
Would there still be a sense of betrayal with AI art if there were no dishonesty or false pretenses in the first place? In your examples, it's not surprising audiences feel betrayal, because there's actual betrayal involved: they were explicitly or implicitly lied to, led to expect art created by humans, then had those expectations disrupted by the big reveal. This muddles the analysis because we don't know whether the feelings of betrayal come from the upset expectations or the nature of AI.
For a clean analysis we should 1) assume complete transparency and examine how we'd feel about AI art presented as such, on its own terms, perhaps even in a situation where we sought it out (an AI art contest or AI art gallery) and 2) compare the dishonest AI art examples to dishonest human art examples, so we have a fair, intellectually honest baseline.
For example, how would you feel if your husband had told you he created the song entirely himself, with more traditional music production tools? Then, when you were done listening and being moved, he revealed it was made with Suno? I assume you would feel a sense of betrayal, not unlike the feeling of discovering the Theatre D'Opera Spatial was made with Midjourney...
I focus on the feeling of betrayal because it's such a common experience when it comes to AI art. The fact is, there is a lot of dishonesty and false pretense, and it's hard to separate that from the art itself. The more people are duped, the more they mistrust AI. We also like to shame people who use AI art, which makes people more likely to lie about it.
It's interesting to consider alternative scenarios, however. Art forgeries are an obvious parallel. How does it feel to come home from a museum, thinking you've seen a real statue, only to find out it was a replica? There's a similar feeling of betrayal there.
If "art is connection across time and space," why is it that, when we're looking back in time to determine whether a certain image meets the criteria for art, we stop our search when we get to an AI model? The trail doesn't go cold there, so why abandon the search for connection prematurely? Why not go further back in time, to the works that made up the training corpus for that model, and then the flesh-and-blood humans who created those works?
When a human is moved by AI art, this is not a coincidence. It's different from, say, the artwork coming into existence by random chance when a strong gust of wind tips over some paint cans. That would be the true letdown. But with AI, the fact that the work moved you is not due to random chance. It's due to the fact that the world is full of the creative output of your fellow humans, and that latent within the patterns of some cross-section of that output was this work that spoke to you, waiting. AI can surface such latent works. It's this incredible tool for remixing and recombining the human at scale, stirring the memetic cultural stew like an industrial egg-beater while our human brains are like handheld whisks, mining areas of the collective unconscious that perhaps no human artist has stumbled upon yet.
From the beginning I found this to be beautiful. AI doesn't destroy the connection across time and space, it extends it and changes its nature. The process changes from artist>>art>>audience to artist>>art>>AI>>art>>audience. It also changes from one-to-one, or few-to-one, to many-to-one (i.e., instead of one human artist connecting with a viewer through their art, millions of human artists connect to a viewer through the piece of AI art.)
I see a general pattern of people ignoring the humanity of the training data whenever it's convenient for their argument, often in ways that are inconsistent. An artist might complain that, "they stole our original human artwork to make AI," then that same artist will say, "there's nothing human at all about AI or what it creates." It feels to me like you can't have it both ways.
The more I think about this, the more I think that art requires context to be meaningful. That's why responses to AI art vary so greatly.
If you are open to the broader idea that AI art builds on all of humanity's cumulative creations, then you might find beauty in it. Other people find this view too impersonal and mechanistic, reducing art to data points. It's either a tremendous amount of context or not nearly enough. Whichever view you hold, that will color the way you respond to an AI's output.
Thanks for the thoughtful response!
Kind of off-topic but perhaps of interest is the novel Manhunt (Gretchen Felker-Martin, 2022) where She-Who-Must-Not-Be-Named has an interesting role!
Interesting. I've never heard of this book before. Thanks!