ChatGPT

No wasn't missing that. I decide I want to author a book about how to "do art", so I go to the public library and a look through hundreds of art books,

In that case you're not using the original artworks, you're using thumbnails or page-sized plates of photographs of artworks that are appearing in those books, almost certainly with permission from the rights-holders. Being that one of the purposes of art books in the first place is for students to learn from, it can be safely presumed that the creators who gave permission for their works to appear in books like these are aware of and approve of that use of their material.
 
How is this functionally different from when humans do the same thing to train themselves?

What difference does that make?

Everyone keeps trying to drag the discussion back to this point. Many of you seem to feel that the closeness to which a machine replicates or simulates the human process of performing a task makes a profound difference in how the machine should be talked about and how issues around the manufacture and use of the machine should be treated. Another poster on the previous page began talking about AI programs having "a right" to access and learn from art just like humans do, as if we were talking about a person choosing and desiring to study and make art and willfully reaching out to explore others' artworks for self-enrichment, personal fulfillment, or potential economic opportunities, rather than a computer program that was written to perform a very specific task that runs when an operator gives it a command and then stops.

Backhoes exactly replicate the human process of digging a hole - they press the business end of their work-tool into the ground, lever against the earth to displace it, then pick up the displaced earth and deposit it somewhere else, leaving a hole. The operator wiggles a couple of two-axis joysticks and presses a couple of buttons and the machine is designed to take those abstract inputs and translate them into a complicated set of mechanical motions that ends up digging a hole in the ground in exactly the same way a human would dig a hole by hand, the only difference being that the excavator machine can do it faster and in larger volume than a human physically could with their hands or with smaller, simpler tools. Yet nobody ever seriously talks about backhoes as independent entities that have "a right to" dig holes; nobody has ever treated discussions about property and land-use rights or call-before-you-dig campaigns as if they're somehow mean and discriminatory against the poor heavy construction equipment that just wants to dig and be happy. We intuit that the backhoe is a tool, made by humans for a purpose; that when it works it's because an operator has turned it on and begun using it, and when the task is completed and the operator turns it off and leaves it in the storage yard until the next time they want to use it again, it doesn't internally reflect and ruminate upon a job well done and dream of electric sewer projects, it just sits there and rusts.

But people have a weird hangup about the human brain; they think of literally anything it does as ineffably profound, so that if someone designs a machine that manages to go about closely replicating the way the human brain goes about doing some specific task, we can't help but think that some kind of special breakthrough has just happened and we're suddenly ready to give the machine "rights" and wonder if we're hurting its feelings by quitting to the desktop when we're done playing with it.
 
Last edited:
I don't think it's about giving rights to machines. I mean the discussion here.
It's just when AI learns in similar way to human, nothing really changed, and no change is necessary.
 
I don't think it's about giving rights to machines. I mean the discussion here.
It's just when AI learns in similar way to human, nothing really changed, and no change is necessary.

That's not what's being argued at all though; licenses and rights to dictate restrictions on commercial use of art by individuals and companies have existed for a long, long time. My argument is that those same laws should apply to software companies, as they always have. But several people here seem to be arguing that if a software company is specifically producing an AI program, then suddenly that company's use of art is "different", it needs to be treated differently from any other company making any other product, suddenly the software company is just "an art teacher" and the AI program it's building should be treated like a human art student.
 
It's like companies training and selling artists. So more or less schools ?
If school says: find pictures of Picasso in the internet, and students do .. that seems like fair use. Is school downloading them for the students so different ? Hard to say.
 
It's like companies training and selling artists. So more or less schools ?

Schools don't "sell" artists, they train them. Then the artist goes out and finds a job on their own. Sure, some schools offer networking opportunities or partnerships with certain employers, or host job fairs where employers can come and look for students; but even when that happens it's not as if the employer puts in an order for "an artist" and the school supplies one the same way that, say, a medical equipment company supplies a hospital with an X-ray machine. The artist still needs to make their own resume and a portfolio to impress potential employers along with whatever training certificate or diploma they have earned. Ultimately it is the student, and not the future employer, that is the customer of the art school, and ultimately any potential employer has to interview and negotiate a job offer with the artist, not the school that trained them.

An AI program isn't a customer of its developer; it is a tool that is marketed directly to end-users. If you want a piece of art made and are willing to accept an AI-generated work, DALL-E isn't going to see your want ad and email you a CV and its personal portfolio. You either have to go to the developer website to purchase and download the software, or you get a response from a human who plans to use the DALL-E tool to make the art and that human, not the AI program, is the one you'll be paying.
 
Schools don't "sell" artists, they train them. Then the artist goes out and finds a job on their own. Sure, some schools offer networking opportunities or partnerships with certain employers, or host job fairs where employers can come and look for students; but even when that happens it's not as if the employer puts in an order for "an artist" and the school supplies one the same way that, say, a medical equipment company supplies a hospital with an X-ray machine. The artist still needs to make their own resume and a portfolio to impress potential employers along with whatever training certificate or diploma they have earned. Ultimately it is the student, and not the future employer, that is the customer of the art school, and ultimately any potential employer has to interview and negotiate a job offer with the artist, not the school that trained them.

An AI program isn't a customer of its developer; it is a tool that is marketed directly to end-users. If you want a piece of art made and are willing to accept an AI-generated work, DALL-E isn't going to see your want ad and email you a CV and its personal portfolio. You either have to go to the developer website to purchase and download the software, or you get a response from a human who plans to use the DALL-E tool to make the art and that human, not the AI program, is the one you'll be paying.

School is selling artists in a sense. It's selling the skills to the students. What is important is that it's commercial business, and that the school doesn't hold copyright for the studying material.

You could have one AI company which would create the software, and another AI company, which would be doing the learning process. That would bring it closer to the school analogy. Teaching people for profit using images you don't own vs. teaching neural network for profit using images you don't own.
 
Is there not a case to be made for "fair use" where AI is concerned?

There almost certainly "a case to be made", but also a case to be made against it.

This seems somewhat similar:

https://en.wikipedia.org/wiki/Fair_use#Text_and_data_mining

The transformative nature of computer based analytical processes such as text mining, web mining and data mining has led many to form the view that such uses would be protected under fair use. This view was substantiated by the rulings of Judge Denny Chin in Authors Guild, Inc. v. Google, Inc., a case involving mass digitisation of millions of books from research library collections. As part of the ruling that found the book digitisation project was fair use, the judge stated "Google Books is also transformative in the sense that it has transformed book text into data for purposes of substantive research, including data mining and text mining in new areas".[52][53]

Text and data mining was subject to further review in Authors Guild v. HathiTrust, a case derived from the same digitization project mentioned above. Judge Harold Baer, in finding that the defendant's uses were transformative, stated that 'the search capabilities of the [HathiTrust Digital Library] have already given rise to new methods of academic inquiry such as text mining."[54][55]
 
School is selling artists in a sense. It's selling the skills to the students. What is important is that it's commercial business, and that the school doesn't hold copyright for the studying material.

Right - again, that's why the school licenses any kinds of materials it uses in that way.

Also - everyone who is involved in this conversation right now, I'm curious about what exactly it is that you think art schools do in their courses. Seriously; like, in so many words, what is your impression of what constitutes "study material" in an art class?

From the statements people are making here, it seems like a great many of you are under the impression that art schools are showing or directing students to existing works by established working artists and asking students to replicate or imitate them. Is that the case - is that what you really think happens in "art schools"? If not, then what exactly?

I'm not an art major by any stretch but I have taken art classes before - and if any already existing art pieces were used in the class at all, it was in the form of maybe a brief prologue featuring a handful of slides of historical or particularly famous examples of a particular technique - museum pieces and the like - and after this introduction, the instructor then breaks out the tools and starts live-demonstrating the technique being taught themselves. Students learn a technique by watching the instructor in the process; if they are meant to reproduce or imitate anything it's the instructor's work.

Now, art teachers absolutely encourage students to go out of their own volition and enjoy and contemplate art and learn things if they can, but no instructor ever ended an art class by telling students that their homework was to go to the library and read a book about Monet, or go to a museum and stare at Piero della Francesca paintings until they somehow osmote his "style". When it comes to "study material", just like with literally any other subject of study, there are textbooks, and other art instruction books made expressly to be used for that exact purpose and of course all of their contents are either original by the authors or licensed with the understanding that those works were being used in a textbook for the purpose of teaching students.
 
If it helps to know where I am coming from - I've lectured and marked on a few modules of degree level "multi-media" courses at a university (it was a long time ago, when multi-media was still a buzzword, so long ago we even had a module about "edutainment") so I do know how materials are used. I've also supported myself as a commercial artist for a little while so I've an understanding of the sharp end of commercial art.

That knowledge and experience is informing a lot of my views about what we are seeing with nascent generative art AIs.

The art AIs are bringing creativity within the reach of many more people, something most people either never had the time to learn for themselves or couldn't afford.


When Olmstead posted: They're software whose main purpose is to squeeze as much value as possible out of the common heritage of human culture for the sole benefit of the rich. And there are no laws to prevent the great theft, no recourse for artists to protect their work from being exploited for the enrichment of disgusting CEOs.

I wasn't being facetious when I reworded his statement, I profoundly disagree that this is only about making the 1%ers richer. In the past only the rich could afford to pay for art, if they wanted a lovely landscape of hills dotted with purple heather and sheep they could commission an artist to produce that to hang on their wall. Not rich? Well you can just go and hang, only the rich can have nice things. We are now seeing a way that lets everyone be creative in a way that used to be only for the very privileged.

Tools like this: https://clipdrop.co/stable-doodle mean that everyone can now have their "landscape of hills dotted with purple heather and sheep".


 
Last edited:
I would have thought concepts such as fair use and plagiarism depend to at least some degree on the capacity to understand what they are. However complex an AI might be, and how able it is to take off on its own, it's a machine of a sort, and as such I think those who run it and set it up ought to be responsible for what it produces.

If I built and programmed an automatic robotic machine that manufactured automobiles, and without knowing what it was doing it made a Ford, I think someone should be liable for something, even though the machine has no understanding of what it is or what it did.

It is what it is, and it's trained to do what it does, and it was invented, not born. You can't just say, "sorry, I didn't know it was loaded."
 
I would have thought concepts such as fair use and plagiarism depend to at least some degree on the capacity to understand what they are. However complex an AI might be, and how able it is to take off on its own, it's a machine of a sort, and as such I think those who run it and set it up ought to be responsible for what it produces.

If I built and programmed an automatic robotic machine that manufactured automobiles, and without knowing what it was doing it made a Ford, I think someone should be liable for something, even though the machine has no understanding of what it is or what it did.

It is what it is, and it's trained to do what it does, and it was invented, not born. You can't just say, "sorry, I didn't know it was loaded."

That’s missing the point that it is people controlling the AIs, it is their intent, their actions that should be the starting point.
 
I wasn't being facetious when I reworded his statement, I profoundly disagree that this is only about making the 1%ers richer. In the past only the rich could afford to pay for art, if they wanted a lovely landscape of hills dotted with purple heather and sheep they could commission an artist to produce that to hang on their wall. Not rich? Well you can just go and hang, only the rich can have nice things. We are now seeing a way that lets everyone be creative in a way that used to be only for the very privileged.

In the distant past, certainly. But that hasn't been the situation for decades, now. There are artists all over the web on sites like DeviantArt who will make your landscape to order for between $10 and $50 or so. They've been there for years. And I disagree with you vehemently; it is very much those people, the ones who make art for peanut commissions and live more or less at the poverty level, who are being replaced by AI art applications, not well-known artists whose reputations command exorbitant prices for commissions by rich clients.

It is a compelling narrative that AI is giving the common people access to a world they've formerly been priced or classed out of - that there's this whole bunch of working-class people that just loooove Nicolas Party so much that they wish they could afford to commission a personal painting from him, and that by letting them print out a digital painting that was definitely not made by Nicolas Party but looks kind of like something he might have painted and hang it on their wall, DALL-E is literally making their dreams come true.

But you can't genuinely believe that this pretty fantasy represents how art AI programs are actually being used by a substantial portion of their users today, let alone in the future. Pragmatically, surely you can intuit that the vast majority of those using AI specifically as a free alternative to obtaining art they would otherwise have to pay for, are companies who could easily afford (and previously had no choice but) to hire graphical designers and digital artists for their websites and publicity projects, but now aren't because they don't have to anymore. And the remainder are largely people who don't want to have to shell out $35 for a portrait of their "fursona".
 
Last edited:
That’s missing the point that it is people controlling the AIs, it is their intent, their actions that should be the starting point.
I thought that was my point when I said the people who invent the machine must be responsible for what it does. People seem in some cases to be saying that because the AI's use of copyrighted material is not predictable, and not under visible control, it's nobody's personal responsibility. But I think if the inventors fail to predict the kind of thing the AI might do with its material, it's their failure. Maybe I'm misreading some of the arguments, but it seems to me that if an AI transgresses common understanding of creative rights, its owners and operators should be responsible. If it's impossible to do this, then perhaps the AI should not be released into the wild.
 
I thought that was my point when I said the people who invent the machine must be responsible for what it does. People seem in some cases to be saying that because the AI's use of copyrighted material is not predictable, and not under visible control, it's nobody's personal responsibility.

I think people are prone to misunderstanding the issue here.

I don't know who makes that kind of argument but at least with regards to the kind generative AI that are in existence today the legal issues aren't with regards to the "AI's use of copyrighted material". It's not like there's a .png file that it looks up for inspiration. It's not "tracing" or directly copying someone else's work.

The issue lies with the fact that the AI is trained on copyrighted material. It's the developers that choose what data-sets to train the AI on. If those data-sets include works that are copyrighted then the developers are quite likely in violation of copyright law, at least if they choose to make it publicly available. Even if the original imagery or other form of copyrighted data no longer exists in any appreciable form they would still have used it without the rights holders permission. Essentially its immaterial how the AI functions at runtime.

Of course one can question if this is actually reasonable since its perfectly legal for people to be "inspired" by copyrighted media, to the point of essentially copying another persons "style". Is there really any fundamental practical difference between that and training an AI using the same media?
 
Last edited:
I thought that was my point when I said the people who invent the machine must be responsible for what it does. People seem in some cases to be saying that because the AI's use of copyrighted material is not predictable, and not under visible control, it's nobody's personal responsibility. But I think if the inventors fail to predict the kind of thing the AI might do with its material, it's their failure. Maybe I'm misreading some of the arguments, but it seems to me that if an AI transgresses common understanding of creative rights, its owners and operators should be responsible. If it's impossible to do this, then perhaps the AI should not be released into the wild.

You did indeed say that, sorry I was hard of understanding last night!
 
Reading an article about Zoe Thorogood, a twenty four year old graphic artist who has just been nominated for five Eisners. (The Oscar of graphic arts.)

One comment in the article reminded me of this thread.

But it was only in 2019, aged barely 20, that Thorogood thought about turning her love of comics into a career. Shy and anxious, she attended a comics event in London organised by US publisher Image. She showed her portfolio and was invited to a dinner of comic creators where she met Kieron Gillen, a British writer who has worked extensively for Marvel on titles including Avengers and X-Men.

Gillen says: “She wasn’t quite what she is now, but she had already metabolised a bunch of influences into a style, and approached the page with this mixture of glamour, groundedness and a real macabre energy. She was clearly the real deal. It was a real ‘guitarist walks into a bar and makes the old hacks’ jaws drop’ moment.
[highlighting mine]​

So apparently some artists (or at least this one) don't see a problem with building on the styles of other artists to develop their own.

And I expect she probably didn't pay a lot of royalties for that 'bunch of influences', but I'm sure it's okay since she's a human.

At least until the 'gotta do sumpin' pols get done with whirlwind legislation to answer the popular outcry. (Because that always turns out well. :boggled:)
 
Reading an article about Zoe Thorogood, a twenty four year old graphic artist who has just been nominated for five Eisners. (The Oscar of graphic arts.)

One comment in the article reminded me of this thread.


[highlighting mine]​

So apparently some artists (or at least this one) don't see a problem with building on the styles of other artists to develop their own.
And I expect she probably didn't pay a lot of royalties for that 'bunch of influences', but I'm sure it's okay since she's a human.

At least until the 'gotta do sumpin' pols get done with whirlwind legislation to answer the popular outcry. (Because that always turns out well. :boggled:)

Actually, I think very few artists haven't been at least heavily inspired by others.

And so have innovators of all kinds. The present cultural stage of humanity (whatever you might think of it) was not built by individuals from scratch.

Hans
 
Actually, I think very few artists haven't been at least heavily inspired by others.

And so have innovators of all kinds. The present cultural stage of humanity (whatever you might think of it) was not built by individuals from scratch.

Hans


This is what I've been pointing out all along. But somehow, when the same methods are used to train an AI it is ... different. For ... reasons.
 

Back
Top Bottom