Can you explain how human works are not a pastiche of elements of things they have been exposed to?
To add to this question: our ethical codes and laws are written around the understanding that creating a new work requires effort and the lived experience of the person involved. If you claim something is your own, that’s because you’ve put in the labor both to develop the skillset for creative work and to create the specific work in question.
Plagiarism is the misdeed of appropriating someone else’s skillset and labor: you, as a human, don’t complete the labor necessary either to acquire the skillset nor to create the work, but you leverage the skillset and work of someone else without their permission, and call the work your own.
It may be that plagiarism isn’t the precise term for what’s going on here–but the reason why it’s unethical is in every relevant way the same.
I am not a philosophy professor, nor is that my claim, so: no.
…until you can give us a set of metrics on what exactly “better than humans” would actually mean, then we can’t really take this anywhere.
What I will say is that I’m 100% confident that AI will never be able to replace what a human writer for television can do right now for at least a generation.
An AI writer will never be able to observe an actor struggling with a line delivery, figure out what is wrong, and make the changes accordingly.
An AI writer will never be able to get a “gut feeling” why killing off a character might be the wrong choice for a story, then passionately argue with the other writers in the room until they all figure out what that “gut feeling” was.
An AI writer will never actually understand what its like to be a Black man in America, what its like to be a trans person living in an oppressive state, what its like to be a woman working for an abusive boss.
An AI writer will always be faster than a human writer. But on the list of things that it will be “better” than a human writer at? Thats probably it.
I disagree. The reason plagiarism is unethical is that you’re taking someone’s work, the exact work they did (or close enough) and calling it your own work. ChatGPT isn’t doing that. It’s taking the work of 1,000 people identifying what makes those works a good exemplar of language, and then making it’s own version of work using the concepts identified.
ChatGPT is derivative, not plagiarized. Nobody ever wrote a Mr Rogers vs. Bob Ross fistfight, there’s no person who can look at that thing and say “Hey, I wrote that!” That script, even if it sucks, is unique, as unique as if you wrote your own version of such a fight.
Right. ChatGPT is looking at a billion examples of English and determining that x word is followed by y word z percent of the time. You can compare it to the US Census–nobody cares about the data of an individual person, it is the collected data of millions of people that is useful.
The fact that it’s taking the work of 1,000 people instead of 1 doesn’t make it ethically better. As I detailed, ownership of concepts has clear underpinnings. These underpinnings date back to Locke’s second treatise on government and its justifications for all private property. THe reason plagiarism is disliked is ethically identical to the reason that ChatGPT is disliked.
People can dislike someone and still be wrong. I’m sure there there are many Supernatural Sam/Dean incest porn writers on AOOO who are incensed that AI Techbros are Stealing Their Work! when what is actually happening is they are providing a slight uptick in a weighting for the model choosing the word “member” after the word “turgid”. Lots of notes are mistaking themselves for a song.
What makes it ethically better is that the AI is not directly copying anyone’s work to produce something it calls its own. It is using other people’s work to learn what writing is, and we are all free to do that as much as we want.
Forgive me for not responding to the blatantly obvious.
Only for metaphorical values of “learn.” Again: the important part is that human beings have put in the labor to acquire skills of creation. Part of that labor involves learning from other artists, of course; but it’s labor. When someone uses technology to bypass that process and instead builds models off other folks’ labor, they’re committing the same ethical misstep as plagiarists, even if the particulars are a bit different.
That’s okay, I’m not responding to this. In fact, I never even saw it.
…well AI isn’t a person though, isn’t it. So it isn’t “free to do whatever it wants.” Its software. It does what it is told.
The position of the WGA is that “minimum-basic-agreement covered material can’t be used to train AI.” Based on everything you’ve written in this thread, that’s a sensible position for the WGA to take, wouldn’t you agree?
ChatGPT, the collection of people, software, and equipment that forms the thing you can access online… is free to read legally available writings so that the system may learn what writing is.
I’m not sure what “minimum-basic-agreement covered material” is as a term of art, but I’ll throw out a couple of concepts and my take on it.
If that means that producers must prevent AI from accessing next season’s SNL broadcasts… that’s completely out of their power.
If that means producers can’t publish final scripts and/or draft scripts for the purpose of AI learning, that’s an agreement that can be made, but I’m not a fan of the concept. I also question the utility of this restriction as the finished product is going to be available regardless of this agreement.
Despite being broadly pro-union, I think unions go too far by using the collective bargaining process to hamper innovation. I would personally rather see unions focus on ensuring that the benefits of innovation get shared in significant fashion with the union workforce and allow innovation to happen.
As a parting note, this is a business negotiation, and the union should strive to get whatever they can get, because the producers are doing the same. I’d bet the producers are asking for way more things “I’m not a fan of”, so let’s not take my position on AI to mean too much.
…it can read it.
But there should be limitations (as there are with everything else) on what it can do with that intellectual property. Because an AI isn’t a human. It isn’t a person. And we can’t conflate a “computer reading something” with a “person reading something.” They aren’t the same thing.
AI scriptwriting software doesn’t have a hope-in-hell of every getting capable of writing a coherent script that goes beyond a couple of pages unless it has both been specifically designed to do so, and it has access to an insane amount of data. (And even then I don’t see that happening in our lifetime) And its that access to the data that is the issue but more importantly what it does with that data that is the issue here.
“The Minimum Basic Agreement (MBA) is the collective bargaining agreement that covers the benefits, rights, and protections for most of the work done by WGA members.”
That isn’t how a negotiation works. If it is literally impossible for the AI scriptwriting software to not access other people’s intellectual property, then the WGA will hold firm in rejecting any AMPTP position that insists on using that particular software.
Because the situation as you describe it is untenable. What you are asking of the writers here is effectively to give up all intellectual property rights to their work for no compensation to be used as part of a dataset to train an algorithm that is guaranteed to take away their job in the next couple of years. You would have to be bonkers to agree to that.
Whether or not you can see it, there are substantial ethical issues at play here regarding intellectual property, and those things will have to be addressed.
It means what it says it means. Any MBA covered material can’t be used to train AI. Not available before broadcast. Not available after broadcast.
I’m sorry, but what innovation is AI bringing to the table here?
We know that it will write scripts faster, as in orders of magnitude faster. But how will this help the creative process? How will that fit in with the existing creative pipeline? I’ve already outlined how removing writers from set has impacted what we are seeing on television at the moment. Less continuity from episode to episode, meandering plot lines that end with a sudden stop, odd disconnects with tone. How will effectively removing human writers from the process address that? Where is the innovation here?
If human writers are pushed out of the industry, that means the data-set used to feed the AI script-writing machines will increasingly consist of AI-written-scripts. It will start feeding itself. And we will end up with ungodly messes. And the only way to innovate out of this ungodly mess? Bring in human writers. It will eventually come full circle. (Assuming humanity lasts that long)
The actual answer of course is that there is no innovation. This is a regressive move that only has a single purpose, to remove a tiny line-item from the budget to help improve stock prices and to make the executives look good. It won’t make better or more interesting television and films. It won’t make life easier for actors and directors. We won’t be seeing new stories from diverse storytellers. The only innovation here is in terms of the budget.
.
They don’t have to agree to it, it’s the way the universe works. Their MBA covered material is widely distributed world wide, to literally anyone who can afford $15 a month, or a rooftop antenna. You are asking people not only to not profit by pretending the writer’s work is their own (which is fine) but to also not allow your work to be “experienced” by an AI that the writers and producers have no relationship to.
Once you decide to profit off of your art, you lose a certain amount of control over it. The idea that an entity shouldn’t be allowed to watch your TV show to learn how TV shows work is nonsense. What you want is for producers to be barred from using an AI writer that may have seen a TV show before. It’s an OK negotiating point, but it isn’t an ethical quandary.
It’s OK for art to be made more efficient. We’re not going to go back to writing with quill pens instead of computers, because writer rooms would be simply bursting with work. If literally all that AI can do is make a 5 person writing job able to be done with 3 people, that’s worth doing.
It’s also worth paying each writer more, since they’re all more productive after the change.
…we aren’t talking about “experience.” Software can’t experience things.
We are talking about how that content is used.
Usage is the key part.
And there are limits on “how much control you lose”, and those limits are typically in the first instance set by the creator of that art. Thats how the system works. You can’t invent a new technology, pretend that its the same as a human, and then use someone elses intellectual property without their permission. Its like Uber pretending they are a “ride-sharing service” or a “technology company” when they are really just a taxi company.
It isn’t an “entity.” It isn’t alive. It isn’t “learning” the same way a human learns. It isn’t “experiencing” the same way a human experiences. Its software. And yes, we can put limits on software.
The AI writer isn’t “seeing a TV show.” An AI writer isn’t human. Stop giving a piece of software human attributes.
It very much is.
An AI writer cannot make a 5-person writing job able to be done with 3 people. Where in the process are you imagining it fitting in the pipeline? How exactly are you imagining that it will work? How will it make it “more efficient?” Can you be precise here?
How will an AI writer make a writer more productive? The human writer will have to constantly be rewriting the AI generated scripts because they will be fundamentally unfilmable for all the reasons that have been outlined already. An AI generated script instantly puts the human writer at a disadvantage. It also makes the breakdown process much harder for everyone else on a production because the AI scriptwriter won’t be (without incredibly advanced programming) taking into consideration the needs of the Art Director, the Prop Master, the Director, the Director of Photography, Wardrobe, all of them dependent on the script in order to do their job. The AI models we are talking about are language models. But there is so much more to the script writing process than just a mastery of language.
What impact will an AI writer have on the other issues that you ignored in my last post which includes quality of scripts, diversity in the writing room, continuity of plot and tone? You are assuming an increase in productivity. But I don’t think that assumption is reasonable.
This is the key point that folks keep ignoring. We don’t weigh the “labor” of software the same way we weigh human labor, nor should we. Attempts to conflate human learning with machine “learning” are obfuscatory.
Our current system is set up because we know that acquiring the skills to create art is enough friction in the artistic milieu that not many people will do it, and the creators will therefore be able to obtain reasonable profits from their work. Short-circuiting that process through the crude mimickry of AI is a fundamental change to the dynamic, and pretending that it’s just the same as what’s always happened betrays a profound lack of understanding of either human creativity or of AI heuristics.
No, it is the assertion of an opinion framed as an objective fact that people keep disagreeing with. That the learning process of an artificial neural net is fundamentally different from the learning process of a natural neural net and that the distinction means anything except as a post-hoc rationalization for why you don’t like it is just, like, your opinion, man.
Not if your work is intended to be made available to virtually every human being in existence for the cost of a sandwich. All you legally have control over is whether other humans can copy your work and resell it for profit.
The word entity is often used for non living things, such as corporations and organizations, probably more often than it is used to describe individual humans. Let’s quit with the silly nitpicking over terminology.
It also doesn’t matter a whit that AI isn’t human, it’s owned by humans, made by humans, it’s a tool to be used by humans, and you will have to tell humans that they can’t let their computer analyze a TV show they paid to download, or a book they paid to read, or decode a message sent for free over the airwaves.
Best of luck with that, if it’s my computer and my AI, sue me, explain to the judge how I violated your copyright without ever selling a copy of anything you made. I’m allowed to analyze your art with a computer.
Then the WGA should have no problem with it.
Your claim is that AI will not streamline the writing process, won’t make writers more efficient, won’t make scripts better, won’t make shows or movies better, but will be enthusiastically adopted by producers anyway because they like making more expensive hard to film shitty shows?