I wonder if there are going to be a lot more k-dramas (even more than now) on Netflix.
Exactly the opposite. You are the “true believer” arguing that souls exist, that humans have something essential that means that they, and only they, will ever be “creative”. You are both giving too much credit to humans and too little to the potential of AI.
Ok. Five characters.
My argument is something different: if something is truly creative, we need to explore giving rights to it to protect its inherent worth. If it’s not truly creative, it’s a tool, and we need to treat it as such and examine how it affects everyone around it.
The AI isn’t doing any of this.
Let’s humanize it for a moment. I subscribe to Netflix and choose to carefully watch and analyze Stranger Things. I watch it over and over again, and create a detailed statistical analysis of the words spoken by the actors, the set directions I’m able to decipher, the costuming and special effects. I turn this analysis into a 200 page book. Who owns the analysis?
Netflix didn’t do the analysis. The writers didn’t do the analysis. Not a single phrase in the 200 page book matches a phrase from any of the show scripts, because it isn’t a script, it’s an analysis. I spent months watching the show, taking notes and doing mathematics to generate my analysis, it belongs to me. It is a product of my work, my effort, my long nights and unique personal take on the project.
When an AI does the same thing, albeit quicker, the analysis still does not belong to Netflix, or the writers. It belongs to the people who made the AI, the people who spent long nights programming a machine to analyze text. It is the product of their effort to do things that literally nobody else was able to do before them. You are calling those efforts plagiarism. As if the AI was doing something as trivial as that. It isn’t, it’s doing a unique and special analysis. An analysis not intended to generate copies of existing works, but to create legitimately unique works by developing a set of language rules that mimic natural human writing.
That doesn’t mean AI can replace people, but it isn’t pasting a picture to a book, or copying someone else’s text to my pathetic blog.
Even if this is true, and I make no claim that you’re wrong… cash is king. Hit shows, well written shows, earn money. If HBO/whatever their name will be next year/ tries to go the all AI route to their writing rooms, their shows will stink, and companies like Netflix, Prime, Hulu will eat them for lunch. There’s tons of competition in TV and movies, everyone is trying to capture market share, and writing shit shows doesn’t do that.
Not when good shows are out there to kick your ass.
…don’t move the goalposts.
We are talking about your specific claim. What does “better than humans” mean?
I’ve talked a bit about David Simon here in this thread. You might think I’m a bit of a fan. But I’m not. I don’t like his writing. I don’t like his work. I don’t like his shows. You also mentioned the Big Bang Theory. And I’d much rather watch an episode of Big Bang than I would watch the Wire. Does that mean that Chuck Lorre is a better writer than David Simon? Nope. I prefer different things, that’s all.
Both writers work in different environments with different challenges, and as Showrunner that had multiple different roles in addition to writing or contributing to scripts including managing a cast and crew of hundreds. So what that means is:
No, my demand is not meaningless. And it still stands.
And no matter how advanced it gets, it won’t ever be able to replace what the writers do on a television production.
Yes, humans struggle. Which is why writers rooms evolved. It means you can bounce ideas off each other. Less experienced writers get mentored by more experienced ones. And a day on set can completely change a writers perspective.
Well, I mean, it was changing.
We aren’t just talking about “who writes the lines for the Black man in the movie.” That isn’t what diversity brings to the table. For one thing: we are talking about opportunities for marginalised people to get their voices and their stories heard. And smaller and shorter writers rooms along with smaller budgets means that there will be less opportunities, and they will get snapped up by the independently wealthy (who more often than not will be white men) effectively shutting the door for marginalised people to ever getting on a writers room.
And secondly, we aren’t talking about “a Black man writing for a Black actor.”
We are talking about marginalised folks being able to tell their stories. A white writer might write a story about slavery that centres the trauma. A Black writer might approach that same topic in a very different way, that doesn’t focus on the trauma but on the resilience of the human experience. The subtleties and nuances are things that the AI will not understand.
Did you come up with some “pretty decent scripts” though? Or did it come up with “some pretty decent dialogue?”
Because a single page from a script, written in 12-point-courier, is approximately one minute of screen time. For a twenty-two minute episode of Friends, I would expect that your script would be anywhere from 20-25 pages long.
I’ll concede that an AI script-writer would be able to write individual lines of dialogue that might even be funny. But thats an entirely different thing to writing a complete shootable script.
This isn’t about whether or not “sitcoms are deep.”
This is how the writers room for a sitcom actually works. Its collaborative. They bounce ideas off each other. They get ideas from real life.
That “easy, simple entertainment, with light humor, petty conflicts, low stakes and easy resolutions” didn’t come out of thin air. Those stories exist because in real life one day one of the writers experienced a “petty conflict” and bought that experience to the writers room.
Nope. Today’s AI will very much not be churning out another season of Friend’s or another MCU movie. Well they might, if the studios get their way. But they wouldn’t be any good. They would be utter trash.
Well no, that really isn’t the way it works in a lot of cases. Often a writer or a team of writers will bring a pitch to the studio, the studio accepts the pitch, it goes into production, a writers room gets convened, then the writers “write what they want.” It goes to the studio, then the studio will send back “notes”.
And even in situations where this isn’t the case, it isn’t a matter of the writers sticking rigidly to the plot outlines. They are (often) employees, but they are also writers. They aren’t robots. They aren’t "creating content.’ They are writing stories.
The reason why the studios own the rights to the stories is because that is what was negotiated with the Guild. If the studios want to use the work created by the writers outside the scope of the current agreements, that’s something that has to be agreed to by the Guild and its membership. It isn’t a one-way street.
Well, no this is the writers guild negotiating on behalf of its membership in an attempt to protect the film and televsion industry from self-destructing.
No this isn’t correct. You don’t need that much training at all to be able to write a script. When I went through film school a few years ago there were 20 people in my class and all of us were capable of writing fairly decent scripts for a five minute movie after a relatively basic level of instruction.
And a writer doesn’t need to have read an insane number of scripts in order to be a screenwriter. A handful might be enough to get started. What really sets a writer apart are the ability to tell stories along with interesting life experiences and a vivid imagination.
An AI writer doesn’t have the ability to do these things. It can’t just “tell a story.” It hasn’t had life experiences. It doesn’t have an imagination. So to make up for those things it needs data. An insane amount of data. The two things are not comparable.
The WGA is not demanding the studios “just give up on all their intellectual property that they paid for.” That isn’t on the list of things the WGA is asking for.
It isn’t like this at all.
Of course not.
But lets not pretend that a writer AI is just “reading someone else’s script.” It isn’t an actual Artificial Intelligence. Its auto-complete on steroids.
They aren’t being fairly compensated for that work. Not at the moment. Its one of the reasons they have gone on strike.
That depends entirely on what the Minimum Basic Agreement says. I’m not a lawyer, and I don’t really want to do a deep dive into what the MBA says. But the answer to your question lies there.
But this isn’t something that is on the table during this particular negotiaion. So it really isn’t relevant.
Incorrect.
Writers that have been experimenting with AI have found that after a couple of pages of dialog it starts to consistently “lose the plot.” Its a language model. Not a database.
And here is an example of that. It couldn’t go more than six lines before it “forgot” that it was only allowed to use the word “fuck.” This isn’t something that will reliably “remember” that Betty was wearing a blue hat at the start of the scene, let alone Johnny and Nina got married in episode 2 Season 3 but got divorced two seasons later.
Its why having a showrunner is important, its why having the writer on set makes a significant difference. They break the story. They know it inside out. There typically aren’t “several teams of writers” working on a single show. You’ve got a writers room that break the story together then (typically, but not always) get tasked with writing their own episodes. But even those individual episodes are still largely collaborative efforts. I’ve cited examples above on how this actually plays out on set. Go back and reread my posts.
I used the word “effectively” here for a reason. For starters I was talking about “innovation.” What innovation does writer AI bring to the table?
Humans write based on what they’ve experienced. That includes what they’ve read. But it also includes what they’ve seen. What they’ve heard. Their life, their loves, their heartache, births, deaths, marriages. Feelings. Smells. Tangibles. Intangibles. Their relationship with god, with religion, with spirituality.
This isn’t the industry “feeding itself”. This is the complete opposite of that.
Its because what makes our stories interesting is what the storyteller brings to the table. The AI scriptwriter doesn’t bring anything to the table. And the old saying holds true here: “garbage in, garbage out.”
The models that are currently being suggested effectively change the role of the writer to that of an editor. Humans will still be a part of the process, but they will be there to facilitate the process which, at the creative level will be set at the producer level and not, where it is now, by the showrunner and the writers room.
Then explain (outside of speed) where the innovation is here.
Except this isn’t a conversation about the car manufacturing process.
Nope.
The WGA aren’t making unreasonable demands. And the technology is nowhere near good enough now for the studios to be able to rely on people “writing prompts and editing the output” to put together an episode of televison, let alone a season. Not without significant rewrites (not just editing.).
This isn’t progress. Its “progress.” The only thing that AI scriptwriters can do better than a human is be faster. But it won’t be good. Not on the scale that would be needed here.
That’s how the system works. In the first instance the limits are largely set by the creator. What happens here is that the studios and WGA have negotiated an agreement where intellectual property rights are transfered according to the terms of the MBA.
Because of the MBA.
The writers haven’t actually agreed to transfer those rights to the studio in the first place. AI wasn’t even a thing at the last negotiation.
Nope.
The unethical part is using peoples intellectual property without permsission or compensation and using that to generate billions of dollars for a handful of corporations.
Nope. Not at all.
This isn’t about just writing lines of dialog.
Its about what writers for for film and televsion actually do.
And there are things that writers for television do that a scriptwriting AI will never be able to do. That includes things like showrunning, which is literally running the entire production, managing hundreds of staff.
“Albeit quicker” deserves to be a lot more than an aside. Our current system is set up with the understanding that creating a derivative work is itself a significant act of labor. That labor is the cost you pay in order to obtain the derivative product. When you bypass that labor, our system starts to fall apart. Whether it’s plagiarism, copyright infringement, or AI, you’re using my work to create something from which you’ll profit, without putting in the labor that makes such behavior rarer.
AIs are new, and they require us to examine our old systems and safeguards and to come up with new structures.
Do you think that the people who created ChatGPT didn’t put any labor into it? It’s their labor that makes the thing work. It’s their labor that distinguishes ChatGPT from autocomplete. Their labor means I can ask a computer to tell me a story, and it tells me a story, a unique story that nobody has ever told before. That labor deserves to be valued, it’s not labor bypassed, it’s a new kind of labor we haven’t had before.
…I wasn’t talking about AI. I was refuting this:
I never claimed that is was. I was talking about legal ownership of intellectual property, and how profit isn’t the primary factor here when we talk about “what we have legal control over.”
Well here’s one of the problems. The writers of the show don’t actually know if the shows they are writing earn any money. And the metrics the studios release to show “if a show is a hit” or not are arbitrary and often change.
To get to this point, you would first have to have directors happy to cross the picket line to direct the AI generated script, you would have to have actors willing to cross the picket line, members of the IATSE would have to be willing to cross the picket line, and none of that is going to happen anytime soon. Certainly not by next year.
And even if they did, and somehow these AI written shows somehow get made and broadcast, we still wouldn’t know if those shows “did well or not.” The studios won’t release the metrics that matter, but will release the metrics that “show they despite the bad reviews, the show did just great”.
No. Before you ask, nor do I think that the people who created pencils didn’t put any labor into them. But that still doesn’t mean you can buy a pencil and thereby avoid a charge of plagiarism.
Yep, going post-scarcity is a bitch for people used to living off scarcity.
I’m guessing my aspiring screenwriter son is glad he has a different kind of writing gig right now. It would have been tough being in the business right out of college under the circumstances.
The morons who don’t understand the creative process will know. They’re bean counters, it’s the only thing they DO know. When some VP says “let’s fire all the writers” his division had better start making money.
When that pencil doesn’t write a single phrase you’ve written, I’ll find it pretty easy to avoid a charge of plagiarism. AI is not copying your work, it is analyzing your work, which you don’t have the right to control.
…the morons that don’t understand the creative process won’t care. They will fiddle the metrics, they will fiddle the books, they will cancel unaired shows as tax write-offs, they will pull all of the shenanigans we’ve seen them do over the last couple of years. They will extract every last cent from the failing businesses until they collapse.
That isn’t how it works. Uber isn’t profitable. Spotify isn’t profitable. Lyft doesn’t make money. “Making money” isn’t the point. Its vulture capitalism. Its the “dotcom crash.” Play funny games, chase shiny objects, do everything at pace except set up a sustainable business model and then when it falls over move onto the next big thing.
Here’s an interesting Forbes article on Common Crawl from 2017.
…here’s a ten-minute video featuring David Simon that makes effectively the same case that I’ve made in this thread. Except Simon makes the case more eloquently as one would expect coming from a professional writer
No it isn’t. No one judges scripts by how much time and effort it took to write them. There is zero value to the labor that went into a script. So e writers take years to write a script, and others can knock out great ones in a month or two. No one cares.
What matters is that good scripts are scarce. There is more demand for great scripts than there are writers to produce them. AI threatens to change that, and that’s what is scaring the WGA. Understandably so.
Still, how is that different than skilled weavers losing their creers to the automated loom? Or window knockers losing their jobs to the cheap alarm clock?
What if AI turns out to produce a better product than average human writers? Should the audience have to suffer sub-standard writing to protect jobs?
Anyway, the writers that are really under immediate threat are the armies of tech writers, business writers, workaday newspeople who do things like rewrite Reuters reports into stories, people who write everything from business plans to corporate reports, technical reports, prospectuses (prospectii?) etc. A lot of those people will soon be toast unless they are using their own specialized knowledge or experience in the job.
This isn’t quite true. It is analyzing a copy of your work, copied from the Internet or other databases into a large dataset which was then used to train the generative model. The copyright argument is that permission was not granted by the copyright owner for that copy to be made.
And then there’s the issue of what it does with that analysis. What it is actually does is look for patterns to try and create a probabilistic model that will allow it to recreate that data There is the argument that said model is essentially a lossy copy of the original data, similar to how a JPEG contains not the original pixels of an image, but instructions to recreate those pixels using an algorithm. And no one would argue that a JPEG copy is not a copy.
You could argue that our brains also store a lossy copy of anything we observe. But then you have the pesky issue that humans have the rights of personhood, while software does not. If I completely memorize something word for word, that is not copyright infringement. But if software reproduces something word for word, then that is a (possibly illegal) copy.
This is something I think those who directly analogize from humans to “AI” miss. They tend to assume that the rules have to be the same for both.
Heck, the biggest difference is that the US Copyright office has said that AI-generated content is not eligible for copyright. There’s a huge downside to replacing your writers whose content you can own with content you cannot.
One big difference is that the automated loom does not attempt to replace the creativity of the skilled weavers. There is no risk that, by replacing the weavers, you wind up discouraging creative thinking by making it something you can’t make a living from.
I don’t think this will occur with LLMs like ChatGPT. But, conceptually, I don’t see any reason why we shouldn’t. The right thing to do would involve weighing the pros and cons. It would entirely be possible that this “suffering” you describe is far far less than the suffering that these people would have for losing their jobs, or than the harm to society of removing the rewards for their creativity.
Of course, that’s the right thing to do, not what would most likely happen. If the rich can make more money by using AI, they will. It won’t even matter if the quality of the content is better. Worse products are often more profitable.
Fortunately, as I mentioned above, the product of AIs is currently not eligible for copyright. While that wouldn’t decrease the quality of the writing, it would greatly reduce the value of the product. Heck, even without that, if the product can be created by everyone, then people have less reason to pay the big companies for it.
Making money is always the point. If a VP takes a division of Disney that is making money hand over fist, and turns it into a division that has to cook the books, that VP is going to be handed his golden parachute and told to leave. Corporations aren’t evil, they’re amoral. They will pay people who make them money all day long, as long as they keep the dollars rolling in, and send people to the bread line by the thousand if THAT brings the money in.