Writers Guild of America goes on strike (5/2/23) tentative deal (9/25/23) Now accepted (10/9/23)

In the sense that both are using information about the world to train themselves on how to interact with it, yes, they are the same.

I’d say that interpreting that statement as you did is fairly absurd. I mean, two people looking at the same piece of artwork are not exactly the same either. Point is, the same action is happening within the neural network as happens in your neurons when you look at a painting and learn from it.

I’d be careful with that. There have been sentient entities that had no rights and were considered property in our past. We should endeavor not to recreate that.

You are a complex set of algorithms as well. Everything that you do is based on information processing that happens within your brain. Neurons fire, and you take an action.

This only is a legal standpoint, and I’ll once again remind you that the same was said about human beings a couple hundred years ago.

The context isn’t as clear cut as some want to think of it as. We are no more than complex information processing systems, ourselves.

Yes, we think of ourselves as special because we are conscious. And we think that other things are not because we don’t think that they are. Consciousness is an emergent phenomena, and illusion of complex information processing systems that are able to think about their own thoughts.

Will there ever come a day that you will consider an AI to be a conscious entity with rights of its own, or will you insist that only a human can be?

Not really. If you take a picture, that is considered to be a copyrighted piece of art. You could have spent months studying the subject and setting up the lighting and waiting for just the right moment, or you can simply point your camera at something and click. Either one is treated exactly the same legally and ethically. There doesn’t need to be any lived experience or skillsets for creative work at all.

Or do you think that me snapping a picture of my dog is somehow unethical and shouldn’t be protected?

So, nearly everything is plagiarism? If I admit that I am inspired by someone else’s work, I have “stolen” their work and called it my own. And if I don’t admit that I am inspired by someone else’s work, then I’m a liar.

It’s only unethical in the sense that people keep insisting that it is. No one can say why its wrong, only that they don’t like it.

Can you give me a set of metrics on what it is exactly to say that one human writer is better than another? If you can, then we can use that to compare. If not, then your demand is meaningless.

If you mean AI generation, I don’t disagree, if you mean generation as 20-30 years, I completely disagree.

AI is advancing quickly, and there’s a whole lot of money behind it. Both the software and hardware are being improved and optimized for the task. What we have seen playing with ChatGTP3.5 is nothing compared to what is coming. And it’s coming fast, much more quickly than you think.

Never is a long, long time. And you are talking about thing that human writers struggle with as well. If you see a movie with a Black man in it, chances are, the lines that black man are reading were written by a white man. That’s changing, fortunately, but it still doesn’t mean that you have to have an experience to write about it, a Black man may be writing for a Black actor, but still have never experienced the things that the story calls for. No one has ever been an Asgardian god helping defeat monsters, but someone has to write Thor’s lines.

While I disagree strongly with the “That’s probably it” conclusion, I will also say that speed is a quality that shouldn’t be dismissed.

I want to be careful here, as people seem to get upset if you say things about pop culture and entertainment, but I played with ChatGPT for a while, and it comes up with some pretty decent scripts with the right prompt. Of course @Darren_Garrison’s example was surreal, his prompt was surreal. But I fed it more reasonable prompts that are more in line with the sorts of plots you may see in a Friend’s episode, and they came out pretty good. I’d bet if you showed them to a fan of Friend’s, along with actual scripts from the show, and asked them which was written by people and which by AI, they’d do little better than random chance at distinguishing them.

And there’s nothing wrong with that. Sitcoms aren’t supposed to be deep, they are supposed to be easy, simple entertainment, with light humor, petty conflicts, low stakes and easy resolutions. They are an escape from the stress of life, where people don’t want to have to think hard to understand the plot or to get the joke.

So, while today’s AI may not be able to write the next The Wire or Life of Pi, it can very well churn out another season of Friend’s or the next MCU movie, and people will watch with the same mental disengagement they currently do.

Let’s say a human has a perfect eidetic memory. They don’t have to put in any work at all. They just look at something and it becomes part of their memory. Are they plagiarizing whenever they create something?

The fact that you use a hippocampus to create memories and an AI uses GPUs doesn’t make it fundamentally different.

And a writer isn’t free to do whatever they want. They are employees, and write what they are told to write.

It’s certainly a position they’d love to win, but it’s completely unreasonable as a demand. It is telling the studios that they don’t have the rights to use their own intellectual property. They paid writers to write, and should now own what they paid for.

This would be like factory workers demanding that you couldn’t watch how they did their jobs to automate them. The factory workers would have loved that, it would have been job security, but it’s an absurd demand.

And a person doesn’t have a hope in hell of writing a script unless they have training and have read an insane number of scripts.

Agreed that that is a large part of the sticking point. And it’s one that the WGA is not going to win. The production companies are not going to just give up on all their intellectual property that they paid for.

It’s like if the UAW demanded that the car manufactures not automate the production lines.

Is it possible for a writer to write a script without ever having read someone else’s script? And they aren’t even looking to access other people’s intellectual property, but their own, that they paid for.

Disney, for example, has a massive amount of IP. The writers would like them not to be allowed to use that as training data. That’s unreasonable to me, and I’m sure it’s unreasonable to Disney as well.

They are compensated, they are paid for their work. If someone pulls out a Friend’s script that the studio paid the writers for, but was never filmed, should they be allowed to produce it?

The AMPTP would be bonkers to agree to that. Just as Ford would be bonkers to agree to worker’s demands that their jobs not be automated.

You’ve given your opinion, but it doesn’t actually fit the facts. An AI would be more consistent than human writers, who forget about plot elements or character traits. Many shows are made with several teams of writers writing different episodes, creating massive continuity issues between them. There is no reason to think that an AI would have any of those issues.

The only way that human writers get completely removed from the process is if they make unreasonable demands and refuse to budge, leaving the studios no choice but to rely entirely on AI.

And humans write based on what they’ve read. The industry has been feeding itself since the first time someone wrote something down.

That’s your opinion, but I see no reason as to why that would be true at all.

Well, I don’t see any reason why humans would have been removed from the process in the first place, but sure, humans adding in what it is that they want to see would be a continuing source of innovation for AI writing systems.

No, that’s not a settled fact, not by a long shot.

Same can be said for automating the car manufacturing process. It’s just removing a tiny line item from the budget. It doesn’t make better or more interesting cars, it doesn’t make life easier for workers or drivers.

But, you say that as if producing things with fewer resources isn’t a useful thing in and of itself.

If the studios are stupid enough to give into these demands, they will go under once studios that don’t agree to these demands hire “scabs” to write prompts and edit the output.

I’m as worried about people being out of a job as anyone else. But the time to worry about that was with the invention of the loom and textile mills that put seamstresses and weavers out of business. It’s a little late in the game to start trying to stop progress now.

That’s rarely been the case. If you are a writer for a studio, the studio owns the intellectual property of what you create while being paid by them. It is the writers that are trying to take that control from the studios in this negotiation.

Can you explain exactly how you learn, exactly how you experience things?

Sure we can. We can do lots of things, and often times do, even if they are bad ideas.

So, you are saying that there is no place for AI assistance in writing?

The same question could be said as to how Lotus 1,2,3 can let one person do the job of dozens, and yet, it did, and tons of accounts got laid off.

Computers and automation have been replacing jobs for as long as they have existed. No one really liked losing their job, but this is the first time I’ve heard it claimed that it is unethical to do so.

And you are assuming that what you’ve seen with people playing with ChatGPT3.5 is the limit of what AI is capable of. You don’t allow for development of both AI and the humans who work with it.

Is art about making money off of it, or is it about people getting to experience it?

Exactly, if AI can’t do as well as human writers, then they have nothing to worry about. It’s the fact that the WGA disagrees with the assessment that it won’t be as good or better than humans that is causing this disagreement in the negotiations in the first place.

I wonder if the posters who think it will never be as good would like to contact the WGA and tell them not to worry about it.

The assertion is that software lacks human rights. This is an opinion?

Yes!

The rest of your really really long snip-and-respond is not in a format I care to engage with.

Fair enough.

It’s a complex issue, and like most complex issues, there are simple answers. Only problem is, those answers are wrong.

If that is what you meant, then you quoted the wrong passage. What you quoted was a claim about AIs not learning the same way humans learn.

That was one part of the bit I quoted, of course, but not the whole thing. The assertion is that AIs differ from humans in relevant ways. To go all Big Lebowski on this would be unworthy of freshman college dormroom discussion.

The assertion that AIs don’t have human rights is a very weird non-sequiter red herring.

It’s really not–but you’re ignoring half of my post each time to respond to the other half in a way that’s pretty exhausting to keep up with, so I don’t think I’ll go around this carousel much more.

The other half of your posts consist of making statements that you think are fact but are actually either wrong or a matter of opinion or unknown.

And all of your arguments could have been applied to the automated loom, which took skilled human labor and mechanized and commodified it. Intricate patterns could be stitched without years of training! Unethical!

It’s also crazy to think that AI wouldn’t make a human writer more productive. I AM a human writer, and it’s easy to see where it makes you more productive. You can use the AI for character bios, as a research assistant, for setting up script formats and boilerplate and all the tedious stuff writers have to do.

“Hey ChatGPT, my character is a mountain climber. What equipment would he be using to scale a building?”

“Hey ChatGPT, I need a character who comes from Turkmeinistan. Give me a likely town for him to grow up in. What kind of job might he have had? What’s in the school curriculum there? What is the average home like?”

“Hey ChatGPT, I’m writing a page, and it’s just not coming out the way I want. Here’s the page - give me five alternatives in different voices and styles.”

“Hey, ChatGPT, check my script for inconsistencies with characters, their histories, and motivations.”

“Hey ChatGPT, I’m blocked. I don’t know where to go next in my story. Write five examples of the next page of the story, with different plot elements engaged.”

“Hey ChatGPT, go through my story and create a Dramatis Personae for me, with the name of each character and a sentence describing them.”

Won’t it be awesome when all those WGA writers do all this stuff manually, while their competitors focus on better scripts or more scripts in the same time and money?

You don’t even have to use a word that the AI gives you for it to accelerate your writing.

BTW, do the WGA writers agree to forego the use of AI themselves? Because my guess is that most of those writers are already either experimenting with it or using it heavily. Or is this only something the suits are forbidden to use?

They say that they aren’t ruling out the use of AI. But they are also saying that they don’t want the AI to have any sort of training on the scripts they are writing or have written.

So, if they get their way, they could use AI, it would just be hobbled and useless.

I don’t know if I would use it for that other than novelty. I write for my own personal enjoyment of the process, and since the last writing list I was on shut down nearly 20 years ago, I haven’t even shared it with anyone.

However, I also write a bunch of correspondence with my landlord, lawyer, CPA and other professional services, and I’ve found that ChatGPT, even if it doesn’t write quite as well as I would have, certainly writes a whole lot faster with far less effort on my part. Some don’t seem to think that saving humans time and effort is worthwhile, though.

The rest of your post seemed to be complaining that reducing the barrier of entry towards creating “art” is a bad thing because it is unfair to the highly trained (rather elitist) and that people are claiming that things are as they have always been (nobody is claiming that. We are inside of a major disruptive technology-based shift in human culture. They happen sometimes. Technology creates niches, technology takes niches away. Then it becomes the new normal).

The idea that it’s elitist to prioritize the labor of people who have skill over the resources of the people who have capital is the most damning indictment of late-stage capitalism that I’ve seen all day. But it does help to clarify your position, so there’s that.

That isn’t remotely my position. My position is that I like the idea of powerful, unrestricted generative AI being in the hands of anyone and everyone who wants it, for free. You seem to see AI as a tool for capitalism, when I see the same AI as practically communism, putting the means of production into the hands of everyone.

A link I posted in the big Great Debates thread:

A leaked report from Google. The gist is “other AI companies are kicking our ass but that don’t matter because open source is gonna kick all our asses”.

Or at least a nominal fee. My fear is that it gets priced out of the range of the everyday consumer. $20 a month is reasonable. $20,000 a month is not.

And this is why how the laws around this get written are important. The greatest gift to late-stage capitalism would be to lock down AI so that you can only train on IP that you own. Then those who own tons of IP like Disney have massive amounts to use to train their AI, and the rest of us have none. The messed up thing is that those who decry “late-stage capitalism” and those who will push for laws that give them this gift are pretty much the same people.

As I said, it’s complex issue, and those seeking a simple answer will come up with the wrong one.

See the paper I posted. Already text and image generating AIs that rival the commercial versions are available to download and run absolutely free, and they are improving at a more rapid pace than the commercial versions. Generative AI right now can be had for zero dollars a month, provided that you have a good enough computer to run it. And the open source versions are finding ways to lower the hardware requirements.

I’m afraid I find this position hopelessly naive, akin to how early Internet adopters thought the WWW would be the great leveler.

We’re not talking about an abstract utopia. We’re talking about the very real world where capitalists are resisting calls by workers to limit the use of this technology, because the workers believe it’ll put a lot of them out of work and further concentrate wealth in the hands of the company owners. This is not remotely a situation in which power is being distributed.

That doesn’t mean that a professionally produced system couldn’t have some advantages. Maybe it will, maybe it won’t. But my point was that something between free and $20 a month is something that the public can use.

If the laws are such that you can’t use publicly available information, that you have to own the IP in order to use it for training, then it’s going to be much more expensive.

My perspective is not from the technical, but the legal side of things, that could enrich the IP holders at the cost of the public, if those who say that AI has to learn under different laws than a human does get their way.

…what does this even mean?

This is the basis of how copyright and intellectual property works. This doesn’t change just because of AI.

Incorrect. Legally, you have control over how your work can be used. Again, usage is the correct term to be using here. People can’t just (legally) copy your work and give it away for free. People can’t just copy your work and use it to illustrate their picture book. People can’t just copy your work and use it on their blog posts. Profit isn’t the primary factor here. Not everything is Fair Use, or Fair Dealing.

When we are talking about ascribing human characteristics to a piece of software, I’ll quibble over the terminology all day.

And it is a tool that will be used by humans to exploit the work of thousands of creatives without their explicit permission and without compensation to, generate billions of dollars of revenue for a handful of corporations.

This is a thread about the Writers Guild of America going on strike, and you appear to be going WILDLY off topic. We are talking about the AMPTP, and unlike you the membership of the AMPTP will make a metric shitload of profit out of this. What you do in the comfort of your own home is an entirely different matter.

Nonsense. I’ve laid out the reasons why the WGA have a problem here.

It won’t.

It won’t.

It won’t.

It won’t.

Yep.

Because, by and large, they are complete and utter morons completely disconnected from the creative process. They think they know better. Which is why they’ve been making cuts to the writers rooms for the last three years.

But we’ve seen this play out multiple times already. We saw it with pivot to video. We saw it with NFT’s. We’ve seen it with companies with over 400 million active users that fail to make a consistent profit. We saw it with the failure of Vice and Buzzfeed News, with the failure of Meta, with the soon-to-be failure of Twitter, the people in charge of these companies are not smart. They chase shiny objects. AI is a shiny object. It certainly could be helpful for writers, and the WGA is leaving the door open here. But it won’t revolutionize script writing. Because script writing is more than just words on the page.

Yeah, after being suspended from this thread, Im giving up. Even though I’m basically arguing that you need a soul to be truly creative, i feel like I’m an atheist arguing with true believers.