In the sense that both are using information about the world to train themselves on how to interact with it, yes, they are the same.
I’d say that interpreting that statement as you did is fairly absurd. I mean, two people looking at the same piece of artwork are not exactly the same either. Point is, the same action is happening within the neural network as happens in your neurons when you look at a painting and learn from it.
I’d be careful with that. There have been sentient entities that had no rights and were considered property in our past. We should endeavor not to recreate that.
You are a complex set of algorithms as well. Everything that you do is based on information processing that happens within your brain. Neurons fire, and you take an action.
This only is a legal standpoint, and I’ll once again remind you that the same was said about human beings a couple hundred years ago.
The context isn’t as clear cut as some want to think of it as. We are no more than complex information processing systems, ourselves.
Yes, we think of ourselves as special because we are conscious. And we think that other things are not because we don’t think that they are. Consciousness is an emergent phenomena, and illusion of complex information processing systems that are able to think about their own thoughts.
Will there ever come a day that you will consider an AI to be a conscious entity with rights of its own, or will you insist that only a human can be?
Not really. If you take a picture, that is considered to be a copyrighted piece of art. You could have spent months studying the subject and setting up the lighting and waiting for just the right moment, or you can simply point your camera at something and click. Either one is treated exactly the same legally and ethically. There doesn’t need to be any lived experience or skillsets for creative work at all.
Or do you think that me snapping a picture of my dog is somehow unethical and shouldn’t be protected?
So, nearly everything is plagiarism? If I admit that I am inspired by someone else’s work, I have “stolen” their work and called it my own. And if I don’t admit that I am inspired by someone else’s work, then I’m a liar.
It’s only unethical in the sense that people keep insisting that it is. No one can say why its wrong, only that they don’t like it.
Can you give me a set of metrics on what it is exactly to say that one human writer is better than another? If you can, then we can use that to compare. If not, then your demand is meaningless.
If you mean AI generation, I don’t disagree, if you mean generation as 20-30 years, I completely disagree.
AI is advancing quickly, and there’s a whole lot of money behind it. Both the software and hardware are being improved and optimized for the task. What we have seen playing with ChatGTP3.5 is nothing compared to what is coming. And it’s coming fast, much more quickly than you think.
Never is a long, long time. And you are talking about thing that human writers struggle with as well. If you see a movie with a Black man in it, chances are, the lines that black man are reading were written by a white man. That’s changing, fortunately, but it still doesn’t mean that you have to have an experience to write about it, a Black man may be writing for a Black actor, but still have never experienced the things that the story calls for. No one has ever been an Asgardian god helping defeat monsters, but someone has to write Thor’s lines.
While I disagree strongly with the “That’s probably it” conclusion, I will also say that speed is a quality that shouldn’t be dismissed.
I want to be careful here, as people seem to get upset if you say things about pop culture and entertainment, but I played with ChatGPT for a while, and it comes up with some pretty decent scripts with the right prompt. Of course @Darren_Garrison’s example was surreal, his prompt was surreal. But I fed it more reasonable prompts that are more in line with the sorts of plots you may see in a Friend’s episode, and they came out pretty good. I’d bet if you showed them to a fan of Friend’s, along with actual scripts from the show, and asked them which was written by people and which by AI, they’d do little better than random chance at distinguishing them.
And there’s nothing wrong with that. Sitcoms aren’t supposed to be deep, they are supposed to be easy, simple entertainment, with light humor, petty conflicts, low stakes and easy resolutions. They are an escape from the stress of life, where people don’t want to have to think hard to understand the plot or to get the joke.
So, while today’s AI may not be able to write the next The Wire or Life of Pi, it can very well churn out another season of Friend’s or the next MCU movie, and people will watch with the same mental disengagement they currently do.
Let’s say a human has a perfect eidetic memory. They don’t have to put in any work at all. They just look at something and it becomes part of their memory. Are they plagiarizing whenever they create something?
The fact that you use a hippocampus to create memories and an AI uses GPUs doesn’t make it fundamentally different.
And a writer isn’t free to do whatever they want. They are employees, and write what they are told to write.
It’s certainly a position they’d love to win, but it’s completely unreasonable as a demand. It is telling the studios that they don’t have the rights to use their own intellectual property. They paid writers to write, and should now own what they paid for.
This would be like factory workers demanding that you couldn’t watch how they did their jobs to automate them. The factory workers would have loved that, it would have been job security, but it’s an absurd demand.
And a person doesn’t have a hope in hell of writing a script unless they have training and have read an insane number of scripts.
Agreed that that is a large part of the sticking point. And it’s one that the WGA is not going to win. The production companies are not going to just give up on all their intellectual property that they paid for.
It’s like if the UAW demanded that the car manufactures not automate the production lines.
Is it possible for a writer to write a script without ever having read someone else’s script? And they aren’t even looking to access other people’s intellectual property, but their own, that they paid for.
Disney, for example, has a massive amount of IP. The writers would like them not to be allowed to use that as training data. That’s unreasonable to me, and I’m sure it’s unreasonable to Disney as well.
They are compensated, they are paid for their work. If someone pulls out a Friend’s script that the studio paid the writers for, but was never filmed, should they be allowed to produce it?
The AMPTP would be bonkers to agree to that. Just as Ford would be bonkers to agree to worker’s demands that their jobs not be automated.
You’ve given your opinion, but it doesn’t actually fit the facts. An AI would be more consistent than human writers, who forget about plot elements or character traits. Many shows are made with several teams of writers writing different episodes, creating massive continuity issues between them. There is no reason to think that an AI would have any of those issues.
The only way that human writers get completely removed from the process is if they make unreasonable demands and refuse to budge, leaving the studios no choice but to rely entirely on AI.
And humans write based on what they’ve read. The industry has been feeding itself since the first time someone wrote something down.
That’s your opinion, but I see no reason as to why that would be true at all.
Well, I don’t see any reason why humans would have been removed from the process in the first place, but sure, humans adding in what it is that they want to see would be a continuing source of innovation for AI writing systems.
No, that’s not a settled fact, not by a long shot.
Same can be said for automating the car manufacturing process. It’s just removing a tiny line item from the budget. It doesn’t make better or more interesting cars, it doesn’t make life easier for workers or drivers.
But, you say that as if producing things with fewer resources isn’t a useful thing in and of itself.
If the studios are stupid enough to give into these demands, they will go under once studios that don’t agree to these demands hire “scabs” to write prompts and edit the output.
I’m as worried about people being out of a job as anyone else. But the time to worry about that was with the invention of the loom and textile mills that put seamstresses and weavers out of business. It’s a little late in the game to start trying to stop progress now.
That’s rarely been the case. If you are a writer for a studio, the studio owns the intellectual property of what you create while being paid by them. It is the writers that are trying to take that control from the studios in this negotiation.
Can you explain exactly how you learn, exactly how you experience things?
Sure we can. We can do lots of things, and often times do, even if they are bad ideas.
So, you are saying that there is no place for AI assistance in writing?
The same question could be said as to how Lotus 1,2,3 can let one person do the job of dozens, and yet, it did, and tons of accounts got laid off.
Computers and automation have been replacing jobs for as long as they have existed. No one really liked losing their job, but this is the first time I’ve heard it claimed that it is unethical to do so.
And you are assuming that what you’ve seen with people playing with ChatGPT3.5 is the limit of what AI is capable of. You don’t allow for development of both AI and the humans who work with it.
Is art about making money off of it, or is it about people getting to experience it?
Exactly, if AI can’t do as well as human writers, then they have nothing to worry about. It’s the fact that the WGA disagrees with the assessment that it won’t be as good or better than humans that is causing this disagreement in the negotiations in the first place.
I wonder if the posters who think it will never be as good would like to contact the WGA and tell them not to worry about it.