Concider other scenarios:
A company produces an operating system. A copy costs hundreds or thousands of dollars and you are strictly locked into their terms of use and their idea of how the operating system should work. Then someone comes along and creates a clone of the operating system that is 100% free and you are free to rewrite if you have a different idea of how it should work. Would people want that? Yes, they would. Literally billions of people use that clone operating system right now, largely in a version known as "Android’, vastly more than the number who use the paid version called “Unix”.
Imagine being able to use free applications like Libreoffice and Gimp instead of paying for Microsoft Office and Photoshop.
Imagine being able to design and print your own newsletter without having to pay a graphics designer, typesetter, and printing press.
Imagine being able to print your own plastic replacement parts for broken ones instead of paying exorbitant prices for a factory part.
Is this a world people want to live in? Yes. And it is the world we do live in.
You are looking at AI as a tool for big businesses to do things more cheaply. And that can happen. But AI is also a tool to allow billions of other people access to be able to do things on their own that are good enough to suit their purposes, and while maybe not the same quality as the professional version can be made for little to no cost. Do I want to live in that world? Yes, I do.
No, I am looking at the fact that people don’t want any kind of constraint applied to the scenario I described, perhaps often because they characterise it as being more like the scenario you described.
Both things (and more) happen, but we don’t have to treat both things the same
You’re making AI into a “tool” for people to use.
It is not.
“AI” is a tool to use people
The clearest proof of this is AI is fed (millions of) illegal copies of copyrighted works.
Even if “teaching” AI comes under fair use (different discussion) the copies they used weren’t obtained legally. I am ok with schools using copies of books that weren’t paid for to teach kids. I am not ok with multi-billions corporations doing so to “teach” their products.
My heartfelt and sincere opinion on that is “legal, schmegal”.
You simping for the likes of Altman?
You a Musk fanboi?
Part of the problem is that nobody really knows for sure if it is or isn’t illegal, because it’s a new thing; the two analogies people commonly use are neither very exact; that is:
It should be illegal because it’s just like making copies.
It should be legal because it’s just like looking around an art gallery.
(it’s not either of these things, exactly)
Even the notion of copying the materials into a separate temporary dataset for the training process is a bit grey, since, when you look at copyright images on the web, your browser is downloading a temporary copy to your local machine (or else it could not display it).
I carefully avoided making personal attacks on you. I see that you lack the same restraint.
There are two factors that go into training an AI. One is access to the training data. The other is access to the computer time. At the moment, the training data is free but the computer time is very expensive, restricting training to those with deep pockets. But innovations in methods and incremental improvments in microchips make that computer time cheaper as time goes on, making training potentially available to a larger number of people. Except that you want to make the training data extremely expensive, forcing it to be even more tightly limited to the multi-billion dollar companies you oppose. And the AIs will continue to be made by people/places willing to continue ignoring the copyright issue while compliant people/places will be attempting to catch up with one leg and one arm tied behind their back. Can’t use AIs made in the US of A? Then I’ll use the ones that will continue to be made in Russia, or China where they aren’t being held down.
How ever you look at it. They did not legally acquire the books to begin with, making the whole discussion moot.
I’m not saying they should buy rights to the books they are using. Just one (1) copy.
Sure, but that feels a bit like saying an invading army should have wiped their boots on the way across the border.
Requiring AI training processes to buy copies of the work is correct, but also sort of trivial in the bigger picture of all the impacts.
I mostly agree, but it is illustrative of the whole mindset of the people involved.
You comparing them to an invading army is telling.
Agree - they have absolute disregard for the thing they are feeding on, except insofar as it can be mass produced and sold.