I'm missing something about AI training

The differnce is that you and me read/watch/listen to something and that leaves an impression.
We can then reproduce the impression that left with us. (depending of our memory, our skill, that impression will be more or less literal)
Our interpretation will always be personal.

A computer doesn’t interpret stuff, it copies stuff.

And then there is the issue that they didn’t get their learning material from legal sources. They just torrented a huge library. Which is dubious when I do it, but really not ok if a multi-billion company does it.

They copied stuff that was copyrighted and even if they didn’t just pump the material into their computer, they would be at fault for the way they aquired the material in the first place.

The “specialness” of humans is not magic and in this context it is specific to its being unique in being able to be creative. We can accept it is able to be creative because we see it create. Understanding the process, the how many neurons firing ends up producing what they do, is not required. Why should we demand to understand the process for a different form of intelligence.

It is a silly bar. And I am not so sure that gen AI isn’t a bit chaotic itself. It is a massively nonlinear system. There may be attractor basins to outputs but the paths to get there are likely different each run.

@Darren_Garrison - cool. I’d count it.

@Mangetout I doubt AI has that as an intent … or any intent. Some users do. And some have indeed attempted similar in art before. Yup. Failed.

I don’t think it’s necessary to assign AI uniquely human attributes to admit they aren’t simply copying existing works.

Is the intent of AI to outcompeted artists to destruction? Cite?

It seems more like the intent of (the people who created) AI is to give artists an incredibly powerful tool.

The intent of AI itself is completely lacking - AI has no agency of its own.

AI (the algorithms) doesn’t have any intent or maybe it does - that’s philosophy, but also irrelevant. The intent of the people making it is to obtain return on their (enormous) investment by selling a thing that replaces human work. Thus, in any way that matters, the intent of AI is to outcompete humans.

It’s a popular sentiment among tech bros that AI will reduce the need for highly trained artists, musicians, authors, etc because AI will allow anyone to produce high-quality work in those fields. Not just those, computer coding, law, and many other fields as well.

Of course nobody really sees it as “destroying art” or whatever, but there is very much a sentiment of “we don’t need experts anymore because AI will make an expert of everyone”. Elon Musk himself has said “synthetic data is the future”, i.e. there’s no more need to pay “content creators”.

Not talking about the philosophical theory of mind as pertaining to AI. Talking about the reason it’s here.

I said the same thing in my post:

Is the intent of the steam engine to outcompeted humans? When we invented the steam engine, did we create the same amount of stuff as before and just slaughter the excess population?

Or did we keep everyone employed and just make orders of magnitude more stuff?

AI doesn’t replace people. It is a tool that people can use to make themselves more efficient and productive, including artists.

That’s a lovely analogy, and apart from the fact that the introduction of mechanisation in the industrial revolution did create hardship; no - we didn’t keep everyone employed, and when they complained about it, we deported or imprisoned them (cite) - there is a thing happening here at a vastly different scale to that.

That’s an incredibly rose-tinted view of the thing and seems to ignore the things experts are predicting (and the news is already reporting) about significant layoffs in many industries.

And this isn’t just in tech or administrative roles, it’s happening in creative roles too

No, we can’t because that’s fundamentally not what creative means. My plastic molding machine can create lots of things but we don’t say it’s creative. In fact that’s the definition of creative. We say a work is creative if its creation involved the human attribute of creativity, like a piece of music, to distinguish it from the crate full of plastic knicknacks the molding machine produced, that did not.

If you are assigning the human attribute of creativity to the collection of assembly instructions that make up your AI model, then you need to explain what is lacking in the collection of assembly instructions that run my molding machine that means it can’t be creative.

Just saw this. Gift link.

Pretty pertinent.

The reason it is here is manifold. Some just scientific ambition. More to make money. Like most technological advancement it will likely mostly make money by increasing productivity. Jobs will change but past experience predicts that they won’t go away.

Ah matter settled then! No other intelligence, be it other earthly species, alien, or machine, can be creative, by definition, since your definition includes it be human.

Yeah I’m done with this silly part of the discussion.

Is this truly like those past examples? I’m not sure it is. This is happening at significant scale, in a world with a lot more people in it, who are more dependent on their place of continued employment in the industrialised world than ever before.

All we can say is that every past prediction that this time is different and will result in lasting decrease of jobs overall have been wrong. The experts I’ve heard bet more on increased productivity than decreased workers as the path to greater profitability.

Of course no one knows for sure.

You could say exactly the same thing at the dawn of the Industrial Revolution. It’s happening at a never before seen scale, in a world that’s never had so many people in it, all of whom are more dependant on continued employment in the feudal system than ever before.

Hell, you could say the same about the Agrarian Revolution. This happened at a never before seen scale, in a world that’s never had so many people in it, all of whom are more dependant on continued employment in the hunter-gatherer system than ever before.

So what does make it not copying?

I have a computer program (again a series of simple assembly instructions that take their input and process in a completely deterministic, simple, way). I encode some copyrighted works as binary 0s and 1s, use them as the input to my program, along with some parameters, it spits out a new work based on the input. For program A (my fancy audio mixing software) everyone agrees the thing my program spits out is a simple copy, for some reason though program B (your AI model) the thing it spits out is apparently an original work not a copy.

Why? Program B is no less a series of simple determinstic computer instructions than Program A. They are both just dumb automatons that take their inputs and set 1s and 0s in an entirely predictable way.

It’s a pretty safe bet. It’s just Jevon’s Paradox:

Making labor more efficient and productive through the use of AI will not result in huge decreases in the “consumption of labor”, ie employment.

One thing that’s different from before is that the goal of the solution being developed is for it to be general purpose.
If 20% of the workforce are laid off, and by some remarkable circumstance they all land firmly on their feet and find profitable employment, that employment just becomes the next target for application of the general-purpose thing where we make more money by deleting humans from the equation.
None of the previous cases was quite like that.

So is your brain, for a certain way of looking at it. Just input in, output out. Like a Chinese Room. Consciousness is just an illusion haphazardly slapped on top, and the illusion sometimes fails.