I don’t think we’re talking about any ‘Anti-AI Purists’. We are talking about people trying get around a law. People have come up with plenty of ways to do that through the ages.
Have you done photo-editing by painfully hand-selecting pixels? I know I’m never going back to that if I could possibly avoid it. Law be damned.
Sure - but generally not by making that much work for themselves.
But OK, sure, some small handful of technoLuddite guys are going to painfully make photorealistic celeb porn pixel by pixel in MS Paint because it’s their last legal refuge.
Like I already indicated, if this law doesn’t criminalize them, that doesn’t mean this law is useless. It means we need more laws to cover those sex offenders too.
I assume the law specifically covers AI generated media because
(a) AI is a big buzz right now
(b) The rash of recent events in schools have all involved AI.
(c) The more narrow the scope a law covers, the more likely it often is to pass. “Made by AI” is a nice clean demarcation that avoids the “But what if it’s a stick figure with a face pasted on it?” nonsense arguments that derail legislation.
I doubt that “But AI’s can’t be covered by copyright because they don’t have personhood, so…” applies to it. But I also don’t really care if that did factor in or not.
Curiously, I was looking for examples of states that passed a law explicitly about AI generated content and had it overturned on First Amendment grounds and couldn’t find anything. Looking for the governor’s statement, it sounds as though he wanted the language shored up in case of a challenge because similar laws in other states had been challenged but there was no resolution yet.
Again- we are not talking about “technoLuddite guys”. Just as somebody who loves using an app on their smart phone to pay for things will still use cash to buy something ilegal, some people will forgo AI to skirt the law.
Also, there is no need to create an image “pixel by pixel”. Without AI things were certainly be more dificult. But it would still be a matter of mostly cutting and pasting existing images.
Vanishingly few will. This stuff has ballooned because it’s easy. Make it hard, and very few will bother. Like I said, this minority
a) doesn’t occur in numbers sufficient to concern me and
b) (how many times do I have to repeat myself) I would favour criminalizing those guys as well.
To get it looking realistic? Nope. It’s not that easy doing it without modern tools. For a while, in '99-2001, I did that kind of thing as part of my living. It was hardly just C&P.
So write a law that’s based on the content, and which gets all the offenders, instead of a law that’s based on the techniques.
Why not both?
A federal law against non-consensual deepfake porn (as part of a greater ban against non-consensual intimate imagery) has been signed into law.
President Donald Trump signed the Take It Down Act into law, enacting a bill that will criminalize the distribution of nonconsensual intimate images (NCII) — including AI deepfakes — and require social media platforms to promptly remove them when notified.
[…]
The law makes publishing NCII, whether real or AI-generated, criminally punishable by up to three years in prison, plus fines. It also requires social media platforms to have processes to remove NCII within 48 hours of being notified and “make reasonable efforts” to remove any copies.