Ok? You’re saying this like it’s somehow in conflict with what I said, but I can’t see how?
If there’s just relationships, no goalposts need to be moved, and the proof I gave in the other thread applies unchanged.
Even if not, I’ve given a general argument against the possibility of conscious AI before.