Ideological Turing Test

In this blog, a strong atheist writes about her debates with her devoutly Catholic boyfriend. In keeping with those explorations, she’s decided to host an ideological Turing test:

How vaild an exploratory device is this?

I find it immensely useful to argue political, economic, and legal issues from both perspectives with the goal of sounding “plausibly supply-side” or whatever. At the very least it makes you aware of the weakness of your opponents arguments, and at the best it makes you re-adjust the underpinnings of your beliefs in these areas.

However, I’m not sure that it translates well to the spiritual or religious spheres. Primarily because both the positive effects and underlying beliefs are not merely logical but emotional ones. Rather than honestly evaluating the basis of thought it seems much more akin to play-acting.

Or, to put it differently, passing a Turing test in economics means I at least understand the arguments of my opponents. Passing a Turing test in Christianity certainly doesn’t mean I “understand” Christ.

It sounds just like an assigned-position debate, only with fewer rules. Those seem to work.

An assigned debate position does not have the Turing test component to it.

The original Turing Test, if passed, tells you there’s perhaps something new and interesting in the world: an artificial intelligence. It’s not a given that such a thing is possible. The topics of conversation during a Turing Test are almost irrelevant. What matters it that the machine behaves indistinguishably from an intelligent human.

On the other hand, an atheist who can convincingly pass as Christian, or vice-versa, is merely telling us that he’s familiar with the opposition’s arguments, and perhaps that he has some acting skills. I don’t find that any more useful or interesting than if everyone on both sides just acted as themselves, presenting their cases honestly.

So what?

Seriously. This goes to the heart of the original Turing test, which assumes there is no meaningful difference between a computer that “has” consciousness and a computer that only “behaves” as if it has consciousness.

Well, at best I’d guess it could disprove any notion that one must have an emotional awakening, to be “born-again”, in order to convincingly appear to be a born-again Christian.

The value of this, I don’t know offhand. I’d kinda rather give a theist my “eighth-day” challenge, in which I produce a copy of Genesis identical in all respects to a conventional bible but with creation taking seven days and God resting on the eighth, and ask the believer which version I should choose, and why.

Understood. And for the economic arguments, for example, I think this line of inquiry works - a “fake” Keynesian is just as good as a real one. But I don’t think this is true for religion.

If God exists, then there is a difference between a true believer and one pretending to be one. Because convincing another human you believe is a far cry from convincing an omniscient being.

I guess one could try to argue that if there is no difference between the true believer and the pretender (in word or deed) then that is somehow evidence of the falsity of the “true religious experience”, but I’m not sure I would try to make that argument.

And I don’t think it’s helpful for the pretender, because logical arguments for God or religion are never the compelling ones - every conversion story I’ve ever heard is based on an emotional response, not a logical conclusion. Folks might deduce themselves out of religious belief, but I’ve never heard of someone deducing themselves into one.

A notable version of this, I guess, is the nonbeliever who comes to believe his own claims, typically a cynical con-artist who makes up lies, finds those lies gather credulous and worshipful followers, then gradually succumbs to feelings of grandeur. L. Ron Hubbard and (arguably) Joseph Smith being examples.

I think it would be an interesting experiment. A non-believer might be aware of all of the believer’s positions but that doesn’t mean he would present them the same way a believer would, even if he was trying to do so. (I’m reminded of the classic SF story The Moon Moth by Jack Vance.)

And I think it could produce useful results. Suppose we found that people who were genuine believers behaved differently from people who were pseudo-believers. It would be useful to be able to tell who was only faking their beliefs in public debates.

Not certain what the point of this is. Certainly, arguing a side opposite your own is good discipline - if nothing else, it teaches the person doing it the true strengths and weaknesses of their opponent. But what is added by the “Turing test” aspect?

A quick note: you quoted Jas09 in your post, but attributed it to me.

Fair enough. I’d meant to quote you and him with some kind of brilliant insight that tied your posts together with references to Deep Blue and John Henry but my train of thought derailed.

I follow that blog and considered this an intriguing idea, but after seeing the actual implementation I don’t really think it’s worth much. One of the things about the actual Turing test is that it’s in real time (or thereabouts), whereas here the participants have essentially unlimited time to go and look up any answers they need. There’s also no ability to ask follow up questions and the actual questions presented are scripted in advance. That makes it more of a research challenge than what it’s trying to accomplish.

I agree that the ability to convincingly argue your opponents’ philosophical position means that you probably have a better understanding of the issue than they do if they can’t articulate your position in return, but this particular effort isn’t going to prove anything one way or the other.

I agree. The ability to step outside one’s own worldview and see things from another person’s perspective is a fundamental part of human wisdom, and one that’s been much neglected in recent generations. However, I also agree with Ambrosio Spinola that the particular constraints put in place by this blogger undermine the exercise.

No harm done . . . and, I’d be curious to hear how John Henry fared against Deep Blue.

(A total wipeout, I bet.)

Well, Deep Blue said “P-K4” and then John Henry hit it with a 20lb sledgehammer.

One thing I’d want to be sure of: Are the participants trying to pass as a deep, insightful (atheist/theist), or just a typical one? There are plenty of atheists who don’t really contribute much beyond mocking the “magical sky fairy” or the like, and plenty of theists who don’t contribute much beyond “I know because the Bible tells me so”. Either of those would, I think, be pretty easy to mimic. The more well-reasoned ones, though, not necessarily so much.

Then again, though, the best advocates of both sides already tend to be the ones most familiar with the other’s arguments. So maybe the most difficult to mimic would be the ones in between.

Ah! Ivan’s Gambit. Well played.

It is an interesting concept, but the test might run into problems with some areas based on Jonathan’s Haidt’s research.

http://www.ted.com/talks/jonathan_haidt_on_the_moral_mind.html

With the liberal / conservative difference partially based on how the world is viewed, the ability for a liberal in this case to understand the Loyalty perspective of a conservative could impact the liberal’s ability to truly mimic a conservative.

In the same sense, mimicking a True Scotsman believer might be tough for a True Scotsman atheist, and vice versa. I could try to parrot the outspoken Atheist - but if you made me keep going I would probably fumble at some point.