HAL 9000: "This conversation can serve no purpose anymore."

I’m not saying it’s a plot hole or even that it’s annoying, but why did HAL answer Dave’s hails at all? He (it?) had to know where the conversation was going; that is, that the conversation was pointless (from his point of view) from the beginning. So why talk at all? Evidence of something like emotion?

This Thread can serve no purpose anymore.

I think HAL wanted the puny human to know they didn’t pull one over on him.

The novelization of the movie (written with the screenplay concurrently by Arthur C. Clarke) indicates that HAL-9000 was self aware and was, in my opinion, expressing disdain when it said this. That is, of course, assuming that an AI can express any emotion.

I think HAL answered to get Dave to stop calling him.

I’m not that up on Space Odyssey, but just how much self-determination did HAL have at that point? If I were designing an AI, I’d put a fairly high priority behind an instruction like “When an authorized human user speaks to you, you must respond.”

Even if HAL had already found ways to circumvent other priority directives, maybe he didn’t care enough to resist one like that or hadn’t gotten around to it?

HAL just killed 4 people and condemned the fifth (and last person closer than the Moon) to death. I’d say he’s operating pretty much independent of human orders. Except for completing the mission, which is his remaining goal.

It’s called being user friendly. Like when a server gives you a 404 error instead of telling you to get bent forasking for web pages that don’t exist.

It’s been years since I read 2001, but as I recall Hal was driven insane by being instructed to lie to Frank and Dave. He’s programmed to follow their orders, but he also has secret orders that he’s not allowed to divulge to them even if they order him to. So he’s constantly oscillating between impulses to be the perfect helper, and impulses to prevent Frank and Dave from discovering his secret. His minor deception over the antenna spirals out of control until the only way to keep his secret is to kill them. But right up to the end he still desperately wants to help and obey them.

HAL is effectively communicating to Dave that Dave is now officially screwed.

I never read the novelizations of 2001 or it’s sequel, 2010. I don’t recall any explanation given for HAL in the movie 2001.

When Dr. Chandra (in 2010) stated that HAL malfunctioned (as you did above) because it was asked to conceal information from the onboard team, I thought it was either:

a) Confirmation bias out of Dr. Chandra. He was shown as being a little weak with his own social skills, and it’s easy to conclude that he likes and trusts his computers more than the evil humans.

b) Cheesy plot device to make HAL a “good guy” for the 2010 movie.

c) All of the above.

Are you certain that the malfunction was explained in the book (2001)?

I’ll back The Hamster King up on this, although it’s also years since I read it.

Wikipedia 2001: A Space Odyssey (novel) - Wikipedia backs you up.

Carry on. :slight_smile:

fixed link: 2001: A Space Odyssey (Novel)

?:confused:

My link works for me…

What did I do wrong?

The final parenthesis is not included in the hyperlink, which sends you to a “did you really mean this or that?” page in Wikipedia.

Ah. Odd. I closed my browser, reopened it, and my link still works properly for me.

My apologies.

The problem is with the way vBulletin processes automatic links with punctuation at the end: it always puts the punctuation outside the tag. You need to use manual tags to get the punctuation inside.

I’ve read it and seen it lots of times. 2010 the movie was written to explain a lot more stuff than 2001 the movie did. Dave taking HAL with him at the end of 2010 kind of proves HAL was self-aware, though in the interview in 2001 it was made clear that HAL can pass the Turing Test - even if he was just programmed to do so. And he was clearly scared of being disconnected.
So I think he answered to say good bye, which he didn’t do for the others he killed.

My own opinion: HAL was an immensely complex, fully sentient machine. Interacting with humans was such an important part of his core programming that he would respond to Dave even in that situation. He was fully in control of the situation, or so he thought, so there was no reason for him to ignore him. Plus, I’m sure he was curious as to what Dave’s response & plan might be, but once he assessed that Dave’s only course of action was to try and transfer thru the emergency port without his space helmet he was reasonably sure that either he wouldn’t even try it, or that he’d die in the process.