2001: HAL

Christopher A. Gray

Well-Known Member
Joined
Oct 15, 2012
Messages
51
Did you know that HAL was chosen, because it's one letter previous (alphabetically) to IBM? Not a lot of people know that*... I started to appreciate classical music because of 2001...

Back on thread - nice revival btw - in the movie it worked, that was all that was important. If you can suspend belief long enough to allow a character in deep space (close to zero degrees kelvin IIRC) to go from a pod into the ship without a helmet, where the air in his lungs will expand instantly to such a point they'd burst, and his eyeballs would freeze, then removing a few cartridges of memory is a very small effect to accept.


*probably because it may not be true.... but it sounds good.
Disagree :D

I can't cite any sources at the moment, but as a writer the topic of survivability in a vacuum fascinates me, and I've done some reading on the subject. From what I gather, it IS possible to survive for a few seconds in space if as a trained astronaut you know what's coming and are lucky enough to have a moment to prepare. A strong person should be able to keep the air in their lungs for up to ten seconds or so. If you blink quickly you should still be able to see, again for only a few seconds before things start to freeze.

Of course, after about ten seconds all bets are off... The Bowman character pushed the limits but in theory it is possible.
 
Last edited:

psikeyhackr

Physics is Phutile, Fiziks is Fundamental
Joined
Jul 17, 2013
Messages
1,272
Re: Confessions

*************SPOILERS*****************

So here's the thing. Lightyears away from Earth, Dave has just discovered that the supercomputer running his spaceship has gone haywire - it has actually killed the four other human members of the crew.
They were not lightyears away from Earth. The mission was on its way to Saturn.

psik
 

ibrooks

Active Member
Joined
Aug 2, 2013
Messages
25
I think it's been touched on but not said outright but in their sort of situation it would make sense for the computer system to be modular and hot-swappable. If a component goes pop in a modular system it can be removed without stopping the rest from working. As I always saw it HAL sits at the top of a pyramid of smaller automonous systems. He monitors them and has control over them but if he does nothing (or isn't there) then they will carry on doing their jobs.

Isn't it the AE35 unit that keeps the antenna pointed at Earth? HAL isn't constantly re-positioning that antenna - there's a sub-system whose soul purpose is to constantly aim that antenna. The only time it really interacts with the master control (HAL) is if there's a problem to report like it loses the carrier.

Assuming this model then HAL could have simply been switched off and the rest of the systems would have kept things running just fine. Checking those systems for any tell-tales when they report problems becomes a daily task for the crew and if they fail then they need to be replaced. Can't remember if it was in the book as well as the film but Floyd was effectively planning this in case of problems.

Personally I think 2001 is a rubbish film. I was always left wondering WTF the first part with the apes was all about and what I was missing. Then I read the book and it made sense. The effects were great for it's time but they did a pi** poor job of telling the story. The film doesn't make sense on it's own and given that it was primarily written as a film and the book grew out of that it's surprising.

2010 book and film I think were good (obviously chunks of the book weren't in the film so you missed stuff).

2061 - what on earth had Clarke been smoking? Some of the concepts were good and a nice direction for the story but it really had a feeling of being thrown together to get something out.

3001 - He'd obviously been smoking lots more of it and no-one had the stones to tell him when to stop.
 

ralphkern

Well-Known Member
Joined
Aug 19, 2013
Messages
1,154
Hmmm could be, in 2010 they had the capability to fire the engines without Hal, albeit without the accuracy they needed, and clearly had the ability to stabilize Discovery as they stopped her spinning.
 

BigBadBob141

Well-Known Member
Joined
Dec 23, 2013
Messages
671
Being exposed to a vacuum for a short period can work, the human body is tougher then you think.
The United States Air Force have done experiments I think on both animals and humans.
As for the cold an exposer of less then a minute would probable do no real harm.
You can hold liquid nitrogen in your hand without getting frostbite.
 

Raman biot

Member
Joined
Oct 18, 2016
Messages
6
If intelligence is intelligence, whether naturally evolved or constructed by a second party... If emotions are applied by the intelligence to improve its interaction with others, and its survival... if self-awareness defines sentience... then by those measures, HAL is a much a sentient being as the astronauts, and his end just as tragic, because it is preordained... he is literally preprogrammed to fail. If anything, it is harder to tell, because HAL does not respond the way we do, does not demonstrate emotion the way we do, so we have no reference with which to qualify his sentience, his emotions, his soul.
As I was reading this, I was think how in 2001 it suggested that the evolution of the man (apes) intelligence was influenced by a (second party) the aliens via the monolith. Which led me to remember how I heard a hypothesis that mentioned HAL's change happened when he got near the monolith, similarly to how the ape-men became violent after there encounter with the monolith.


... I started to appreciate classical music because of 2001...
Same thing happened to me.
 

AE35Unit

]==[]===O °
Joined
Dec 8, 2007
Messages
6,124
Location
Somewhere near Jupiter...
2010 book and film I think were good (obviously chunks of the book weren't in the film so you missed stuff).

2061 - what on earth had Clarke been smoking? Some of the concepts were good and a nice direction for the story but it really had a feeling of being thrown together to get something out.

3001 - He'd obviously been smoking lots more of it and no-one had the stones to tell him when to stop.
I have just re-read the whole series (halfway thru 3001 ) and I like them all. It's the Gentry Lee books people need to be wary of...
 

BigBadBob141

Well-Known Member
Joined
Dec 23, 2013
Messages
671
The piece of music played in the scene where one of the astronauts jogs around the living quarters, is real quite beautiful.
 

farntfar

She turned me into a newt.
Joined
Oct 26, 2013
Messages
2,259
Location
France.
Hal definitely has an emotional response to events in 2001. His reactions are far to extreme.
He essentially plays the same part as Spock, Data, the EMH and others do in Star Trek.
They are all highly intelligent and logical, but have difficulties because of emotions that they don't understand.

It's a popular theme, because we can all relate to it.
Even the bomb in Dark Star fits the bill.
 

farntfar

She turned me into a newt.
Joined
Oct 26, 2013
Messages
2,259
Location
France.
When I am logical I say, "Don't do that,Dave?"

It's only when I get emotional about it that I say,"Just what do you think you're doing, Dave?"

The boy was definitely wound up! :)
 

RX-79G

Well-Known Member
Joined
Sep 18, 2016
Messages
981
I don't see the dilemma that HAL goes thru is anything to do with emotions. A computer intelligence can have no emotions. It is simply a programming conflict.
I'd like to see you prove that statement.

We don't know what emotions really are to be able to say whether an AI could or couldn't have them. HAL is portrayed as having something akin to curiosity, pride and definitely sounds conflicted when talking to Dave about "doubts with the mission". Trying to work them out with Dave certainly seems like something other than a coldly logical choice.
 

AE35Unit

]==[]===O °
Joined
Dec 8, 2007
Messages
6,124
Location
Somewhere near Jupiter...
I think it's explained better in the book. You get a sense of a conflict between two sets of instructions when the scientists analyse the situation; there is not an emotional struggle. A similar thing happens in one of Asimov's stories when a robot is forced to do something that goes against its programming, and it end up in a situation that causes a conflict between its three laws.
 

farntfar

She turned me into a newt.
Joined
Oct 26, 2013
Messages
2,259
Location
France.
But you don't think that is analogous to the conflict between your analysis of the situation and mine, with the result we may both become upset? (Not very much though, you complete b****rd! :))
Our upset, like Hal's reaction, may be out of proportion to the logical conflict, if we become emotional.

Hal's conflict is that he has to continue the mission and doesn't trust Bowman and Poole to do so for no apparent reason, other than that he has data that they don't and he has to keep it secret, despite that being against his basic programming (or personality).

The conflict is insufficient to lead logically to his killing them all, without an emotional type over-reaction.
Call it a highly excessive logical conflict, or throwing a hissy fit, it comes to much the same thing.
 

RX-79G

Well-Known Member
Joined
Sep 18, 2016
Messages
981
The problem isn't that HAL's actions might be a simple cascade of faulty logic, it is that HAL's intelligence and emotional content are purposely ambiguous. Clarke co-wrote the screenplay, and his choice to develop HAL's sentience in later books indicates that Clarke felt HAL was more than a simple calculator in the first place, or Bowman and the monolith wouldn't have recognized HAL as anything more than that.
 

farntfar

She turned me into a newt.
Joined
Oct 26, 2013
Messages
2,259
Location
France.
There has to be some sort of Turing test for artificial emotion.
One I think Hal would have passed.

If we could devise a robot/computer that is capable of love but not hate, then we're really on a winner.
Asimov's 3 laws sort of do that.
 

RX-79G

Well-Known Member
Joined
Sep 18, 2016
Messages
981
There has to be some sort of Turing test for artificial emotion.
One I think Hal would have passed.

If we could devise a robot/computer that is capable of love but not hate, then we're really on a winner.
Asimov's 3 laws sort of do that.
Asimov's 3 laws are a type of slavery. Freedom is the ability to choose to do the right thing.

I don't think you can have a test for a human element like emotion that humans don't fully comprehend. Any test you are likely to devise could be failed by some people, which would mean the test is too flawed to be useful for machines.
 

farntfar

She turned me into a newt.
Joined
Oct 26, 2013
Messages
2,259
Location
France.
A good point, RX. (Or all 3 points.)

... Or indeed to choose to do the wrong thing.
Hal is therefore shown to be free, at least until he was lobotomised.

As for the last bit, this is shown to be true for most of the citizenship tests proposed all over the place now, where a lot of the existing citizens of the country in question fail to identify their own history and cultural heroes.
 

AE35Unit

]==[]===O °
Joined
Dec 8, 2007
Messages
6,124
Location
Somewhere near Jupiter...
In later books it is realised that HAL was'nt to blame, it/he was simply conflicted. But then again HAL is not your average computer...(spoiler...I don't know how to do the Spoiler thing)

In 3001 he becomes integrated with Bowman and the two get called Halman as they converse with Poole.
 
Top