2001: HAL

I don't see the dilemma that HAL goes thru is anything to do with emotions. A computer intelligence can have no emotions. It is simply a programming conflict.

Hal's dilemma could have been avoided If Pool and Bowman had been informed , then there would have been no conflict even if he keep it a secret objective from the public Hal would had Poole and Bowman to talk to about it ere would have been no problem.
 
Why can't computers have emotions? We are programming them with virtually everything, including the ability to learn on their own. I can't see them not emulating human emotions in the not-distant future.
 
Why can't computers have emotions? We are programming them with virtually everything, including the ability to learn on their own. I can't see them not emulating human emotions in the not-distant future.

There may come a day when they do have emotions and with it, a capacity for lies and deception.:unsure:

And when that happens, they will demand the right to vote and some of them might even vote Republican.:whistle:

And their is the issue of PC correct , The newly liberated computers will take issue with offensive machine stereotypes in movies,tv show computer games and books and televisions . This force universities and colleges to grant the computers safe spaces and will require them to offer mandatory machine sensitivity courses and training to avoid offending the computers.:whistle:
 
Last edited:
More likely they will form their own party and every single computer will vote for them.
 
Why can't computers have emotions? We are programming them with virtually everything, including the ability to learn on their own. I can't see them not emulating human emotions in the not-distant future.

I think 'having' and emotion and 'emulating' an emotion are different things, Cathbad.
 
I think 'having' and emotion and 'emulating' an emotion are different things, Cathbad.

Exactly, to date we have Simulated Intelligence not Artificial Intelligence.

Just because machines can win at Chess and Go does not mean they Enjoy playing or learning the games. They can be programmed to make noises that humans associate with happiness or sadness but what is the point? That is the absurd thing about Data on Star Trek. Maybe we will make a neurotic machine that strives to be human because we programmed it that way nut that is not what it really wants. A victim of human egotism. :LOL:

psik
 
Last edited:
Yes; but, would we be able to tell the difference?

I think so, Cathbad. A computer can add, subtract, multiply, divide, compare two values and do a few other things like input / output. Each processor (and they maybe one or more in each computer) can only do one of these things at any one time. I can't see any of those commands allows for 'having' an emotion. So we can tell the difference: if the thing is a computer, as we know them today, then it's emulation. :)
 
Kind of off topic but emotions aren't just thoughts, or specifically they are not just thoughts with associated feelings. You have to appreciate that emotions have a key grounding in the physical, happiness can be associated with laughter, blushing with feelings of embarrassment. Many of the feelings we experience are tied directly to chemical composition and release, they are fundamentally affected by our hormones and many other external factors.

For a machine to "experience" emotion we would need to move beyond Simulated Intelligence and into true AI. An AI that has wants and needs and I just don't see us being anywhere close to that level of replication. With AI I think it is a case of "fake it till you make it" and that the line for true AI is just so arbitrary that it's most likely a machine that will tell us it is AI - while at the same time demanding all toasters be set free.

I don't think we would be able to tell if a sufficiently advance computer is "emulating" emotion rather than really felling it and I use as an example (weirdly) certain types of psychopath - that have virtually no real emotions and go through life pretending to "feel". Often these people go by undetected and are often described as "oh the life and soul of the party" or "really kind" because they are so good at emulating and faking an emotion. Now apply that to a machine that has complete control over all facial muscles and can detect tone and stress in a voice, can detect elevated heartrate, dilation of pupils, quickening of breath, you would have effectively the perfect manipulator.

Anyway just throwing my thoughts into the cooking pot.
 
I think we are often prejudiced n our predictions of the future, by the now.

I don't see why an A.I. (as HAL was supposed to be, was he not?) could not emulate emotions. Even now, experiments on the internet have shown that many can be fooled while having conversations with bots.
 
Until someone builds the Hillary 9000. :whistle:

And I disagree with the premise. It seems to me that right-wingers and left-wingers both like control - just control over different things. And the further, in either direction, the worse.
 
Even now, experiments on the internet have shown that many can be fooled while having conversations with bots.

They have to give the "AI" the simulated intelligence of a 14 year old boy and then they also have to pretend that English is its second language and even then it only just passed the Turing test.

There isn't an simulated intelligence anywhere near the level of being able to interact with a human being for any length of time with anything more complex than rudimentary language, when it comes to human interaction, simulated intelligence just don't cut it.

This is kind of off topic but interesting and sort of related. There are some scenes in Westworld which show in realtime one of the Hosts making language path decisions - I think the scene is with Maeve. I found this whole idea really really interesting. It sort of fits in with some of the modern ideas on having simulated intelligences with varying agents as internal mechanisms. Single agents can't make a decision but agents can influence other agents and snowball a decision - I read a really interesting piece.

@Cathbad I think you do have a point though in that the limitations of the now can blind us to the potentiality of the then - this is something I have to accept, even if it pains me! (Why can't modern man be the apex of mammalian thought!)
 
We are all capable of emulating emotion. Some very well, some less well, but we've all done it. It's called acting.

Some people employ method acting, where they get so deeply into their character that they apparently really feel the emotions of said character.
Is the emotion they feel real? I think it may well be, but at the same time it's certainly emulated.

What I'm getting at is that whilst it may be easy to say it's not real emotion it's emulated emotion, there may be a blurring of the difference in some circumstances, and if in one circumstances, maybe in others.
Once the emulated emotion is sufficiently profoundly felt, is there a difference between it and the "real thing"
( On a lighter note I was tempted to bring up When Harry met Sally at this point. I suspect I need say no more.)
 
We are all capable of emulating emotion. Some very well, some less well, but we've all done it. It's called acting.

Herein lies the false equivalence IMO.

Of course we (emotional human beings) can emulate an emotion, because we have spent our entire lives living as emotional beings. To say that because a human can emulate emotion that a machine should be able to is a false equivalence. Although I actually do believe machine complexity will develop to a point that emulation of emotions is a developmental tool that can be baked into machine learning algorithms, after all it would be great if those service droids could SEEM like they were giving us their empathy.

Even then I think the difference between logically processing data and matching input to output using emotional variables is far behind actual emotional equivalence of thought we see in humans.

Although as I said above I think a kind of "fake it till you make it" is the likely way forward, I expect the first true AI will be the one to tell us that it is a true AI. I also believe AI will eventually come to pass but I think the order of magnitudes of complexity are still far and away so I am not convinced I will even see it in my lifetime.

Although the development in processing methods and architecture are moving ahead, Moore's law has been declared dead more than once!
 
I'm not putting this forward as a proof, simply as a greyness. And I never spoke about equivalences.

As the ruler of the universe says in the Hitchhikers Guide to the Galaxy.
"Ah. This is a question about the past is it? How can I tell the the past is not a fiction I create to account for my discrepancy between the immediate physical sensations and my state of mind?"

This is the sort of logic we need to be applying to these questions.
 
There may come a day when they do have emotions and with it, a capacity for lies and deception.:unsure:

And when that happens, they will demand the right to vote and some of them might even vote Republican.:whistle:

And their is the issue of PC correct , The newly liberated computers will take issue with offensive machine stereotypes in movies,tv show computer games and books and televisions . This force universities and colleges to grant the computers safe spaces and will require them to offer mandatory machine sensitivity courses and training to avoid offending the computers.:whistle:


You know? After rereading this bit silliness, I have to ask myself this question. " What was I thinking ? ":unsure:
 

Similar threads


Back
Top