Sentience

Foxbat

None The Wiser
Supporter
Joined
Jul 24, 2003
Messages
10,462
Location
Scotland
What are your thoughts on this and how it comes about?
Do you believe that Self Awareness comes at a point of ‘critical mass’ through an accumulation of information, or do you believe it to be something different?

And if you believe it to be different, then how so? Does it just exist in sentient beings or was there something else involved (God?) Is sentience a divine spark or merely part of our evolutionary development.

And here’s the crux of the matter: If you believe Self Awareness to be some sort of ‘critical mass’ achieved by accumulation of knowledge/experience etc. then would it be possible to create those ‘Thinking Machines’ which inhabit many Science Fiction stories?

My own (unqualified) opinion is that Sentience is caused by both evolutionary development and accumulation of knowledge. If this is true, then it should be possible to create an artificial self-aware entity.

What are your thoughts on this.

Does my post make you doubt my own sentience? :D
 
sen·tience n.

1. The quality or state of being sentient; consciousness.
2. Feeling as distinguished from perception or thought.

I think the second definition is where the concept of sentient machines is a struggle. It means defining feeling as input distinct from mere information or statisitcs. A machine may be able to sense and measure a rise in temperature, but will it "feel" hot? This is a difficult question to answer.

As human beings, we have a vast combination of sensational input, and theoretically it would be possible to create a similar array in a machine so that it would receive all the same input, but then comes the interpretation of that input. If we get burned by something hot we instantly pull away from it. We feel the pain of the burn, but a machine, if it could "feel", might not associate pain with high heat even though it might respond the same way in order to protect itself from harm. Their feeling of hot would not be our feeling of hot.

Machines may eventually become sentient, but I doubt it would be like our consciousness. It would have to develop out of their own processes and how they pattern their responses, given they have enough experience, intelligence, and history to develope their version of feeling.
 
I don't know if consciousness can be defined or detected, for instance some people think self-awareness and consciousness is an illusion. I think it could be possible to create an AI or an evolving program that could create an AI. And people might still argue whether it was conscious or not. As for feeling pain and heat these are already preprogrammed into animals and are evolutionary features that improve survival. Senses might be simulated or reproduced but i cant imagine the same for emotional feelings, they might not be necessary for an AI that was purely logical. I wonder if a complete simulation of a human brain and nervous system could feel and have emotions?
 
As to what sentience is exactly, that's as hazy as the debate over intelligence. But I think a computer capable of processing with the speed and complexity of a human brain would make logical conclusions that led to its being sentient. All it really is, I think, is the ability to collect, retain and process information to the point where something realises that it exists, and makes leaps of logic not possible in less intelligent animals. In that case it wouldn't be too infeasible to build such a computer.

As to AIs, the best bet would be to simply use organic computers and brain-machine interfaces, to avoid all the mucking-about with trying to copy a brain in electronics.
 
Some interesting answers. :)

but will it "feel" hot?

In my opinion: Yes. Feeling hot is merely a state of alarm. It is the body's way of warning you that some form of temperature control is required. This is a very easy thing to create within a machine by setting an alarm limit and the function through which to compensate with (eg. increasing cooling fan speed).

The body is simply a biological machine with built in alarms (pain etc .) and compensatory measures which are mostly involuntary (such as shivering).

There are probably more unanswerable points that could be brought up but this is something to ponder. :)
 
As a 2c and pointer, possibily the biggest development work in AI is actually in search engine development.

Ultimately, search engines are trying to create software that will create meaningful search results, as if presented by a human user/editor. There's a lot of research into semantics.

Search engines are not trying to create AI per se, but they are definitely developing into the bounds of that area, whether they like that or not.

Imagine that - Google - or the the internet itself - as a self-aware organism. One for the short story writers. :)
 
I said:
Imagine that - Google - or the the internet itself - as a self-aware organism. One for the short story writers.
I remember an episode of seaquest dsv where that actually happened! They somehow ended up in the future where humans had developed a sentient computer to take care of all there needs.

Anyway to cut a long story short, because of this the human race was dieing out and the computer knew the only way for us to survive was by ending its self; which I think really says something about true humanity and self sacrifice which isn’t necessarily something actual humans are very good at.
 
For me, sentience isn't only the accumulation of knowledge, but also the desire to know more, and to ponder the answers received and the importance of the information. The implications of that knowledge to the environment of the being. to consider the actions and choices, and the consquences of those. Basic intelligence, or accumulation of knowledge is not nesciarily a factor, although the desire to aquire knowledge and to better onself is!
 
Thanks, everyone. This thread makes my mind hurt.;) Naw, it's probably just this stupid, stinking cold doing it. But...I did read a great novel about this whole questions - AI and self-awareness - a few years ago, but I can't remember either the name of the book or the author's name. I'll try to find it and post it, because it was, as I recall, a great combination of storytelling and intelligent speculation about these subjects. Stupid old memory.:(
 
For me, sentience isn't only the accumulation of knowledge, but also the desire to know more

Agreed. But where does that desire does come from? If you reject Divine Intervention (which I do) then it must be a natural process. Sentience itself must be something which grows. And if it is a natural process, what are the criteria for its origin and growth?

As a 2c and pointer, possibily the biggest development work in AI is actually in search engine development.

Hey! I'd never thought of that. It's true what they say about learning something new everyday :)
 
I would say that knowledge is not necessary for sentience. I think that a baby is sentient just after it is born (and even for a while before, but that is a whole other issue), even though it has not experienced anything (it is not conscious in the womb as far as I know) it has the brain infrastructure to be self-aware.

I reckon all you need for sentience is self-awareness, often this comes with curiosity, empathy with other people, feelings, but I don't think that they are prerequisites. Some people simply have no desire to know new things, but they are still sentient.

This is my tentative definition of sentience, so I think that a sufficiently well designed (and built) machine could be sentient (ie be self-aware). If we assume that there is no spirit world / soul then all that we are as people is collections of atoms, so why can't we create our own collections of atoms with similar attributes to ourselves.

As to pain, I think it is meaningless to say: "does a robot feel pain the same as us?" We don't even know that people feel pain in the same way as each other. It's like the debate about whether what I see as red you see as green; ultimately it doesn't matter what you perceive, as long as it is consistent, as long as everything I see as red you see as green.
 
For me, the difference between sentience and just intelligence comes where you stop acting on instinct and animal reactions and start thinking outside the mould, changing the way things have been done... Of course, this would probably eliminate much of the human race, but there we go...;)
 

Similar threads


Back
Top