|
Posted: Mar 11, 2011 |
[ # 16 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
Hans Peter Willems - Mar 11, 2011: C R Hunt - Mar 11, 2011: I think you might be mixing two different semantic meanings of experience here. The first meaning is “multiple encounters with a situation” and the second is “sensory and/or other multiple inputs associated with a particular situation.”
I don’t see a distinct difference between the two. For me it is ‘to experience something’. Either multiple times, having different experiences, or in particular having a specific experience involving certain situations and maybe specific ‘sensors’.
The way in which we encounter a situation is determined by our sensory input (the second definition of experience), but the encounter itself can also be deemed an experience (the first definition). This is the difference between saying “A bot doesn’t need experience with text in order to interpret it” and “a bot doesn’t need a nose in order to read text”. The bot can still have an experience with text without smelling it. (The second definition.) But the bot isn’t going to learn about text if it doesn’t have textual training, or experiences. (The first definition.)
It is a separate argument about what senses are necessary in order to fully experience an event. (The resolution of this question is likely to be highly specific to the event itself.) My point was that Victor’s initial post seemed to be entangling these two points in a way that left the (false) ultimatum: either a bot doesn’t have experiences or it necessarily must have senses. This ultimatum is simply a semantics trap concerning the definition of “experience” and I was trying to point that out.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 17 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
Jan Bogaerts - Mar 11, 2011:
Our nervous system is not different from electronic sensors, they just report a value back to the brain. So far there is no difference. However, the real ‘magic’ happens when our brain references that value to ‘previously stored values’ either learned (experienced) before or readily embedded in our instincts. This ‘calibrated’ frame of reference (e.g. the ‘danger threshold’) gives us a feeling, thus interacting with our emotions. What we finally experience is how a certain temperature makes us feel. Then, we react to that.
That’s basically also how I understand it.
I agree. The key to experiencing pain/pleasure/etc. is not simply in the recording of a pain value, but in how that changes the state of our minds (the ‘calibrated’ frame of reference). A well-designed bot brain is capable of this as much as a human brain.
Of course, this all feels vaguely artificial if there aren’t physical reasons for the bot to behave the way it does. If it says it’s cold when the temperature is cold for a human, but perfectly fine for the computer housing the bot, then the whole thing seems a bit contrived. Or at the very least, the bot has poor ‘instincts’. But there are hypersensitive people too.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 18 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
I consider:
1) the bot itself
2) the input
3) language understanding
4) its knowledge base (data and rules)
5) previous conversations
6) feedback—“out of band” signaling success or failure.
The ‘goal’ of the bot is proper interpretation. It could ‘experience’ feedback via an out of band kind of ‘signal’ that tells it whether it properly understood.
Then different approaches of parsing could even be changed, weighted by that feedback.
Basically, the bot’s “environment” would be those items listed.
I’m not sure if that OOB feedback needs to be necessarily grounded though.
Hum… we’re at 4/5, 80% say a computer can experience. Very interesting!!!
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 19 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Wow! So much to process here! It’s going to take me DAYS to get through all of this!
First things first. Let’s not confuse “experiencing pain” with “responding to pain”. Let’s take the analogy of “hot and cold” between a human, and a computer that has a temperature sensor. If you place me in a small room where the temperature can be raised and lowered (I’ll volunteer, since I don’t want anyone else getting injured), I’ll “experience” heat as the room is warmed. As the room gets hotter and hotter, I’ll feel more and more heat, and at some point, that heat becomes painful. If this continues, I’ll get burned, and damage will occur.
Now, let’s look at the same scenario with a computer. as the ambient temperature in the room increases, the sensor reports the changes in temperature, and the computer stores the readings in a temporary buffer; thus, the computer “experiences” the heat. Once the temperature in the room goes above the computer’s “safe operating tolerance”, damage begins to occur. This is a direct corollary to “pain”, as pain is an indication of damage. So, in my opinion, a computer can be made to “experience” pain. Now whether it can respond to pain is a completely different story.
I’ll now go back through the posts, and try to catch up.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 20 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Dave Morton - Mar 11, 2011: If you place me in a small room where the temperature can be raised and lowered (I’ll volunteer, since I don’t want anyone else getting injured), I’ll “experience” heat as the room is warmed. As the room gets hotter and hotter, I’ll feel more and more heat, and at some point, that heat becomes painful. If this continues, I’ll get burned, and damage will occur.
Dave, I differ from your view because tmo the part where you ‘experience the heat’ is already a ‘reaction’ to the sensory input. As I described (but you say you haven’s read all here ), experiencing is already past the storing of the sensor value; what you are feeling is the reaction to the sensor level in relation to your ‘frame of reference’ (i.e. preprogrammed values).
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 21 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
I agree Hans. When you have a mental reaction to heat, you have responded to it by changing the state of your mind. This is what I consider to be “experiencing” heat. It is not an objective statement of physical reality. (Both you and the computer are in a hot room.) And it is not a statement of data recording. (Both you and the computer are writing the temperature information to memory.) It is a statement of the fact that objective reality is influencing the way you behave (if only changing your brain function). This is more than simply recording a temperature—assuming the computer would record the temperature no matter how hot or cold it gets.
But I think this is a semantic distinction. One could define “experience” as simply some information recording if you wanted. And then “reacting” to heat would be what I consider “experiencing” it. The key thing is to define yourself clearly and be self-consistent.
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 22 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
CR, great reply and, as I see it, spot on
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 23 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
@CR:
So if, as a result of increasing temperature in the room, the computer causes a cooling fan to activate, then what? And as the temperature increases, I remove my shirt? To my way of thinking, given your premise, both myself and the computer have reacted to the present situation, and performed an action to prevent or reduce damage. Are we “experiencing” the heat now?
I agree that this is a matter of semantics, but I also see this as a distinction without a difference.
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 24 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Dave Morton - Mar 12, 2011: So if, as a result of increasing temperature in the room, the computer causes a cooling fan to activate, then what? And as the temperature increases, I remove my shirt? To my way of thinking, given your premise, both myself and the computer have reacted to the present situation, and performed an action to prevent or reduce damage. Are we “experiencing” the heat now?
Good question Dave.
I would say that the only difference is that you are aware of what happens, which makes it an experience. The computer is nor aware because there is no interaction between the sensing of the temperature and any ‘awareness system’ that changes the ‘state of mind’ of the computer (as CR eloquently pointed out).
Now think about this scenario instead: you are sleeping and it is so hot that, while you are asleep, your preservation algorithm kicks in and you throw of your blanket to cool down a bit. When you wake up (going to conscious state) you discover you have thrown of the blanket but you have no memory of doing that as there is no conscious experience related to the action.
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 25 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
It’s my thought that, as humans, we all have a somewhat biased viewpoint in this matter, since we have no true frame of reference from the viewpoint of a computer/AI entity to work with. All we can do is make assumptions based on our imaginations. I used to have some very detailed and “energetic” discussions with a good friend of mine over whether or not a spoon has objections to it’s use while we eat a bowl of cereal. That may sound like a very silly argument to have, but it points out that we just “don’t know” what “the other side” is thinking/feeling/experiencing. That spoon may have been completely happy and content as a diffuse quantity of soil, until it was ripped from the earth, torn asunder by the process of ore refining, heated in a smelter, pounded in a forge into the shape it now holds, and constantly battered, dropped, immersed in strange liquids, scraped across someone’s teeth, etcetera and so on for years on end. I would think that, if put in the spoon’s position, I would certainly have any number of strenuous objections to such treatment. but, then again, I’ll never know.
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 26 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
It’s my thought that, as humans, we all have a somewhat biased viewpoint in this matter, since we have no true frame of reference from the viewpoint of a computer/AI entity to work with. All we can do is make assumptions based on our imaginations
You’re probably right with that.
I would mostly agree with you Dave in that things can be simulated, except for the ‘sensing of pain’ part when something is broken. If the computer were to have sensors that triggered when a part fails, yes, then he could have a simulation of pain. But most of the time, in a computer, when a part fails, it simply stops working and that’s the only clue it has that something is wrong. so, there is no more signal. A broken nerve still gives a signal.
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 27 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Jan Bogaerts - Mar 12, 2011: But most of the time, in a computer, when a part fails, it simply stops working and that’s the only clue it has that something is wrong. so, there is no more signal.
That is because there is no process taking place that’s making the computer aware of the failure. But it is easy to program responses to that failure. In real-time software applications (like process control in a chemical plant) there are actually different responses programmed for good-signal, wrong-signal and no-signal. Sensors in this field report other signals for ‘somethings wrong’ then ‘no signal’.
Jan Bogaerts - Mar 12, 2011: A broken nerve still gives a signal.
That is an engineering flaw in our design; the human machine is tailored to having all parts in place and to react to that. When a part goes missing and there is no longer a signal then the software, having to proper response algorithm for that situation, starts acting erratically (so yes, it’s a bug). We see the same in real-time systems: a binary ‘0’ is represented as zero voltage and a binary ‘1’ is represented by +5 volts. Strange things can happen if a sensor goes broken and gives out a voltage somewhere between those two states, if not being handled up front by the software design.
Btw, a broken nerve does NOT give a signal at it’s point of breaking. However, as many nerve-endings use the same main nerve strain, at the brains end there almost always is a signal, from all the nerve-endings that are still working. It is a known fact that several nerve-endings use the same nerve-strain and somehow encode their signal so the brain can distinguish between them.
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 28 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
Dave Morton - Mar 12, 2011: @CR:
So if, as a result of increasing temperature in the room, the computer causes a cooling fan to activate, then what? And as the temperature increases, I remove my shirt? To my way of thinking, given your premise, both myself and the computer have reacted to the present situation, and performed an action to prevent or reduce damage. Are we “experiencing” the heat now?
I agree that this is a matter of semantics, but I also see this as a distinction without a difference.
Good question. I guess what it comes down to is, is a de-centralized response to stimuli still an experience of the stimuli?
For example, if I’m asleep and start to sweat. There is no change in the conscious part of my brain, but my cerebellum went to work telling my sweat glands to start working. So there was a change in my mental state, as well as physical. And yet nothing involving my frontal lobe. (When people discuss consciousness, I substitute the words “frontal lobe activity”. This is where all our meta-cognition takes place.) So is meta-cognition necessary for an experience?
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 29 ]
|
|
Senior member
Total posts: 328
Joined: Jul 11, 2009
|
Interesting thread to read. Couldn’t understand a lot of it
Can’t you just make up a name like ‘Artificial Experience’ and then add bits on as you progress ? It seems to me that is what is happening with ‘Artificial Intelligence’ anyway…
Or am I wrong ? Does everything have to measure up to what a human is capable of ?
If it does then please excuse me !
|
|
|
|
|
Posted: Mar 12, 2011 |
[ # 30 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
Artificial Experience
That’s a good one.
Edit: The next thing that popped into my mind though, was this: If I were to take some LSD, woudn’t that also be an artificial experience (I hope)?
Or am I wrong ? Does everything have to measure up to what a human is capable of ?
I don’t think so. We have to start somewhere, don’t we?
|
|
|
|