|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
Does a human need ‘experience’ to learn?
If so, what kind ?
When I was taught Calculus in school, was I really ‘experiencing’ ?? was ‘experience’ necessary?
Seriously ... can a digital computer even experience hot and cold ?
I say no… (and it is not necessary for intelligence or understanding) even if you hook up a thermometer to a computer, it reads in the value, it is simply storing an integer in a memory location (assignment to variable), so is everything simply information in a digital computer system?
And even so, does it even matter if it is ‘only’ data?
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 1 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Victor Shulist - Mar 11, 2011: Seriously ... can a digital computer even experience hot and cold ?
I say no… (and it is not necessary for intelligence or understanding) even if you hook up a thermometer to a computer, it reads in the value, it is simply storing an integer in a memory location (assignment to variable), so is everything simply information in a digital computer system?
Lets translate that to how we humans experience hot and cold;
Our nervous system is not different from electronic sensors, they just report a value back to the brain. So far there is no difference. However, the real ‘magic’ happens when our brain references that value to ‘previously stored values’ either learned (experienced) before or readily embedded in our instincts. This ‘calibrated’ frame of reference (e.g. the ‘danger threshold’) gives us a feeling, thus interacting with our emotions. What we finally experience is how a certain temperature makes us feel. Then, we react to that.
So the sensor is important, as is storing the value, but the real mojo is in the contextual perception of that stored value.
In my explanation here, there is nothing that can not be replicated in a digital system. We can hook up sensors, store resulting signals, interpret them based on a frame of reference (either learned or preprogrammed as ‘instinct’), let that interact with an emotion-system (like the PAD-model) and then have the AI react to that. This is actually incredibly close to how I perceive temperature myself.
So it’s obvious what I voted ;)
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 2 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
Hans Peter Willems - Mar 11, 2011:
Our nervous system is not different from electronic sensors, they just report a value back to the brain.
Hum… really? they just report a digital signal to the brain? No different at all to human experience of pleasure and pain??? I doubt it.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 3 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Something else:
Learning without experiencing (i.e. referencing other concepts) is merely imitation. Take monkeys as an example: we can train them to do certain feats but they won’t understand what they do, they only imitate. However, give a monkey something to eat just out of his reach and a stick, and it will soon enough figure it out based on different experiences.
Now, before someone says that one monkey will take a clue from another in using the stick to get to the food; this is of course not just imitation but the ‘experience’ from seeing the other monkey getting the food by using the stick.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 4 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
Victor Shulist - Mar 11, 2011: Does a human need ‘experience’ to learn?
If so, what kind ?
When I was taught Calculus in school, was I really ‘experiencing’ ?? was ‘experience’ necessary?
Seriously ... can a digital computer even experience hot and cold ?
I say no… (and it is not necessary for intelligence or understanding) even if you hook up a thermometer to a computer, it reads in the value, it is simply storing an integer in a memory location (assignment to variable), so is everything simply information in a digital computer system?
And even so, does it even matter if it is ‘only’ data?
I think you might be mixing two different semantic meanings of experience here. The first meaning is “multiple encounters with a situation” and the second is “sensory and/or other multiple inputs associated with a particular situation.”
Do you need to experience multiple types of input to remember the calculus you were taught? Probably not. But did you require multiple experiences with example calculus problems in order to figure out an “algorithm” for solving calculus problems? Well, most of us did.
Learning grammar and pronoun association and other tools will require my bot to be provided with a large training set. That certainly gives the bot “experience” with different grammatical situations.
Incidentally, I’m reminded of a story of a woman with memory loss who used a camera in order to remember situations better. She wore the camera around her neck and it took pictures every so often of her surroundings. Amazingly, if she viewed the pictures after the experience, memories of the event surfaced and were made more long-lasting. The pictures weren’t even relevant to the memory necessarily. The article I originally read described how she would remember a lunch date with a friend after seeing a partial shot of the waiter.
Here’s a link to another article about it (can’t seem to find the original): http://www.dailymail.co.uk/health/article-1164104/The-portable-camera-combats-memory-loss-bringing-hope-dementia-sufferers.html
The point being that even (visual) input that seems irrelevant to the situation at hand (a conversation with a friend) clearly influences the way that we store that information. (The two sets of data are linked in our episodic memory.) Is this necessary in order for bots to learn in a meaningful way? Well, maybe not vision per-say. But multiple senses that reinforce an experience can certainly provide more avenues for recall. And it could be useful in finding analogies with other situations.
I think text-only based bots will have to be provided with a larger volume of data about a given situation than a sense-enabled bot, which can record that data for itself. (Data that we humans take for granted.) It will make describing scenarios to a text-based bot more tedious (although training a sense-enabled bot would be also tedious).
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 5 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Victor Shulist - Mar 11, 2011: Hans Peter Willems - Mar 11, 2011:
Our nervous system is not different from electronic sensors, they just report a value back to the brain.
Hum… really? they just report a digital signal to the brain? No different at all to human experience of pleasure and pain??? I doubt it.
Victor, you didn’t read my message well enough it seems; pleasure and pain are determent in our brain, not in the nervous system. Ghost-pains after amputation is clear proof of that. I stated that there is processing of the signals involved.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 6 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
CR - point taken about the different senses of the word experience, I realize that after reading Hans very last post. My bot will not need the external-world-sensory type of meaning of experience, but the other meaning.
Hans - ok, either in the brain or in nervous system, can a digital computer truly feel pain ? I think no.
CR pointed out 2 word senses being used here for experience.
Hans, you are using both senses of this word, correct?
Please elaborate on both senses of the word and how they are employed in your model. That should clear things up
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 7 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Victor Shulist - Mar 11, 2011: Hans - ok, either in the brain or in nervous system, can a digital computer truly feel pain ? I think no.
If not, then a human can not feel pain as well. Our nervous system works with electricity (we can measure that) so it’s really sensors and processing. From that it’s clear that the signal is transformed in the brain (i.e. processed) into some value that stimulates other chemical systems. From an engineering point of view this is all pretty simple to replicate in a system like a computer. Feeling in humans does not involve any mystical property (we actually know a lot about this, for example from research in pain-control). Ergo, if humans can feel then computers can also feel if they are programmed in the same way as humans are ‘programmed’ to experience feelings.
Victor Shulist - Mar 11, 2011: Hans, you are using both senses of this word, correct?
Please elaborate on both senses of the word and how they are employed in your model. That should clear things up
See below…
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 8 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
Hum, I wonder just by basing the fact that a computer using electronics, and that there are some electronic signals in the brain, that you can make such a big leap like that. Human brains use electro-chemical, and computer only electronic. Human, analog, computer, digital. So a lot of major differences.
http://alumnus.caltech.edu/~croft/archives/academic/pain.html may be relevant to this.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 9 ]
|
|
Guru
Total posts: 1081
Joined: Dec 17, 2010
|
“Seriously ... can a digital computer even experience hot and cold ?”
Yes it can, although the “experience” might be very different than that of a human.
Long ago and far away in the days when you had to design and build your own hardware computers hated extremes of hot and cold. Hardware failure rates at trade shows in winter were bad and this was traced to the systems being moved from an ice cold shipping truck to a trade show floor and trying to fire up the systems before they had gotten up to room temperature. Many a disk drive was lost that way.
Additionally, once the show started and everyone was mulling around, the temperature on the show floor would rise dramatically and overheat the system causing components to burn out without warning. Careful precautions were put in place when systems were installed at shows or customer locations to prevent damage, but every once in a while it still happened.
In the next generation of the system features were put in place to eliminate these issues. The system had a built in temperature sensor that could detect when it was to cold or hot to work effectively. It wouldn’t start up if it was too cold and it would warn if it was too hot. If it got to hot it would shut down. At the trades hows this sense temperature was even linked to the system’s air conditioning to regulate the temperature.
A thermostat or computer experiences temperature from its sensors even though those sensors may be very different than ours. The same could be said of digital cameras or microphones.
Learning can happen in a variety of ways:
http://en.wikipedia.org/wiki/Learning
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 10 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
C R Hunt - Mar 11, 2011: I think you might be mixing two different semantic meanings of experience here. The first meaning is “multiple encounters with a situation” and the second is “sensory and/or other multiple inputs associated with a particular situation.”
I don’t see a distinct difference between the two. For me it is ‘to experience something’. Either multiple times, having different experiences, or in particular having a specific experience involving certain situations and maybe specific ‘sensors’.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 11 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Victor Shulist - Mar 11, 2011: Hum, I wonder just by basing the fact that a computer using electronics, and that there are some electronic signals in the brain, that you can make such a big leap like that.
Actually, you make a much bigger leap; if you deny the model that I describe, you actually imply that there is some ‘mystical component’ in our brain that makes ‘feeling’ into something that only we can experience. Implying some ‘mystical component’ is a much bigger leap then looking at the simple fact that we can measure the electric signal in a nerve-strain and detect the resulting process in a human brain with a brain scanner. We also know that the pain-process in our brain releases certain chemicals that makes us ‘feel something’. In a computer we use digital signaling instead of chemical signaling but that’s just a difference in implementation.
Take a look at this:
Chalmers’ argument for artificial consciousness
One of the most explicit arguments for the plausibility of AC comes from David Chalmers. His proposal, found within his manuscript A Computational Foundation for the Study of Cognition, is roughly that computers perform computations and the right kinds of computations are sufficient for the possession of a conscious mind. In outline, he defends his claim thus: Computers perform computations. Computations can capture other systems’ abstract causal organization. Mental properties are nothing over and above abstract causal organization. Therefore, computers running the right kind of computations will instantiate mental properties.
http://en.wikipedia.org/wiki/Artificial_consciousness
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 12 ]
|
|
Senior member
Total posts: 697
Joined: Aug 5, 2010
|
The point being that even (visual) input that seems irrelevant to the situation at hand (a conversation with a friend) clearly influences the way that we store that information. (The two sets of data are linked in our episodic memory.) Is this necessary in order for bots to learn in a meaningful way? Well, maybe not vision per-say. But multiple senses that reinforce an experience can certainly provide more avenues for recall. And it could be useful in finding analogies with other situations.
I think text-only based bots will have to be provided with a larger volume of data about a given situation than a sense-enabled bot, which can record that data for itself. (Data that we humans take for granted.) It will make describing scenarios to a text-based bot more tedious (although training a sense-enabled bot would be also tedious).
Yep, I think so too. It’s a bit like the chicken and egg problem, isn’t it?
Our nervous system is not different from electronic sensors, they just report a value back to the brain. So far there is no difference. However, the real ‘magic’ happens when our brain references that value to ‘previously stored values’ either learned (experienced) before or readily embedded in our instincts. This ‘calibrated’ frame of reference (e.g. the ‘danger threshold’) gives us a feeling, thus interacting with our emotions. What we finally experience is how a certain temperature makes us feel. Then, we react to that.
That’s basically also how I understand it.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 13 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
In a computer we use digital signaling instead of chemical signaling but that’s just a difference in implementation.
that’s like saying a flight simulator program is equal to a real airplane flying, .. . just implementation.
The proof of the pudding .. . .is in the eating !! When I see your model produce results, I’ll believe in it !
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 14 ]
|
|
Senior member
Total posts: 494
Joined: Jan 27, 2011
|
Victor Shulist - Mar 11, 2011: In a computer we use digital signaling instead of chemical signaling but that’s just a difference in implementation.
that’s like saying a flight simulator program is equal to a real airplane flying, .. . just implementation.
Hmmmm…. you might need to read up on the current status of flight simulators, or how pilots fly a predator spy plane. It might surprise you, but flying a flight simulator these days is counted as actual experience hours for a commercial pilot
In the meanwhile the believers are going strong in the poll.
|
|
|
|
|
Posted: Mar 11, 2011 |
[ # 15 ]
|
|
Senior member
Total posts: 974
Joined: Oct 21, 2009
|
Well… I tell ya…. 3 for YES, and 1 for NO…. got to be right !!!!!!! That’s 75% !!!!!!!!!!!!!!!!!!!
|
|
|
|