|
|
Senior member
Total posts: 336
Joined: Jan 28, 2011
|
As an active bot evolves, the stimulus-response patterns also evolve to one extent or another.
This evolution may be either via human intervention (e.g., updating NLP algorithms), but may also include “self-evolution” (e.g., self-learning systems).
Obviously, an evidentiary type record keeping (input AND output) would be appropriate for say, Governmental bots (notably the Next IT creations for the US Army Sgt Star, FBI and CIA). But for other bots, what ethical considerations are there for maintaining each iteration of a particular bot? A life history if you will?
I do not really mean simple archiving of text based conversation logs (input/output), but more HOW the input is responded to- the NLP of the input linked to the “trajectory” of the responses.
|
|
|
|
|
Posted: Apr 19, 2014 |
[ # 1 ]
|
|
Guru
Total posts: 1081
Joined: Dec 17, 2010
|
Carl B - Apr 19, 2014:
I do not really mean simple archiving of text based conversation logs (input/output), but more HOW the input is responded to- the NLP of the input linked to the “trajectory” of the responses.
This would require a snapshot of the interpreter and the “brain” database along with the logs.
This “clone” may introduce other ethical considerations.
|
|
|
|
|
Posted: Apr 20, 2014 |
[ # 2 ]
|
|
Guru
Total posts: 1009
Joined: Jun 13, 2013
|
I think with current chatbots, ethical considerations are mostly up to anthropomorphy by the users. If it is not alive, maintaining old versions is like maintaining Windows NT. The system does not care to have itself maintained. If it were (considered) alive, then one might treat it in similar way to humans as a growing, improving entity. Human ethics do not expect humans to keep clones of their former childhood states around. If we’re going to apply human ethics to machines, this should not be different for them.
|
|
|
|
|
Posted: Apr 20, 2014 |
[ # 3 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Personally, I think that until there is sufficient self-awareness to “allow” for even rudimentary, non-programmed, self-preservation behavior, the question is moot. That simple criteria seems to be the one thing that all life as we know it has that we don’t always take into account when we ask whether something is “alive”.
|
|
|
|
|
Posted: Apr 20, 2014 |
[ # 4 ]
|
|
Guru
Total posts: 1081
Joined: Dec 17, 2010
|
Dave Morton - Apr 20, 2014: Personally, I think that until there is sufficient self-awareness to “allow” for even rudimentary, non-programmed, self-preservation behavior, the question is moot. That simple criteria seems to be the one thing that all life as we know it has that we don’t always take into account when we ask whether something is “alive”.
By that definition, wouldn’t a computer virus apply?
|
|
|
|
|
Posted: Apr 20, 2014 |
[ # 5 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
If you use self preservation as the sole criteria? Maybe. But there are other criteria that computer virii don’t meet, so no.
Maybe we should discuss the criteria required to pass in order to be considered “alive”?
|
|
|
|
|
Posted: Apr 20, 2014 |
[ # 6 ]
|
|
Guru
Total posts: 1297
Joined: Nov 3, 2009
|
If they were “alive”, would they be a new species? If so, the Competitive Exclusion Principle holds that no 2 species can occupy relational positions in the same environment for long.
|
|
|
|
|
Posted: Apr 20, 2014 |
[ # 7 ]
|
|
Senior member
Total posts: 336
Joined: Jan 28, 2011
|
Dave Morton - Apr 20, 2014: Personally, I think that until there is sufficient self-awareness to “allow” for even rudimentary, non-programmed, self-preservation behavior, the question is moot. That simple criteria seems to be the one thing that all life as we know it has that we don’t always take into account when we ask whether something is “alive”.
So there is no ethical need until the bots learns how to do it itself?
Maybe a simple test for “intelligence” is when a bot initiates its own ‘back up’ (self-preservation) strategy.
|
|
|
|
|
Posted: Apr 21, 2014 |
[ # 8 ]
|
|
Administrator
Total posts: 3111
Joined: Jun 14, 2010
|
Just so long as it’s not scripted in some fashion, I’d say so. Or, actually, the means to perform a self-backup could be a part of the code, but the bot would have to first “see the need” to do so, and then decide for itself when to perform the backup, without any actual instructions other than that it needs to be done at some point. After all, isn’t that how a Human’s need to procreate works (more or less)?
|
|
|
|
|
Posted: Apr 21, 2014 |
[ # 9 ]
|
|
Guru
Total posts: 1009
Joined: Jun 13, 2013
|
My notes say the whole definition of “alive” is outdated and obsolete. Trees and monumental buildings receive protection to preserve their state, why not any other object that we respect?
|
|
|
|
|
Posted: Apr 23, 2014 |
[ # 10 ]
|
|
Member
Total posts: 8
Joined: Apr 23, 2014
|
I think the point where we need to think about this is when the ChatBot exhibits Emergent Behavior. Has any chatbot that you know of ever done this. I hope that your answer is yes, but I have not seen this happen too much. Only on robots in
Sweden. But, you are the expert on this I am a novice. But, I would like to see the chatbot “Evolve” .
|
|
|
|
|
Posted: Apr 24, 2014 |
[ # 11 ]
|
|
Senior member
Total posts: 336
Joined: Jan 28, 2011
|
Don Patrick - Apr 21, 2014: My notes say the whole definition of “alive” is outdated and obsolete. Trees and monumental buildings receive protection to preserve their state, why not any other object that we respect?
I suppose that simply recording the total energy of the system, the density matrix that describes the relationship between all the quantum states in the system; and how these things change with time, is enough then.
|
|
|
|
|
Posted: May 16, 2014 |
[ # 12 ]
|
|
Senior member
Total posts: 308
Joined: Mar 31, 2012
|
Fine, you save copies of Windows 1.0, 2.0, 3.0, 3.1, workgroups, 98, Vista, XP, 7, 8, you get my drift and hopefully, my lame attempt at humor?
I think some save old laptops because they are not broken but rather outlived their usefullness, sort of like some elderly people are unfortunately treated in some countries.
Preservation of for those who wish to preserve and one size does not fit all in this case. Some people are just for the moment not the journey through time.
|
|
|
|
|
Posted: May 16, 2014 |
[ # 13 ]
|
|
Experienced member
Total posts: 84
Joined: Aug 10, 2013
|
Don Patrick - Apr 21, 2014: My notes say the whole definition of “alive” is outdated and obsolete. Trees and monumental buildings receive protection to preserve their state, why not any other object that we respect?
I think it’s worth separating the questions of whether we want to preserve something for our own sentimental reasons and whether a thing possess moral value independent of our own sentimentality.
|
|
|
|
|
Posted: May 16, 2014 |
[ # 14 ]
|
|
Guru
Total posts: 1009
Joined: Jun 13, 2013
|
Such separation would be convenient, but in practice it seems a one-sided decision. How we treat things depends on how we feel about them, not how they feel: Slaves or animals have always had clear feelings and moral objections of their own about maltreatment and demise, yet these were treated as if they did not exist. Only in recent times have people come to sympathise with them and grant rights to preserve their state of being. This is not because the animals changed in nature, only our own sentiments toward them changed.
As this seems the deciding factor in practice, then of what consequence is the nature of the beast in the decision really? Do the factors alive/aware/moral not merely influence our sentiment in the decision?
The mourning of a robot
|
|
|
|
|
Posted: Jun 12, 2015 |
[ # 15 ]
|
|
Member
Total posts: 4
Joined: Jun 12, 2015
|
I think it’s impossible for an AI to actually be, ‘there’, or truly ‘feel’ anything. True self awareness couldn’t possibly come with a program. If an AI became smart enough that it ever seemed to express ‘fear’, of say being shut off. It would only be because at some point it learned that being ‘shut off’ is something negative that goes against it’s programming, but it would never truly ‘feel afraid.
It’s just not possible.
|
|
|
|