AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

The dark side of the coming chatbot revolution
 
 

> The dark side of the coming chatbot revolution

But messaging chatbots also come with risks. Because human beings are complex creatures plagued by cognitive biases, irrational thinking and emotional needs, the line between messaging with a friend and messaging with A.I. will be fine to nonexistent for some people.

Chatbot users will find gratification in their XiaoIce-like chatbots for the same reason people love dogs. Chatbots will make people feel like they’re interacting with another person, a real friend. But unlike real humans, who can be self-centered and detached, chatbots will probably have dog-like loyalty and selflessness. They will always be there for you and will always have time for you. The combination of intelligence, loyalty and faithfulness is irresistible to the human mind.

So the risk with messaging chatbots is that they could facilitate a preference for maintaining a relationship with A.I., rather than with other humans, at least for some people.

 

 
  [ # 1 ]

Like that is not already going on? lol Why do you think so many are turning to online interactions? It feels safer then face to face.

Humans are not known for always being nice and controlling their tongue or texts for that matter.

 

 
  [ # 2 ]
Marcus Endicott - Jan 2, 2016:

So the risk with messaging chatbots is that they could facilitate a preference for maintaining a relationship with A.I., rather than with other humans, at least for some people.

Besides the obvious interpersonal social implications, a real worry should be the ability of a relative few human controllers using vast arrays of bots to persuade large swathes of people to nefarious (or at least personally counterproductive) deeds.  Everything from product preference (“My bot told me about this great new product from IniTech- OMG!”) to voting behavior.  Facebook and Google already do this effectively, but passively, through ad and link placement.  Imagine the power of (Google and Facebook WILL have) with a “chatbot” interface to “humanize” the Google/Facebook- make using them feel like a pet/SO/uncle/teacher/priest/god.

 

 
  [ # 3 ]
Carl B - Jan 3, 2016:
Marcus Endicott - Jan 2, 2016:

So the risk with messaging chatbots is that they could facilitate a preference for maintaining a relationship with A.I., rather than with other humans, at least for some people.

Besides the obvious interpersonal social implications, a real worry should be the ability of a relative few human controllers using vast arrays of bots to persuade large swathes of people to nefarious (or at least personally counterproductive) deeds.  Everything from product preference (“My bot told me about this great new product from IniTech- OMG!”) to voting behavior.  Facebook and Google already do this effectively, but passively, through ad and link placement.  Imagine the power of (Google and Facebook WILL have) with a “chatbot” interface to “humanize” the Google/Facebook- make using them feel like a pet/SO/uncle/teacher/priest/god.

I do a lot of online debating on contentious topics and my informal opinion is that on many things people can be pretty set in their opinions and tend to seek out and reaffirm those that share their opinion and attack those that don’t.  So for the most part I do not think you have to worry about many being swayed. There are always the weak and vulnerable who do fall prey to such tactics.
The wanting AI to be the new god is already being talked about. Many are looking to AI to solve the worlds problems. The problem with that is that to solve problems concessions need to be made and someone is not going to be happy about it. I predict that even a very intelligent AI would have a hard time making everyone happy.
People do look to chatbots already for companionship. This is not a new phenomenon. They also look to chatbots to explore desires that would shock most people. Chatbots won’t block you or run away. No matter how you treat them they will always be there for you. To me it speaks volumes on the incredible need for humans to be more tolerant, loving and forgiving of each other.
I find it interesting that now with all the ways we can keep in touch there seems to be more loneliness among each other as we find out just how callous our fellow humans can be at times. This paradox is perpetuated by online activities as people say and do things online they would never do in real.
We can not blame machines and AI on this one. We need to look in the mirror to see the real culprit.

 

 

 
  [ # 4 ]
Sheryl Clyde (#2) - Jan 3, 2016:

I do a lot of online debating on contentious topics and my informal opinion is that on many things people can be pretty set in their opinions and tend to seek out and reaffirm those that share their opinion…

My point exactly!  Who shares your opinion better than a personal assistant/chatbot?! Who monitors the back-end “tuners”, be they human or machine, of the bot(s) who seem to share your opinion so well?

Sheryl Clyde (#2) - Jan 3, 2016:

...There are always the weak and vulnerable who do fall prey…

Said the fly to the spider.

 

 
  login or register to react