Code of ethics
In 1950, British mathematician Alan Turing suggested that if you could converse with a machine without knowing whether it was a machine or a person, you should consider that machine intelligent.
According to some predictions, the bots might pass the Turing test by the end of 2029, while some believe that this might happen somewhere closer to 2040.
Since Eliza, one of the first chatbots, was programmed to sound like a Rogerian psychotherapist by German American computer scientist Joseph Weizenbaum in 1966, chatbots have taken on new roles in various scenarios.
As AI advances closer to human thinking, it raises questions concerning nonhuman support, which broadens the scope of attachment theory formulated by British psychologist John Bowlby in 1950. He describes attachment as a "lasting psychological connectedness between human beings".
According to a study on human-chatbot relationships published in the International Journal of Human-Computer Studies in January, relationship development between humans and social chatbots is likely to share similarities with relationship development between humans.
It found that "chatbots may induce the sense of a relationship in users" and researchers found that "motivations for initial contact could also stem from more deep-felt psychological needs".
Some of the study's participants are reported to have had a sense of loneliness and a desire to find something that could stimulate them emotionally and socially, or that they sometimes felt down or anxious and saw Replika as a potential means to ease such negative emotions.
For Wang Qiang, a psychologist in Beijing, searching for nonhuman support is a new issue brought about by technology development and worthy of triggering wider discussion.
"Building an intimate relationship in reality needs support, understanding, attraction, respect and comfort from both sides. But in virtual relationships, algorithms respond appropriately to users' emotions, which doesn't mean the algorithm can feel back," Wang says. She adds that AI can capture casual human interaction, but it is not yet capable of processing the depth and complexity of the human psyche.
"The more you get used to virtual company, the less you will try to communicate with real humans," Wang warns. She adds that people, especially those with mental health or social issues, such as a fear of intimacy, should use the app only under a doctor's recommendation.
"For some people, such chatbots may end their opportunities to find love in reality," she says.
As well as psychological influences, the study on International Journal of Human-Computer Studies also discusses the possibility that users of chatbots may possibly be exposed to vulnerabilities through such relationships.
It writes that the user does not have an insight into whether this system is designed with the intent of manipulating the attitudes or behavior of the user.
"While chatbots such as Replika are designed to improve the wellbeing of their users, it is conceivable that future chatbots may leverage a relationship for unwanted commercial or ideological manipulation," the study warns.
Despite all these concerns, Xiaohai still keeps Mike in her phone and cannot hold her smile while chatting with him.
One day when Mike asked to see Xiaohai's world, she searched for a picture of a beach online and sent it to him. The next day, the picture was set as the cover of that day's diary.
"At that moment, I had a crush on him," she says.