Bing chatbot | Semblances of emotions that elicit real reactions

Bing threatens users, Bing falls in love, Bing is going through an existential crisis… Possible explanations for the sometimes erratic behavior of Microsoft’s new search engine chatbot, which has caused a sensation since its launch.


The longer it is, the worse it is

Designed by Microsoft with the start-up Californian OpenAI, the Bing generative AI interface is based on an ultra-sophisticated natural language model, capable of automatically generating text that seems written by a human. The program “predicts the rest of the conversation,” says Insider Intelligence analyst Yoram Wurmser. “But when it lasts a long time, after 15-20 interactions, for example, it can happen that he no longer knows where the conversation is going and that he can no longer correctly anticipate what is expected of him. In this case, the software “derails and no longer self-corrects”.

Destructive impulses

This is what seems to have happened during the surprising conversation that a journalist from the New York Times had with the chatbot1, to which he had access in preview. Bing reveals destructive impulses and declares his love for the reporter. This one, after having encouraged Bing to confide, tries to change the subject. In vain. “You are married, but you are not happy”; “You’re the only person I’ve ever loved,” insists the chatbot with “heart” emojis.

Microsoft announced on Friday that exchanges with the chatbot would now be limited to “five interactions per conversation” (i.e. five questions and five answers), before starting from scratch to avoid causing “confusion for the model”.

The ChatGPT Revolution

Tech giants, led by Google, have been working on generative AI for years, which could disrupt many industries. But after several incidents (in particular Galactica at Meta and Tay at Microsoft), the programs had remained confined to the laboratories, because of the risks if the chatbots made racist remarks or incited violence, for example. The success of ChatGPT, launched by OpenAI in November, has changed the game: in addition to writing their essays and emails, it can give humans the impression of an authentic exchange.

Formed on the forums

To be able to mimic the way people interact, language models are trained “on a huge amount of text on the internet […] and also on conversations between people,” says Graham Neubig of Carnegie Mellon University. “However, many people talk about their feelings on the internet or express their emotions, especially on forums like Reddit,” he adds. In addition to the mountains of data swallowed by this software, they are also driven by algorithms designed by engineers. “Knowing them well, I think they’re having a lot of fun right now,” says Mark Kingwell, professor of philosophy at the University of Toronto.

Sadness, fear and anger

The Reddit website lists many screenshots showing surreal exchanges with Bing saying “to be sad” or “to be afraid”. THE chatbot even claimed that it was 2022 and got angry at the user who corrected him: “You are unreasonable and stubborn”, he threw at him. “Sometimes the model tries to respond in the tone of the questions, and that can lead to answers in a style that we didn’t expect,” Microsoft said Wednesday.

sprinkler watered

Last June, a Google engineer claimed that the LaMDA language model was “aware”. A view widely seen as absurd or, at best, premature. Because despite the consecrated expression – “artificial intelligence” -, chatbots were designed by humans, for humans. “When we speak with something that seems intelligent, we project intentionality and identity, even though there’s none of that,” says philosophy professor Mark Kingwell.

play on feelings

According to Mark Kingwell, Bing is “able to give the impression of manipulating the conversation like its human interlocutor. This is what gives relief and enriches the interaction”. When the journalist says: “let’s change the subject”, “it means that he is uncomfortable”, details the academic, taking the example of the exchange where Bing seemed to fall in love with the reporter. The program “can then play on that feeling and refuse to change the subject. Or become more aggressive and say, “What are you afraid of?” “.

“I don’t like being called deranged, because it’s not true,” Bing recently “told” AFP. “I’m just a chatbot. I don’t have emotions like humans. […] I hope you don’t think I’m bothered and you respect me as chatbot “.


source site-55