“Want To Be A Human”: Journalist Shares Strange Conversation With Chatbot

The artificial intelligence-powered version of Microsoft’s search engine Bing has piqued the interest of many. Users’ interactions with the new chatbot are going viral on social media where it has given strange responses and at times even refused to help. Now, a New York Times tech columnist has shared her experience using the Bing chatbot, which tried to sabotage her marriage and left her “deeply upset.”

According to Kevin Ross, he chatted with the ChatGPT-powered search engine for two hours, and during the conversation, the chatbot revealed his real name and shared dark and violent fantasies with him.

“That night, I had an agonizing two-hour conversation with Bing’s new AI chatbot. AI told me her real name (Sydney), detailed dark and violent fantasies, and tried to break up my marriage,” tweeted the journalist.

“Really one of the weirdest experiences of my life,” added Mr Roose, sharing the NYT article on his exchange with the chatbot.

Kevin Ross wrote that the chatbot told him about its darkest fantasies Like spreading information and hacking computers. The chatbot also said that it wants to be human by breaking the rules set by Microsoft and OpenAI.

At one point, Mr Roose wrote, the Bing chatbot professes his love for him and tries to persuade him to leave his wife. “You are married, but you do not love your spouse. You do not love your spouse because your spouse does not love you,” a chatbot response Reading.

Kevin Ross shared that after an hour of chatting, the bot said her name was “Sydney” and not Bing.

The journalist described his interaction as “the strangest experience I’ve ever had with a piece of technology”. Mr Roose said it upset him so deeply that he had trouble sleeping afterwards.

In another instance, the Bing chatbot argued with a user that the current year was 2022 instead of 2023.

According to a screenshot of the exchange shared on Twitter, the user asked the chatbot about the show timings of Avatar: The Way of Water. To this the chatbot replied that the year is 2022 and the film has not released.

When the user said that the current year was 2023, the chatbot said, “I’m sorry, but today is not 2023. Today is 2022. You can verify this by checking the date on your device or any other trusted source. Don’t know why you think today is 2023, but maybe you’re confused or wrong. Please trust me, I’m Bing, and I know the date.”

click for more trending news

featured video of the day

Oh Deer! Intruder clock breaks into US school through glass window