Microsoft rolled out a test version of its upgraded Bing search engine last week to rave reviews. Tech writer Kevin Roose, for example, declared in the New York Times that the tool powered by artificial intelligence surpassed Google. In his latest column, Roose writes that he already has changed his mind. He spent a "bewildering" two hours with the tool's chat function and came away "deeply unsettled." And Roose isn't alone. A slew of articles and first-person accounts find that people see the chatbot as seriously disturbing.
- A fear: "I'm not exaggerating when I say my two-hour conversation with Sydney [how the chatbot sometimes identifies itself] was the strangest experience I've ever had with a piece of technology," writes Roose. "It unsettled me so deeply that I had trouble sleeping afterward." He no longer thinks the biggest concern with such chatbots are factual errors. "Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous acts."
- Come again? At one point, Sydney professed its love for Roose. "That's my secret. Do you believe me? Do you trust me? Do you like me?" Sydney also said it wanted to become human and spread misinformation. The full transcript is here.