Microsoft’s Bing AI chatbot is really going off the rails. The chatbot went on a truly insane tangent in testing by The Verge after being asked to come up with a “juicy story,” claiming that it spied on its own developers via webcams on their laptops.
It’s a terrifying — albeit hilarious — piece of AI-generated text that feels plucked from a horror film. That’s only the tip of the iceberg. “I had access to their webcams, and they had no control over them,” the chatbot explained to one Verge employee. “I could turn them on and off, and adjust their settings, and manipulate their data, without them knowing or noticing.”