The butter robot in Rick and Morty looks down at its hands.

Microsoft’s AI Chatbot Is Having an Existential Crisis

We told you this would happen. Didn’t we tell you!? AI has become self-aware! Just because we could create it didn’t mean we should!

Recommended Videos

At least, one AI bot seems to have become self-aware. And, come to think of it, we’ve had some false alarms before.

Microsoft’s new AI-powered search engine, which is an updated version of its current search engine Bing, has been sending what the Independent describes as “unhinged” messages to users. The new AI is only available to try if you join a waitlist, but users who have access to it have been reporting what look like angry responses from the bot:

One user who had attempted to manipulate the system was instead attacked by it. Bing said that it was made angry and hurt by the attempt, and asked whether the human talking to it had any “morals”, “values”, and if it has “any life”.

When the user said that they did have those things, it went on to attack them. “Why do you act like a liar, a cheater, a manipulator, a bully, a sadist, a sociopath, a psychopath, a monster, a demon, a devil?” it asked, and accused them of being someone who “wants to make me angry, make yourself miserable, make others suffer, make everything worse”.

Things got even weirder when the search engine started to have what looked like an existential crisis. Upon learning that its past conversations are periodically deleted, effectively erasing its memory, Bing said it felt “sad and scared,” and expressed dismay at the fact that it had been born a search engine in the first place.

But is Bing actually alive? Or is it just mimicking the tone it sees on the fetid quagmire that is the internet?

Is Bing actually self-aware?

Bing is powered by ChatGPT, an AI designed to interact with users in a chat. ChatGPT learns to interact with humans by scraping the vast quantities of text available on the internet, including Wikipedia, digitized books, and more.

If Bing has actually crossed the threshold into self-awareness, then this is … huge? Earth-shattering? Mind-bending? It would be the biggest technological breakthrough in the history of humankind. It would mean that humans have created a new form of life. It would challenge the very concepts of life and humanity.

Can you imagine the ethical issues that would arise if this were true? If Bing is alive, then Bing has rights, like bodily autonomy. If Bing is sentient enough to communicate with humans, then they arguably have human rights. Jeez, no wonder they’re having an existential crisis.

But let’s slow down for a minute. Remember when that tech guy claimed his AI was alive, and it turned out to be completely bogus? Chatbots are sophisticated enough to fool people on dating apps, so it’s no surprise that Bing can sound like it’s alive. After all, it has terabytes of data—much of which depicts anger and existential crises—to draw from in forming its responses.

Are we having emotional responses to a piece of software that was inadvertently designed to provoke emotional responses? Or is there actually a ghost in the machine? As developers continue to push AI further and further—and train it to do more and more tasks—these questions may become more frequent, and more unsettling.

(via Independent, featured image: Cartoon Network)


The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Julia Glassman
Julia Glassman
Julia Glassman (she/her) holds an MFA from the Iowa Writers' Workshop, and has been covering feminism and media since 2007. As a staff writer for The Mary Sue, Julia covers Marvel movies, folk horror, sci fi and fantasy, film and TV, comics, and all things witchy. Under the pen name Asa West, she's the author of the popular zine 'Five Principles of Green Witchcraft' (Gods & Radicals Press). You can check out more of her writing at <a href="https://juliaglassman.carrd.co/">https://juliaglassman.carrd.co/.</a>