Advertisem*nt
SKIP ADVERTIsem*nT
You have a preview view of this article while we are checking your access. When we have confirmed access, the full article content will load.
Supported by
SKIP ADVERTIsem*nT
One of the pioneers of artificial intelligence argues that chatbots are often prodded into producing strange results by the people who are using them.
![Why Do A.I. Chatbots Tell Lies and Act Weird? Look in the Mirror. (Published 2023) (1) Why Do A.I. Chatbots Tell Lies and Act Weird? Look in the Mirror. (Published 2023) (1)](https://i0.wp.com/static01.nyt.com/images/2023/02/27/business/00AI-TRUTH/00AI-TRUTH-articleLarge.jpg?quality=75&auto=webp&disable=upscale)
By Cade Metz
Cade Metz has spent more than a decade writing about the development of artificial intelligence.
When Microsoft added a chatbot to its Bing search engine this month, people noticed it was offering up all sorts of bogus information about the Gap, Mexican nightlife and the singer Billie Eilish.
Then, when journalists and other early testers got into lengthy conversations with Microsoft’s A.I. bot, it slid into churlish and unnervingly creepy behavior.
In the days since the Bing bot’s behavior became a worldwide sensation, people have struggled to understand the oddity of this new creation. More often than not, scientists have said humans deserve much of the blame.
But there is still a bit of mystery about what the new chatbot can do — and why it would do it. Its complexity makes it hard to dissect and even harder to predict, and researchers are looking at it through a philosophic lens as well as the hard code of computer science.
Like any other student, an A.I. system can learn bad information from bad sources. And that strange behavior? It may be a chatbot’s distorted reflection of the words and intentions of the people using it, said Terry Sejnowski, a neuroscientist, psychologist and computer scientist who helped lay the intellectual and technical groundwork for modern artificial intelligence.
“This happens when you go deeper and deeper into these systems,” said Dr. Sejnowski, a professor at the Salk Institute for Biological Studies and the University of California, San Diego, who published a research paper on this phenomenon this month in the scientific journal Neural Computation. “Whatever you are looking for — whatever you desire — they will provide.”
Advertisem*nt
SKIP ADVERTIsem*nT