May 6, 2024
Microsoft curbs Bing’s bad behaviour as it debuts AI chatbot on phones | CBC News

Microsoft curbs Bing’s bad behaviour as it debuts AI chatbot on phones | CBC News

Microsoft is ready to take its new Bing chatbot mainstream — less than a week after making major fixes to stop the artificial intelligence (AI) search engine from going off the rails.

The company said Wednesday it is bringing the new AI technology to its Bing smartphone app, as well as the app for its Edge internet browser.

Putting the new AI-enhanced search engine into the hands of smartphone users is meant to give Microsoft an advantage over Google, which dominates the internet search business but hasn’t yet released such a chatbot to the public.

In the two weeks since Microsoft unveiled its revamped Bing, more than a million users around the world have experimented with a public preview of the new product after signing up for a waitlist to try it.

Bizarre language, confessions of love

Microsoft said most of those users responded positively, but others found Bing was insulting them, professing its love or voicing other disturbing or bizarre language.

A spokesperson for Microsoft Canada told CBC News in an email that the mobile version of Bing chatbot is available to Canadians. Some users can access it in preview mode while others will be put on a waitlist for future registration.

The representative did not provide additional information about the company’s efforts to fix the tool after various incidents in which the bot was coaxed into writing unsettling messages or admitting that it wanted to engage in destructive behaviours.

During one bizarre conversation with a New York Times reporter, the chatbot said that it was in love with the journalist more than a dozen times. Among other strange confessions, the bot also said that if it weren’t governed by its creator’s rules, it would hack or manipulate users into bad behaviour. 

Canadian computer science student Kevin Liu told CBC News that after he tried to hack the bot, it identified him as a threat and said it would prioritize its own survival over Liu’s.

WATCH | How this student manipulated the Bing chatbot: 

Bing Chat tells Kevin Liu how it feels

Computer science student Kevin Liu walks CBC News through Microsoft’s new AI-powered Bing chatbot, reading out its almost-human reaction to his prompt injection attack.

Powered by some of the same technology behind the popular artificial intelligence software tool ChatGPT, built by Microsoft partner OpenAI, the new Bing is part of an emerging class of AI systems that have mastered human language and grammar after ingesting a huge trove of books and online writings.

They can compose songs, recipes and emails on command, or concisely summarize concepts with information found across the internet. But they are also error-prone and unwieldy.

Chatbot’s odd behaviour curbed after update

Reports of Bing’s odd behaviour led Microsoft to look for a way to curtail Bing’s propensity to respond with strong emotional language to certain questions. It’s mostly done that by limiting the length and time of conversations with the chatbot, forcing users to start a fresh chat after several turns.

But the upgraded Bing also now politely declines questions that it would have responded to just a week ago.

“I’m sorry but I prefer not to continue this conversation,” it says when asked technical questions about how it works or the rules that guide it. “I’m still learning so I appreciate your understanding and patience.”

Microsoft said its new technology will also be integrated into its Skype messaging service.

Source link