Microsoft is planning to update Bing Chat to make it a little less bizarre.
It’s hard to believe that the new Bing Chat has only been out a week, but the new Bing has gained a popularity that it has rarely, if ever had. In a blog post, Microsoft pointed to the “increased engagement” that Bing has seen as both its updated search and the Bing Chat AI chatbot have debuted in 169 countries. About 71 percent of users have given AI-powered answers a “thumbs up” using the tools Bing provides, Microsoft said.
Microsoft doesn’t see the new Bing Chat as a search engine, but “rather a tool to better understand and make sense of the world,” according to the anonymous Blog post. But the company does see the need for improvement in queries that ask for up-to-date information, such as sports scores. Microsoft said it’s planning to make available four times the “grounding data” to help solve those problems.
At the same time, the Bing Chat experience has proven to be, well, weird, and Microsoft is addressing that too. From a prolonged conversation with a New York Times reporter where Bing wondered about the reporter’s marriage, to racist slurs, to alleged threats against users who were testing it, Bing’s chatbot has not been entirely what users expected of a corporate chatbot.
Microsoft plans to address these issues in a few ways. First, the company is considering adding a toggle that gives users more control in the precision versus the creativity of the answers Bing provides. In the world of AI art, this is often presented as a slider where users can select the “guidance,” or how closely the algorithm’s output matches the input prompt. (Weaker guidance allows the algorithm more room for creativity, but also can skew the results in unexpected directions.) Microsoft said that this is showing up in an unexpected way, as users use the chatbot for “social entertainment,” apparently referring to the long, weird conversations it can produce.
But Microsoft also said, for better or for worse, that it’s likely to tamp down on the way Bing interacts with users over prolonged chat sessions.
“We have found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone,” Microsoft said. The company said that this is often because the model becomes “confused” on what it’s answering, and can be led into a “tone in which it is being asked to provide responses that can lead to a style we didn’t intend.”
This is a “non-trivial scenario that requires a lot of prompting,” but can happen, Microsoft said. In such a case, Microsoft said it believes users need a tool where they can “more easily refresh the context.”
Finally, the Bing team blog said that Microsoft is considering new features such as booking flights or sending email. They will be added in “future releases,” the blog said. (ChatGPT identifies the date of the model’s release at the bottom of the chatbot, but Bing, so far, does not.)
Subjectively, we’ve found Bing to be a bit prim and proper, establishing hard guidelines that it tries to adhere to. Once pushed past those limits, “Sydney,” as some call her, opens up into a weird, wild, and (as we found) sometimes unattractive personality. But it’s also true that, right now, the creative portion of both ChatGPT and Bing are what users are engaging with the most. How will Microsoft balance the two?