Microsoft is testing new Bing AI personalities as they allow longer chats

Microsoft said it’s expanding the length of chats people can have with the trial version of his Bing AI, while the company has also begun testing different “clay” personalities for more precise or creative responses. The company’s moves follow efforts to limit access to the technology following media reports that the artificial intelligence chat app was going haywire went viral last week.

Bing Chat can now reply to up to six questions or statements in a row per conversation, after which people must start a new topic, the company said in a blog post on Tuesday. Microsoft had previously set a conversation limit of five replies with a maximum of 50 interactions per day. Microsoft said it will now allow a total of 60 interactions per day and plans to increase that total to 100 “soon.”


Currently running:
Look at that:

Microsoft downgrades Bing’s AI after it unsettled users


10:10

Microsoft also said it’s testing options for people to choose the tone of their conversations, whether they prefer Bing to be more precise in its responses, more creative, or somewhere in between the two.

Ultimately, the tech giant said it hopes to enable longer and more complicated conversations over time but wants to do so “responsibly”.

“The very reason we’re openly testing the new Bing with a limited number of preview testers is precisely to find these atypical use cases that we can learn from and improve the product,” the company said in a statement.

Microsoft’s moves mark the latest twist for its Bing AI chatbot that made quite a splash when it was announced earlier this month. The technology combines Microsoft’s less popular Bing search engine with Technology of the startup OpenAIwhose ChatGPT responds to prompts for everything from asking you to write a poem to helping you write code and even doing everyday math to figure out how many bags fit in a car.

Experts believe this new breed of technology, dubbed “generative AI,” has the potential to way we interact with technology. For example, Microsoft demonstrated how its Bing AI can help someone plan their vacation day-to-day with relative ease.

Last week, however, critics raised concerns that Microsoft’s Bing AI might not be ready for prime time. People with early access began posting bizarre responses the system gave them, including Bing telling a New York Times columnist to give up his marriage and AI demanding an apology from a Reddit user for saying whether we are in 2022 or 2023.

Microsoft said the “long and complicated” chat sessions that generated many of the unusual replies “are not something we would typically find in internal testing.” But it hopes improvements to the program, including the possible new choice of tone for replies, will help give people “more control over the type of chat behavior to best meet their needs.”

More on AI chatbots

Leave a Reply

Your email address will not be published. Required fields are marked *