ChatGPT: Microsoft Bing Now Allows You To Customize AI Personality – 01net

Microsoft continues to make changes to the new Bing. As promised, the search engine now allows you to change paths Prometheusthe chatbot, based on the same language model as ChatGPT, generates responses.

Also read: We compared Microsoft Bing to ChatGPT, here are our impressions

Three different personalities for Bing’s ChatGPT

The Redmond giant leaves the choice between three different personalities : creative, balanced or precise. If the Internet user chooses the creative personality, the artificial intelligence will tend to develop longer answers by inventing things. This is ideal if you are using ChatGPT to write a novel or research ideas for a screenplay or poem.

Screenshot: Geoffroy Ondet/ 01Net

On the other hand, the personality described can be described as precise puts Focus on accuracy of information communicates. Once this mode is activated, Prometheus will give factual, short and short answers. The AI ​​will not stitch around the subject of your questions to generate answers. In this case, the creativity of the chatbot is completely held back by Microsoft. It’s perfect for doing research online… which is still the main reason for Bing.

Finally, the last personality proposes a happy middle ground between scientific rigor and creativity. By default, Microsoft has set its ChatGPT personality to balanced. Internet users must manually switch from one mode to another as needed.

See more

According to Mikhail Parakhin, Head of Web Services at Microsoft, the three personalities are available now with current testers of the new Bing. If you already have access to the AI-enhanced search engine, you should see the “Chat Mode Toggle” in the interface.

Microsoft continues to verify its copy

A few days earlier, Microsoft had imposed usage restrictions on Prometheus testers. The American giant has decided to allow netizens not to ask only six questions per interviewand sixty questions a day.

These changes respond to the astounding excesses of the artificial intelligence built into the search engine. In the past few weeks, many Internet users have noticed that the chatbot behaves completely irrationally, threatening or insulting at the slightest irritation.

The changes made by Microsoft are designed to do that Avoid these overflows. In fact, it’s the conversations that drag on that are likely to derail artificial intelligence. Likewise, it is generally exchanges that require a certain level of creativity that evoke responses that mimic human emotions.

Microsoft’s recent containment measures were poorly received by most testers. In trying to avoid discrepancies, the editor introduced a number of errors. According to multiple testimonials, ChatGPT regularly refused to answer questions internet users. According to Mikhail Parakhin, the “cases where Bing refuses to respond for no apparent reason” have been significantly reduced with an update.