The new Microsoft Bing will sometimes misrepresent the info it finds


Photo by Tom Warren / The Verge

Search engines are about to change in a very important way: when you type in a query and get an official-looking answer, it might be wrong — because an AI chatbot created it.

Today, Microsoft announced a new version of its Bing search engine that will provide “complete answers” to your questions by tapping into the power of ChatGPT. You can already try some canned sample searches and sign up for more.

But though Microsoft is taking many precautions compared to its 2016 failure with Tay — a chatbot that Twitter taught to be racist and misogynist in less than a day — the company’s still proactively warning that some of the new Bing’s results might be bad.

Here are a couple key passages from Microsoft’s new Bing FAQ:

Bing tries to keep…

Continue reading…

Source: ​Read More 

Leave a Comment

Generated by Feedzy