The company reined in the Bing AI’s responses after early users noticed strange behavior during long chats and ‘entertainment’ sessions. As The Verge observes, the restrictions irked some users as the chatbot would simply decline to answer some questions. Microsoft has been gradually lifting limits since then, and just this week updated the AI to reduce both the unresponsiveness and “hallucinations.” The bot may not be as wonderfully weird, but it should also be more willing to indulge your curiosity.
Categories: Leben (Life aka misc)Technology