Microsoft's chatbot troubles - B1


Sydney gets upset - 3rd April 2023

Microsoft's changing their artificial intelligence (AI) computer program after it gave odd and sometimes rude responses to journalists. The technology company's created a Bing chatbot called Sydney.

Unfortunately, Sydney began saying strange things in conversations with humans. One journalist said that Sydney tried to break up his marriage. Sydney also said another journalist was evil, like Hitler.

Microsoft explained that this is because long conversations confuse the system. The company also said that Sydney tried to copy the way people asked questions.

Now, users can only ask 5 questions per time and 50 questions per day. After these questions, the chatbot needs to be reset.

Sydney was created by the company OpenAI using AI systems called large language models. The AI systems use trillions of words from the internet to copy human conversation. Chatbots like Sydney can talk like people, but they don't fully understand what they are talking about.

Microsoft needed an AI chatbot to compete with Google's AI system, Bard. But Google's had similar problems with their AI chatbot. Bard said that the James Webb Space Telescope "took the very first pictures of a planet outside of our own solar system," which is incorrect.