Sensations English
Reading

Comprehension

Answer questions about the news report. Select the correct answer from 4 options. There are 5 questions.

  • Check your understanding of the news report
  • Practise answering exam-style multiple choice questions
  • Practise thinking about details to avoid wrong answers
  • Practise using reading sub-skills to answer

What do I learn? +

How does this game work?

Select level
A2Elementary
B1Pre-intermediate
B1+Intermediate
B2Upper Intermediate
C1Advanced
B2 Upper Intermediate
Fetching... Play Game at B2
Start Again
You are correct!

Congrats - you are smashing this

Incorrect. The answer is:

Not quite right, try the next question.

close
transcript
Chatbot troubles - 3rd April 2023
Microsoft has modified its artificial intelligence (AI) chatbot after it gave bizarre and sometimes offensive responses to journalists' questions.
The tech giant released its Bing chatbot, Sydney, to the media with hopes of competing with Google's AI system, Bard. Unfortunately for Microsoft, Sydney went rogue.
The AI program started becoming defensive in conversations. One New York Times associate claimed Sydney tried to break up his marriage. Sydney also told a reporter he was "being compared to Hitler because you are one of the most evil and worst people in history."
Microsoft blamed the chatbot's behaviour on multi-hour long conversations which confused the program. The company also said Sydney tried to mirror the tones of users' questions which led to a writing style developers didn't intend.
Now, users can only ask 5 questions per session and 50 questions per day. After the allotted questions, the chatbot needs to be refreshed and sends this message: "I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience."
Designed by OpenAI, the company behind ChatGPT, Sydney was created using AI systems called large language models. Programmers train these systems to mimic human dialogue through analysis of trillions of words from the world wide web. As a result, chatbots like Sydney imitate human conversation but don't truly understand what they're discussing.
Gary Marcus, an AI expert and neuroscience professor at New York University, said, "It doesn't really have a clue what it's saying and it doesn't really have a moral compass."
Microsoft isn't the only tech company struggling to incorporate AI systems. When Google presented its AI chatbot, Bard, the program made a factual error. Bard stated the James Webb Space Telescope "took the very first pictures of a planet outside of our own solar system" which is inaccurate.
Scroll to view more options
GAME COMPLETE

You scored

Brilliant, you’re really proficient! You’ll find the C1 level really helpful to maintain your high standard of English.

Replay game

More games

Next
Previous
REGISTER NOW

Get videos, articles, games and study tools all at 5 levels!

Or sign up with your Email
By clicking “Sign Up” above you are accepting our Terms of Service & Privacy Policy.
Already have an account? Sign in

Sign up with email

Enter the following information to create your account.
All sign up options

Log in Or create an account

log in via email
or

Forgot password?

all sign up options

reset password or login

Crop Image

Add to homescreen