The chief constable of one of Britain’s largest police forces has admitted that Microsoft’s Copilot AI assistant made a mistake in a football (soccer) intelligence report. The report, which led to Israeli football fans being banned from a match last year, included a nonexistent match between West Ham and Maccabi Tel Aviv.
Copilot hallucinated the game and West Midlands Police included the error in its intelligence report. “On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot [sic],” says Craig Guildford, chief constable of West Midlands Police, in a letter to the Home Affairs Committee earlier this week. Guildford previously denied in December that the West Midlands Police had used AI to prepare the report, blaming “social media scraping” for the error.
Maccabi Tel Aviv fans were banned from a Europa League match against Aston Villa in November last year, because the Birmingham Safety Advisory Group deemed the match “high risk” after “violent clashes and hate crime offences” at a previous Maccabi match in Amsterdam.
As Microsoft warns at the bottom of its Copilot interface, “Copilot may make mistakes.” This is a pretty high-profile mistake, though. We tested Copilot recently, and my colleague Antonio G. Di Benedetto found that Microsoft’s AI assistant often “got things wrong” and “made stuff up.”
We’ve reached out to Microsoft to comment on why Copilot made up a football match that never existed, but the company didn’t respond in time for publication.
