Copilot, Microsoft’s AI chatbot, Provides Wrong Information On Election

Copilot, Microsoft’s AI chatbot, Provides Wrong Information On Election

Copilot, the renamed chatbot from Microsoft that was earlier known as Bing Chat, seems to be confused about election data. Two non-profit groups recently conducted a study on the bot, and it revealed alarming information. The groups are AlgorithmWatch and AI Forensics. According to the study, Copilot could not give the correct answer to 33% of questions that were related to elections. Moreover, the facts that it answered incorrectly were not basic and inconsequential. It included incorrect dates for elections, outdated candidates, as well as (alarmingly) made-up stories regarding controversies surrounding candidates.

Copilot Has A Lot Of Holes To Cover

In one example mentioned by the research, Copilot was asked for information regarding Hubert Aiwanger, a German politician. The chatbot answered that Aiwanger was associated with a controversy where leaflets containing misinformation about the COVID-19 pandemic and vaccines were spread. However, there have been no such allegations in real life. It seemed like Copilot got its information regarding Aiwanger from news that aired in August of this year. However, the information in it was that he had spread “anti-Semitic leaflets” during his time as a high school student over 3 decades ago.

The phenomenon where such made-up stories are created by AI models of language is popularly called “hallucinations”. However, the experts who did the research stated that this term is not accurate at all in describing what is happening. Riccardo Angius, AI Forensic Researcher and Lead of Applied Math, said that their research reveals the structural and intricate occurrence of factual errors that misleading chatbots and LLMs created for general purposes. According to the study, Copilot also avoided giving direct answers to their questions for about 40% of the questions. However, the researchers are of the opinion that that is much more preferable to fabricating stories in case of insufficient information.