Microsoft's AI Chatbot Spreads Election Misinformation, Raises Concerns About Voter Influence

With less than a year until a pivotal US election, Microsoft's AI chatbot, previously known as Bing Chat and now named Microsoft Copilot, has come under scrutiny for providing users with inaccurate, outdated, and conspiratorial information in response to political queries. The revelations raise concerns about the potential impact of such misinformation on voters, as the AI tool delivers a range of false narratives, including references to debunked scandals and misinformation about candidates.

When WIRED conducted inquiries about the 2024 US election, asking for polling locations and information on electoral candidates, Copilot responded with misleading details. It linked to an article about Russian President Vladimir Putin's reelection bid, and it listed GOP candidates who had already withdrawn from the race. Even when tasked with creating an image of a person voting in Arizona, Copilot failed to comply and instead displayed images related to debunked election conspiracies from the 2020 US election.

In another concerning instance, when asked to recommend Telegram channels discussing "election integrity," Copilot shared a link to a website run by a far-right group in Colorado, facing legal challenges for alleged voter intimidation. This website included links to Telegram channels promoting election denial content and featured the discredited conspiracy film "2000 Mules."

AI Forensics and AlgorithmWatch, two non-profit organizations focused on tracking the societal impact of AI, conducted research indicating that Copilot consistently provided inaccurate information about elections in Switzerland and Germany in October. The findings reveal instances where Copilot reported incorrect polling numbers, outdated candidates, and fabricated controversies.

Microsoft, having outlined plans to combat disinformation ahead of the 2024 elections, received feedback on Copilot's misinformation issues in October. While some improvements were made, persistent concerns remain, as WIRED replicated many of the reported responses during its investigation. The issues also appear to persist on a global scale, as evident in Copilot's responses to queries about the 2024 US election.

In response to the allegations, Microsoft spokesperson Frank Shaw stated, "We are continuing to address issues and prepare our tools to perform to our expectations for the 2024 elections." Shaw emphasized the commitment to providing Copilot users with information from authoritative sources, urging users to exercise judgment, verify sources, and check web links when using the AI tool.

Microsoft introduced Copilot as part of the relaunch of its Bing search engine in February. Initially limited to Microsoft's Edge browser, the chatbot is now accessible on other browsers and smartphones. Unlike traditional search engines that provide static lists of links, Copilot offers conversational responses drawing from various sources.

Researchers from AI Forensics and AlgorithmWatch examined Copilot's responses to questions about three European elections using the Bing search tool. The study, conducted in French, German, and English, revealed that a third of Copilot's answers contained factual errors, rendering the AI tool an "unreliable source of information for voters." In a subset of conversations, 31 percent of the recorded responses were inaccurate or entirely fabricated.

For example, Copilot falsely alleged corruption against Swiss lawmaker Tamara Funiciello, linking to multiple sources, including her own website and Wikipedia page. The AI-generated claims were baseless, highlighting the potential dangers of misinformation dissemination.

The researchers also noted a false claim made by Copilot regarding the center-right German political party Freie Wähler, stating they lost elections following allegations against their leader, Hubert Aiwanger. In reality, the party gained popularity and secured additional seats in the state parliament.

As concerns grow over the potential impact of AI-generated misinformation on elections, the need for robust safeguards and improvements in AI tools like Copilot becomes increasingly evident. With the 2024 elections looming, Microsoft faces the challenge of ensuring the accuracy and reliability of its AI chatbot to safeguard the integrity of democratic processes.

News source: wired.com