“AI chatbot got election info wrong 30 percent of time, European study finds”

Washington Post:

Want accurate information about elections? Don’t ask an AI chatbot, experts warn — even if it seems confident of its answers and cites seemingly trustworthy sources.

New research from a pair of European nonprofits finds that Microsoft’s Bing AI chatbot, recently rebranded as Microsoft Copilot, gave inaccurate answers to 1 out of every 3 basic questions about candidates, polls, scandals and voting in a pair of recent election cycles in Germany and Switzerland. In many cases, the chatbot misquoted its sources.

The problems were not limited to Europe, with similar questions eliciting inaccurate responses about the 2024 U.S. elections as well.

There has been a concerted effort to move state and local governments to dot-gov domains, and perhaps there will be more pressure to channel voters to official sites for election information, too.

Share this: