CSET’s Josh A. Goldstein provided his expert insights in an article published by WIRED about issues with Microsoft’s AI chatbot, originally named Bing Chat and now called Microsoft Copilot, providing misinformation, conspiracies, and outdated information in response to political queries.
In the context of the growing concerns surrounding the impact of generative AI on high-profile elections in 2024, Goldstein commented on the subject stating, “The tendency to produce misinformation related to elections is problematic if voters treat outputs from language models or chatbots as fact.” He added, “If voters turn to these systems for information about where or how to vote, for example, and the model output is false, it could hinder democratic processes.”
If voters turn to these systems for information about where or how to vote, for example, and the model output is false, it could hinder democratic processes.Josh A. Goldstein, CSET Research Fellow
Read the article in WIRED.