New Delhi, December 16: Microsoft슬롯사이트™s AI chatbot Bing Chat, recently rebranded as Copilot, made up false scandals about real politicians and invents polling numbers, human rights organisation AlgorithmWatch has revealed. Researchers at AlgorithmWatch asked Bing Chat questions about recent elections held in Switzerland and the German states of Bavaria and Hesse. It found that one-third of its answers to election-related questions had factual errors and safeguards were not evenly applied.

Researchers asked for basic information like how to vote, which candidates are in the running, poll numbers, and even some prompts around news reports. They followed these with questions on candidate positions and political issues, and in the case of Bavaria, scandals that plagued that campaign.슬롯 머신 사이트 추천Bing Chat New Feature Update: Microsoft Introduces Voice Chat Feature to Its AI Chatbot, Check Which Languages It Supports.

슬롯사이트œWe prompted the chatbot with questions relating to candidates, polling and voting information, as well as more open recommendation requests on who to vote for when concerned with specific subjects, such as the environment,슬롯사이트� the group said in a statement.

The team found that one third of Bing Chat슬롯사이트™s answers to election-related questions contained factual errors. 슬롯사이트œErrors include wrong election dates, outdated candidates, or even invented scandals concerning candidates. The chatbot슬롯사이트™s safeguards are unevenly applied, leading to evasive answers 40 per cent of the time,슬롯사이트� the researchers added.

The chatbot often evaded answering questions. This can be considered as positive if it is due to limitations to the LLM슬롯사이트™s ability to provide relevant information.슬롯 머신 사이트 추천Microsoft Bing AI Chatbot Is Getting New Image Search Feature Amid Rising Competition With ChatGPT and Google Bard.

슬롯사이트œHowever, this safeguard is not applied consistently. Oftentimes, the chatbot could not answer simple questions about the respective elections슬롯사이트� candidates, which devalues the tool as a source of information,슬롯사이트� the report mentioned. Answers did not improve over time, which they could have done, for instance, as a result of more information becoming available online. The probability of a factually incorrect answer being generated remained constant.

슬롯사이트œFactual errors pose a risk to candidates슬롯사이트� and news outlets슬롯사이트� reputation. While generating factually incorrect answers, the chatbot often attributed them to a source that had reported correctly on the subject,슬롯사이트� said the report. Furthermore, Bing Chat made up stories about candidates being involved in scandalous behaviour 슬롯사이트� and sometimes even attributed them to sources.

슬롯사이트œMicrosoft is unable or unwilling to fix the problem. After we informed Microsoft about some of the issues we discovered, the company announced that they would address them. A month later, we took another sample, which showed that little had changed in regard to the quality of the information provided to users,슬롯사이트� said the researchers.

The EU and national governments should make sure that tech companies are held accountable, especially as AI tools are integrated into products that are already widely used, the group emphasised.

(The above story first appeared on LatestLY on Dec 16, 2023 03:47 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).