#confabulation

[ follow ]
fromZDNET
2 days ago

Stop saying AI 'hallucinates' - it doesn't. And the mischaracterization is dangerous

The expression "AI hallucination" is well-known to anyone who's experienced ChatGPT or Gemini or Perplexity spouting obvious falsehoods, which is pretty much anyone who's ever used an AI chatbot. Only, it's an expression that's incorrect. The proper term for when a large language model or other generative AI program asserts falsehoods is not a hallucination but a "confabulation." AI doesn't hallucinate, it confabulates.
Artificial intelligence
US politics
fromemptywheel
2 months ago

Investigate POTUS' Health Cover-Up - emptywheel

Congress must investigate a potential cover-up of President Trump's declining neurological health that poses a threat to national security.
Artificial intelligence
fromWIRED
7 months ago

An AI Customer Service Chatbot Made Up a Company Policy-and Created a Mess

Cursor's AI support bot falsely claimed a non-existent policy, causing user frustration and potential business harm.
[ Load more ]