AI chatbots distort the news, BBC finds - see what they get wrong
Briefly

A BBC investigation revealed that major AI chatbots, including ChatGPT and Google's Gemini, produce significant inaccuracies in summarizing news stories. Out of 100 news stories analyzed, over half of the responses from AI contained significant issues, with 19% introducing factual errors. The investigation noted that altered quotes compromised the integrity of the sourced articles. Deborah Turness, BBC News CEO, expressed concern over the potential for misleading information from AI, emphasizing the need for clarity over confusion in news consumption.
Errors highlighted in the report included the following: ChatGPT claimed that Hamas chairman Ismail Haniyeh was assassinated in December 2024 in Iran when he was killed in July.
Deborah Turness, CEO of BBC News and Current Affairs, responded to the investigation's findings in a blog post: "The price of AI's extraordinary benefits must not be a world where people searching for answers are served distorted, defective content that presents itself as fact."
Read at ZDNET
[
|
]