
"ChatGPT may be an excellent tool in case your strongly-worded email to your landlord about that ceiling leak needs a second pair of eyes. It also excels at coming up with a rough first draft for non-mission-critical writing, allowing you to carefully pick it apart and refine it. But like all of its competitors, ChatGPT is plagued by plenty of well-documented shortcomings as well, from rampant hallucinations to a sycophantic tone that can easily lull users into gravely mistaken beliefs."
"In a column for Nature, Bucher admitted he'd "lost" two years' worth of "carefully structured academic work" - including grant applications, publication revisions, lectures, and exams - after turning off ChatGPT's "data consent" option. He disabled the feature because he "wanted to see whether I would still have access to all of the model's functions if I did not provide OpenAI with my data." But to his dismay, the chats disappeared without a trace in an instant."
A researcher lost two years' worth of carefully structured academic work, including grant applications, publication revisions, lectures, and exams, after disabling ChatGPT's data consent setting. The researcher disabled the feature to test access to model functions without providing data, and the chats disappeared instantly with no warning or undo option. The incident generated social media backlash and prompted calls for accountability, while some colleagues emphasized human error and flawed workflow. The case highlights that AI chat services can remove stored conversations without safeguards and that local backups and cautious integration into academic workflows are essential. Dependence on AI for critical tasks creates vulnerability to data loss and undermines academic continuity.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]