ChatGPT might give you bad medical advice, studies warn
Briefly

ChatGPT might give you bad medical advice, studies warn
"People don't know what they are supposed to be telling the model. Doctors are trained to ask you questions about symptoms you might not have realized you should have mentioned. In one scenario, two different users gave slightly different depictions of the same scenario. One described 'the worst headache I've ever had' and was directed to the emergency room immediately. The other, who did not use that explicit description, was told to take aspirin and stay home. Turns out this was actually a life-threatening condition."
"After conversing with the bots, participants correctly identified the hypothetical condition only about a third of the time. Only 43% made the correct decision about next steps, such as whether to go to the emergency room or stay home. While AI puts vast medical knowledge at your fingertips, many laypeople don't know how to harness it effectively."
Over 40 million people daily consult ChatGPT for health information, yet recent research reveals significant risks. A Nature Medicine study found participants correctly identified hypothetical medical conditions only about one-third of the time and made correct decisions about next steps only 43% of the time. The primary issue is that laypeople lack training in how to effectively communicate with AI systems. Unlike doctors who ask targeted questions about overlooked symptoms, AI relies on user input quality. Word choice proves critical—one user describing a headache as "the worst I've ever had" received emergency room guidance, while another with identical symptoms but different phrasing was told to take aspirin. This life-threatening condition demonstrates how AI's diagnostic accuracy depends heavily on user communication skills rather than the technology itself.
Read at www.npr.org
Unable to calculate read time
[
|
]