The More Scientists Work With AI, the Less They Trust It
Briefly

The More Scientists Work With AI, the Less They Trust It
"Anxiety over security and privacy were up 11 percent from last year, while concerns over ethical AI and transparency also ticked up. In addition, there was a massive drop-off in hype compared to last year, when buzzy AI research startups dominated headline after headline. In 2024, scientists surveyed said they believed AI was already surpassing human abilities in over half of all use cases. In 2025, that belief dropped off a cliff, falling to less than a third."
"While more studies are needed to show how widespread this phenomena is, it's not hard to guess why professionals would start to have doubts about their algorithmic assistants. For one thing, those hallucinations are a serious issue. They've already caused major turmoil in courts of law, medical practice, and even travel. It's not exactly a simple fix either; in May, testing showed that AI models were hallucinating more even as they technically became more powerful."
Scientists' trust in AI declined in 2025 even as AI use among researchers rose. Concern about hallucinations grew from 51 percent in 2024 to 64 percent in 2025, while researcher AI adoption climbed from 45 to 62 percent. Anxiety about security and privacy increased by 11 percent, and worries about ethical AI and transparency also rose. Belief that AI already surpassed human abilities fell from over half of use cases in 2024 to less than a third in 2025. Greater technical understanding correlated with lower trust. Hallucinations caused major problems in courts, medicine, and travel, and testing showed hallucinations increased as models grew more powerful.
Read at Futurism
Unable to calculate read time
[
|
]