EchoPrime, a video-based vision-language model, analyses echocardiogram footage and generates a written report of cardiac form and function. Its findings were published in Nature (volume 650, pages 970-977) in February 2026, under the title 'Comprehensive echocardiogram evaluation with view primed vision language AI.'
In a head-to-head comparison with five experienced physicians, each with more than a decade of practice, the system achieved higher accuracy across the board. DeepRare correctly identified the disease on its first suggestion 64.4 per cent of the time, compared to 54.6 per cent for the doctors. When given three suggestions instead of one, the AI system achieved diagnostic success in 79 per cent of cases versus 66 per cent for the human specialists.
AI plays an important role-but not by fixing fragmented data on its own. The work of organizing, connecting, and interpreting healthcare information still belongs to people and the systems they build. Where AI helps is after that foundation is in place: by bringing the right information forward at the right time, reducing the effort it takes to find what matters, and supporting better decisions in the moment of care.