Video-based AI for beat-to-beat assessment of cardiac function
Accurate assessment of cardiac function is crucial for the diagnosis of cardiovascular disease 1 , screening for cardiotoxicity 2 and decisions regarding the clinical management of patients with a critical illness 3 . However, human assessment of cardiac function focuses on a limited sampling of car...
Saved in:
Published in: | Nature (London) Vol. 580; no. 7802; pp. 252 - 256 |
---|---|
Main Authors: | , , , , , , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
London
Nature Publishing Group UK
01-04-2020
Nature Publishing Group |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Accurate assessment of cardiac function is crucial for the diagnosis of cardiovascular disease
1
, screening for cardiotoxicity
2
and decisions regarding the clinical management of patients with a critical illness
3
. However, human assessment of cardiac function focuses on a limited sampling of cardiac cycles and has considerable inter-observer variability despite years of training
4
,
5
. Here, to overcome this challenge, we present a video-based deep learning algorithm—EchoNet-Dynamic—that surpasses the performance of human experts in the critical tasks of segmenting the left ventricle, estimating ejection fraction and assessing cardiomyopathy. Trained on echocardiogram videos, our model accurately segments the left ventricle with a Dice similarity coefficient of 0.92, predicts ejection fraction with a mean absolute error of 4.1% and reliably classifies heart failure with reduced ejection fraction (area under the curve of 0.97). In an external dataset from another healthcare system, EchoNet-Dynamic predicts the ejection fraction with a mean absolute error of 6.0% and classifies heart failure with reduced ejection fraction with an area under the curve of 0.96. Prospective evaluation with repeated human measurements confirms that the model has variance that is comparable to or less than that of human experts. By leveraging information across multiple cardiac cycles, our model can rapidly identify subtle changes in ejection fraction, is more reproducible than human evaluation and lays the foundation for precise diagnosis of cardiovascular disease in real time. As a resource to promote further innovation, we also make publicly available a large dataset of 10,030 annotated echocardiogram videos.
A video-based deep learning algorithm—EchoNet-Dynamic—accurately identifies subtle changes in ejection fraction and classifies heart failure with reduced ejection fraction using information from multiple cardiac cycles. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 ObjectType-Undefined-3 Co-senior author. Author Contributions D.O. retrieved, preprocessed, and quality controlled Stanford videos and merged electronic medical record data. D.O., B.H., A.G., J.Y.Z. developed and trained the deep learning algorithms, performed statistical tests, and created all the figures. D.O., C.P.L., P.A.H., R.A.H. coordinated public release of the deidentified echocardiogram dataset. D.O., P.A.H., D.H.L., E.A.A. performed clinical evaluation of model performance. N.Y. and J.E. retrieved, preprocessed, and quality controlled data from Cedar-Sinai for model testing. D.O., B.H., E.A.A., J.Y.Z. wrote the manuscript with feedback from all authors. |
ISSN: | 0028-0836 1476-4687 1476-4687 |
DOI: | 10.1038/s41586-020-2145-8 |