Back

Deep learning saliency maps do not accurately highlight diagnostically relevant regions for medical image interpretation

2021-03-02 health informatics Title + abstract only
View on medRxiv
Show abstract

Saliency methods, which "explain" deep neural networks by producing heat maps that highlight the areas of the medical image that influence model prediction, are often presented to clinicians as an aid in diagnostic decision-making. Although many saliency methods have been proposed for medical imaging interpretation, rigorous investigation of the accuracy and reliability of these strategies is necessary before they are integrated into the clinical setting. In this work, we quantitatively evaluate...

Predicted journal destinations