Breast cancer histopathological image classification using attention high‐order deep network
Citations Over TimeTop 10% of 2021 papers
Abstract
Abstract Computer‐aided classification of pathological images is of the great significance for breast cancer diagnosis. In recent years, deep learning methods for breast cancer pathological image classification have made breakthrough progress, becoming the mainstream in this field. To capture more discriminant deep features for breast cancer pathological images, this work introduces a novel attention high‐order deep network (AHoNet) by simultaneously embedding attention mechanism and high‐order statistical representation into a residual convolutional network. AHoNet firstly employs an efficient channel attention module with non‐dimensionality reduction and local cross‐channel interaction to achieve local salient deep features of breast cancer pathological images. Then, their second‐order covariance statistics are further estimated through matrix power normalization, which provides a more robust global feature presentation of breast cancer pathological images. We extensively evaluate AHoNet on the public BreakHis and BACH breast cancer pathology datasets. Experimental results illustrate that AHoNet gains the optimal patient‐level classification accuracies of 99.29% and 85% on the BreakHis and BACH database, respectively, demonstrating the competitive performance with state‐of‐the‐art single models on this medical image application.
Related Papers
- → Relationship between magnification and resolution in digital pathology systems(2013)76 cited
- → Performance Analysis of Dimensionality Reduction Techniques: A Comprehensive Review(2021)7 cited
- → Deep fake Detection Through Deep Learning(2023)4 cited
- → Editorial Computational Pathology(2021)3 cited
- → History of the SPIE Medical Imaging Digital Pathology Conference(2022)2 cited