A Multi-scale Graph Network with Multi-head Attention for Histopathology Image Diagnosis

Hematoxylin-eosin (H&E) staining plays an essential role in brain glioma diagnosis, but reading pathologic images and generating diagnostic reports can be a tedious and laborious work. Pathologists need to combine and navigate extremely large images with different scales and to quantify different aspects for subtyping. In this work, we propose an automatic diagnosis algorithm to identify cell types and severity of H&E slides, in order to classify five major subtypes of glioma from whole slide pathological images. The proposed method is featured by a pyramid graph structure and an attention-based multi-instance learning strategy. We claim that our method not only improve the classification accuracy by utilizing multi-scale information, but also help to identify high risk patches. We summarized patches from multiple resolutions into a graph structure. The nodes of the pyramid graph are feature vectors extracted from image patches, and these vectors are connected by their spatial adjacency. We then fed the graph into the proposed model with self-attention and graph convolutions. Here, we used a multi-head self-attention architecture, where same self-attention blocks are stacked in parallel. As proven in Transformer networks, multiple attention maps herein capture comprehensive activation patterns from different subspace representation. Using the proposed method, the results show a 71% accuracy for glioma subtyping. The multiresolution attention maps generated from the proposed method could help locate proliferations and necrosis in the whole pathologic slide.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods