Lesion-Aware Segmentation Network for Atrophy and Detachment of Pathological Myopia on Fundus Images

Pathological myopia, potentially causing loss of vision, has been considered as one of the common issues that threatens visual health of human throughout the world. Identification of retinal lesions including atrophy and detachment is meaningful because it provides ophthalmologists with quantified reference for accurate diagnosis and treatment. However, segmentation of lesions on fundus photography is still challenging because of widespread data and complexity of lesion shape. Fundus images may vary from each other distinctly as they are taken by different devices with different surroundings. False positive predictions are also inevitable on negative samples. In this paper, we propose originally invented lesion-aware segmentation network to segment atrophy and detachment on fundus images. Based on the existing architecture of paired encoder and decoder, we introduce three innovations. Firstly, our proposed network is aware of existence of lesion by including an extra classification branch. Secondly, feature fusion module is integrated to the decoder that makes the output node sufficiently absorbs the features in various scales. Last, the network is trained with an elegant objective function, called edge overlap rate, that eventually boosts the model?s sensitivity of lesion edges. The proposed network wins PALM challenge in ISBI 2019 with a large margin, that could be seen as evidence of effectiveness. Our team, PingAn Smart Health, leads the leaderboards in all metrics in scope of lesion segmentation. Permission of using the dataset outside PALM challenge was issued by the sponsor.
  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

Videos in this product

Lesion-Aware Segmentation Network for Atrophy and Detachment of Pathological Myopia on Fundus Images

00:03:52
0 views
Pathological myopia, potentially causing loss of vision, has been considered as one of the common issues that threatens visual health of human throughout the world. Identification of retinal lesions including atrophy and detachment is meaningful because it provides ophthalmologists with quantified reference for accurate diagnosis and treatment. However, segmentation of lesions on fundus photography is still challenging because of widespread data and complexity of lesion shape. Fundus images may vary from each other distinctly as they are taken by different devices with different surroundings. False positive predictions are also inevitable on negative samples. In this paper, we propose originally invented lesion-aware segmentation network to segment atrophy and detachment on fundus images. Based on the existing architecture of paired encoder and decoder, we introduce three innovations. Firstly, our proposed network is aware of existence of lesion by including an extra classification branch. Secondly, feature fusion module is integrated to the decoder that makes the output node sufficiently absorbs the features in various scales. Last, the network is trained with an elegant objective function, called edge overlap rate, that eventually boosts the model?s sensitivity of lesion edges. The proposed network wins PALM challenge in ISBI 2019 with a large margin, that could be seen as evidence of effectiveness. Our team, PingAn Smart Health, leads the leaderboards in all metrics in scope of lesion segmentation. Permission of using the dataset outside PALM challenge was issued by the sponsor.