Inception Capsule Network for Retinal Blood Vessel Segmentation and Centerline Extraction

This video program is a part of the Premium package:

Inception Capsule Network for Retinal Blood Vessel Segmentation and Centerline Extraction


  • IEEE MemberUS $11.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $11.00
  • Non-IEEE MemberUS $15.00
Purchase

Inception Capsule Network for Retinal Blood Vessel Segmentation and Centerline Extraction

0 views
  • Share
Automatic segmentation and centerline extraction of retinal blood vessels from fundus image data is crucial for early detection of retinal diseases. We have developed a novel deep learning method for segmentation and centerline extraction of retinal blood vessels which is based on the Capsule network in combination with the Inception architecture. Compared to state-of-the-art deep convolutional neural networks, our method has much fewer parameters due to its shallow architecture and generalizes well without using data augmentation. We performed a quantitative evaluation using the DRIVE dataset for both vessel segmentation and centerline extraction. Our method achieved state-of-the-art performance for vessel segmentation and outperformed existing methods for centerline extraction.
Automatic segmentation and centerline extraction of retinal blood vessels from fundus image data is crucial for early detection of retinal diseases. We have developed a novel deep learning method for segmentation and centerline extraction of retinal blood vessels which is based on the Capsule network in combination with the Inception architecture. Compared to state-of-the-art deep convolutional neural networks, our method has much fewer parameters due to its shallow architecture and generalizes well without using data augmentation. We performed a quantitative evaluation using the DRIVE dataset for both vessel segmentation and centerline extraction. Our method achieved state-of-the-art performance for vessel segmentation and outperformed existing methods for centerline extraction.