Show simple item record

dc.contributor.authorShanto, Abdullah Al Alam
dc.contributor.authorEsha, Mumtahina
dc.contributor.authorAlam, Sadia
dc.date.accessioned2026-04-09T06:10:21Z
dc.date.available2026-04-09T06:10:21Z
dc.date.issued2025-12
dc.identifier.urihttps://ar.iub.edu.bd/handle/11348/1068
dc.description.abstractRetinal diseases including Age-related Macular Degeneration (AMD), Diabetic Retinopathy (DR), Cataract, and Myopia, are among the most prevalent causes of irreversible vision loss worldwide. Early and accurate diagnosis is vital for effective treatment and management, yet manual assessment of retinal fundus images is time-consuming, subject to inter-observer variability, and often challenging due to subtle pathological features. Motivated by these clinical demands, we experimented with the classification of retinal diseases using deep learning models, starting from conventional deep learning methods and advancing towards a bilateral ensemble solution. The first phase of our experimentations involved the use of convolutional neural networks and vision transformers for multi-class classification using individual ocular fundus images. Conventional approaches showed promise however they are limited in their capacity to extract complex features specific to each disease and limited to analysis of single eye which can make the diagnostic accuracy of these approaches to be unreliable as single eye diagnosis might not capture the potential correlations of both eyes of the same individual. Such limitations were suggestive in the experimental scenarios where both eyes presented disease cues, and the method was restricted in capturing complementary information. It is in view of such limitations that the next stage of experimentations has made use of various ensemble learning strategies. Satisfying the need for more potentials, ensemble methods tried to incorporate a better set of features by combining the predictions of several state-of-the-art deep learning network architectures. The comparative analysis examined different decision ensemble techniques as well as feature ensemble techniques, finding out that though the ensemble methods have improved the overall classification task over conventional models, there were still certain problems. Most of all, ensemble strategies were still considering images from each eye as separate entities, ignoring valuable information that are present in both eyes of a subject affected by the same eye disease. Our major contribution, based on these insights, is the design and validation of a custom bilateral ensemble framework for the classification of retinal diseases. This strategy is found to be unique in combining corresponding fundus images from both eyes of a subject from a ConvNeXt-XLarge backbone with preprocessing and bilateral feature fusion technique. Using the disease features in both eyes together, the proposed framework showed a promising performance for the multiclass disease detection. Comprehensive experiments on the publicly available OIA-ODIR dataset demonstrated that the bilateral ensemble approach overcomes conventional method limitations, delivering improved and more reliable results. The research establishes a foundation for advanced deep learning in detection of eye diseases. Our work presents that systematic improvements to data presentation and system architecture can lead to better diagnostic performance and improved potential of artificial intelligence based systems in the medical field. These findings support future computational ophthalmology research and practical diagnostic tool development for global eye health.en_US
dc.language.isoenen_US
dc.publisherIUBen_US
dc.subjectRetinal Diseaseen_US
dc.titleEnsemble Deep Learning for Retinal Disease Detection from Fundus Images: A Bilateral Ensemble Approach for Retinal Disease Classificationen_US
dc.typeThesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


Copyright © 2002-2021  IUB Academic Repository.
Maintained by  Library Information Technology (LIT)
LIT