Jimma University Open access Institutional Repository

Wheat Disease Identification: A Multi-Modal Approach using Deep Learning

Show simple item record

dc.contributor.author Abdurezak Yisihak
dc.contributor.author Getachew Mamo
dc.contributor.author Tesfu Mekonen
dc.date.accessioned 2024-10-07T08:28:30Z
dc.date.available 2024-10-07T08:28:30Z
dc.date.issued 2024-06
dc.identifier.uri https://repository.ju.edu.et//handle/123456789/9289
dc.description.abstract Wheat, as a global staple crop, is vital for food security, yet it remains highly susceptible to various diseases that threaten its yield. Traditional methods of disease identification, which largely rely on visual inspection, are inadequate and often result in misdiagnosis and delayed intervention. This research aims to transform wheat disease detection by developing and implementing an innovative deep learning system that integrates multi-modal data. Our study focuses on four prevalent wheat diseases: Brown Rust, Yellow Rust, Powdery Mildew, and Septoria. By making a hybrid model combining Convolutional Neural Networks (CNNs) for image-based features and Feedforward Neural Networks (FNNs) for environmental variables, we aim to enhance the accuracy of disease identification. Data collection encompasses RGB images of both healthy and diseased wheat parts alongside crucial environmental data—altitude, temperature, humidity, and precipitation—collected from diverse Ethiopian regions: Holeta (highland), Jimma (midland), and Kemissie (lowland). A Total of 5012 data have been collected, around 3000 of them are different diseased data and the rest healthy and invalid classes data has been collected. This comprehensive dataset allows us to evaluate the hypothesis that "only RGB images are not sufficient for disease identification in AI systems, and that considering environmental factors significantly improves accuracy." Utilizing the Keras Functional API, we integrate these diverse inputs to generate a unified output, showcasing the model's capability to handle complex, multi-modal data. The experimental design breaks down into three main experiments Unimodal with Environmental Data, With Image Data only and Multi-modal data, the results show 98% for Test Accuracy which is better than the unimodal frameworks accuracy, 85.82 % for FNN (Multi-Layer Perceptron) and 94.41 % for CNN architecture. The Experimental results analysis demonstrates the validation of this hypothesis and the establishment of a robust, adaptive model capable of accurately diagnosing wheat diseases across varied environmental conditions. The findings have the potential to transform current practices in crop disease management, emphasizing the importance of integrating multi-modal data for more reliable and timely interventions. Future works should focus on expanding the dataset to include more diverse environmental conditions and wheat varieties by integrating with IoT. This will enhance the model's generalizability and ensure its applicability across different regions and climates. en_US
dc.language.iso en_US en_US
dc.title Wheat Disease Identification: A Multi-Modal Approach using Deep Learning en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search IR


Browse

My Account