Abstract

Plant diseases are a major cause of degraded fruit quality and yield losses. These losses can be significantly reduced with early detection of diseases to ensure their timely treatment, particularly in developing countries. In this regard, an expert system based on deep learning model where the expert knowledge, particularly the one acquired by plant pathologist, is recursively learned by the system and is applied using a smart phone application for use in the target field environment, is being proposed. In this paper, a robust disease detection method is developed based on convolutional neural network (CNN), where its powerful features extraction capabilities are leveraged to detect diseases in images of fruits and leaves. The features extraction pipelines of several state-of-the-art pretrained networks are fine-tuned to achieve optimal detection performance. A novel dataset is collected from peach orchards and extensively augmented using both label-preserving and non-label-preserving transformations. The augmented dataset is used to study the effects of fine-tuning the pretrained networks’ feature extraction pipeline as opposed to keeping the network parameters unchanged. The CNN models, particularly EfficientNet exhibited superior performance on the target dataset once their feature extraction pipelines are fine-tuned. The optimal model is able to achieve 96.6% average accuracy, 90% sensitivity and precision, and 98% specificity on the test set of images.

Details

Title
EfficientNet-Based Robust Recognition of Peach Plant Diseases in Field Images
Author
Haleem Farman; Ahmad, Jamil; Bilal, Jan; Shahzad, Yasir; Abdullah, Muhammad; Ullah, Atta
Pages
2073-2089
Section
ARTICLE
Publication year
2022
Publication date
2022
Publisher
Tech Science Press
ISSN
1546-2218
e-ISSN
1546-2226
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2604986261
Copyright
© 2022. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.