Detection of Rice Diseases: Leaf Blast, Bacterial Leaf Light, and Brown Spot Using Image Enhancement and Faster Region-Based Convolutional Neural Network
Downloads
Rice diseases such as leaf blight, blast, and brown spot remain major constraints on food security and rural livelihoods across Southeast Asia, causing significant yield losses each year. In Indonesia, particularly in Lamongan, East Java, these pathogens threaten smallholder productivity and disrupt national rice supply chains. This study aims to enhance automated rice disease detection under real agricultural conditions by integrating image preprocessing techniques with a deep learning-based detection framework. The main contribution lies in developing a hybrid pipeline that combines RGB-to-grayscale conversion and contrast stretching prior to model training, effectively mitigating low-contrast conditions and noise commonly found in field-acquired image datasets. The enhanced images are subsequently processed using the Faster Region-Based Convolutional Neural Network (Faster R-CNN) with a ResNet-50 backbone to localize and classify disease symptoms. Experiments conducted on a dataset of 1,500 annotated rice leaf images achieved high detection performance, with accuracies of 97.37% for leaf blight, 94.12% for blast, and 95.24% for brown spot. Compared with the baseline Faster R-CNN model, the proposed approach improved classification accuracy from 0.8906 to 0.9297, reduced false negatives from 0.439 to 0.1998, increased foreground classification accuracy from 0.55 to 0.78, and descreased total loss from 0.839 to 0.6493. These results demonstrate that integrating RGB-to-grayscale conversion and contrast stretching significantly enhances feature representation, leading to improved detection accuracy, reduced error rates, and more stable training behavior. Overall, the proposed framework provides a robust and reliable approach for rice disease identification and offers strong potential for practical deployment in precision agriculture systems.
[1] Food and Agriculture Organization of the United Nations, World Food and Agriculture – Statistical Yearbook 2021. Rome, Italy: FAO, 2021.
[2] S. Savary, L. Willocquet, S. J. Pethybridge, P. Esker, N. McRoberts, and A. Nelson, “The global burden of pathogens and pests on major food crops,” Nat. Ecol. Evol., vol. 3, pp. 430–439, 2019, doi: 10.1038/s41559-018-0793-y
[3] P. Skamnioti and S. J. Gurr, “Against the grain: Safeguarding rice from rice blast disease,” Trends Biotechnol., vol. 27, no. 3, pp. 141–150, 2009, doi: 10.1016/j.tibtech.2008.12.002
[4] D. O. Niño-Liu, P. C. Ronald, and A. J. Bogdanove, “Xanthomonas oryzae pathovars: Model pathogens of a model crop,” Mol. Plant Pathol., vol. 7, no. 5, pp. 303–324, 2006, doi: 10.1111/j.1364-3703.2006.00344.x
[5] Ministry of Agriculture, Plant Protection Annual Report, 2022.
[6] A.-K. Mahlein, “Plant Disease Detection by Imaging Sensors - Parallels and Specific Demands for Precision Agriculture and Plant Phenotyping,” Plant Dis., vol. 100, no. 2, pp. 1–11, 2016, doi: 10.1094/PDIS-03-15-0340-FE
[7] S. Sladojevic, M. Arsenovic, A. Anderla, D. Culibrk, and D. Stefanovic, “Deep neural networks based recognition of plant diseases by leaf image classification,” Comput. Intell. Neurosci., vol. 2016, 2016, doi: 10.1155/2016/3289801
[8] S. P. Mohanty, D. P. Hughes, and M. Salathé, “Using deep learning for image-based plant disease detection,” Front. Plant Sci., vol. 7, 2016, doi: 10.3389/fpls.2016.01419
[9] E. C. Too, L. Yujian, S. Njuki, and L. Yingchun, “A comparative study of fine-tuning deep learning models for plant disease identification,” Comput. Electron. Agric., vol. 161, pp. 272–279, 2019, doi: 10.1016/j.compag.2018.03.032
[10] X. Zhang, Y. Qiao, F. Meng, C. Fan, and M. Zhang, “Identification of maize leaf diseases using improved deep convolutional neural networks,” IEEE Access, vol. 6, pp. 30370–30377, 2018, doi: 10.1109/ACCESS.2018.2844405
[11] Y. Lu, S. Yi, N. Zeng, Y. Liu, and Y. Zhang, “Identification of rice diseases using deep convolutional neural networks,” Neurocomputing, vol. 267, pp. 378–384, 2017, doi: 10.1016/j.neucom.2017.06.023
[12] G. Zhou, W. Zhang, A. Chen, M. He, and X. Ma, “Rapid detection of rice disease based on FCM-KM and Faster R-CNN fusion,” IEEE Access, vol. 7, pp. 143190–143206, 2019, doi: 10.1109/ACCESS.2019.2943454
[13] P. K. Sethy, N. K. Barpanda, A. K. Rath, and S. K. Behera, “Rice false smut detection based on Faster R-CNN,” Indones. J. Electr. Eng. Comput. Sci., vol. 19, no. 3, pp. 1590–1595, 2020, doi: 10.11591/ijeecs.v19.i3.pp1590-1595
[14] S. P. and J. Sil, “Rice disease identification using pattern recognition techniques,” 11th Int. Conf. Comput. Inf. Technol. Khulna, Bangladesh, pp. 420–423, 2008, doi: doi: 10.1109/ICCITECHN.2008.4803079
[15] S. Ramesh and D. Vydeki, “Recognition and classification of paddy leaf diseases using optimized deep neural network with Jaya algorithm,” Inf. Process. Agric., vol. 7, no. 2, pp. 249–260, 2020, doi: 10.1016/j.inpa.2019.09.002
[16] A. Fuentes, S. Yoon, S. C. Kim, and D. S. Park, “A robust deep-learning-based detector for real-time tomato plant diseases and pests recognition,” Sensors, vol. 17, no. 9, 2017, doi: 10.3390/s17092022
[17] K. P. Ferentinos, “Deep learning models for plant disease detection and diagnosis,” Comput. Electron. Agric., vol. 145, pp. 311–318, 2018, doi: 10.1016/j.compag.2018.01.009
[18] J. G. A. Barbedo, “Plant disease identification from individual lesions and spots using deep learning,” Biosyst. Eng., vol. 180, pp. 96–107, 2019, doi: 10.1016/j.biosystemseng.2019.02.002
[19] S. Coulibaly, B. Kamsu-Foguem, D. Kamissoko, and D. Traore, “Deep neural networks with transfer learning in millet crop images,” Comput. Ind., vol. 108, pp. 115–120, 2019, doi: 10.1016/j.compind.2019.02.003
[20] A. Picon et al., “Deep convolutional neural networks for mobile capture device-based crop disease classification in the wild,” Comput. Electron. Agric., vol. 161, pp. 280–290, 2019, doi: 10.1016/j.compag.2018.04.002
[21] W. H. Zeng et al., “Identification of maize leaf diseases using SKPSNet-50 Convolutional Neural Network Model,” Sustain. Comput. Informatics Syst., vol. 35, 2022, doi: 10.1016/j.suscom.2022.100695
[22] G. O. and E. Mwebaze, “Machine learning for plant disease incidence and severity measurements from leaf images,” in Proc. IEEE ICMLA, 2016, pp. 158–163, doi: 10.1109/ICMLA.2016.0034
[23] G. Polder et al., “Automatic detection of tulip breaking virus (TBV) using a deep convolutional neural network,” IFAC-PapersOnLine, vol. 52, no. 30, pp. 12–17, 2019, doi: 10.1016/j.ifacol.2019.12.482
[24] M. Shafay et al., “Recent advances in plant disease detection: Challenges and opportunities,” Plant Methods, vol. 21, no. 1, 2025, doi: 10.1186/s13007-025-01450-0
[25] A. Y. Ashurov et al., “Enhancing plant disease detection through deep learning,” Front. Plant Sci., vol. 15, 2024, doi: 10.3389/fpls.2024.1505857
[26] S. U. Khan et al., “A review on automated plant disease detection: motivation, limitations, challenges, and recent advancements for future research,” 2025, doi: 10.1007/s44443-025-00040-3
[27] A. Upadhyay et al., “Deep learning and computer vision in plant disease detection,” Artif. Intell. Rev., vol. 58, 2025, doi: 10.1007/s10462-024-11100-x
[28] L. C. Ngugi, M. Abelwahab, and M. Abo-Zahhad, “Recent advances in image processing techniques for automated leaf pest and disease recognition,” Inf. Process. Agric., vol. 8, no. 1, pp. 27–51, 2021, doi: 10.1016/j.inpa.2020.04.004
[29] P. Sridhar and P. Angamuthu, “Enhancing image based classification for crop disease detection using a multiclass SVM approach with kernel comparison,” Scientific Reports, vol. 15, art. no. 40055, 2025, doi: 10.1038/s41598-025-23568-w
[30] T. W. Mew, “Current status and future prospects of research on bacterial blight of rice,” Annual Review of Phytopathology, vol. 25, pp. 359–382, 1987, doi: 10.1146/annurev.py.25.090187.002043
[31] D. W. Utami et al., “The pathogenicity and genetic diversity of Indonesian blast pathogen from wide host ranges of rice sub-species,” J. Plant Pathol., vol. 107, pp. 661–673, 2025, doi: 10.1007/s42161-024-01815-9
[32] S. Ren, K. He, R. Girshick, and J. Sun, “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 39, no. 6, pp. 1137–1149, 2017, doi: 10.1109/TPAMI.2016.2577031
[33] B. Liu, W. Zhao and Q. Sun, "Study of object detection based on Faster R-CNN," 2017 Chinese Automation Congress (CAC), Jinan, China, 2017, pp. 6233-6236, doi: 10.1109/CAC.2017.8243900
[34] Z. Zou, K. Chen, Z. Shi, Y. Guo and J. Ye, "Object Detection in 20 Years: A Survey," in Proceedings of the IEEE, vol. 111, no. 3, pp. 257-276, March 2023, doi: 10.1109/JPROC.2023.3238524
[35] Y. Amit, P. Felzenszwalb, and R. Girshick, “Object detection,” in Computer Vision: A Reference Guide, Cham, Switzerland: Springer International Publishing, 2021, pp. 875–883, doi: 10.1007/978-3-030-63416-2_660
[36] M. Tan and Q. V. Le, “EfficientNet: Rethinking model scaling for convolutional neural networks,” in Proc. Int. Conf. Machine Learning (ICML), 2019, pp. 6105–6114, doi: 10.48550/arXiv.1905.11946
[37] Z. Liu et al., "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows," 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 2021, pp. 9992-10002, doi: 10.1109/ICCV48922.2021.00986
[38] T. Ahad et al., “Comparison of CNN-based architectures for rice disease classification,” Artif. Intell. Agric., vol. 9, pp. 22–35, 2023, doi: 10.1016/j.aiia.2023.07.001
[39] C. K. Sunil, C. D. Jaidhar, and N. Patil, “Systematic study on deep learning-based plant disease detection or classification,” Artificial Intelligence Review, vol. 56, pp. 14955–15052, 2023, doi: 10.1007/s10462-023-10517-0
[40] M. H. Saleem, J. Potgieter, and K. M. Arif, “Plant disease detection and classification by deep learning,” Plants, vol. 8, no. 11, p. 468, 2019, doi: 10.3390/plants8110468
[41] A. G. Sebastián et al., “Enhancing plant disease detection: Incorporating Advanced CNN Architectures for Better Accuracy and Interpretability,” Int. J. Comput. Intell. Syst., 2025, doi: 10.1007/s44196-025-00835-2
[42] F. Martinelli, R. Scalenghe, S. Davino et al., “Advanced methods of plant disease detection: A review,” Agronomy for Sustainable Development, vol. 35, pp. 1–25, 2015, doi: 10.1007/s13593-014-0246-1
[43] C. Zhang et al., “Lightweight multi-scale CNN for rice disease recognition,” 2023, doi: 10.32604/cmc.2023.027269
[44] A. S. V. and S. F. Sayyad, “Enhancing agricultural sorting systems through image-based classification and machine learning,” Multimed. Tools Appl., 2026, doi: 10.1007/s11042-026-21155-3
[45] Y. Xu et al., "FDViT: Improve the Hierarchical Architecture of Vision Transformer," 2023 IEEE/CVF International Conference on Computer Vision (ICCV), Paris, France, 2023, pp. 5927-5937, doi: 10.1109/ICCV51070.2023.00547
[46] A. Dosovitskiy et al., “An image is worth 16×16 words: Transformers for image recognition at scale,” arXiv preprint arXiv:2010.11929, 2020, doi: 10.48550/arXiv.2010.11929
Copyright (c) 2026 Monika Faswia Fahmi, Deni Tri Laksono, Achmad Fiqhi Ibadillah, Dedi Tri Laksono (Author)

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-ShareAlikel 4.0 International (CC BY-SA 4.0) that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).





