Abstract:Aiming to address the challenges of accuracy and efficiency in rice disease detection under natural conditions, a lightweight detection model, YOLO-RD, was presented based on an improved YOLO v5s framework. The model was optimized and successfully deployed on edge computing devices, enabling the creation of a portable device for fast rice disease detection. In the proposed model, GhostNet was integrated to reduce computational complexity and the number of parameters. Finally, the lightweight Shuffle Attention mechanism and the dynamic detection head DyHead were employed to enhance feature extraction and adaptive detection capabilities, particularly for complex disease features. Furthermore, the standard CIoU loss function was replaced by Shape-IoU to improve detection performance in challenging environments by focusing on shapebased regression. Experimental results demonstrated that YOLO-RD achieved a mean Average Precision (mAP) of 94.2%, while significantly reducing computational complexity and parameter size. Specifically, YOLO-RD reduced computation, parameters, and weight by 44.4%, 43.2%, and 41.3%, respectively, compared with the baseline model. In addition, the model outperformed detection models such as YOLO 11n, YOLO v8n, YOLO v5n, and others in terms of accuracy. When deployed on a raspberry Pi 4B edge computing device, YOLO-RD achieved an inference time of 1.97 s per image, meeting the requirements for real-time application. These findings suggested that YOLO-RD offered an efficient and robust solution for intelligent rice disease detection in practical agricultural scenarios.