In the ever-evolving landscape of agricultural technology, a recent study published in *Agriculture Communications* has shed light on the capabilities of various You Only Look Once (YOLO) object detection algorithms in the challenging environment of commercial orchards. Led by Ranjan Sapkota from Cornell University’s Biological and Environmental Engineering department, the research provides a comprehensive evaluation of YOLOv8 through YOLOv12, offering insights that could revolutionize fruitlet detection and counting in agriculture.
The study, which systematically assessed all configurations of YOLO-based algorithms, revealed significant differences in performance across the board. YOLOv12l emerged as a standout in recall, achieving a remarkable 0.900, while YOLOv10x and YOLOv9 GELAN-c reported the highest precision scores of 0.908 and 0.903, respectively. Notably, YOLOv9 GELAN-base and GELAN-e achieved the highest mean Average Precision at 50% Intersection over Union (mAP@50) of 0.935, closely followed by YOLO11s and YOLOv12l.
One of the most compelling findings was the superior accuracy of YOLO11n in counting validation, with Root Mean Square Error (RMSE) values ranging from 4.51 to 4.96 and Mean Absolute Error (MAE) values from 3.85 to 7.73 across four apple varieties. This level of precision is a game-changer for orchard management, enabling farmers to make data-driven decisions that can significantly impact yield and profitability.
“The ability to accurately detect and count fruitlets in real-time can transform how we manage orchards,” said Sapkota. “This technology allows for precise monitoring of fruit development, optimizing the use of resources and ultimately improving the bottom line for growers.”
The study also highlighted the importance of sensor-specific training, as using Intel RealSense sensors further improved detection performance. Additionally, YOLO11n demonstrated the fastest inference speed at 2.4 milliseconds, making it an ideal candidate for real-time applications in the field.
The implications of this research are vast. As agricultural automation continues to advance, the ability to accurately detect and count fruitlets can streamline operations, reduce labor costs, and enhance overall efficiency. This technology could be particularly beneficial for large-scale commercial orchards, where manual counting is time-consuming and prone to human error.
Looking ahead, the findings suggest that YOLO-based algorithms, particularly the newer versions, hold immense potential for agricultural applications. As Sapkota noted, “The continuous evolution of these algorithms, coupled with advancements in sensor technology, will further enhance their accuracy and efficiency, paving the way for smarter, more sustainable farming practices.”
In conclusion, this study not only provides a detailed comparison of YOLO-based algorithms but also underscores their transformative potential in the agriculture sector. As the industry moves towards greater automation and precision, these findings will undoubtedly shape future developments, offering growers powerful tools to optimize their operations and achieve better outcomes.

