In the heart of China’s tea-growing regions, a technological revolution is brewing, one that could reshape the way tea gardens are managed and monitored. Researchers have developed a sophisticated deep learning framework designed to recognize fine-grained picking actions in tea gardens, addressing long-standing challenges in agricultural behavior detection.
The study, published in the journal *Agriculture*, introduces a novel approach to tackling the complexities of tea plantation environments. “Existing behavior detection systems often struggle with dense vegetation, variable lighting, and diverse human–machine interactions,” explains lead author Ru Han from the College of Smart Agriculture at Nanjing Agricultural University. “Our framework is specifically tailored to overcome these obstacles, providing a robust solution for real-world agricultural settings.”
The research team constructed a large-scale, annotated dataset comprising 12,195 images across seven behavior categories, collected from both field and web sources. This dataset captures a diverse range of geographic, temporal, and environmental conditions, ensuring the model’s adaptability to various scenarios.
To enhance detection accuracy, the researchers integrated an Exponential Moving Average (EMA) attention mechanism, Complete Intersection over Union (CIoU) loss, and Atrous Spatial Pyramid Pooling (ASPP) into the YOLOv5 model. This combination resulted in a 73.6% mean Average Precision ([email protected]), representing an 11.6% relative improvement over the baseline model. “This significant enhancement in detection accuracy under complex tea garden conditions opens up new possibilities for operational monitoring and intelligent management,” Han notes.
In addition to the enhanced YOLOv5 model, the team proposed an SE-Faster R-CNN model by embedding Squeeze-and-Excitation (SE) channel attention modules and anchor box optimization strategies. This model further boosts performance in complex scenarios, demonstrating the framework’s versatility and effectiveness.
The practical deployability of the system is enhanced by a lightweight visual interface for real-time image and video-based detection. This interface allows for seamless integration into existing agricultural workflows, providing real-time insights and actionable data.
The commercial implications of this research are substantial. By enabling precise behavior recognition in tea gardens, the framework can optimize labor allocation, improve operational efficiency, and enhance overall productivity. “This technology has the potential to revolutionize the way tea gardens are managed, leading to more sustainable and profitable agricultural practices,” Han explains.
As the agriculture sector continues to embrace smart technologies, the development of robust behavior recognition systems will play a crucial role in shaping the future of the industry. This research not only addresses current challenges but also paves the way for further advancements in agricultural monitoring and management.
With the framework’s proven effectiveness, robustness, and real-time potential, it is poised to become a valuable tool for tea garden operators and agricultural professionals worldwide. As the technology continues to evolve, it will undoubtedly contribute to the ongoing transformation of the agriculture sector, driving innovation and sustainability in the years to come.

