In the vast expanses of agricultural landscapes, particularly in large-scale monocultures like maize fields, monitoring and managing crop damage caused by wildlife have long been a formidable challenge. However, a groundbreaking study led by Sebastian Banaszek from the Institute of Geodesy and Cartography in Warsaw, Poland, offers a promising solution. Published in the journal ‘Czujniki’ (Sensors), the research introduces a semi-automated, cost-effective method for detecting wildlife-induced crop damage using RGB imagery acquired from unmanned aerial vehicles (UAVs).
The method, designed for non-specialist users, is fully integrated within the QGIS platform, a popular open-source geographic information system. Banaszek and his team calculated three vegetation indices—Excess Green (ExG), Green Leaf Index (GLI), and Modified Green-Red Vegetation Index (MGRVI)—based on a standardized orthomosaic generated from RGB images collected via UAV. “This approach allows for a detailed and accurate assessment of vegetation vigor, which is crucial for identifying areas affected by wildlife,” Banaszek explains.
The process involves an unsupervised k-means clustering algorithm that divides the field into five vegetation vigor classes. Within each class, 25% of the pixels with the lowest average index values are preliminarily classified as damaged. A dedicated QGIS plugin enables drone data analysts (DDAs) to adjust index thresholds interactively, based on visual interpretation. This flexibility ensures that the method can be tailored to specific field conditions and user expertise.
The effectiveness of the proposed procedure was validated on a 50-hectare maize field, where 7 hectares of damage (15% of the area) were identified. The results showed a high level of agreement between the automated and manual classifications, with an overall accuracy of 81%. “The highest concentration of damage occurred in the ‘moderate’ and ‘low’ vigor zones, which aligns with our expectations and field observations,” Banaszek notes.
The final products of this method include vigor classification maps, binary damage masks, and summary reports in HTML and DOCX formats with visualizations and statistical data. These outputs provide farmers and agricultural managers with valuable insights for decision-making and resource allocation.
The implications of this research are significant for the agricultural sector, particularly in the context of precision agriculture and wildlife population management. By offering a repeatable, cost-effective, and field-operable alternative to multispectral or AI-based approaches, this method can be easily integrated into existing agricultural practices. “This method not only enhances our ability to monitor and manage crop damage but also contributes to sustainable agriculture by minimizing the environmental impact,” Banaszek adds.
As the agricultural industry continues to evolve, the adoption of such innovative technologies will be crucial for optimizing crop yields and ensuring food security. The research published in ‘Czujniki’ (Sensors) by Banaszek and his team represents a significant step forward in this direction, paving the way for more efficient and sustainable agricultural management practices.