North Dakota Researchers Use AI to Protect Strawberries from Frost

In the heart of North Dakota, where the plains meet the sky, Sunil GC, a researcher at the Department of Agricultural and Biosystems Engineering, North Dakota State University, is pioneering a new approach to safeguard one of the state’s most delicate crops: strawberries. His work, recently published in ‘Smart Agricultural Technology’ (which translates to ‘Intelligent Agricultural Technology’), delves into the use of multispectral imaging and AI-based computer vision to assess freeze damage in strawberries. This isn’t just about saving strawberries; it’s about harnessing technology to create more resilient crops in an era of climate change.

Imagine this: a sudden temperature drop in early fall or late spring, and your entire strawberry crop is at risk. This is a reality that farmers face more frequently due to climate change. Traditional methods of assessing freeze damage are time-consuming and prone to human error. Enter GC’s research, which explores both feature engineering and deep learning approaches to make these assessments more consistent and accurate.

GC and his team used multispectral (RGNIR) images of strawberry plants grown in a greenhouse setting. They extracted various vegetative indices, such as the modified chlorophyll absorption ratio index (MCARI), modified transformed vegetation index (MTVI), and normalized difference vegetation index (NDVI). These indices play a significant role in classifying frost damage. “Vegetative indices provide a quantitative measure of plant health and stress,” GC explains. “By analyzing these indices, we can better understand the extent of frost damage and develop strategies to mitigate it.”

The team trained machine learning (ML) and deep learning (DL) models on these images. For ML models, 80% of the 493 images were used for training and 20% for testing. For DL models, the data was divided into training, validation, and testing sets using a 70:15:15 ratio. The results were impressive. Support vector machine (SVM) with backward feature elimination outperformed other machine learning algorithms, achieving an F1-score of 87%. Among deep learning models, DenseNet-169 achieved the highest F1-score of 93%.

But what does this mean for the future of agriculture? GC’s work shows that the integration of RGNIR images with computer vision and AI algorithms can be effective in classifying frost damage. This technology could revolutionize how farmers monitor and protect their crops, leading to more resilient and sustainable agricultural practices. As GC puts it, “This technology has the potential to transform the way we approach crop management. By providing accurate and timely assessments of frost damage, we can help farmers make informed decisions and reduce crop losses.”

This research is a significant step forward in the field of agritech. It opens the door to more precise, data-driven approaches to agriculture, which could have far-reaching implications for the energy sector as well. As we strive for more sustainable and efficient farming practices, technologies like these will be crucial in shaping the future of agriculture and energy.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
×