As the global population surges and climate change tightens its grip, traditional agricultural practices are under immense pressure. By 2050, food production must rise by 70% to meet demand. Enter “Agriculture 4.0,” a wave of innovation that includes the Internet of Things (IoT), artificial intelligence (AI), unmanned aerial vehicles (UAVs), and robotics, all aimed at boosting efficiency and sustainability. However, these technologies often operate in isolation, creating data silos and operational gaps.
A groundbreaking review by Professor Lauren Genith Isaza Domínguez and colleagues from Universidad de Los Llanos in Colombia suggests that augmented reality (AR) could be the key to unifying these disparate systems. Published in *Frontiers of Agricultural Science and Engineering*, the review explores how AR can act as a “central interface,” integrating IoT, UAVs, agricultural robots, edge computing, and AI into a cohesive, data-driven smart agricultural system.
AR technology overlays digital information onto the real-world environment, offering farmers real-time data through smart glasses or mobile devices. For instance, when UAVs or ground sensors detect crop anomalies, the AR interface can guide farmers to the affected area. Edge AI then analyzes the data, providing specific recommendations for action, whether manual intervention or deploying agricultural machinery.
Current applications of AR in agriculture are already showing promising results. An AR-integrated strawberry harvesting system achieves a 93% success rate in identifying ripe fruits, while an AR-based irrigation system uses sensor data and machine learning to optimize water usage. Additionally, AR integrated with UAVs allows farmers to monitor large-scale farms, view crop health, and conduct collaborative diagnostics, enhancing decision-making efficiency.
Despite these advancements, challenges remain. Real-time data processing capabilities are often insufficient, AI models struggle with real-world adaptability, and there is a lack of standardized communication between heterogeneous devices. The stability of gesture-based AR interactions in complex environments, the technical framework for multi-UAV collaborative operations, and the development of low-cost AR solutions for resource-constrained regions are all areas that require further innovation.
To address these issues, the authors propose several future directions, including building low-latency data pipelines, developing explainable AI interaction interfaces, advancing collaborative control between UAV swarms and AR, designing lightweight edge AI models, and enhancing data privacy through federated learning. The review underscores that AR should be seen not just as a visualization tool but as an integrated interface connecting on-site agricultural sensing, analysis, decision-making, and execution.
As AR technology matures and integrated solutions are verified, it holds the potential to revolutionize agriculture, particularly for resource-limited smallholder farmers. By lowering the threshold for adopting smart agriculture applications, AR could help farmers worldwide meet the challenges of a rapidly changing world.

