Have a personal or library account? Click to login
Advancements in Industry-Agriculture 5.0: Utilizing Unmanned Ground and Aerial Vehicles for Sustainable Precision Agriculture Cover

Advancements in Industry-Agriculture 5.0: Utilizing Unmanned Ground and Aerial Vehicles for Sustainable Precision Agriculture

Open Access
|Dec 2025

Full Article

1.
Introduction

With the advent of the Fourth Industrial Revolution, the integration of technology and agriculture has given birth to what is now termed IndustryAgriculture 5.0 (IA5.0). The evolution of contemporary farming practices is defined by a profound shift, placing technological innovation at the core of efforts to achieve sustainability, precision, and heightened productivity. Within this transformative landscape, the essential deployment of Unmanned Ground Vehicles (UGV) and Unmanned Aerial Vehicles (UAV) emerges as a cornerstone in the realization of IA5.0.

In modern agriculture, the advent of IA5.0 is signified by the critical role played by technological advancements in ensuring sustainable and precise farming practices. The usage of UGV and UAV as integral components of Industry-Agriculture 5.0 is explored in this study, with a focus on their applications in sustainable precision agriculture. UGVs and UAVs have become indispensable tools for the modern farmer, facilitating data-driven decision-making, optimized resource management, and increased crop yields. In this context, they perform a diverse range of tasks, including soil and crop monitoring, pest control, irrigation management, and yield prediction. Their ability to gather real-time data through sensors and imaging technologies equips farmers with valuable insights, enhancing the efficiency and sustainability of agricultural operations.

Robotics is rapidly finding its place in everyday life with the advancement of technology. Agricultural robots have been developed by various universities and institutes, and efforts to commercialize these robots are ongoing. One of the main reasons for the delayed adoption of robots in the agriculture sector is the high level of variability in agricultural practices, stemming from factors like crop types, soil, water, and climate conditions. These variations significantly impact the success of robotic technologies, limiting their widespread use in the agricultural sector.

In this study, we explore the integration of UGVs and UAVs into the agricultural landscape. These autonomous vehicles, equipped with state-of-the-art sensors and imaging technologies, have emerged as indispensable tools for modern agriculture. Their role extends beyond mechanization, offering farmers the means to make data-informed decisions, optimize resource allocation, and maximize crop yields. Such capabilities underline the potential for sustainable and precise farming practices in an era marked by demand for agricultural products.

This article explores diverse tasks and applications undertaken by these intelligent machines, including but not limited to soil and crop monitoring, pest management, irrigation control, and predictive analytics for yield optimization. The real-time data they collect empowers farmers with invaluable insights, fostering a more efficient and sustainable agricultural ecosystem.

This study not only underscores the significance of incorporating UGVs and UAVs into IA5.0 but also examines the broader implications of this integration. Beyond improving agricultural efficiency, it offers prospects for environmental conservation, resource preservation, and global food security. However, the journey is not without its challenges and uncertainties, which we will address in our comprehensive exploration of this technology.

The integration of UGVs and UAVs into IA5.0 represents a critical shift towards data-driven, environmentally responsible farming. It embodies the promise of a more sustainable and productive future for the agricultural sector, serving as an example of optimism in addressing the pressing challenges facing the global food supply chain in the 21st century.

Using advanced technologies, robots have the potential to transform precision pesticide spraying applications into the field of precision farming. Given the need for targeted pesticide application in agricultural pest control and the opportunities presented by modern technology, an innovative pest control system has been developed for precise pesticide spraying. The primary goal of this study is to design and implement a precision pesticide spraying system that utilizes a synchronized combination of UGVs and UAVs, offering an alternative to traditional spraying methods.

2.
Literature Review

The increasing number of contemporary literary trends underscores the critical necessity for research in this ever-evolving field. Agriculture 5.0 integrates advanced technologies such as Artificial Intelligence (AI), Internet of Things (IoT), and Machine Learning (ML) to support agricultural productivity, sustainability, and informed decision-making. This transformation automates diverse tasks, encompassing crop protection, yield prediction, and pest control, with the assistance of UGV and UAV. However, despite its promising potential, Agriculture 5.0 faces challenges related to adaptability and concerns surrounding data privacy [1,2].

Traditional approaches alone fall short in providing comprehensive protection for agricultural fields against pests and diseases. It’s evident that relying solely on conventional spraying techniques not only diminishes the efficiency of crop yields but also creates detrimental effects on agricultural ecosystems. These methods, characterized by their excessive chemical usage, bear adverse consequences on soil quality, plant health, and the vitality of plant roots. Furthermore, traditional methods prove inadequate when applied to the field of precision farming. A comparison between conventional and precision farming applications is presented in the following table, illustrating the stark disparities between these two approaches [3,4].

In traditional farming, the field is perceived as a homogeneous arable site, leading to uniform practices in nutrient management, plant protection, sowing, and machine operation.

Plant stock is managed as a unified entity, with decision-making influenced by relatively limited data. In contrast, precision farming treats the arable site as field-level heterogeneous, using GPS and pointlike sample taking for precise nutrient and plant protection management. Sowing practices are plant species and variety-specific, while machine operation is adjusted to the site’s unique conditions. Plant stock is organized into homogeneous blocks, and decisionmaking benefits from a wealth of data, resulting in more informed and efficient practices.

Agricultural technology encompasses a wide array of functions, spanning primary operations, crop monitoring, precision agriculture, livestock management, irrigation control, weed management, soil sampling, harvesting, fertilizer application, and data collection. These technological advancements introduce flexibility, minimize environmental footprints, hold the potential to reduce labor demands, and can result in substantial long-term cost efficiencies. Effective integration with other advanced technologies is critical for farm management, amplifying overall efficiency and sustainability in contemporary agriculture [5].

Open data resources (Table 2) play a key role in advancing smart agriculture by providing valuable information to farmers and researchers. These openly accessible datasets encompass a wide range of agricultural aspects, from crop production and climate monitoring to soil quality and pest control. This data-driven approach serves as a catalyst for empowering farmers to optimize their farming techniques while supporting the agriculture sector. This study explores the diverse landscape of open data sources in the scope of smart agriculture, highlighting their significance in enhancing sustainable farming practices and informed decision-making within the agricultural sector.

Table 1.

Comparison of Precision Farming and Conventional Farming Practices

FeaturePrecision FarmingConventional Farming
EfficiencyIncreased efficiency through techLower efficiency
Resource ConservationMinimizes resource overuseResource-intensive
Data-Driven Decision-MakingInformed decisions through data analysisExperience-based decisions
Labor RequirementsReduced labor needs (automation)Labor-intensive
Environmental ImpactReduced environmental impactEnvironmental concerns
Initial CostsHigher initial costsLower initial costs
ComplexityComplex data managementSimplicity and familiarity
SustainabilityEnhanced sustainabilityReduced sustainability
Table 2.

Open Data Resources in the Field of Smart Agriculture [6]

Data SourceDescription
USDA National Agricultural Statistics Service (NASS) DataCrop production, livestock statistics, and agricultural information in the United States
European Space Agency’s Sentinel DataHigh-resolution satellite imagery for crop monitoring and land use
NOAA Climate DataWeather and climate data, including forecasts, rainfall, and temperature
Global Open Data for Agriculture and Nutrition (GODAN)Open datasets related to agriculture, food security, and nutrition
NASA EarthdataRemote sensing data for monitoring climate, soil moisture, and vegetation
FAO DataAgricultural data and information on global food production and trade
IoT Sensor DataSensor data, such as soil moisture and temperature, from IoT devices in fields
Crop and Soil DatabasesInformation on crop yields, soil quality, and nutrient levels
Agricultural Research InstitutionsData from research institutions on crop trials, pest monitoring, and experiments
OpenWeatherMapWeather data, including current conditions, forecasts, and historical records
Government Agricultural PortalsInformation on farming practices, subsidies, and agricultural policies
Community-Generated DataUser-generated data sharing experiences, practices, and insights

Embracing the era of Agriculture 5.0 involves the integration of transformative technologies like artificial intelligence (AI), the Internet of Things (IoT), and machine learning (ML) into the agricultural landscape. The overarching objective is to not only bolster productivity and sustainability but also to refine the very essence of decision-making processes within the sector. Agriculture 5.0 seeks to usher in a new era of automation and scalable technological solutions, effectively mitigating risks, fortifying sustainability, and empowering predictive decision-making in agriculture [7,810].

In the field of crop protection, AI and robotics have wrought a revolution, introducing automation into monitoring and enabling early pest detection while delivering precisely targeted treatments for enhanced yields and sustainable farming practices. Semantic recognition now swiftly identifies threats, and sophisticated algorithms, bolstered by machine learning capabilities, facilitate timely interventions, ultimately enhancing crop health [7,1114].

Recent years have witnessed the application of various ML techniques for crop yield prediction, including support vector regression, k-nearest neighbor methods, and more [7,1518]. Deep learning excels at extracting key characteristics to enhance precise yield estimation, even in resource-constrained rural areas [7,19].

An exhaustive analysis featured in references [7, 20] examined the challenges associated with the global implementation of smart agri-robotic solutions. These difficulties encompass factors such as crop diversity, the adaptability of robotics, and complex decision-making. The proposed solution involved a transformative shift, embracing advanced technologies like machine learning, computer vision, and sensor networks to transform smart agri-robotics.

Intelligent robot systems in ecological agriculture were explored, with a focus on their role in pest control and sustainable farming. The integration of robotics, sensors, AI, and imaging technology fosters precise pest management, reduces pesticide usage, and enhances efficiency, all while addressing considerations like cost and compatibility with existing agricultural practices [7,21].

The essential role of robotics, IoT, and AI in automating agriculture was highlighted, encompassing applications such as automated harvesting, precision agriculture, and smart irrigation systems. These technologies increase efficiency, productivity, and sustainability, with a specific emphasis on data analytics and machine learning for intelligent decisionmaking. Challenges in this area include cost, scalability, interoperability, and data privacy [7,22].

Furthermore, an overview of machine learning and emerging technologies in precision crop protection was provided, with a resolute aim to drive Agriculture 5.0. They focused on their capacity to enhance agriculture, promote sustainability, and optimize crop protection practices [7,23].

As the demand for efficient monitoring and surveillance continues to rise across diverse settings, including industrial facilities and public spaces, the adoption of thermal UAV technology has gained substantial attention [24]. Thermal imaging, as illustrated in Figure 1, is now frequently integrated into agricultural drones, providing a unique aerial perspective of fields and livestock.

Figure 1.

Thermal measurements and their results in vineyards [7,25]

Drones equipped with thermal cameras enable fast and comprehensive monitoring of extensive agricultural areas [25]. Moreover, with the ongoing advancement of thermal imaging technology, its applications in agriculture are continually expanding. By researching into these agricultural applications of thermal imaging, farmers and agricultural professionals can harness the full potential of this technology to make well-informed decisions, enhance productivity, and promote sustainable farming practices.

Both UAVs and UGVs are part of the growing trend of smart farming (Figure 2), where technology and data-driven approaches are used to optimize agricultural practices. These unmanned vehicles have the potential to increase efficiency, reduce costs, and promote sustainability in agriculture. Sustainability, both in terms of resource utilization and long-term agricultural viability, is a core focus in the adoption of these technologies. Table 3 provides a comparison between UAVs and UGVs in agriculture. Table data serve as a valuable reference point for farmers and researchers seeking to evaluate the most suitable unmanned vehicle for their specific needs for smart farming.

Figure 2.

UAVs and UGVs in agriculture [7,25]

Table 3.

Comparison of UAVs and UGVs in Agricultural Farm Operations

FeatureUAVs (Unmanned Aerial Vehicles)UGVs (Unmanned Ground Vehicles)
MobilityOperate in the air, offering a bird’s-eye view ofthe entire field.Operate on the ground, moving at ground level and typically covering less area.
Data CollectionUseful for aerial imaging, crop monitoring, and capturing visual and multispectral data.Suited for close-up inspections, soil sampling, and carrying sensors at ground level.
Field CoverageCan cover large areas efficiently and quickly, making them ideal for large farms.Better for smaller, precision tasks and for navigating through tighter spaces.
Real-time MonitoringCan provide real-time data and immediate insights from above.May require additional data processing or time to convey ground-level information.
AccessibilityEasily access difficult-to-reach areas and fields with various terrains.Limited by obstacles and may face challenges on uneven terrain.
Crop Health AssessmentEfficient for monitoring crop health, identifying stress, and assessing overall field conditions.Useful for proximity analysis, detecting plant diseases, and assessing soil quality.
Data AccuracyOffers high-resolution imaging and data collection capabilities, especially with advanced sensors.May have limitations in capturing high-resolution data, depending on the UGV’s setup.
CostUAVs can be more cost-effective for large-scale surveillance of vast areas.UGVs can be cost-effective for specific, targeted tasks and smaller plots.
VersatilityWell-suited for scouting and monitoring tasks but not for physical intervention or soil manipulation.Can perform tasks like weeding, planting, and soil sampling directly.
LimitationsAffected by weather conditions and regulations related to airspace.Constrained by the limitations of ground mobility and potential obstacles.

UAVs and UGVs have revolutionized agricultural practices, each offering distinct advantages. UAVs excel in large-scale data collection and crop monitoring from above, providing valuable insights for precision farming on expansive fields. On the other hand, UGVs demonstrate efficiency in proximity, enabling targeted, ground-level tasks like planting, weeding, and localized crop health assessment. Decisions regarding the use of UAVs or UGVs in agriculture should be based on the specific needs and objectives of the farm, as both technologies contribute significantly to optimizing efficiency and sustainability in modern farming.

UAVs and UGVs have revolutionized agricultural practices, each offering distinct advantages. UAVs excel in large-scale data collection and crop monitoring from above, providing valuable insights for precision farming on expansive fields. On the other hand, UGVs demonstrate effectiveness in proximity tasks, enabling targeted, ground-level activities such as planting, weeding, and localized crop health assessment. Decisions regarding the use of UAVs or UGVs in agriculture should be based on the specific needs and objectives of the farm, as both technologies contribute significantly to optimizing efficiency and sustainability in modern farming.

In recent years, the application of machine learning models has rapidly transformed the agricultural landscape, leading to data-driven precision farming. The integration of advanced data analytics and machine learning techniques into agro-ecosystems has opened possibilities for optimizing crop management, resource utilization, and sustainability.

From predicting crop yields and identifying pest outbreaks to enhancing soil quality and enabling autonomous farming, these models play a crucial role in revolutionizing the way we cultivate and manage agricultural resources. By exploring these diverse applications, we gain insights into how technology and data-driven approaches are reshaping the agricultural sector, paving the way for more efficient, productive, and environmentally sustainable practices.

3.
Materials and Method

This experimental study involves the design and manufacturing of the vineyard robot, which is then tested for functionality in laboratory conditions and in field conditions.

3.1.
Mechanical Design, Analysis, and Manufacturing

Designs in ASRobot were created using Inventor and SolidWorks software, including sketching, solid modeling, assembly, static/dynamic analysis, technical drawings, and manufacturing drawings. The robot has been designed to be very lightweight for extended operation and easy maneuverability. Its dimensions have been set to make it suitable for use in all vineyards. Additionally, a wheeled version of the robot was designed by making modifications only to the walking components, without changing the main chassis (Figures 3 and 4). These additional modifications in terms of electronic systems and software do not result in any extra cost or time delay.

Figure 3.

Design of the Tracked Robot Platform

Figure 4.

Isometric and Three-Dimensional View of the Wheeled Robot Platform

3.2.
Electronic Design, Analysis, and Manufacturing

This phase encompasses the creation and optimization of electronic circuits that govern the robot’s functionality, ensuring efficient and precise control. The electronic circuits of the robot were created and simulated using Proteus software. Circuits that had successful simulations in Proteus were manufactured as PCBs and tested on the motors. The designed circuits included a single motor driver circuit, two motor driver circuits (for driving the motors in locomotion), and three motor driver circuits (for controlling the motors in the robot arm used for the sensing system) (Figures 57). Motor control was based on the signal received from the joystick. When the system was set to automatic mode, the joystick was disconnected, and control was shifted to RTK GPS and Stereo Visionbased control. Additionally, a general input-output circuit was designed, and the PCB for this circuit, which had a successful simulation, was manufactured.

Figure 5.

Circuit for Motor Control

Figure 6.

Circuit for Controlling Two Motors for Locomotion

Figure 7.

Circuit for Controlling Two Motors for Locomotion

3.3.
Software Development

The software development phase was carried out using two different approaches. In the initial phase of the project, coordinates, including latitude, longitude, and elevation, were obtained for controlling the robot using SAFAR software. After acquiring RTK GPS data, route planning within the vineyard was performed manually.

These coordinates were compared with SAFAR coordinates. Software development for both RTK GPS–controlled movement and Stereo Vision-based control of the robot was accomplished using Visual C++.

The robot platform was operated in manual and automatic modes within the vineyard (Figures 812). It functioned automatically through two different methods. The planned system operates based on machine vision and RTK GPS coordinate control. Preliminary software work and simulation testing for the RTK GPS-controlled system are depicted in Figures 1317.

Figure 8.

Determination of Agricultural Land [Safar]

Figure 9.

Drawing and Defining Field Boundaries [Safar]

Figure 10.

Determination of the Route, Adding Robots and Equipment [Safar]

Figure 11.

Running the Simulation [Safar]

Figure 12.

System Operation Through Integration [Safar]

Figure 13.

Office-Based Testing of Distance Measurement Using Stereo Imaging

Figure 14.

Results of Topography Determination Using Stereo Imaging Studies

Figure 15.

RTK-GPS Route Determination and Planning Study

Figure 16.

Simulation of Robot Maneuvering on RTK-GPS Route Determination and Planning

Figure 17.

Simulation of Robot Maneuvering on RTK-GPS Route Determination and Planning - Route Numbers

Table 4.

Comparison of UAVs and UGVs in Agricultural Farm Operations

Machine Learning ModelUse Case in Agro-Ecosystems
Linear RegressionCrop yield prediction, soil quality assessment
Decision TreesPest detection, crop disease identification
Random ForestCrop classification, yield forecasting
Support Vector MachinesWeed detection, precision agriculture
Neural NetworksWeather forecasting, crop monitoring, yield prediction
K-Nearest NeighborsSoil health assessment, precision agriculture
Naive BayesDisease risk assessment, crop disease identification
Clustering (K-Means)Crop grouping, yield pattern analysis
Principal Component Analysis (PCA)Dimensionality reduction for feature extraction
Long Short-Term Memory (LSTM)Time-series data analysis, weather prediction
Convolutional Neural Networks (CNN)Image-based pest and disease detection
Reinforcement LearningAutonomous farming, autonomous machinery control

Running simulation with Safar is a crucial step in testing the functionality and performance of the robot in a virtual environment. This phase allows us to explore various scenarios and assess how the robot responds to different conditions and challenges.

3.4.
Field Studies and Vineyard Trials

The following procedures have been planned to be followed for vineyard trials at the Manisa Viticulture Research Station. Within the scope of the research, the grape varieties “Sultani Çekirdeksiz”, “Siyah Kiomio”, “Crimson Seedless”, and “Cardinal” were grafted onto 1103 Paulsen rootstock.

3.5.
Method

The research was conducted with three replications, following a randomized complete block design. Each replication comprised six grapevines. To mitigate potential side effects resulting from various treatment applications, edge effect rows were deliberately left at the beginning and end of the experimental setup.

Figure 18.

Location Images of the Manisa Viticulture Research Station

This strategic arrangement aimed to ensure a comprehensive and accurate assessment of the treatments applied. The vines used in the experiment were tube-grown and planted in rows with a row spacing of 2.00 meters and an inter-row spacing of 3.00 meters on land owned by the Manisa Viticulture Research Station Directorate. Figure 19 depicts the experimental design for this study.

Figure 19.

Experimental Design

The study involved water stress applications at three different levels: full irrigation (control), stress 1, and stress 2. Accordingly, the available soil moisture content in the plant root zone was maintained at 75–100% for the control group, 50–75% for stress 1, and 25–50% for stress 2 applications. Subsurface drip irrigation was used as the irrigation method in the study. Drip irrigation laterals were buried 40 cm deep. For each application, a water meter and valve were installed to control water, and the supplied water was distributed to the plants through 20 mm diameter pipes. The pipes were equipped with pressure-regulated emitters with a flow rate of 3.5 liters per hour, and the emitter spacing was set at 40 cm. Laterals were placed in the row center and separately for each application (3 in total).

Soil moisture content was monitored throughout the season using profile probes (Figure 20). Profile Probes typically consist of multiple sensors at various depths along a probe, or probe rods that are inserted into the soil. These sensors measure parameters like soil moisture content, electrical conductivity, and temperature.

Figure 20.

Profile Probe

In this context, for each replication concerning the topics to be applied, a tube attached to a profile probe was placed, and soil moisture at effective root depth was regularly monitored throughout the season. Based on the measured values, the amount of useful soil moisture at the effective root depth was calculated, and irrigation was performed accordingly. Irrigation was carried out as follows:

  • -

    Full irrigation (control): Irrigation would be triggered when the useful soil moisture at the effective root depth reduced by 25%, maintaining soil moisture levels between 75-100%.

  • -

    Stress 1 application: Irrigation would be triggered when the useful soil moisture at the effective root depth reduced by 50%, maintaining soil moisture levels between 50-75%.

  • -

    Stress 2 application: Irrigation would be triggered when the useful soil moisture at the effective root depth reduced by 75%, maintaining soil moisture levels between 25-50%.

Soil moisture was maintained within the specified levels for each topic throughout the season. For this purpose, soil moisture was continuously monitored using profile probes, and irrigation was adjusted accordingly. Additionally, leaf water potential values were measured using a Scholander Pressure Chamber (Figure 21) to determine the timing of irrigation and to observe the plant’s water status under stress conditions.

Figure 21.

A portable Scholander Pressure Chamber

The trials were conducted in a designated area at the Manisa Viticulture Research Institute. Soil analysis results for the selected area are provided in Table 5.

Table 5.

Comparison of UAVs and UGVs in Agricultural Farm Operations

Depth (cm)Saturation (%)TextureTotal Salinity (%)PHCaCO3(%)Total N (%)
0–3030.00Sandy0.01137.805.600.11
30–6031.00Loamy0.01597.874.800.11
60–9037.00Loamy0.02827.888.000.09
ConditionNon-SalineSlightly AlkalineHighMedium
4.
Results

The creation of a vineyard robot prototype stands as an achievement in the trajectory of this research. The subsequent performance testing of the vineyard robot prototype serves as a crucial step in unlocking the advantages of automation and robotics in agriculture, ultimately enhancing productivity and endorsing sustainable farming practices.

The prototype underwent a comprehensive process, including design, manufacturing, and thorough testing across diverse parameters. These tests were instrumental in refining the robot’s capabilities and preventively addressing any potential challenges, aligning its performance with the predetermined project objectives. The intricacies of the robot platform, including detailed mechanical analysis and specifications, are meticulously presented in Figures 22 through 46, offering a comprehensive insight into the robot’s design and functionality.

Figure 22.

The design of the robot platform, its center of gravity, and dimensional specifications

Figure 23.

Dynamical Simulation Results of the Robot (Power)

Figure 24.

Dynamical Simulation Results of the Robot (Energy)

Figure 25.

Dynamical Simulation Results of the Robot (Motor Force)

Figure 26.

Static Analysis Results of the Robot (Stress)

Figure 27.

Static Analysis Results of the Robot (Displacement)

Figure 28.

Static Analysis Results of the Robot (Strain)

Figure 29.

Dynamic Analysis Results of the Robot (Stress)

Figure 30.

Dynamic Analysis Results of the Robot (Displacement)

Figure 31.

Dynamic Analysis Results of the Robot (Strain)

Figure 32.

The version operated with improved detection cards for calipers and ultrasonic sensors

Figure 33.

LIDAR detectino

Figure 34.

RTK-GPS to be used for guiding the robot

Figure 35.

The Standard GPS SPP graph

Figure 36.

Google Maps output (37°51’24.7”N 27°51’29.8”E 37.856858, 27.858279)

Figure 37.

Tracking Screen

Figure 38.

Analysis Screen

Figure 39.

Observation Screen

Figure 40.

RTK GPS positioning results 1

Figure 41.

RTK GPS positioning results 2

Figure 42.

Vineyard robot detection and control software

Figure 43.

Vineyard robot detection and control software - Sensor data

Figure 44.

Lidar measurement - Unobstructed conditions

Figure 45.

Lidar measurement - Obstructed conditions

Figure 46.

Lidar measurement - Two obstacles present

In this study, the process of obtaining and recording real-time kinematic global position data is explained, using the Piksi RTK GPS device acquired from Swift Navigation (Figure 36). RTK GPS, or Real-Time Kinematic Global Positioning System, is a highly precise satellite navigation technology used for applications that require centimeter-level accuracy. It’s widely utilized in fields such as surveying, precision agriculture, construction, and autonomous vehicle navigation.

The RTK GPS data was obtained in two stages. In the first stage, standard GPS data was acquired for both Piksi devices, verifying the accuracy of the devices.

In the second stage, RTK GPS data was obtained by comparing the satellite signals acquired by the Piksi devices capable of wireless communication. The tests conducted in these stages and the processes of obtaining position data are described below.

A Single Point Position (SPP) test is a basic positioning method using Global Navigation Satellite Systems (GNSS), such as the Global Positioning System (GPS) or other GNSS constellations like GLONASS, Galileo, or BeiDou.

In a SPP test, a receiver uses signals from multiple satellites to determine its position with a single receiver without the use of differential corrections, reference stations, or real-time kinematic (RTK) corrections.

The purpose of this test is to measure the accuracy of Piksi RTK GPS devices in determining positions within standard GPS limits. GPS data obtained for each Piksi device was collected, and this data was verified using the Google Maps application. This test was conducted separately for each Piksi device at two different locations. The data and coordinates obtained from this test are presented in table and graphic formats below.

The SPP graph obtained using the Piksi RTK GPS device, used similarly to a standard GPS device, is presented in Figure 35.

The results of the latitude and longitude obtained after the SPP test were entered in degrees into the Google Maps application, and the output is provided in Figure 36.

In this stage, the RTK GPS test was conducted using the Known Baseline Initialization method. In the test, one RTK GPS device was designated as the Piksi Base Station (Piksi 1 - Ground Station), and the other GPS device was designated as the Piksi Rover (Piksi 2 - Mobile).

The steps for obtaining RTK GPS data were as follows: Initially, both Piksi devices were positioned very close to each other and activated. Subsequently, each Piksi began to acquire standard GPS data by tracking at least 4 satellites.

Table 6.

Location Data Comparison Table

Single Point Position (SPP) location data obtained with Piksi RTK GPSLocation data obtained using Google Maps
Latitude (Decimal)37.856858385337.856858
Longitude (Decimal)27.858278572327.858279
Latitude (Degrees Minutes Seconds)37° 51’24.6902” N37° 51’ 24.7” N
Longitude (Degrees Minutes Seconds)27° 51’29.8029” E27° 51’ 29.8” E
Distance between the two acquired points (Decimal)0,00000039-0,00000043
Distance between the two acquired points (Degrees Minutes Seconds)0° 0’0.0014” N0° 0’ 0.0015” W
Distance between the two acquired points (Meters)0.06455 m

The satellite signals obtained by Piksi devices with a sufficient signal strength were transmitted between them using wireless communication (RF transceiver). The RTK GPS data obtained as a result of comparing satellite positions and the steps of determining the location are explained and shown in the following graphics.

  • a)

    Tracking – Monitoring Satellites: At least 4 satellites with a signal strength of 3 or higher are being tracked.

  • b)

    Solution – Both Piksi devices begin to obtain position solution results approximately 30 seconds after they have started tracking satellites with single point position.

  • c)

    Observation – Using the Piksi console, Piksi 1 was designated as the Base Station, and Piksi 2 as the Rover. Both Piksi devices were equipped with 3DR radio antennas (3D RF transceiver connection) and positioned very close to each other. Piksi 1 and Piksi 2 accurately sent and received satellite data to and from each other, which was then verified through two separate screens.

  • d)

    RTK GPS positioning results: Piksi 1 (Base Station) is indicated by a red cross (+), while Piksi 2 (Rover) is indicated by an orange cross (+). The movement of Piksi 2 relative to Piksi 1 can be observed in the east, west, north, and south directions.

  • e)

    In the process of determining the locations where the field vehicle will navigate in the study, the positioning data required can be obtained from the Piksi 2 (Rover) RTK GPS device and saved as an Excel (.csv) file.

As a result, it has become evident that Piksi RTK GPS is a reliable and effective tool for acquiring the desired coordinate data, underscoring its importance in precise positioning applications and data collection. Vineyard robot detection and control software (Figure 42) is available, along with a sensor data graph (Figure 43). Lidar measurements have been presented in the following figures (Figures 4446).

This study signifies a significant advancement in the field of agricultural robotics, marked by the successful development of versatile robotic systems, control architecture, and the realization of agricultural robots featuring both wheeled and tracked traction units. Within the scope of this study:

  • -

    An autonomous and manually operable robotic system has been developed.

  • -

    An original and universal robotic control architecture has been designed for agricultural robots.

  • -

    The design, manufacturing, and development processes of the agricultural robot with wheeled traction units have been successfully completed.

  • -

    The design, manufacturing, and development processes of the agricultural robot with tracked traction units have been successfully completed.

In conclusion, this study has successfully accomplished the development of a versatile and adaptable robotic system that can operate autonomously and manually. A novel and universal robotic control architecture was designed, enabling efficient control of the agricultural robots. Both wheeled and tracked traction units for agricultural robots were designed, manufactured, and improved with positive outcomes.

Figure 47.

Robot Control Architecture

Figure 48.

Wheel-Based Agricultural Robot

Figure 49.

Tracked Agricultural Robot

5.
Conclusion

In this experimental study, significant endeavors have been achieved. These accomplishments encompass the development of an innovative autonomous and manually operable robotic system, the design of a universally applicable robotic control architecture tailored for use in the agricultural robotics field, as well as the successful implementation of the design, manufacturing, and development processes for agricultural robots equipped with wheeled traction units. The choice of a control architecture depends on the specific requirements of the robot’s tasks and its level of autonomy.

Different types of robots, such as autonomous vehicles, or agricultural robots, may utilize various control architectures tailored to their unique functionalities and operational environments.

In parallel, this experimental study has also realized the same comprehensive development processes for agricultural robots employing tracked traction units. These achievements collectively mark a substantial advancement in the field of agricultural robotics, showcasing the successful integration ofversatile control systems and the realization of robust robotic platforms suitable for agricultural applications.

Recent years have seen a significant transformation in agriculture, shifting towards “IndustryAgriculture 5.0,” marked by the integration of information and communication technology. The use of IoT sensors, optimization techniques, and machine learning algorithms has guided smart farming, aiming to improve agricultural productivity while reducing costs and environmental impact. These systems collect vast real-time data for optimization and in-depth analysis. Smart farming techniques enabled by intelligent agricultural technology are leading the way.

Drones utilize scanning systems along predefined routes, enabling extensive coverage of agricultural fields. The effectiveness of this process is heavily reliant on computer vision technology. The captured images serve a multitude of purposes, including the generation of multi-spectral data for in-depth crop analysis, continuous monitoring of plant growth, identification of weeds, assessment of disease prevalence, and evaluation of overall crop health and drought impact. This approach has proven to be reasonably accurate in estimating the overall crop yield, showcasing the versatility and efficacy of drone technology in modern agriculture.

Future research attempts should prioritize the optimization of IoT integration, advanced data analytics, and the incorporation of Artificial Intelligence (AI) into decision-making processes. Efforts to enhance security and scalability in agricultural technologies are crucial areas for exploration, along with delving into supply chain management to further strengthen agricultural efficiency. Key practices include crop rotation, diversity, reduced tillage, and integrated pest management. Implementing these approaches is crucial for ecological sustainability and food security amid a growing global population, as these ecofriendly measures are indispensable for shaping a resilient and sustainable future that is vital for the well-being of our shared planet.

DOI: https://doi.org/10.14313/jamris-2025-034 | Journal eISSN: 2080-2145 | Journal ISSN: 1897-8649
Language: English
Page range: 35 - 51
Submitted on: Oct 16, 2023
|
Accepted on: Dec 4, 2023
|
Published on: Dec 24, 2025
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2025 Ismail Bogrekci, Pinar Demircioglu, published by Łukasiewicz Research Network – Industrial Research Institute for Automation and Measurements PIAP
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.