Type

Data source

Date

Thumbnail

Search results

2 records were found.

Introducing robotics in agriculture can allow a rise in productivity and a reduction in costs and waste. Its capabilities can be enhanced to or above the human level, enabling a robot to function as a human does, but with higher precision, repeatability, and with little to no effort. This paper develops a detection algorithm of peach trunks in orchard rows, as autonomous navigation and anti-bump auxiliary system of a terrestrial robotic rover for agricultural applications. The approach involved computational vision, more specifically, the creation of an object detection model based on Convolutional Neural Networks. The framework of the algorithm is Tensorflow, for implementation in a Raspberry Pi 4. The model’s core is the detection system SSD MobileNet 640×640 with transfer learning from the COCO 2017 database. 89 pictures were captured for the database of the model, of which 90% were used for training and the other 10% for testing. The model was converted for mobile applications with a full integer quantization, from 32floatto uint8, and it was compiled for Edge TPU support. The orientation strategy consists of two conditions: a double detection forms a linear function, represented by an imaginary line, which updates every two simultaneous trunks detected. Through the slope of this function and the horizontal deviation of a single detected bounding box from the created line, the algorithm orders the robot to adjust the orientation or keep moving forward. The arithmetic evaluation of the model shows a precision and recall of 94.4%. After the quantization, the new values of these metrics are 92.3 and 66.7%, respectively. These simulation results prove that, statistically, the model can perform the navigation task.
As a result of the increasing world population and the growing demand for food, there is a great need to increase agricultural productivity. However, the problem is aggravated by the population migration from rural areas to cities, which causes a decrease in the workforce in the agricultural sector. In this regard, within the concept of agriculture 4.0, the introduction of autonomous robots in agricultural activities may face these problems, supporting the growing lack of labor and promoting increased agricultural productivity. This work exposes the algorithm used to perform autonomous navigation, based on global positioning systems (GPS), of the robotic rover for agricultural applications (R2A2) multitasking aimed at performing herbicide spraying. An augmented reality (AR) web application has also been developed to assist in the supervision of autonomous vehicles. A code in C/C++ was developed for the autonomous movement of the robotic platform using an Arduino Mega2560 as the main microprocessor and the AR web application based on the positioning which was developed using the AR.JS libraries and the A-FRAME framework compiled in a code HTML. The application was tested in a peach orchard and presented an average of approximately 94% of correct answers, which reveals the accuracy of the technological solution developed. Also exposed are the results and conclusions of the autonomous movement algorithm and the web application.