En-tête de navigationNavigation principaleSuiviFiche


Unité de recherche
OFAG
Numéro de projet
10.20_9
Titre du projet
Simultaneous Safety and Surveying for Collaborative Agricultural Vehicles (S3-CAV)
Titre du projet anglais
Simultaneous Safety and Surveying for Collaborative Agricultural Vehicles (S3-CAV)

Textes relatifs à ce projet

 AllemandFrançaisItalienAnglais
Mots-clé
Anzeigen
Anzeigen
-
Anzeigen
Description succincte
-
-
-
Anzeigen
Objectifs du projet
-
-
-
Anzeigen
Résumé des résultats (Abstract)
-
-
-
Anzeigen

Textes saisis


CatégorieTexte
Mots-clé
(Allemand)
Präzisionslandwirtschaft, landwirtschaftliche Fahrzeuge, Multisensorik, Drohnen
Mots-clé
(Anglais)
precision agriculture, agricultural vehicles, multi-sensoring, drones
Mots-clé
(Français)
agriculture de précision, véhicules agricoles, multi-senseurs, drones
Description succincte
(Anglais)

Precision farming relies on the ability to accurately locate the crops or leaves with problems and to accurately apply a local remedy without wasting resources or contaminating the environment. This project develops a unifying framework allowing incorporation of many different types of sensor data, methods for creating 3D maps and maximising map accuracy to facilitate operations on a narrow scale with a smaller environment footprint, methods for combining this data to make relevant information easily visible to the farmer, and methods for incorporating real-time sensor data into historical data both to increase precision during applications and to provide fast automated safety responses.

The framework is implemented to be compatible with AgriCircle’s farm management information system (FMIS), which enables map-based control of many application devices, and displayed via standard tablet PCs.

Objectifs du projet
(Anglais)

S3-CAV focuses on making the ability to perceive the local environment in 3D accessible to farmers transnationally. We devise a sensor framework which we populate with various vision-based and proprioceptive sensors, and combine their real-time input with stored data to provide short-loop safety responses and data sufficient to precisely control an application device in 3D.

The detailed data from the sensors is also sent to a commercial cloud-based Precision Farming Management Information System, where it is combined with stored data from earlier passes and other sources to produce human-readable maps with semantic overlays showing crop health, crop maturity, field traversability, irrigation networks, etc., whatever is relevant and requested. The versatility of this general sensor framework makes it truly transnational -- sensors and overlays can be adapted to specific crops, climates and geographical conditions.

Factors that hinder widespread adoption of Precision Farming methods include those relating to the perceived cost and complexity in getting started, and myths that PF is only feasible for large row crop farms. We address complexity by proposing a general PF solution with a common interface for data- and farm management, auto guidance, data visualization, and action planning, integrated into an existing commercial FMIS. We address the row crop preconception by building our initial test system to work in vineyards and olive groves.

Adoption of our system will be encouraged by being able to input data from any (open format) map-based source, and by the output from our system being compatible with most currently-used agricultural devices, everything that uses ISOXML, and the maps will be displayed on a standard Android tablet PC. Any map-based sensor data in any open format can be incorporated.

Résumé des résultats (Abstract)
(Anglais)

S3-CAV focuses on making the ability to perceive the local environment in 3D making it accessible to farmers though a farm management system.

A sensor suite with various vision-based and proprioceptive sensors has been used and their real-time input got combined with stored data for various purposes depending on the application case. This for example to precisely control an application device in 3D, precise guidance of AGVs in complex topologies, or tracking crop management through semantic labelling using LIDARS, Thermographic and hyperspectral cameras.

The derived results from the sensors data can be sent to a commercial cloud-based Precision Farming Management Information System, where it is combined with stored data from earlier passes and other sources to produce human-readable maps with semantic overlays showing crop health, crop maturity, field traversability, irrigation networks, etc., whatever is relevant and requested.

During the project, work was focused through a Vineyard case scenario to identify crop health and diseases in different grape varieties. In the project the different crop parts such as leafs, grapes and trunks have been successfully labelled followed by diseases such as peronospora. Also the vine tree volumes have been estimated with LIDAR to identify leaf volumes for adapted spraying in the future.

A survey based on 100 farmers identified the functionalities required and how to incorporate them in the FMIS. The data input to the system was chosen as GeoTIFF for semantically labelled data. The FMIS outputs pseudo colored maps and ISOXML for machinery.

Proprioception demonstrated terrain modelling and improved steering in relevant agricultural domains has been performed and evaluated. Strengths and weakneses of drone- vs vehicle data, camera based vs lidar based methods were characterized on basis of application areas. The team built a carry-on sensor suite with easy mounting on various vessels, and designed a method for improving hyperspectral imaging, which led to promising results in crop yield and health. The acquisition sensor suite was also tested by non-experts over a period of 3 months with weekly scans.