Sensor Fusion for Precision Agriculture

21 June 2019

Precision agriculture has considerably benefited from the use of unmanned aerial vehicles, often combined with active ranging sensors like Lidar to gather information from underneath the crops, but existing systems are relatively expensive for the farming sector.

In this project, researchers in Canada have used an integrated sensor orientation method for the local referencing of Lidar measurements. Their approach is based on loosely coupled, image-aided inertial navigation in which the pose of the camera replaces the GNSS measurements. The result is a low-cost solution that is useful for capturing topographic data of inaccessible areas as well as in GNSS-denied environments.

Integration of a low-cost 3D Lidar sensor with a DSLR camera and an industrial-grade INS allows the creation of high-resolution and complete point clouds of agricultural plots and the automatic generation of a terrain model to measure soil micro-topography. The quality of the data is assured via integrated calibration that solves for all system and Lidar parameters simultaneously. The accuracy of Lidar observations can be improved by 37% via this calibration approach.

A critical challenge with this system is georeferencing the Lidar point clouds with an accuracy comparable to the georeferencing accuracy of image-based point clouds.

Click here to read the extensive article for further details.

Source: GIM International