At Mapsens® we continue to develop applications based on machine learning to automate detection tasks. In this post we show you how we are using deep neural networks to detect oil palms in orthophotos taken by drones while estimating their health status.

Example of neural network detection.

The function presented here allows an automatic inventory to be taken, with the usefulness for production estimation that this entails. In addition, the detections are classified into three groups, depending on the state of health of the plant. One group corresponds to palms in an optimal state, another to those in a possible state of deterioration and another for those showing clear signs of damage. To achieve these detections, data have been used to train the neural network, achieving high accuracy on previously unseen images. The output of the neural network is post-processed and geolocated to later integrate this information with Mapsens®.

This ability to detect the state of health of the plant, and to have this information geolocated, is very useful for crop maintenance. The fusion of this data with Mapsens® gives the farmer an idea of the risk areas in his crop, being able to directly access these areas without having to thoroughly inspect, in principle, the entire extension of the plantation.

A viewer with the results of the detection in a palm plantation is shown below. The color of the dots corresponds to the state of health of each palm detected, with green being the best, red the most deteriorated and yellow those in an intermediate point.