top of page

Cloud Detection and Phenotyping

#MATlab  #ComputerVision  #MachineLearning

 

redbox.png

SUMMARY

Traditional methods for acquiring crop traits, such as plant height, leaf color, chlorophyll content, and yield, involve manual sampling, a process that is utterly time-consuming and labor-intensive. As a result, Unmanned Aerial Vehicles (UAVs) equipped with different sensors, have become a critical phenotyping tool in recent years. The aerial images obtained from UAVs are regularly used by crop researchers and agricultural producers to not only monitor crops during the growing season but also to make quick and reliable judgments.

Unfortunately, the accuracy of aerial images can be affected by various factors, one of which is the presence of clouds. Clouds and their accompanying shadows are inevitable contaminants for aerial imagery. According to the estimation made by the International Satellite Cloud Climatology Project-Flux Data (ISCCPFD), the global annual mean cloud cover is approximately 66%. Clouds impede UAV’s and satellites from obtaining clear views of the land. Moreover, the shadows cast by clouds eliminates crucial spectral information required for calculation of the vegetation indices such as The normalized difference red edge index (NDRE) that is used to prescribe nitrogen fertilizers.

 

 

 

As NDRE is influenced by clouds and shadows, ignorant farmers may give more nitrogen than they would on a regular day, resulting in wasting money and harmful effects on the environment. On the other hand, they may prescribe too small amounts of nitrogen, which would hurt their crops in the long run. This project explored ways to detect clouds and shadows in aerial images. Furthermore, the effects on the NDRE measurements were also studied.

To accomplish these tasks three objectives were defined which are as follows:

  1. Detect the full canopy stage: Approximate the date at which the crop reaches full canopy.

  2. Detecting clouds and tracing shadows: Determine the presence of clouds and shadows over fields in the images and trace their outlines.

  3. Comparing data: Determine the factor that clouds and shadows influence vegetation indices and the sufficiency index.

 

To accomplish these tasks the team used the method of Full Vegetation Coverage (FVC) and estimated reasonable dates when the crop reaches full canopy. Next, the team developed an algorithm to detect clouds and shadows then trace them in aerial images. Based on the developed algorithm, clouds were successfully detected 92% of the time and shadows 83% of the time. Boundaries of shadows and clouds were found and shapefiles generated. Lastly, it was proven that the clouds impact the vegetation index measurements. The end goal of this study is to make input decisionmakers aware of the significance of clouds and shadows in aerial images. This knowledge will enable agricultural producers to prescribe nitrogen to their crops in a more efficient and accurate manner. As a result, farmers will save money and produce healthier crops. The aforementioned project was done in accordance with the final project requirement for CSCE 473/873: Computer Vision at the University of Nebraska-Lincoln.  The project was accomplished during the Fall 2019 semester. Other members of the team included Kevin Toney and Jackson Stansell.

The full report detailing the approach to accomplish the above tasks with justifications to the decisions made for each phase can be found below. 

 

h1.jpg
h2.jpg

Example of aerial imagery for precision farming.

Figures retrieved online from: http://geovantage.com/applications/precision-agriculture/

g2.jpg
g3.jpg

Example of the end result for detecting clouds and shadows

bottom of page