Plants and Their Response to Light Energy
All objects respond to light by either absorbing, transmitting, or reflecting the light energy they are exposed to. The fraction of light energy that is reflected is called reflectance and is often expressed as a percentage.
Plants reflect light energy differently at different colors (or wavelengths) of the light spectrum, and their reflectance also changes as a function of stress level in the plant (see Figure 1). Leaves of a plant reflect visible light primarily in the Green region (which explains why plants appear green to our eyes). Beyond visible light (in the near-infrared, or NIR, region), plants have a significantly greater reflectance. Our eyes cannot see this energy, but multispectral cameras can.
Combinations of Bands Yield Vegetation Indices
The reflectance at various parts of the spectrum can be used to derive what are termed vegetation indices, which are often correlated with plant vigor and stress. For example, NDVI (Normalized Difference Vegetation Index), a commonly used vegetation index, is proportional to the difference between reflectance of the plant in the NIR region and the visible region (typically the Red spectrum).
Because of this scaling, NDVI has values from -1.0 to 1.0, though for most crops, the values of NDVI lie between 0.2 and 0.9. The higher the value of NDVI, the more vigorous the plant.
There are other vegetation indices beyond NDVI that make use of different combinations of bands. NDRE, for example, uses the NIR and Red Edge bands and is more highly correlated with chlorophyll content than NDVI.
Multispectral Cameras Capture This Information
Multispectral cameras work by imaging different wavelengths of light, both in the visible and non-visible light spectrum. Professional multispectral cameras like MicaSense RedEdge and Parrot Sequoia have multiple imagers, each fitted with a special optical filter that allows only a precise set of light wavelengths to be captured by that imager. Figure 1 shows the five filters (or bands) of the MicaSense RedEdge camera: Blue, Green, Red, Red Edge, and NIR. The output of the camera for each capture is a set of images for each band (Figure 2).
Capturing and Analyzing Data over a Field
Multispectral cameras such as RedEdge and Sequoia are typically used in conjunction with an aircraft (either manned or unmanned). The aircraft flies over a field in a “lawnmower” flight pattern while the camera captures images for each band at specific intervals (Figure 3).
These sets of images are then processed and stitched together to create geographically accurate mosaics, with multiple layers, one for each band. From these multi-layer mosaics, crop health maps can be generated, such as the example shown below for a potato pivot.