Laura J. Thompson, Ag Technologies Extension Educator, On-Farm Research Coordinator
Yeyin Shi, Extension Agricultural Information System Engineer
Richard B. Ferguson, Extension Soil Fertility Specialist
There is great interest in the use of drones in agriculture. While commonly referred to as drones, these systems are more technically referred to as unmanned aerial vehicles (UAVs) or unmanned aircraft systems (UAS). Agriculture is expected to be one of the largest markets for drones with a projected economic impact at over $32 billion globally (Michał, Wiśniewski, & McMillan, 2016).
Use of drones in agriculture can vary widely. Many producers may be interested initially in use primarily for crop scouting with a system that has video feed to the ground control station. The ability to quickly view a field from above in real time can be an invaluable scouting resource to identify areas of concern. Such use can be accomplished with an inexpensive, off-the-shelf, consumer-grade drone with a standard RGB (red, green, and blue) camera. A standard RGB camera may also be called a natural-color or true-color camera and will produce images similar to a digital point-and-shoot camera or smartphone camera. Such systems are easy to operate by producers or crop consultants. Images from this type of use may or may not necessarily be archived after collection.
The other end of the spectrum for current drone applications in agriculture is collection of georeferenced, multispectral images. Such systems involve sensors beyond standard RGB cameras, along with image processing, to generate maps of crop condition, or stress. Often such imagery is collected at regular intervals during the growing season to detect and help manage the onset of stress. Georeferenced images can be used in geographic information system (GIS) software to relate multispectral imagery to other geospatial information, such as yield maps. Such systems are more costly to purchase and operate, and processing and analysis of such imagery requires skill and time. In many cases, growers may choose to contract for collection and processing of such data with a crop consultant or image service.
Regulatory changes by the Federal Aviation Administration, effective August 29, 2016, opened the door for widespread commercial drone flight. After passing an airman knowledge test administered at Federal Aviation Administration approved testing centers (https://www.faa.gov/training_testing/testing/media/test_centers.pdf), a person can be licensed as a remote pilot and authorized for commercial flights. Commercial flights include for-hire flights as well as flights in which the operator has a commercial/financial investment, such as commercial farming. A summary of these regulations can be found at: https://www.faa.gov/uas/media/Part_107_Summary.pdf.Table 1. Advantages and disadvantages of various sensor platforms.
Advantages | Disadvantages | |
---|---|---|
Satellite |
-Require little effort to obtain -Capture large areas, therefore are better for landscape scale assessments |
-Lower imagery resolution compared with drones -During cloudy conditions, no images are available -Not on-demand -Imagery obtained may not be as frequent as desired or at critical times when imagery is desired |
Manned Aircraft |
-Typically capture the entire field in one image -Can be ordered for on-demand imagery |
-Typically lower imagery resolution compared with drones -Some services offer routine flights rather than on-demand; therefore, obtaining imagery at critical times may not be available -Generally, on-demand imagery is more expensive to obtain |
Drone |
-On-demand imagery -Operate at lower altitudes, therefore can obtain higher imagery resolution |
-Requires “stitching” of multiple images taken over the field into a composite map -Less acres can be covered compared with satellite and airplane imagery |
Ground based |
-Can be obtained while another field operation is occurring |
-Product is limited to point data, which must be interpolated, rather than images of the entire field -Dependent on adequate soil conditions for entering the field |
Drones generally fall into three distinct types. Each offers certain advantages.
Fixed wing drones feature a rigid wing span and are able to glide in flight. The ability to glide allows fixed wing drones to fly for longer periods of time—an advantage when flying over large fields.
Rotary wing drones have multiple rotors with rotating blades. Drones with four rotors (quadcopters) and six rotors (hexcopters) are most common. Rotary wing drones allow for vertical takeoff, hovering, and closer crop inspection. Rotary wing drones are easier to control manually than fixed wing drones. Generally, rotary wing drones are less expensive than fixed wing drones.
An evolving category of drones is a hybrid, generally allowing for vertical takeoff as a rotary drone, then transitioning into a gliding flight style.
Drones are most commonly used as a platform to carry sensors to record observations about growing crops or the bare soil. This mission is no different than that of other platforms such as satellites and airplanes, which have historically been used for this purpose. Each sensor platform has certain advantages and disadvantages (Table 1). Additionally, some sensors can be mounted on ground-based field equipment as an alternate way of collecting similar information.
Many types of sensors may be mounted on a drone. The sensor selection is based primarily on the end use goals.
A RGB camera (Figure 1a) is also called a natural-color or true-color camera. RGB cameras are so named because they detect reflected light in three basic color components—red (R), green (G), and blue (B) (Figure 2). Images taken with an RGB camera look very similar to what is seen by the human eye, so image interpretation is straightforward. Most of the stock cameras integrated with drones are RGB cameras. They are usually low cost and useful for field scouting.
A multispectral camera is another type of camera with applications in agriculture. A multispectral camera usually detects light in three to five spectral bands (Figure 2). For example, a 3-band multispectral camera may detect light in green, red, and near-infrared spectral bands; while a 5-band multispectral camera may detect light in blue, green, red, red edge, and near-infrared bands. The bands of light detected by the camera will vary based on the camera model and in some cases are customizable. The near-infrared band is in the spectral region beyond the red band. This region is not visible by our eyes but is useful in detecting plant health conditions. Healthy plants have much stronger reflectance in the near-infrared region than that in the RGB region; while stressed plants have decreased reflectance in the near-infrared region (Figure 2). Another spectral region of interest is the red edge band. This band is between the red band and near-infrared band. Plants have an increase in reflectance between the red and near-infrared region, resulting in a sharp increase in reflectance through the red edge region (Figure 2). Reflectance in this band has also been demonstrated to be highly correlated with plant health condition.
Most of the multispectral cameras for drones on the market are passive sensors, which means they detect the sunlight reflected by the plant canopy rather than having their own active light sources. The amount of light reflected varies from day to day due to variations in atmospheric conditions. This makes it difficult to compare these images over time. Additionally, if the sunlight intensity changes during a flight, parts of a field appear darker or lighter than other parts. To compare the measured reflectance values from image to image, some passive multispectral camera systems include a downwelling light sensor (DLS). A DLS detects the amount of sunlight from the sky for each of the spectral bands of a certain camera (Figure 3). This allows the crop reflectance values to be compared with the sunlight intensity at the moment each image is taken. We recommend using a downwelling light sensor with your passive multispectral cameras.
Generally, reflectance values for individual wavebands are mathematically combined to generate vegetation indices (VI). These VI are correlated with specific properties of the crop. This enables more meaningful comparisons of the crop spatially within the field and at various times. One of the most commonly used VI is the Normalized Difference Vegetation Index (NDVI). It is calculated as:
Where: NIR and Red stand for the reflectance in the near-infrared and the red spectral bands.
NDVI is most effective at portraying variation in chlorophyll content and canopy density during early and mid-growth stages but tends to saturate later in the season after canopy closure. Another very commonly used VI is the Normalized Difference Red Edge (NDRE), which is calculated as:
Where: NIR and Red Edge stand for the reflectance in the near-infrared and the red edge spectral bands.
NDRE is a better indicator of chlorophyll content and total biomass than NDVI for mid to late season, high biomass crops such as corn after canopy closure with high levels of chlorophyll accumulated. When canopy cover is greatest, during the mid to late growth stage, the amount of light in the red band that can be absorbed by leaves reaches a peak regardless of the biomass accumulation inside the canopy. This results in the saturation of NDVI values for the whole field, masking spatial variability, particularly later in the season. The NDRE vegetation index uses reflectance in the red edge band instead of the red band, resulting in a vegetation index that is still sensitive to changes in chlorophyll content even with high biomass. An example of NDVI and NDRE imagery over the same corn field post-tasseling is shown in Figure 4. Both NDVI and NDRE maps show variations in the areas highlighted by yellow polygons; more distinctions can be observed in the NDRE map than in the NDVI map. There are many other VIs, such as the Soil Adjusted Vegetation Index (SAVI), Optimized Soil Adjusted Vegetation Index (OSAVI), and Green Normalized Difference Vegetation Index (GNDVI); each has applications for which it is best suited.
Besides the RGB and multispectral cameras, thermal cameras are also used with drones for agriculture applications (Figure 1c). Thermal cameras detect radiation in the long-wavelength infrared region (8,000–14,000 nm). The higher the temperature of an object, the higher the emitted thermal radiation. When plants are under water stress, evapotranspiration is reduced, which results in a slight increase in canopy temperature. Because of this, thermal cameras can be used to detect plant stresses—especially water stress. Environmental conditions can interfere with thermal readings, which need to be considered for thermal camera applications. This includes changes of wind speed, solar radiation, and air temperature during a flight.
Miniature LiDAR sensors are available for aerial applications, but they are not commonly used with drones due to their heavy payload and sensor cost (Figure 1d). A LiDAR sensor measures the distance between the sensor and objects using time-of-flight technology. They are mainly used for terrain mapping and are more frequently deployed on manned aircraft.
The majority of current small drones use lithium polymer batteries for power and therefore have limited flight endurance. It is important to know the maximum payload of the drone as well as how the weight of the sensor system will alter flight time.
Immediate benefits can be realized simply by viewing the field from above. For example, patterns can be detected and portions of the field that are not visible from the ground can be seen. Generally, these snapshots are taken at oblique angles, although they can also be taken in the nadir (straight down) position.
Generating georeferenced imagery of the whole field requires more work but provides multiple advantages. Georeferenced imagery can be useful for identifying, quantifying, and locating issues. This is particularly valuable for crop scouting later in the growing season. As crops such as corn get taller, it is difficult to assess the field through scouting on foot. Georeferenced imagery allows scouts to assess the entire field using the imagery to identify areas needing further investigation, and using a GPS-enabled device, navigate to these locations and make inspections.
To generate georeferenced maps, there are consistent steps that generally apply, regardless of the drone type, sensor type, and software being used. An illustration of the entire workflow is shown in Figure 5. The remainder of this paper discusses each of the steps in Figure 5 in greater detail.
Generally, missions or flight paths are planned first using a flight control software (Figure 6). The flight control software is used to control the drone during flight and/or plan the flight beforehand. In planning, you can define the coverage area, flight altitude, flight speed, flight pattern (usually a serpentine pattern), forward and side-to-side overlaps between images, and camera model or parameters (sensor size, focal length, shutter speed, ISO, etc.). A flight time is estimated based on these settings.
Because drone imagery has a smaller footprint than the imagery from an airplane, stitching images after the flight is necessary to create a map of the entire field. Sufficient overlap between successive images and between passes is critical for good stitching. To stitch all images into one map, features on the ground must appear in multiple images. Often up to 75 percent overlap between pictures is needed, both in the forward direction of flight and in the side-to-side direction between passes. Numerous flight control software options are available, many of which are free apps that can be operated on a smartphone or tablet. The preferred application will vary based on the drone and sensor specifications, but generally, the user has several compatible options.
When planning the flight, several aspects should be addressed to obtain reliable imagery. The altitude flown determines how many images you will need to capture to adequately cover the area of interest. When flying at a higher altitude, more of the field is covered in each image, but the resolution of the image is lower. Resolution refers to the area on the field represented by one pixel. A resolution of 3.5 inches per pixel is a lower resolution than 1 inch per pixel. Resolution depends on both sensor capabilities and altitude.
Because the altitude flown determines the total number of images captured to cover a given area, this also determines the amount of data that will be generated. Generally, it is best to fly as high as possible within legal constraints (currently 400 feet). The imagery obtained at 400 feet is generally adequate for most purposes. By flying as high as possible, the time it takes to collect images and the amount of data generated is minimized. For example, flying a 40 acre field with a 5 band multispectral camera, at a 400 foot altitude, with a 70 percent overlap in both directions would take around 20 to 30 minutes. This flight generates about 2,900 individual images and 6.8 GB of data. The resolution of the imagery is about 3.5 inches per pixel. Resolution will vary based on sensor capabilities. For some applications higher resolution than 3.5 inches per pixel may be desirable, but for most applications this resolution is adequate. The image below demonstrates the features that are visible with a resolution of 3.5 inches per pixel (Figure 7).
After flight parameters are set up and proper safety checks are made, the flight plan can be executed. When executing the mission, several scenarios should be avoided to obtain useful imagery.
Flying early in the morning or late in the day increases shadows cast by buildings, trees, and in some cases the crop itself (Figure 8). These shadows can obscure the imagery and produce unreliable maps. It is best to avoid flying in the early morning or late evening. For best comparison of multispectral imagery over multiple dates, collect images between 10 a.m. and 2 p.m.
Cloud cover is an important consideration when conducting flights. Days with clear skies or complete cloud cover are ideal. Partially cloudy days result in frequently changing light conditions, which often show up as cloudy spots or streaks in the final image (Figure 9). For best comparison over multiple dates, collect images in clear weather.
Hot, dry conditions cause leaves of some crops, such as corn, to roll up. Leaf rolling and dry conditions result in physiological changes to the plant, which can create unreliable assessments of the crop condition.
When the mission is complete, the data can be retrieved from the sensor memory. Data obtained are individual image footprints (Figure 10). Generally, overlap between images was planned so that the same area on the ground appears in multiple images. This ensures good coverage and allows for stitching of imagery. Care should be taken to adequately back up original imagery to prevent data loss.
The individual images can be stitched together into a georeferenced map using either software installed on local machines or cloud-based services. Each has advantages and disadvantages (Table 2). Hybrid models that seek to provide the best of both services are also being developed.
Table 2. Comparison of cloud-based and desktop-based software for stitching individual aerial images into a georeferenced map.
Local Desktop Software | Cloud Based Services | |
---|---|---|
Ease of use |
Generally require more training and experience to operate. |
Easy to use. |
Control of process |
Greater control over the processing options allow individual processing options to be adjusted to improve the final result. |
Individual processing parameters cannot be adjusted. |
Cost |
Can be a one-time investment or monthly or yearly software lease fee. |
Often a pay-per-use cost based on acres covered or images uploaded. |
Computing requirements |
High-end computing hardware required. |
Fast and reliable internet upload speeds are needed to upload imagery for processing. |
Processing time |
Quicker processing; however, computer resource demand is high; therefore, computer may not be available for other tasks while processing is being completed. |
Longer wait time for processing completion. |
Ease of sharing |
Varies; some desktop options also offer cloud upload of processed map to a site where the map can be shared with others via a link. |
Web-based; therefore, easy to share with others via a link. |
Several available vegetation indices were previously discussed in the sensor selection section. Indices may be calculated using specialized drone imagery software such as Pix4D, Drone Deploy, AgiSoft, Dronifi, or Precision Hawk Mapper or with general use geospatial platforms such as ArcGIS or QGIS or agriculture-specific geospatial platforms such as AgLeader SMS Advanced.
Stitched maps and raw data should be stored and backed up on a local hard drive or on a cloud-based database.
Comparisons and judgments can be made more easily when the map covers the entire area of interest. Features of interest can then be further investigated in the field to determine if action is needed. Various crop issues may be detected using georeferenced imagery. Because the resolution of drone imagery is often much higher than that of satellites, smaller features and patterns may be detected. These features are in some cases not visible in yield maps of the field due to the lower resolution of yield data. (Yield data resolution is determined by the width of the combine head and the frequency of recorded observations as dictated by speed of travel.) Additionally, flow delay and imprecise calibration of yield monitors may obstruct patterns. Drone-based sensors provide a means of obtaining high resolution imagery of fields; this imagery can be used for many practical applications.
There is great interest in the ability to quickly assess the early season crop stand so that replant and pest management decisions can be made. Various services and products are available to provide assessments of crop stand. Generally, these services require high resolution images (less than 1 inch per pixel) and therefore necessitate flying the drone at a lower altitude. Generally an RGB camera is adequate (multispectral sensors are not required). When considering using these services to assess the quality of the stand and make replant decisions, ground-truthing of the information is critical.
Due to the challenge of herbicide-resistant weed management, strategies to detect and map weeds are of interest. The image below (Figure 11) was captured with a 5-band multispectral sensor and has a resolution of approximately 3.5 inches per pixel. The bands recorded were arranged in a proprietary chlorophyll index (MicaSense Chlorophyll Index, MSCI) and assigned a color scheme for visualization. In this color scheme, the green corresponds to bare soil between the rows of soybeans. The yellow and orange represents the soybean rows (in 30-inch row spacing), while the irregular dark red spots correspond to weeds. Identifying areas with greater weed density can allow for site-specific weed management.
Various causes of stress in plant growth and development can be detected using imagery. For example, the extent of wind and hail damage can be quantified using imagery. Figure 12 illustrates an early (1999) effort to use simple RGB imagery to quantify corn stalk breakage from a windstorm. The natural color image was classified in GIS/image analysis software, which estimated 2.9 out of 10 acres in this block were broken. The pattern of yield reduction that falls in the accompanying yield map corresponded visually to the pattern of broken stalks in the classified aerial image.
Figure 13 illustrates a 2017 effort to use multispectral data to detect injury from herbicide drift from the south-southwest. The image below shows a map of NDVI for the field with corresponding images taken of plants up close in various locations. The lower NDVI values (red) correspond with areas of greater damage while higher NDVI values (blue) correspond with no visible herbicide injury symptoms. The darkest red areas are grassed waterways and an alfalfa field. The imagery was used to delineate the damage and direct ground-truth efforts.
Viewing the crop from above can help detect issues with irrigation. Figure 14 shows a furrow irrigated field where water was not being uniformly distributed; water flowed from the top of the image to the bottom, hitting a dike at the end of the field and backing up into the field. Thus the top and bottom ends of the field received adequate water, but not most of the center of the field. While this image was taken from a manned aircraft, a drone could be used to collect similar imagery. Other irrigation issues, such as plugged nozzles on center pivot irrigation systems, could be similarly detected.
New products or practices may be tested in on-farm research studies. Imagery can be useful for assessing various treatments and their effect on crop performance, especially spatially across a field.
In 2015 and 2016, the Nebraska On-Farm Research Network conducted research studies with a new seed treatment, ILeVO®, which was developed to combat sudden death syndrome in soybeans. Sudden death syndrome is spotty in its distribution, occurring in “pockets” in the field and causing yield loss. Figure 15 shows NDVI imagery of an on-farm research study that compared soybeans with and without the ILeVO seed treatment. Areas of low NDVI (blue) show where sudden death syndrome symptoms were worse. Consequently, the effect of ILeVO can be compared in these areas of high sudden death syndrome versus areas where sudden death syndrome was not as severe (shown by high NDVI values in red). Such imagery can aid in site-specific management of inputs such as ILeVO, whereby the product may be applied in the areas where it is needed, and not applied in other areas of the field. While this image was taken from a manned aircraft, using drones to collect similar imagery for future on-farm research projects provides a lower cost option for obtaining imagery with greater flexibility on timing of imagery acquisition.
Aerial imagery can be used for other observations regarding on-farm research studies. A 2016 on-farm research study examined the economically optimal planting population for soybeans in southeast Nebraska. Four soybean planting rates were examined—116,000, 130,000, 160,000, and 185,000 seeds/acre. Natural color imagery from a manned aircraft (Figure 16) revealed lodging in soybeans was greater in the higher planting population treatments. Such imagery can aid in the interpretation of on-farm research results and provides additional valuable information for decision-making for producers. While this image was taken from a manned aircraft, using drones to collect similar imagery for future on-farm research projects provides a lower cost option for obtaining imagery with greater flexibility on timing of imagery acquisition.
Figure 17. Base nitrogen application rate in lb N/acre as anhydrous ammonia. Two small plots (in gray) that received 225 lb N/acre are for N-sufficient reference plots.
Multispectral sensors mounted on drones may be used to assess the crop canopy to direct variable-rate, in-season nitrogen fertilizer applications in corn. This technique is promising for reducing excess nitrogen application by supplying the crop with nitrogen when it is needed, in the quantities needed, and in the locations within the field where nitrogen is needed. In Figure 17, nitrogen was applied as anhydrous ammonia at rates of 75, 100, and 160 lb/acre. These three rates were replicated four times across the field. The order in which the three rates occurs in each replication is randomized. Randomization and replication allow us to perform statistical analyses to assess the impact of field variability in the yield results and allows us to have greater confidence in our conclusions. To learn more about setting up an on-farm research experiment, visit http://cropwatch.unl.edu/farmresearch.
† Profit based on actual N fertilizer and application costs, which were $0.284/lb N as anhydrous ammonia, $0.355/lb N as stabilized urea, $14/ac anhydrous application, $12/ac flat rate urea application, and $13.75/ac variable rate urea application. Corn selling price used was $3/bu.
The field was monitored weekly with a multispectral camera on a drone (Figure 18). The NDRE values across the field were compared with the NDRE values of the N-sufficient reference plots in what is termed a sufficiency index. The sufficiency index allowed us to determine the supplemental nitrogen fertilizer need spatially across the field. To learn more about the sufficiency index concept, consult the NebGuide, Using a Chlorophyll Meter to Improve N Management found at http://extensionpublications.unl.edu/assets/pdf/g1632.pdf.
A variable-rate prescription for in-season nitrogen was then developed based on the imagery from June 24 at V12 growth stage and applied on June 29 at V15 (Figure 19).
Total nitrogen rates applied, yield, nitrogen use efficiency, and partial profit are shown in Table 3.
Table 3. Nitrogen rates applied, yield, nitrogen use efficiency, and partial profit for each treatment.
Treatment | Base N Rate | Avg In-season N Rate | Total N Rate | Yield (15.5% moisture) | Nitrogen Use Efficiency | Partial Profit† |
---|---|---|---|---|---|---|
lb/ac | bu/ac | bu grain/lb N | $/ac | |||
75 lb/ac base rate + in-season variable rate |
75 |
102 |
177 |
257 A* |
1.45 A |
$686.03 |
100 lb/ac base rate + in-season variable rate |
100 |
75 |
175 |
257 A |
1.45 A |
$687.59 |
Traditional Farmer Management |
160 |
40 |
200 |
257 A |
1.3 B |
$684.23 |
* Within a column, values with the same letter are not statistically different at alpha = 0.1.
By using this method, the crop can be frequently assessed, so fertilizer can be applied when the crop begins to show nitrogen need, but prior to yield-reducing stress. Weekly NDRE imagery allows detection of stress not visible to the naked eye; therefore, allowing for earlier detection of nitrogen need and the creation of a nitrogen prescription that is in accordance with varying nitrogen need across the field. In this example, nitrogen application was reduced by around 25 lb/acre compared with the traditional approach the farmer had been using. This resulted in increased nitrogen use efficiency and a slight increase in profit.
While drone and sensor technology is rapidly evolving, benefits to crop management already can be realized. Drones may be used to simply view fields from above or to conduct systematic mapping missions. A variety of sensors can be attached to a drone to collect additional information beyond RGB imagery. Adoption of drone and sensor systems can help detect problems such as malfunctioning irrigation equipment, storm, and herbicide damage. Additionally, these systems provide new opportunities for site-specific crop management, which can result in more precise and efficient use of resources.
1. Reference to commercial products or trade names is made with the understanding that no discrimination is intended and no endorsement by Nebraska Extension is implied. Use of commercial and trade names does not imply approval or constitute endorsement by Nebraska Extension. Nor does it imply discrimination against other similar products.
Michał, M., Wiśniewski, A., & McMillan, J. (2016). Clarity from above: PwC global report on the commercial applications of drone technology. Pwc Drone Powered Solutions. Retrieved from https://www.pwc.pl/pl/pdf/clarity-from-above-pwc.pdf
This publication has been peer reviewed.
Nebraska Extension publications are available online at http://extension.unl.edu/publications.
Extension is a Division of the Institute of Agriculture and Natural Resources at the University of Nebraska–Lincoln cooperating with the Counties and the United States Department of Agriculture.
University of Nebraska–Lincoln Extension educational programs abide with the nondiscrimination policies of the University of Nebraska–Lincoln and the United States Department of Agriculture.
© 2017, The Board of Regents of the University of Nebraska on behalf of the University of Nebraska–Lincoln Extension. All rights reserved.