10 likes | 192 Views
Fusion of remotely sensed spectral reflectance data from airborne and ground-based sensors to enhance detection of cotton plants. Y. Lan 1 , H. Zh ang 2 , C. Suh 1 , J. Westbrook 1 , W.C. Hoffmann 1 , C. Yang 3 , Y. Huang 4 1 USDA-ARS, 2771 F&B Road, College Station, TX, USA.
E N D
Fusion of remotely sensed spectral reflectance data from airborne and ground-based sensors to enhance detection of cotton plants Y. Lan1, H. Zhang2, C. Suh1, J. Westbrook1, W.C. Hoffmann1, C. Yang3, Y. Huang4 1USDA-ARS, 2771 F&B Road, College Station, TX, USA. 2USDA-ARS, 9611 S. Riverbend Ave., Parlier, CA, USA. 3USDA-ARS, 2413 E Hwy 83, Weslaco, TX 78596, USA. 4USDA-ARS, 141 Experiment Station Road, Stoneville, MS 38776, USA. Abstract The study investigated the use of aerial multispectral imagery and ground-based hyperspectral data for the discrimination of different crop types and timely detection of cotton plants over large areas. Airborne multispectral imagery and ground-based spectral reflectance data were acquired at the same time over three large agricultural fields in Burleson Co., Texas during the 2010 growing season. The discrimination accuracy of aerial- and ground-based data was examined individually; then a multi-sensor data fusion technique was applied on both datasets in order to improve the accuracy of discrimination. The individual classification accuracy of data taken with the aerial- and ground-based sensors were 90% and 93.3%, respectively. In comparison, the accuracy of discriminating crop types with fused data was 100% in the calibration and only 3.33% misclassification in the cross-validation. These results suggest data fusion techniques could greatly enhance our ability to detect volunteer cotton plants occurring in cultivated and non-cultivated habitats. Keywords: Airborne multispectral imagery, remote sensing, data fusion Materials and Methods Data collection Ground-based remotely-sensed data were collected with a Fieldspec spectroradiometer from agricultural fields in Burleson Co., TX on multiple dates throughout the 2010 growing season. A total of ten sampling locations were randomly marked within each field with color flags. Airborne multispectral imagery of the study area was acquired at the same time on the sampling date. The airborne imaging system sensitive to spectral bands: blue (430-470 nm), green (530-570 nm), red (630-670 nm), and near-infrared (810-850 nm). Data analysis Principal component analysis was applied to reduce data dimensionality and produce two principal components for classification. Discriminant analyses were applied on several datasets: 1) derived reflectance values from four bands of imagery; 2) simulated broad-band reflectance values from the Fieldspec spectroradiometer; 3) principal components derived from reflectance values taken with a spectroradiometer at 1.6-nm sampling interval; 4) combined dataset 1 and 2; 5) principal components derived from dataset 4; 6) principal components derived from the combination of dataset 1 and Fieldspec reflectance data at 1.6-nm bandwidth. Table1 Summary of principal component analysis Table 2 Summary of misclassification matrices obtained from the DISCRIM procedure. Cotton Results and Discussion Corn Fig. 2 gives the average reflectance spectra of cotton, corn, and soybean plants from three fields taken with a FieldSpec spectroradiometer Introduction Airborne remote sensing technologies have made tremendous improvements recently and are now being used in precision agricultural applications. Airborne multispectral techniques are much less expensive and less data-intensive than hyperspectral imaging systems, and can rapidly provide continuous remotely-sensed data over a large field or region. Ground-based hand-held remote sensing instruments are less influenced by environmental conditions but measurements are labor-intensive and time-consuming. Considering the advantages of both airborne multispectral imagery and ground-based remote sensing data, multisensor data fusion techniques have recently been developed to combine data from multiple sensors or sources. Soybean Conclusions The fused dataset performed better in discriminating between crop types than did the datasets using a single sensor alone. The overall results indicate the potential of multisensor data fusion of remotely-sensed data from different sensors as an effective tool for detecting cotton plants. Acknowledgment The authors would like to thank Cotton Inc. for providing funding (Project no. 10-776). Fig. 1. Aerial imagery Data association A shape file was created using the spatial coordinates of ten ground sampling locations within each field of cotton, corn, and soybean . Subset images of the region of interest were exported into ArcGIS 9.3.1 (ESRI, Redlands, CA, USA). The ground sampling points were overlaid on the image. The values of the four spectral bands for the image pixels were geo collocated with ground sampling points and extracted for further analysis. Fig. 2. The mean reflectance spectra of three crop types. Principal component analysis was performed on three datasets to reduce the dimensionality of the datasets. The classification performance was evaluated by the misclassification rate in both the calibration and cross- validation steps in the DISCRIM procedure. Objective The objective of this work was to distinguish between different crop types using ground-based hyperspectral data, airborne multispectral imagery, and fused data from the ground-based and airborne spectral measurements. Contact Information: Dr. Yubin Lan, USDA-ARS, 2771 F&B Road, College Station, TX 77845; USA Phone:979-260-3759 Email: yubin.lan@usda.ars.gov