This page has only limited features, please log in for full access.

Dr. Anjin Chang
School of Engineering and Computing Sciences, Texas A&M University-Corpus Christi, 6300 Ocean Dr. Corpus Christi, TX 78414, USA

Basic Info

Basic Info is private.

Research Keywords & Expertise

0 Civil Engineering
0 Remote Sensing
0 Remote Sensing Applications
0 UAS
0 Remote sensing data processing

Fingerprints

UAS

Honors and Awards

The user has no records in this section


Career Timeline

The user has no records in this section.


Short Biography

The user biography is not available.
Following
Followers
Co Authors
The list of users this user is following is empty.
Following: 0 users

Feed

Journal article
Published: 17 March 2021 in Remote Sensing
Reads 0
Downloads 0

Drought significantly limits wheat productivity across the temporal and spatial domains. Unmanned Aerial Systems (UAS) has become an indispensable tool to collect refined spatial and high temporal resolution imagery data. A 2-year field study was conducted in 2018 and 2019 to determine the temporal effects of drought on canopy growth of winter wheat. Weekly UAS data were collected using red, green, and blue (RGB) and multispectral (MS) sensors over a yield trial consisting of 22 winter wheat cultivars in both irrigated and dryland environments. Raw-images were processed to compute canopy features such as canopy cover (CC) and canopy height (CH), and vegetation indices (VIs) such as Normalized Difference Vegetation Index (NDVI), Excess Green Index (ExG), and Normalized Difference Red-edge Index (NDRE). The drought was more severe in 2018 than in 2019 and the effects of growth differences across years and irrigation levels were visible in the UAS measurements. CC, CH, and VIs, measured during grain filling, were positively correlated with grain yield (r = 0.4–0.7, p < 0.05) in the dryland in both years. Yield was positively correlated with VIs in 2018 (r = 0.45–0.55, p < 0.05) in the irrigated environment, but the correlations were non-significant in 2019 (r = 0.1 to −0.4), except for CH. The study shows that high-throughput UAS data can be used to monitor the drought effects on wheat growth and productivity across the temporal and spatial domains.

ACS Style

Mahendra Bhandari; Shannon Baker; Jackie Rudd; Amir Ibrahim; Anjin Chang; Qingwu Xue; Jinha Jung; Juan Landivar; Brent Auvermann. Assessing the Effect of Drought on Winter Wheat Growth Using Unmanned Aerial System (UAS)-Based Phenotyping. Remote Sensing 2021, 13, 1144 .

AMA Style

Mahendra Bhandari, Shannon Baker, Jackie Rudd, Amir Ibrahim, Anjin Chang, Qingwu Xue, Jinha Jung, Juan Landivar, Brent Auvermann. Assessing the Effect of Drought on Winter Wheat Growth Using Unmanned Aerial System (UAS)-Based Phenotyping. Remote Sensing. 2021; 13 (6):1144.

Chicago/Turabian Style

Mahendra Bhandari; Shannon Baker; Jackie Rudd; Amir Ibrahim; Anjin Chang; Qingwu Xue; Jinha Jung; Juan Landivar; Brent Auvermann. 2021. "Assessing the Effect of Drought on Winter Wheat Growth Using Unmanned Aerial System (UAS)-Based Phenotyping." Remote Sensing 13, no. 6: 1144.

Journal article
Published: 20 February 2021 in Agronomy
Reads 0
Downloads 0

Salinity is one of the most common and critical environmental factors that limit plant growth and reduce crop yield. The aquifers, the primary sources of irrigation water, of south Florida are shallow and highly permeable, which makes agriculture vulnerable to projected sea level rise and saltwater intrusion. This study evaluated the growth responses of two ornamental nursery crops to the different salinity levels of irrigation water to help develop saltwater intrusion mitigation plans for the improved sustainability of the horticultural industry in south Florida. Two nursery crops, Hibiscus rosa-sinensis and Mandevilla splendens, were treated with irrigation water that had seven different salinity levels from 0.5 (control) to 10.0 dS/m in the experiment. Crop height was measured weekly, and growth was monitored daily using the normalized difference vegetation index (NDVI) values derived from multispectral images collected using affordable sensors. The results show that the growth of H. rosa-sinensis and M. splendens was significantly inhibited when the salinity concentrations of irrigation water increased to 7.0 and 4.0 dS/m, for each crop, respectively. No significant differences were found between the NDVI values and plant growth variables of both H. rosa-sinensis and M. splendens treated with the different irrigation water salinity levels less than 2.0 dS/m. This study identified the salinity levels that could reduce the growth of the two nursery crops and demonstrated that the current level of irrigation water salinity (0.5 dS/m) would not have significant adverse effects on the growth of these crops in south Florida.

ACS Style

Xinyang Yu; YoungGu Her; Anjin Chang; Jung-Hun Song; E. Campoverde; Bruce Schaffer. Assessing the Effects of Irrigation Water Salinity on Two Ornamental Crops by Remote Spectral Imaging. Agronomy 2021, 11, 375 .

AMA Style

Xinyang Yu, YoungGu Her, Anjin Chang, Jung-Hun Song, E. Campoverde, Bruce Schaffer. Assessing the Effects of Irrigation Water Salinity on Two Ornamental Crops by Remote Spectral Imaging. Agronomy. 2021; 11 (2):375.

Chicago/Turabian Style

Xinyang Yu; YoungGu Her; Anjin Chang; Jung-Hun Song; E. Campoverde; Bruce Schaffer. 2021. "Assessing the Effects of Irrigation Water Salinity on Two Ornamental Crops by Remote Spectral Imaging." Agronomy 11, no. 2: 375.

Technical note
Published: 15 January 2021 in Remote Sensing
Reads 0
Downloads 0

Sorghum is one of the most important crops worldwide. An accurate and efficient high-throughput phenotyping method for individual sorghum panicles is needed for assessing genetic diversity, variety selection, and yield estimation. High-resolution imagery acquired using an unmanned aerial vehicle (UAV) provides a high-density 3D point cloud with color information. In this study, we developed a detecting and characterizing method for individual sorghum panicles using a 3D point cloud derived from UAV images. The RGB color ratio was used to filter non-panicle points out and select potential panicle points. Individual sorghum panicles were detected using the concept of tree identification. Panicle length and width were determined from potential panicle points. We proposed cylinder fitting and disk stacking to estimate individual panicle volumes, which are directly related to yield. The results showed that the correlation coefficient of the average panicle length and width between the UAV-based and ground measurements were 0.61 and 0.83, respectively. The UAV-derived panicle length and diameter were more highly correlated with the panicle weight than ground measurements. The cylinder fitting and disk stacking yielded R2 values of 0.77 and 0.67 with the actual panicle weight, respectively. The experimental results showed that the 3D point cloud derived from UAV imagery can provide reliable and consistent individual sorghum panicle parameters, which were highly correlated with ground measurements of panicle weight.

ACS Style

Anjin Chang; Jinha Jung; Junho Yeom; Juan Landivar. 3D Characterization of Sorghum Panicles Using a 3D Point Cloud Derived from UAV Imagery. Remote Sensing 2021, 13, 282 .

AMA Style

Anjin Chang, Jinha Jung, Junho Yeom, Juan Landivar. 3D Characterization of Sorghum Panicles Using a 3D Point Cloud Derived from UAV Imagery. Remote Sensing. 2021; 13 (2):282.

Chicago/Turabian Style

Anjin Chang; Jinha Jung; Junho Yeom; Juan Landivar. 2021. "3D Characterization of Sorghum Panicles Using a 3D Point Cloud Derived from UAV Imagery." Remote Sensing 13, no. 2: 282.

Letter
Published: 17 December 2020 in Remote Sensing
Reads 0
Downloads 0

Citrus greening is a severe disease significantly affecting citrus production in the United States because the disease is not curable with currently available technologies. For this reason, monitoring citrus disease in orchards is critical to eradicate and replace infected trees before the spread of the disease. In this study, the canopy shape and vegetation indices of infected and healthy orange trees were compared to better understand their significant characteristics using unmanned aerial vehicle (UAV)-based multispectral images. Individual citrus trees were identified using thresholding and morphological filtering. The UAV-based phenotypes of each tree, such as tree height, crown diameter, and canopy volume, were calculated and evaluated with the corresponding ground measurements. The vegetation indices of infected and healthy trees were also compared to investigate their spectral differences. The results showed that correlation coefficients of tree height and crown diameter between the UAV-based and ground measurements were 0.7 and 0.8, respectively. The UAV-based canopy volume was also highly correlated with the ground measurements (R2 > 0.9). Four vegetation indices—normalized difference vegetation index (NDVI), normalized difference RedEdge index (NDRE), modified soil adjusted vegetation index (MSAVI), and chlorophyll index (CI)—were significantly higher in healthy trees than diseased trees. The RedEdge-related vegetation indices showed more capability for citrus disease monitoring. Additionally, the experimental results showed that the UAV-based flush ratio and canopy volume can be valuable indicators to differentiate trees with citrus greening disease.

ACS Style

Anjin Chang; Junho Yeom; Jinha Jung; Juan Landivar. Comparison of Canopy Shape and Vegetation Indices of Citrus Trees Derived from UAV Multispectral Images for Characterization of Citrus Greening Disease. Remote Sensing 2020, 12, 4122 .

AMA Style

Anjin Chang, Junho Yeom, Jinha Jung, Juan Landivar. Comparison of Canopy Shape and Vegetation Indices of Citrus Trees Derived from UAV Multispectral Images for Characterization of Citrus Greening Disease. Remote Sensing. 2020; 12 (24):4122.

Chicago/Turabian Style

Anjin Chang; Junho Yeom; Jinha Jung; Juan Landivar. 2020. "Comparison of Canopy Shape and Vegetation Indices of Citrus Trees Derived from UAV Multispectral Images for Characterization of Citrus Greening Disease." Remote Sensing 12, no. 24: 4122.

Journal article
Published: 14 September 2020 in Remote Sensing
Reads 0
Downloads 0

Assessing plant population of cotton is important to make replanting decisions in low plant density areas, prone to yielding penalties. Since the measurement of plant population in the field is labor intensive and subject to error, in this study, a new approach of image-based plant counting is proposed, using unmanned aircraft systems (UAS; DJI Mavic 2 Pro, Shenzhen, China) data. The previously developed image-based techniques required a priori information of geometry or statistical characteristics of plant canopy features, while also limiting the versatility of the methods in variable field conditions. In this regard, a deep learning-based plant counting algorithm was proposed to reduce the number of input variables, and to remove requirements for acquiring geometric or statistical information. The object detection model named You Only Look Once version 3 (YOLOv3) and photogrammetry were utilized to separate, locate, and count cotton plants in the seedling stage. The proposed algorithm was tested with four different UAS datasets, containing variability in plant size, overall illumination, and background brightness. Root mean square error (RMSE) and R2 values of the optimal plant count results ranged from 0.50 to 0.60 plants per linear meter of row (number of plants within 1 m distance along the planting row direction) and 0.96 to 0.97, respectively. The object detection algorithm, trained with variable plant size, ground wetness, and lighting conditions generally resulted in a lower detection error, unless an observable difference of developmental stages of cotton existed. The proposed plant counting algorithm performed well with 0–14 plants per linear meter of row, when cotton plants are generally separable in the seedling stage. This study is expected to provide an automated methodology for in situ evaluation of plant emergence using UAS data.

ACS Style

Sungchan Oh; Anjin Chang; Akash Ashapure; Jinha Jung; Nothabo Dube; Murilo Maeda; Daniel Gonzalez; Juan Landivar. Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework. Remote Sensing 2020, 12, 2981 .

AMA Style

Sungchan Oh, Anjin Chang, Akash Ashapure, Jinha Jung, Nothabo Dube, Murilo Maeda, Daniel Gonzalez, Juan Landivar. Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework. Remote Sensing. 2020; 12 (18):2981.

Chicago/Turabian Style

Sungchan Oh; Anjin Chang; Akash Ashapure; Jinha Jung; Nothabo Dube; Murilo Maeda; Daniel Gonzalez; Juan Landivar. 2020. "Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework." Remote Sensing 12, no. 18: 2981.

Journal article
Published: 23 November 2019 in Remote Sensing
Reads 0
Downloads 0

This study presents a comparative study of multispectral and RGB (red, green, and blue) sensor-based cotton canopy cover modelling using multi-temporal unmanned aircraft systems (UAS) imagery. Additionally, a canopy cover model using an RGB sensor is proposed that combines an RGB-based vegetation index with morphological closing. The field experiment was established in 2017 and 2018, where the whole study area was divided into approximately 1 x 1 m size grids. Grid-wise percentage canopy cover was computed using both RGB and multispectral sensors over multiple flights during the growing season of the cotton crop. Initially, the normalized difference vegetation index (NDVI)-based canopy cover was estimated, and this was used as a reference for the comparison with RGB-based canopy cover estimations. To test the maximum achievable performance of RGB-based canopy cover estimation, a pixel-wise classification method was implemented. Later, four RGB-based canopy cover estimation methods were implemented using RGB images, namely Canopeo, the excessive greenness index, the modified red green vegetation index and the red green blue vegetation index. The performance of RGB-based canopy cover estimation was evaluated using NDVI-based canopy cover estimation. The multispectral sensor-based canopy cover model was considered to be a more stable and accurately estimating canopy cover model, whereas the RGB-based canopy cover model was very unstable and failed to identify canopy when cotton leaves changed color after canopy maturation. The application of a morphological closing operation after the thresholding significantly improved the RGB-based canopy cover modeling. The red green blue vegetation index turned out to be the most efficient vegetation index to extract canopy cover with very low average root mean square error (2.94% for the 2017 dataset and 2.82% for the 2018 dataset), with respect to multispectral sensor-based canopy cover estimation. The proposed canopy cover model provides an affordable alternate of the multispectral sensors which are more sensitive and expensive.

ACS Style

Akash Ashapure; Jinha Jung; Anjin Chang; Sungchan Oh; Murilo Maeda; Juan Landivar. A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data. Remote Sensing 2019, 11, 2757 .

AMA Style

Akash Ashapure, Jinha Jung, Anjin Chang, Sungchan Oh, Murilo Maeda, Juan Landivar. A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data. Remote Sensing. 2019; 11 (23):2757.

Chicago/Turabian Style

Akash Ashapure; Jinha Jung; Anjin Chang; Sungchan Oh; Murilo Maeda; Juan Landivar. 2019. "A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data." Remote Sensing 11, no. 23: 2757.

Preprint
Published: 19 November 2019
Reads 0
Downloads 0

Unoccupied aerial systems (UAS) were used to phenotype growth trajectories of inbred maize populations under field conditions. Three recombinant inbred line populations were surveyed on a weekly basis collecting RGB images across two irrigation regimens (irrigated and non-irrigated/rain fed). Plant height, estimated by the 95th percentile (P95) height from UAS generated 3D point clouds, exceeded 70% correlation to manual ground truth measurements and 51% of experimental variance was explained by genetics. The Weibull sigmoidal function accurately modeled plant growth (R2: >99%; RMSE: < 4 cm) from P95 genetic means. The mean asymptote was strongly correlated (r2=0.66-0.77) with terminal plant height. Maximum absolute growth rates (mm d-1) were weakly correlated to height and flowering time. The average inflection point ranged from 57 to 60 days after sowing (DAS) and was correlated with flowering time (r2=0.45-0.68). Functional growth parameters (asymptote, inflection point, growth rate) alone identified 34 genetic loci, each explaining 3 to 15% of total genetic variation. Plant height was estimated at one-day intervals to 85 DAS, identifying 58 unique temporal quantitative trait loci (QTL) locations. Genomic hotspots on chromosome 1 and 3 indicated chromosomal regions associated with functional growth trajectories influencing flowering time, growth rate, and terminal growth. Temporal QTL demonstrated unique dynamic expression patterns not observable previously, no QTL were significantly expressed throughout the entire growing season. UAS technologies improved phenotypic selection accuracy and permitted monitoring traits on a temporal scale previously infeasible using manual measurements, furthering understanding of crop development and biological trajectories.Author summaryUnoccupied aerial systems (UAS) now can provide high throughput phenotyping to functionally model plant growth and explore genetic loci underlying temporal expression of dynamic phenotypes, specifically plant height. Efficient integration of temporal phenotyping via UAS, will improve the scientific understanding of dynamic, quantitative traits and developmental trajectories of important agronomic crops, leading to new understanding of plant biology. Here we present, for the first time, the dynamic nature of quantitative trait loci (QTL) over time under field conditions. To our knowledge, this is first empirical study to expand beyond selective developmental time points, evaluating functional and temporal QTL expression in maize (Zea mays L.) throughout a growing season within a field-based environment.

ACS Style

Steven Langlie Anderson; Seth C. Murray; Yuanyuan Chen; Lonesome Malambo; Anjin Chang; Sorin Popescu; Dale Cope; Jinha Jung. Unoccupied aerial system enabled functional modeling of maize (Zea mays L.) height reveals dynamic expression of loci associated to temporal growth. 2019, 848531 .

AMA Style

Steven Langlie Anderson, Seth C. Murray, Yuanyuan Chen, Lonesome Malambo, Anjin Chang, Sorin Popescu, Dale Cope, Jinha Jung. Unoccupied aerial system enabled functional modeling of maize (Zea mays L.) height reveals dynamic expression of loci associated to temporal growth. . 2019; ():848531.

Chicago/Turabian Style

Steven Langlie Anderson; Seth C. Murray; Yuanyuan Chen; Lonesome Malambo; Anjin Chang; Sorin Popescu; Dale Cope; Jinha Jung. 2019. "Unoccupied aerial system enabled functional modeling of maize (Zea mays L.) height reveals dynamic expression of loci associated to temporal growth." , no. : 848531.

Journal article
Published: 29 June 2019 in Remote Sensing
Reads 0
Downloads 0

Unmanned aerial vehicle (UAV) platforms with sensors covering the red-edge and near-infrared (NIR) bands to measure vegetation indices (VIs) have been recently introduced in agriculture research. Consequently, VIs originally developed for traditional airborne and spaceborne sensors have become applicable to UAV systems. In this study, we investigated the difference in tillage treatments for cotton and sorghum using various RGB and NIR VIs. Minimized tillage has been known to increase farm sustainability and potentially optimize productivity over time; however, repeated tillage is the most commonly-adopted management practice in agriculture. To this day, quantitative comparisons of plant growth patterns between conventional tillage (CT) and no tillage (NT) fields are often inconsistent. In this study, high-resolution and multi-temporal UAV data were used for the analysis of tillage effects on plant health and the performance of various vegetation indices investigated. Time series data over ten dates were acquired on a weekly basis by RGB and multispectral (MS) UAV platforms: a DJI Phantom 4 Pro and a DJI Matrice 100 with the SlantRange 3p sensor. Ground reflectance panels and an ambient illumination sensor were used for the radiometric calibration of RGB and MS orthomosaic images, respectively. Various RGB and NIR-based vegetation indices were then calculated for the comparison between CT and NT treatments. In addition, a one-tailed Z-test was conducted to check the significance of VIs’ difference between CT and NT treatments. The results showed distinct differences in VIs between tillage treatments during the whole growing season. NIR-based VIs showed better discrimination performance than RGB-based VIs. Out of 13 VIs, the modified soil adjusted vegetation index (MSAVI) and optimized soil adjusted vegetation index (OSAVI) showed better performance in terms of quantitative difference measurements and the Z-test between tillage treatments. The modified green red vegetation index (MGRVI) and excess green (ExG) showed reliable separability and can be an alternative for economic RGB UAV application.

ACS Style

Junho Yeom; Jinha Jung; Anjin Chang; Akash Ashapure; Murilo Maeda; Andrea Maeda; Juan Landivar. Comparison of Vegetation Indices Derived from UAV Data for Differentiation of Tillage Effects in Agriculture. Remote Sensing 2019, 11, 1548 .

AMA Style

Junho Yeom, Jinha Jung, Anjin Chang, Akash Ashapure, Murilo Maeda, Andrea Maeda, Juan Landivar. Comparison of Vegetation Indices Derived from UAV Data for Differentiation of Tillage Effects in Agriculture. Remote Sensing. 2019; 11 (13):1548.

Chicago/Turabian Style

Junho Yeom; Jinha Jung; Anjin Chang; Akash Ashapure; Murilo Maeda; Andrea Maeda; Juan Landivar. 2019. "Comparison of Vegetation Indices Derived from UAV Data for Differentiation of Tillage Effects in Agriculture." Remote Sensing 11, no. 13: 1548.

Journal article
Published: 12 April 2019 in ISPRS Journal of Photogrammetry and Remote Sensing
Reads 0
Downloads 0

Recent years have witnessed enormous interest in the application of Unmanned Aerial Systems (UAS) for precision agriculture. This study presents a novel approach to use multi-temporal UAS data for comparison of two management practices in cotton, conventional tillage (CT) and no-tillage (NT). The plant parameters considered for the comparison are: canopy height (CH), canopy cover (CC), canopy volume (CV) and Normalized Difference Vegetation Index (NDVI). Initially, the whole study area was divided into approximately one square meter size grids. Measurements were extracted grid wise using high resolution UAS data captured ten times over whole crop growing season of the cotton. One tailed Z-test hypothesis reveals that there is a significant difference between cotton growth under CT and NT for almost all the epochs. With 95% confidence interval, the crop grown under NT found to have taller canopy, higher canopy cover, bigger biomass and higher NDVI, as compared to those under CT cropping system.

ACS Style

Akash Ashapure; Jinha Jung; Junho Yeom; Anjin Chang; Murilo Maeda; Andrea Maeda; Juan Landivar. A novel framework to detect conventional tillage and no-tillage cropping system effect on cotton growth and development using multi-temporal UAS data. ISPRS Journal of Photogrammetry and Remote Sensing 2019, 152, 49 -64.

AMA Style

Akash Ashapure, Jinha Jung, Junho Yeom, Anjin Chang, Murilo Maeda, Andrea Maeda, Juan Landivar. A novel framework to detect conventional tillage and no-tillage cropping system effect on cotton growth and development using multi-temporal UAS data. ISPRS Journal of Photogrammetry and Remote Sensing. 2019; 152 ():49-64.

Chicago/Turabian Style

Akash Ashapure; Jinha Jung; Junho Yeom; Anjin Chang; Murilo Maeda; Andrea Maeda; Juan Landivar. 2019. "A novel framework to detect conventional tillage and no-tillage cropping system effect on cotton growth and development using multi-temporal UAS data." ISPRS Journal of Photogrammetry and Remote Sensing 152, no. : 49-64.

Short communication
Published: 18 February 2019 in Computers and Electronics in Agriculture
Reads 0
Downloads 0

Unmanned aerial vehicles (UAV) have been recognized as excellent tools to provide real time feedback of temporal and spatial conditions found in agricultural fields throughout the growing season. UAVs have also allowed accelerating breeding programs by screening varieties or by selecting agronomic traits that confer resistance to biotic and abiotic stresses and selecting the best management practices that optimize the management of soil and water resources. The main objectives of this study were to assess the potential use of UAVs to determine crop height, canopy cover, and NDVI during the tomato growing season for three tomato varieties; to validate tomato height obtained with a UAV; and evaluate the correlation between leaf area index and canopy cover determined with the UAV. The UAV was flown over a tomato trial planted with 90 plots that contained eight different tomato varieties; 3 roma and 5 round replicated three times per row and planted in three rows. The plots of the tomato varieties TAM-HOT, Shourouq, and Mykonos were selected for validation with the UAV. Commitment field measurements of plant height, leaf area index, and NDVI were collected weekly (from April 27 to June 22, 2017). All the tomato varieties were healthy without diseases and the NDVI values estimated with the UAV peaked between 90 and 110 days after transplanting (DAP). A coefficient of determination of 0.72 was observed between canopy cover estimated with the UAV and leaf area index measured with the ceptometer. The coefficient of correlation between the estimated and measured crop heights were 0.9845, 0.9766 and 0.9949 for the TAM-HOT, Shourouq and Mykonos, respectively. In addition, the calculated paired t test statistic showed no significant difference (P ≥ 0.05) between the estimated, the UAV and manually measured crop heights. In the future, UAV crop growth and NDVI monitoring could be improved through temporally dense data acquisition, increasing the number of ground samples and their geometric coincidence with the grids in UAV images, removal of weather effects, and other systematic errors caused from image quality and grid size.

ACS Style

Juan Enciso; Carlos A. Avila; Jinha Jung; Sheren Elsayed-Farag; Anjin Chang; Junho Yeom; Juan Landivar; Murilo Maeda; Jose C. Chavez. Validation of agronomic UAV and field measurements for tomato varieties. Computers and Electronics in Agriculture 2019, 158, 278 -283.

AMA Style

Juan Enciso, Carlos A. Avila, Jinha Jung, Sheren Elsayed-Farag, Anjin Chang, Junho Yeom, Juan Landivar, Murilo Maeda, Jose C. Chavez. Validation of agronomic UAV and field measurements for tomato varieties. Computers and Electronics in Agriculture. 2019; 158 ():278-283.

Chicago/Turabian Style

Juan Enciso; Carlos A. Avila; Jinha Jung; Sheren Elsayed-Farag; Anjin Chang; Junho Yeom; Juan Landivar; Murilo Maeda; Jose C. Chavez. 2019. "Validation of agronomic UAV and field measurements for tomato varieties." Computers and Electronics in Agriculture 158, no. : 278-283.

Journal article
Published: 27 November 2018 in Remote Sensing
Reads 0
Downloads 0

Unmanned aerial vehicle (UAV) images have great potential for various agricultural applications. In particular, UAV systems facilitate timely and precise data collection in agriculture fields at high spatial and temporal resolutions. In this study, we propose an automatic open cotton boll detection algorithm using ultra-fine spatial resolution UAV images. Seed points for a region growing algorithm were generated hierarchically with a random base for computation efficiency. Cotton boll candidates were determined based on the spatial features of each region growing segment. Spectral threshold values that automatically separate cotton bolls from other non-target objects were derived based on input images for adaptive application. Finally, a binary cotton boll classification was performed using the derived threshold values and other morphological filters to reduce noise from the results. The open cotton boll classification results were validated using reference data and the results showed an accuracy higher than 88% in various evaluation measures. Moreover, the UAV-extracted cotton boll area and actual crop yield had a strong positive correlation (0.8). The proposed method leverages UAV characteristics such as high spatial resolution and accessibility by applying automatic and unsupervised procedures using images from a single date. Additionally, this study verified the extraction of target regions of interest from UAV images for direct yield estimation. Cotton yield estimation models had R2 values between 0.63 and 0.65 and RMSE values between 0.47 kg and 0.66 kg per plot grid.

ACS Style

Junho Yeom; Jinha Jung; Anjin Chang; Murilo Maeda; Juan Landivar. Automated Open Cotton Boll Detection for Yield Estimation Using Unmanned Aircraft Vehicle (UAV) Data. Remote Sensing 2018, 10, 1895 .

AMA Style

Junho Yeom, Jinha Jung, Anjin Chang, Murilo Maeda, Juan Landivar. Automated Open Cotton Boll Detection for Yield Estimation Using Unmanned Aircraft Vehicle (UAV) Data. Remote Sensing. 2018; 10 (12):1895.

Chicago/Turabian Style

Junho Yeom; Jinha Jung; Anjin Chang; Murilo Maeda; Juan Landivar. 2018. "Automated Open Cotton Boll Detection for Yield Estimation Using Unmanned Aircraft Vehicle (UAV) Data." Remote Sensing 10, no. 12: 1895.

Journal article
Published: 22 November 2018 in Sensors
Reads 0
Downloads 0

Continuing population growth will result in increasing global demand for food and fiber for the foreseeable future. During the growing season, variability in the height of crops provides important information on plant health, growth, and response to environmental effects. This paper indicates the feasibility of using structure from motion (SfM) on images collected from 120 m above ground level (AGL) with a fixed-wing unmanned aerial vehicle (UAV) to estimate sorghum plant height with reasonable accuracy on a relatively large farm field. Correlations between UAV-based estimates and ground truth were strong on all dates (R2 > 0.80) but are clearly better on some dates than others. Furthermore, a new method for improving UAV-based plant height estimates with multi-level ground control points (GCPs) was found to lower the root mean square error (RMSE) by about 20%. These results indicate that GCP-based height calibration has a potential for future application where accuracy is particularly important. Lastly, the image blur appeared to have a significant impact on the accuracy of plant height estimation. A strong correlation (R2 = 0.85) was observed between image quality and plant height RMSE and the influence of wind was a challenge in obtaining high-quality plant height data. A strong relationship (R2 = 0.99) existed between wind speed and image blurriness.

ACS Style

Xiongzhe Han; J. Alex Thomasson; G. Cody Bagnall; N. Ace Pugh; David W. Horne; William L. Rooney; Jinha Jung; Anjin Chang; Lonesome Malambo; Sorin C. Popescu; Ian T. Gates; Dale A. Cope. Measurement and Calibration of Plant-Height from Fixed-Wing UAV Images. Sensors 2018, 18, 4092 .

AMA Style

Xiongzhe Han, J. Alex Thomasson, G. Cody Bagnall, N. Ace Pugh, David W. Horne, William L. Rooney, Jinha Jung, Anjin Chang, Lonesome Malambo, Sorin C. Popescu, Ian T. Gates, Dale A. Cope. Measurement and Calibration of Plant-Height from Fixed-Wing UAV Images. Sensors. 2018; 18 (12):4092.

Chicago/Turabian Style

Xiongzhe Han; J. Alex Thomasson; G. Cody Bagnall; N. Ace Pugh; David W. Horne; William L. Rooney; Jinha Jung; Anjin Chang; Lonesome Malambo; Sorin C. Popescu; Ian T. Gates; Dale A. Cope. 2018. "Measurement and Calibration of Plant-Height from Fixed-Wing UAV Images." Sensors 18, no. 12: 4092.

Journal article
Published: 07 September 2012 in IEEE Geoscience and Remote Sensing Letters
Reads 0
Downloads 0

Most pansharpened images from existing algorithms are apt to present a tradeoff relationship between the spectral preservation and the spatial enhancement. In this letter, we developed a hybrid pansharpening algorithm based on primary and secondary high-frequency information injection to efficiently improve the spatial quality of the pansharpened image. The injected high-frequency information in our algorithm is composed of two types of data, i.e., the difference between panchromatic and intensity images, and the Laplacian filtered image of high-frequency information. The extracted high frequencies are injected by the multispectral image using the local adaptive fusion parameter and postprocessing of the fusion parameter. In the experiments using various satellite images, our results show better spatial quality than those of other fusion algorithms while maintaining as much spectral information as possible.

ACS Style

JaeWan Choi; Junho Yeom; Anjin Chang; Younggi Byun; Yongil Kim. Hybrid Pansharpening Algorithm for High Spatial Resolution Satellite Imagery to Improve Spatial Quality. IEEE Geoscience and Remote Sensing Letters 2012, 10, 490 -494.

AMA Style

JaeWan Choi, Junho Yeom, Anjin Chang, Younggi Byun, Yongil Kim. Hybrid Pansharpening Algorithm for High Spatial Resolution Satellite Imagery to Improve Spatial Quality. IEEE Geoscience and Remote Sensing Letters. 2012; 10 (3):490-494.

Chicago/Turabian Style

JaeWan Choi; Junho Yeom; Anjin Chang; Younggi Byun; Yongil Kim. 2012. "Hybrid Pansharpening Algorithm for High Spatial Resolution Satellite Imagery to Improve Spatial Quality." IEEE Geoscience and Remote Sensing Letters 10, no. 3: 490-494.

Journal article
Published: 30 April 2012 in Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
Reads 0
Downloads 0
ACS Style

Jun-Ho Yeom; An-Jin Chang; Yong-Il Kim. Shadow Extraction of Urban Area using Building Edge Buffer in Quickbird Image. Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography 2012, 30, 163 -171.

AMA Style

Jun-Ho Yeom, An-Jin Chang, Yong-Il Kim. Shadow Extraction of Urban Area using Building Edge Buffer in Quickbird Image. Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography. 2012; 30 (2):163-171.

Chicago/Turabian Style

Jun-Ho Yeom; An-Jin Chang; Yong-Il Kim. 2012. "Shadow Extraction of Urban Area using Building Edge Buffer in Quickbird Image." Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography 30, no. 2: 163-171.