This page has only limited features, please log in for full access.

Mr. Yu-Chun Hsu
Department of Civil Engineering, and Innovation and Development Center of Sustainable Agriculture, National Chung Hsing University, Taichung 40227, Taiwan

Basic Info


Research Keywords & Expertise

0 Remote Sensing
0 Articial intelligence
0 Deep and Machine Learning
0 Crop 3D modelling and mapping
0 Image processing

Honors and Awards

The user has no records in this section


Career Timeline

The user has no records in this section.


Short Biography

The user biography is not available.
Following
Followers
Co Authors
The list of users this user is following is empty.
Following: 0 users

Feed

Journal article
Published: 31 August 2021 in Sensors
Reads 0
Downloads 0

Grain moisture content (GMC) is a key indicator of the appropriate harvest period of rice. Conventional testing is time-consuming and laborious, thus not to be implemented over vast areas and to enable the estimation of future changes for revealing optimal harvesting. Images of single panicles were shot with smartphones and corrected using a spectral–geometric correction board. In total, 86 panicle samples were obtained each time and then dried at 80 °C for 7 days to acquire the wet-basis GMC. In total, 517 valid samples were obtained, in which 80% was randomly used for training and 20% was used for testing to construct the image-based GMC assessment model. In total, 17 GMC surveys from a total of 201 samples were also performed from an area of 1 m2 representing on-site GMC, which enabled a multi-day GMC prediction. Eight color indices were selected using principal component analysis for building four machine learning models, including random forest, multilayer perceptron, support vector regression (SVR), and multivariate linear regression. The SVR model with a MAE of 1.23% was the most suitable for GMC of less than 40%. This study provides a real-time and cost-effective non-destructive GMC measurement using smartphones that enables on-farm prediction of harvest dates and facilitates the harvesting scheduling of agricultural machinery.

ACS Style

Ming-Der Yang; Yu-Chun Hsu; Wei-Cheng Tseng; Chian-Yu Lu; Chin-Ying Yang; Ming-Hsin Lai; Dong-Hong Wu. Assessment of Grain Harvest Moisture Content Using Machine Learning on Smartphone Images for Optimal Harvest Timing. Sensors 2021, 21, 5875 .

AMA Style

Ming-Der Yang, Yu-Chun Hsu, Wei-Cheng Tseng, Chian-Yu Lu, Chin-Ying Yang, Ming-Hsin Lai, Dong-Hong Wu. Assessment of Grain Harvest Moisture Content Using Machine Learning on Smartphone Images for Optimal Harvest Timing. Sensors. 2021; 21 (17):5875.

Chicago/Turabian Style

Ming-Der Yang; Yu-Chun Hsu; Wei-Cheng Tseng; Chian-Yu Lu; Chin-Ying Yang; Ming-Hsin Lai; Dong-Hong Wu. 2021. "Assessment of Grain Harvest Moisture Content Using Machine Learning on Smartphone Images for Optimal Harvest Timing." Sensors 21, no. 17: 5875.

Data descriptor
Published: 01 April 2021 in Remote Sensing
Reads 0
Downloads 0

Recently, unmanned aerial vehicles (UAVs) have been broadly applied to the remote sensing field. For a great number of UAV images, deep learning has been reinvigorated and performed many results in agricultural applications. The popular image datasets for deep learning model training are generated for general purpose use, in which the objects, views, and applications are for ordinary scenarios. However, UAV images possess different patterns of images mostly from a look-down perspective. This paper provides a verified annotated dataset of UAV images that are described in data acquisition, data preprocessing, and a showcase of a CNN classification. The dataset collection consists of one multi-rotor UAV platform by flying a planned scouting routine over rice paddies. This paper introduces a semi-auto annotation method with an ExGR index to generate the training data of rice seedlings. For demonstration, this study modified a classical CNN architecture, VGG-16, to run a patch-based rice seedling detection. The k-fold cross-validation was employed to obtain an 80/20 dividing ratio of training/test data. The accuracy of the network increases with the increase of epoch, and all the divisions of the cross-validation dataset achieve a 0.99 accuracy. The rice seedling dataset provides the training-validation dataset, patch-based detection samples, and the ortho-mosaic image of the field.

ACS Style

Ming-Der Yang; Hsin-Hung Tseng; Yu-Chun Hsu; Chin-Ying Yang; Ming-Hsin Lai; Dong-Hong Wu. A UAV Open Dataset of Rice Paddies for Deep Learning Practice. Remote Sensing 2021, 13, 1358 .

AMA Style

Ming-Der Yang, Hsin-Hung Tseng, Yu-Chun Hsu, Chin-Ying Yang, Ming-Hsin Lai, Dong-Hong Wu. A UAV Open Dataset of Rice Paddies for Deep Learning Practice. Remote Sensing. 2021; 13 (7):1358.

Chicago/Turabian Style

Ming-Der Yang; Hsin-Hung Tseng; Yu-Chun Hsu; Chin-Ying Yang; Ming-Hsin Lai; Dong-Hong Wu. 2021. "A UAV Open Dataset of Rice Paddies for Deep Learning Practice." Remote Sensing 13, no. 7: 1358.

Journal article
Published: 12 October 2020 in Computers and Electronics in Agriculture
Reads 0
Downloads 0

Rice is a globally important crop that will continue to play an essential role in feeding our world as we grapple with climate change and population growth. Lodging is a primary threat to rice production, decreasing rice yield, and quality. Lodging assessment is a tedious task and requires heavy labor and a long duration due to the vast land areas involved. Newly developed autonomous crop scouting techniques have shown promise in mapping crop fields without any human interaction. By combining autonomous scouting and lodged rice detection with edge computing, it is possible to estimate rice lodging faster and at a much lower cost than previous methods. This study presents an adaptive crop scouting mechanism for Autonomous Unmanned Aerial Vehicles (UAV). We simulate UAV crop scouting of rice fields at multiple levels using deep neural networks and real UAV energy profiles, focusing on areas with high lodging. Using the proposed method, we can scout rice fields 36% faster than conventional scouting methods at 99.25% accuracy.

ACS Style

Ming-Der Yang; Jayson G. Boubin; Hui Ping Tsai; Hsin-Hung Tseng; Yu-Chun Hsu; Christopher C. Stewart. Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet. Computers and Electronics in Agriculture 2020, 179, 105817 .

AMA Style

Ming-Der Yang, Jayson G. Boubin, Hui Ping Tsai, Hsin-Hung Tseng, Yu-Chun Hsu, Christopher C. Stewart. Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet. Computers and Electronics in Agriculture. 2020; 179 ():105817.

Chicago/Turabian Style

Ming-Der Yang; Jayson G. Boubin; Hui Ping Tsai; Hsin-Hung Tseng; Yu-Chun Hsu; Christopher C. Stewart. 2020. "Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet." Computers and Electronics in Agriculture 179, no. : 105817.

Journal article
Published: 18 September 2020 in Sensors
Reads 0
Downloads 0

Rice is one of the three major crops in the world and is the major crop in Asia. Climate change and water resource shortages may result in decreases in rice yields and possible food shortage crises. In this study, water-saving farming management was tested, and IOT field water level monitoring was used to regulate water inflow automatically. Plant height (PH) is an important phenotype to be used to determine difference in rice growth periods and yields using water-saving irrigation. An unmanned aerial vehicle (UAV) with an RGB camera captured sequential images of rice fields to estimate rice PH compared with PH measured on site for estimating rice growth stages. The test results, with two crop harvests in 2019, revealed that with adequate image calibration, the correlation coefficient between UAV-PH and field-PH was higher than 0.98, indicating that UAV images can accurately determine rice PH in the field and rice growth phase. The study demonstrated that water-saving farming is effective, decreasing water usage for the first and second crops of 2019 by 53.5% and 21.7%, respectively, without influencing the growth period and final yield. Coupled with an automated irrigation system, rice farming can be adaptive to water shortage situations.

ACS Style

Chin-Ying Yang; Ming-Der Yang; Wei-Cheng Tseng; Yu-Chun Hsu; Guan-Sin Li; Ming-Hsin Lai; Dong-Hong Wu; Hsiu-Ying Lu. Assessment of Rice Developmental Stage Using Time Series UAV Imagery for Variable Irrigation Management. Sensors 2020, 20, 5354 .

AMA Style

Chin-Ying Yang, Ming-Der Yang, Wei-Cheng Tseng, Yu-Chun Hsu, Guan-Sin Li, Ming-Hsin Lai, Dong-Hong Wu, Hsiu-Ying Lu. Assessment of Rice Developmental Stage Using Time Series UAV Imagery for Variable Irrigation Management. Sensors. 2020; 20 (18):5354.

Chicago/Turabian Style

Chin-Ying Yang; Ming-Der Yang; Wei-Cheng Tseng; Yu-Chun Hsu; Guan-Sin Li; Ming-Hsin Lai; Dong-Hong Wu; Hsiu-Ying Lu. 2020. "Assessment of Rice Developmental Stage Using Time Series UAV Imagery for Variable Irrigation Management." Sensors 20, no. 18: 5354.

Journal article
Published: 14 February 2020 in Remote Sensing
Reads 0
Downloads 0

A rapid and precise large-scale agricultural disaster survey is a basis for agricultural disaster relief and insurance but is labor-intensive and time-consuming. This study applies Unmanned Aerial Vehicles (UAVs) images through deep-learning image processing to estimate the rice lodging in paddies over a large area. This study establishes an image semantic segmentation model employing two neural network architectures, FCN-AlexNet, and SegNet, whose effects are explored in the interpretation of various object sizes and computation efficiency. Commercial UAVs imaging rice paddies in high-resolution visible images are used to calculate three vegetation indicators to improve the applicability of visible images. The proposed model was trained and tested on a set of UAV images in 2017 and was validated on a set of UAV images in 2019. For the identification of rice lodging on the 2017 UAV images, the F1-score reaches 0.80 and 0.79 for FCN-AlexNet and SegNet, respectively. The F1-score of FCN-AlexNet using RGB + ExGR combination also reaches 0.78 in the 2019 images for validation. The proposed model adopting semantic segmentation networks is proven to have better efficiency, approximately 10 to 15 times faster, and a lower misinterpretation rate than that of the maximum likelihood method.

ACS Style

Ming-Der Yang; Hsin-Hung Tseng; Yu-Chun Hsu; Hui Ping Tsai. Semantic Segmentation Using Deep Learning with Vegetation Indices for Rice Lodging Identification in Multi-date UAV Visible Images. Remote Sensing 2020, 12, 633 .

AMA Style

Ming-Der Yang, Hsin-Hung Tseng, Yu-Chun Hsu, Hui Ping Tsai. Semantic Segmentation Using Deep Learning with Vegetation Indices for Rice Lodging Identification in Multi-date UAV Visible Images. Remote Sensing. 2020; 12 (4):633.

Chicago/Turabian Style

Ming-Der Yang; Hsin-Hung Tseng; Yu-Chun Hsu; Hui Ping Tsai. 2020. "Semantic Segmentation Using Deep Learning with Vegetation Indices for Rice Lodging Identification in Multi-date UAV Visible Images." Remote Sensing 12, no. 4: 633.