ORCID Profile
0000-0002-5382-5351
Current Organisations
Nanyang Technological University
,
Griffith University
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Springer International Publishing
Date: 2019
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 12-2017
Publisher: IEEE
Date: 10-01-2020
Publisher: IEEE
Date: 12-2019
Publisher: IEEE
Date: 08-2010
Publisher: IEEE
Date: 08-2014
Publisher: IGI Global
Date: 2018
DOI: 10.4018/978-1-5225-7033-2.CH073
Abstract: Biochar soil amendment is globally recognized as an emerging approach to mitigate CO2 emissions and increase crop yield. Because the durability and changes of biochar may affect its long term functions, it is important to quantify biochar in soil after application. In this chapter, an automatic soil biochar estimation method is proposed by analysis of hyperspectral images captured by cameras that cover both visible and infrared light wavelengths. The soil image is considered as a mixture of soil and biochar signals, and then hyperspectral unmixing methods are applied to estimate the biochar proportion at each pixel. The final percentage of biochar can be calculated by taking the mean of the proportion of hyperspectral pixels. Three different models of unmixing are described in this chapter. Their experimental results are evaluated by polynomial regression and root mean square errors against the ground truth data collected in the environmental labs. The results show that hyperspectral unmixing is a promising method to measure the percentage of biochar in the soil.
Publisher: Elsevier BV
Date: 08-2021
Publisher: IEEE
Date: 09-2019
Publisher: Elsevier BV
Date: 09-2019
Publisher: IEEE
Date: 12-2019
Publisher: IEEE
Date: 11-2013
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 04-2023
Publisher: IEEE
Date: 11-2014
Publisher: IEEE
Date: 2015
DOI: 10.1109/WACV.2015.49
Publisher: IEEE
Date: 06-2017
Publisher: IEEE
Date: 11-2009
Publisher: WORLD SCIENTIFIC
Date: 09-2011
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 12-2016
Publisher: IEEE
Date: 11-2013
Publisher: IEEE
Date: 05-2015
Publisher: IEEE
Date: 2005
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2020
Publisher: Elsevier BV
Date: 11-2020
Publisher: Elsevier BV
Date: 08-2021
Publisher: Elsevier BV
Date: 12-2017
Publisher: Elsevier BV
Date: 05-2015
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 04-2012
Publisher: Springer International Publishing
Date: 31-12-2016
Publisher: IEEE
Date: 12-2018
Publisher: IEEE
Date: 2006
Publisher: IEEE
Date: 10-2018
Publisher: IEEE
Date: 08-2010
Publisher: IEEE
Date: 10-2018
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 04-2017
Publisher: Elsevier BV
Date: 08-2020
Publisher: IEEE
Date: 2006
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2020
Publisher: IEEE
Date: 11-2016
Publisher: IEEE
Date: 09-2015
Publisher: Elsevier BV
Date: 06-2020
Publisher: IEEE
Date: 09-2016
Publisher: Springer Berlin Heidelberg
Date: 2006
Publisher: IEEE
Date: 11-2013
Publisher: Springer International Publishing
Date: 2020
Publisher: IEEE
Date: 12-2011
Publisher: Elsevier BV
Date: 2023
Publisher: IEEE
Date: 06-2017
Publisher: IEEE
Date: 11-2009
Publisher: IEEE
Date: 2006
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2023
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 03-2019
Publisher: IEEE
Date: 12-2016
Publisher: IEEE
Date: 11-2016
Publisher: Elsevier BV
Date: 07-2009
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 07-2010
Publisher: IEEE
Date: 11-2017
Publisher: Elsevier BV
Date: 2011
Publisher: Institute of Electronics, Information and Communications Engineers (IEICE)
Date: 2011
Publisher: Elsevier BV
Date: 11-2021
Publisher: World Scientific Pub Co Pte Lt
Date: 08-2002
DOI: 10.1142/S021800140200185X
Abstract: The profile view of a face provides a complementary structure that is not seen in the frontal view. The classification system combining both frontal and profile views of faces can improve the classification accuracy. And it would be more foolproof because it is difficult to fool the profile face identification by a mask. This paper proposes a new face recognition approach, which can be applied on both frontal and profile faces, to build a robust combined multiple view face identification system. The recognition employs a novel facial corner coding and matching method, and integrates the outline and interior facial parts in the profile matching. The proposed multiview modified Hausdorff distance fuses multiple views of faces to achieve an improved system performance.
Publisher: IEEE
Date: 05-2017
DOI: 10.1109/FG.2017.78
Publisher: IEEE
Date: 09-2019
Publisher: Elsevier BV
Date: 2022
Publisher: Elsevier BV
Date: 11-2009
Publisher: Springer International Publishing
Date: 2020
Publisher: IEEE
Date: 11-2017
Publisher: IEEE
Date: 12-2019
Publisher: IEEE
Date: 11-2016
Publisher: IEEE
Date: 07-2017
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 10-2018
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2022
Publisher: Elsevier BV
Date: 07-2019
Publisher: IGI Global
Date: 2017
DOI: 10.4018/978-1-5225-1837-2.CH047
Abstract: 3D modeling plays an important role in the field of computer vision and image processing. It provides a convenient tool set for many environmental informatics tasks, such as taxonomy and species identification. This chapter discusses a novel way of building the 3D models of objects from their varying 2D views. The appearance of a 3D object depends on both the viewing directions and illumination conditions. What is the set of images of an object under all viewing directions? In this chapter, a novel image representation is proposed, which transforms any n-pixel image of a 3D object to a vector in a 2n-dimensional pose space. In such a pose space, it is proven that the transformed images of a 3D object under all viewing directions form a parametric manifold in a 6-dimensional linear subspace. With in-depth rotations along a single axis in particular, this manifold is an ellipse. Furthermore, it is shown that this parametric pose manifold of a convex object can be estimated from a few images in different poses and used to predict object's appearances under unseen viewing directions. These results immediately suggest a number of approaches to object recognition, scene detection, and 3D modeling, applicable to environmental informatics. Experiments on both synthetic data and real images were reported, which demonstrates the validity of the proposed representation.
Publisher: IEEE
Date: 11-2016
Publisher: IEEE
Date: 2005
DOI: 10.1109/MMMC.2005.55
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2022
Publisher: Elsevier BV
Date: 03-2021
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2021
Publisher: Elsevier BV
Date: 12-2008
Publisher: Elsevier BV
Date: 04-2003
Publisher: IEEE
Date: 12-2010
Publisher: Elsevier BV
Date: 09-2014
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 11-2016
Publisher: Elsevier BV
Date: 03-2021
Publisher: Elsevier BV
Date: 09-2020
Publisher: IGI Global
Date: 2015
DOI: 10.4018/978-1-4666-9435-4.CH011
Abstract: Biochar soil amendment is globally recognized as an emerging approach to mitigate CO2 emissions and increase crop yield. Because the durability and changes of biochar may affect its long term functions, it is important to quantify biochar in soil after application. In this chapter, an automatic soil biochar estimation method is proposed by analysis of hyperspectral images captured by cameras that cover both visible and infrared light wavelengths. The soil image is considered as a mixture of soil and biochar signals, and then hyperspectral unmixing methods are applied to estimate the biochar proportion at each pixel. The final percentage of biochar can be calculated by taking the mean of the proportion of hyperspectral pixels. Three different models of unmixing are described in this chapter. Their experimental results are evaluated by polynomial regression and root mean square errors against the ground truth data collected in the environmental labs. The results show that hyperspectral unmixing is a promising method to measure the percentage of biochar in the soil.
Publisher: Elsevier BV
Date: 02-2017
Publisher: Elsevier BV
Date: 12-2012
Publisher: Nanyang Technol. Univ
Date: 2002
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 02-2010
Publisher: Elsevier BV
Date: 02-2002
Publisher: IEEE
Date: 2008
Publisher: IEEE
Date: 11-2016
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2011
Publisher: IEEE
Date: 11-2015
Publisher: IEEE
Date: 12-2019
Publisher: IEEE
Date: 10-01-2021
Publisher: Elsevier BV
Date: 11-2021
Publisher: IEEE
Date: 2006
Publisher: IEEE
Date: 12-2019
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2023
Publisher: Academy Publisher
Date: 07-2006
DOI: 10.4304/JMM.1.4.1-10
Publisher: IEEE
Date: 12-2010
Publisher: IEEE
Date: 2006
Publisher: Elsevier BV
Date: 02-2023
Publisher: Elsevier BV
Date: 03-2012
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 08-2023
Publisher: Springer International Publishing
Date: 2018
Publisher: IEEE
Date: 09-2016
Publisher: IEEE
Date: 2009
Publisher: Elsevier BV
Date: 03-2024
Publisher: IEEE
Date: 11-2016
Publisher: Elsevier BV
Date: 05-2017
Publisher: IEEE
Date: 09-2019
Publisher: Springer New York
Date: 03-08-2013
Publisher: IEEE
Date: 09-2013
Publisher: Springer International Publishing
Date: 2017
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 02-2023
Publisher: Inderscience Publishers
Date: 2012
Publisher: Springer Science and Business Media LLC
Date: 19-08-2021
Publisher: IGI Global
Date: 2015
DOI: 10.4018/978-1-4666-9435-4.CH010
Abstract: 3D modeling plays an important role in the field of computer vision and image processing. It provides a convenient tool set for many environmental informatics tasks, such as taxonomy and species identification. This chapter discusses a novel way of building the 3D models of objects from their varying 2D views. The appearance of a 3D object depends on both the viewing directions and illumination conditions. What is the set of images of an object under all viewing directions? In this chapter, a novel image representation is proposed, which transforms any n-pixel image of a 3D object to a vector in a 2n-dimensional pose space. In such a pose space, it is proven that the transformed images of a 3D object under all viewing directions form a parametric manifold in a 6-dimensional linear subspace. With in-depth rotations along a single axis in particular, this manifold is an ellipse. Furthermore, it is shown that this parametric pose manifold of a convex object can be estimated from a few images in different poses and used to predict object's appearances under unseen viewing directions. These results immediately suggest a number of approaches to object recognition, scene detection, and 3D modeling, applicable to environmental informatics. Experiments on both synthetic data and real images were reported, which demonstrates the validity of the proposed representation.
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 12-2008
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 03-2023
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2021
Publisher: IEEE
Date: 08-2018
Publisher: IEEE
Date: 2005
DOI: 10.1109/MMMC.2005.38
Publisher: Elsevier BV
Date: 08-2004
Publisher: IEEE
Date: 09-2013
Publisher: Springer Berlin Heidelberg
Date: 2013
Publisher: IEEE
Date: 09-2019
Publisher: IEEE
Date: 2008
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 03-2017
Publisher: IEEE
Date: 2008
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 09-2014
Publisher: Wiley
Date: 31-03-2021
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2020
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 06-2002
Publisher: IEEE
Date: 10-2011
Publisher: IEEE
Date: 08-2010
Publisher: IEEE
Date: 09-2018
Publisher: Elsevier BV
Date: 07-2005
Publisher: IEEE
Date: 08-2018
Publisher: Elsevier BV
Date: 08-2022
Publisher: IEEE
Date: 12-2019
Publisher: IEEE
Date: 12-2019
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 12-2013
Publisher: Springer International Publishing
Date: 2020
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 02-2017
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 12-2019
Publisher: Elsevier BV
Date: 12-2021
Publisher: Springer International Publishing
Date: 2022
Publisher: Elsevier BV
Date: 02-2006
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 05-2003
Publisher: Springer Berlin Heidelberg
Date: 2010
Publisher: Springer International Publishing
Date: 2018
Publisher: Institution of Engineering and Technology (IET)
Date: 2001
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2003
Publisher: Elsevier BV
Date: 11-2016
Publisher: IEEE
Date: 11-2015
Publisher: Elsevier BV
Date: 02-2002
Publisher: IEEE
Date: 12-2010
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2019
Publisher: Institution of Engineering and Technology (IET)
Date: 2003
Publisher: IEEE
Date: 12-2010
Publisher: IEEE
Date: 12-2011
Publisher: IEEE
Date: 2005
Publisher: Elsevier BV
Date: 04-2020
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 09-2021
Publisher: ACM
Date: 15-10-2018
Publisher: Elsevier BV
Date: 03-2023
Publisher: American Chemical Society (ACS)
Date: 05-05-2020
Publisher: IEEE
Date: 09-2016
No related grants have been discovered for Yongsheng Gao.