ORCID Profile
0000-0003-1319-5073
Current Organisation
LYRO Robotics
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 10-2023
Publisher: SAGE Publications
Date: 19-08-2019
Abstract: Various approaches have been proposed to learn visuo-motor policies for real-world robotic applications. One solution is first learning in simulation then transferring to the real world. In the transfer, most existing approaches need real-world images with labels. However, the labeling process is often expensive or even impractical in many robotic applications. In this article, we introduce an adversarial discriminative sim-to-real transfer approach to reduce the amount of labeled real data required. The effectiveness of the approach is demonstrated with modular networks in a table-top object-reaching task where a seven-degree-of-freedom arm is controlled in velocity mode to reach a blue cuboid in clutter through visual observations from a monocular RGB camera. The adversarial transfer approach reduced the labeled real data requirement by 50%. Policies can be transferred to real environments with only 93 labeled and 186 unlabeled real images. The transferred visuo-motor policies are robust to novel (not seen in training) objects in clutter and even a moving target, achieving a 97.8% success rate and 1.8 cm control accuracy. Datasets and code are openly available.
Publisher: IEEE
Date: 10-2012
Publisher: SAGE Publications
Date: 2012
DOI: 10.5772/54657
Abstract: We present a combined machine learning and computer vision approach for robots to localize objects. It allows our iCub humanoid to quickly learn to provide accurate 3D position estimates (in the centimetre range) of objects seen. Biologically inspired approaches, such as Artificial Neural Networks (ANN) and Genetic Programming (GP), are trained to provide these position estimates using the two cameras and the joint encoder readings. No camera calibration or explicit knowledge of the robot's kinematic model is needed. We find that ANN and GP are not just faster and have lower complexity than traditional techniques, but also learn without the need for extensive calibration procedures. In addition, the approach is localizing objects robustly, when placed in the robot's workspace at arbitrary positions, even while the robot is moving its torso, head and eyes.
Publisher: IEEE
Date: 07-2014
Publisher: IEEE
Date: 05-2018
Publisher: Springer New York
Date: 2013
Publisher: IEEE
Date: 11-2013
Publisher: Springer Science and Business Media LLC
Date: 28-03-2020
Publisher: IEEE
Date: 07-2009
Publisher: Frontiers Media SA
Date: 2014
Publisher: IEEE
Date: 05-2016
Publisher: IEEE
Date: 06-2013
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 10-2018
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 10-2022
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2020
Publisher: ACM
Date: 07-07-2012
Publisher: IEEE
Date: 07-2017
Publisher: SAGE Publications
Date: 04-2018
Abstract: The application of deep learning in robotics leads to very specific problems and research questions that are typically not addressed by the computer vision and machine learning communities. In this paper we discuss a number of robotics-specific learning, reasoning, and embodiment challenges for deep learning. We explain the need for better evaluation metrics, highlight the importance and unique challenges for deep robotic learning in simulation, and explore the spectrum between purely data-driven and model-driven approaches. We hope this paper provides a motivating overview of important research directions to overcome the current limitations, and helps to fulfill the promising potentials of deep learning in robotics.
Publisher: Springer Science and Business Media LLC
Date: 11-03-2019
Publisher: Springer Science and Business Media LLC
Date: 09-08-2019
Publisher: SciTePress - Science and and Technology Publications
Date: 2012
Publisher: Frontiers Media SA
Date: 25-05-2016
Publisher: Robotics: Science and Systems Foundation
Date: 26-06-2018
Publisher: SAGE Publications
Date: 26-06-2019
Abstract: We present a novel approach to perform object-independent grasp synthesis from depth images via deep neural networks. Our generative grasping convolutional neural network (GG-CNN) predicts a pixel-wise grasp quality that can be deployed in closed-loop grasping scenarios. GG-CNN overcomes shortcomings in existing techniques, namely discrete s ling of grasp candidates and long computation times. The network is orders of magnitude smaller than other state-of-the-art approaches while achieving better performance, particularly in clutter. We run a suite of real-world tests, during which we achieve an 84% grasp success rate on a set of previously unseen objects with adversarial geometry and 94% on household items. The lightweight nature enables closed-loop control of up to 50 Hz, with which we observed 88% grasp success on a set of household objects that are moved during the grasp attempt. We further propose a method combining our GG-CNN with a multi-view approach, which improves overall grasp success rate in clutter by 10%. Code is provided at ougsm/ggcnn
Publisher: IEEE
Date: 05-2019
Publisher: Springer Berlin Heidelberg
Date: 2013
Publisher: IEEE
Date: 05-2018
Publisher: Elsevier BV
Date: 07-2013
Publisher: IEEE
Date: 05-2017
Publisher: IEEE
Date: 08-2013
Publisher: IEEE
Date: 05-2019
Publisher: IEEE
Date: 05-2019
Publisher: IEEE
Date: 10-2018
Publisher: IEEE
Date: 05-2016
Publisher: CRC Press
Date: 17-09-2015
DOI: 10.1201/B19171-18
Publisher: IEEE
Date: 07-2017
Publisher: IEEE
Date: 11-2012
Publisher: Wiley
Date: 04-2018
DOI: 10.1111/EXSY.12258
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 07-2020
Publisher: IEEE
Date: 03-2016
Publisher: Springer Berlin Heidelberg
Date: 2009
Publisher: SCITEPRESS - Science and and Technology Publications
Date: 2014
Publisher: SAGE Publications
Date: 04-2018
Publisher: IEEE
Date: 05-2018
Location: Portugal
No related grants have been discovered for Jürgen Leitner.