The encoding of friction by tactile mechanoreceptors - the key to fingertip force control during dexterous object manipulation by humans. Unmatched human ability to control the hand so that brittle objects are gently held without slipping, or being crushed by excessive force rely on sophisticated tactile sense in the fingertips. This project will record and analyse signals which human nerves are sending from fingertip receptors to the brain centres controlling hand actions.
Sensory mechanisms underlying human dexterity in object manipulation. This project aims to understand the sensory mechanisms and biomechanics underlying sensory encoding. Tactile sensory information is crucial for controlling grip forces so that delicate objects are held without slipping, or being crushed by excessive force. This project will record signals from single human tactile receptors using microneurography. By modelling the neural data with skin biomechanical events, this project aims t ....Sensory mechanisms underlying human dexterity in object manipulation. This project aims to understand the sensory mechanisms and biomechanics underlying sensory encoding. Tactile sensory information is crucial for controlling grip forces so that delicate objects are held without slipping, or being crushed by excessive force. This project will record signals from single human tactile receptors using microneurography. By modelling the neural data with skin biomechanical events, this project aims to reveal sensory mechanisms underlying the human ability to manipulate objects and use tools. This research could lead to next generation sensory-controlled prosthetics and robotic manipulators.Read moreRead less
Real-time friction sensing, feedback and control for dexterous prosthetic and robotic manipulation. Prosthetic and robotic hands demonstrate poor dexterity during object manipulation, often dropping objects. Humans rarely allow objects to slip because we can sense when an object is slippery and adjust our grip. Exceptionally little research has been directed at replicating this ability to sense friction. This project aims to enable artificial hands to estimate frictional properties while graspin ....Real-time friction sensing, feedback and control for dexterous prosthetic and robotic manipulation. Prosthetic and robotic hands demonstrate poor dexterity during object manipulation, often dropping objects. Humans rarely allow objects to slip because we can sense when an object is slippery and adjust our grip. Exceptionally little research has been directed at replicating this ability to sense friction. This project aims to enable artificial hands to estimate frictional properties while grasping an object. Non-invasive methods to feed back this frictional information to an amputee will also be investigated. Finally, the friction-sensing system will be used to improve robotic gripper control. The outcomes of this research will significantly advance the fields of prosthetics, telesurgery, and service and manufacturing robotics.Read moreRead less
Discovery Early Career Researcher Award - Grant ID: DE150100548
Funder
Australian Research Council
Funding Amount
$359,000.00
Summary
Neural and robotic correlates of predictive coding and selective attention. Whether a human catching a ball, a dog leaping at a frisbee or a dragonfly hunting prey amidst a swarm, brains both large and small have evolved the ability to focus attention on one moving target, even in the presence of distracters. This project aims to investigate how brains solve this challenging problem by recording the activity of dragonfly neurons that selectively attend to one target whilst ignoring others. The p ....Neural and robotic correlates of predictive coding and selective attention. Whether a human catching a ball, a dog leaping at a frisbee or a dragonfly hunting prey amidst a swarm, brains both large and small have evolved the ability to focus attention on one moving target, even in the presence of distracters. This project aims to investigate how brains solve this challenging problem by recording the activity of dragonfly neurons that selectively attend to one target whilst ignoring others. The project aims to examine how expectation and attention are encoded in the brain and will build an autonomous robot using computational models bio-inspired from this neuronal processing. Robots capable of visually perceiving and interacting with targets in natural environments have applications in health, surveillance and defence.Read moreRead less