ORCID Profile
0000-0002-9128-5958
Current Organisations
University of Queensland
,
University of Oxford
,
University College London
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Elsevier BV
Date: 03-2022
DOI: 10.1016/J.CUB.2022.01.047
Abstract: During active movement, there is normally a tight relation between motor command and sensory representation about the resulting spatial displacement of the body. Indeed, some theories of space perception emphasize the topographic layout of sensory receptor surfaces, while others emphasize implicit spatial information provided by the intensity of motor command signals. To identify which has the primary role in spatial perception, we developed experiments based on everyday self-touch, in which the right hand strokes the left arm. We used a robot-mediated form of self-touch to decouple the spatial extent of active or passive right hand movements from their tactile consequences. Participants made active movements of the right hand between unpredictable, haptically defined start and stop positions, or the hand was passively moved between the same positions. These movements caused a stroking tactile motion by a brush along the left forearm, with minimal delay, but with an unpredictable spatial gain factor. Participants judged the spatial extent of either the right hand's movement, or of the resulting tactile stimulation to their left forearm. Across five experiments, we found that movement extent strongly interfered with tactile extent perception, and vice versa. Crucially, interference in both directions was stronger during active than passive movements. Thus, voluntary motor commands produced stronger integration of multiple sensorimotor signals underpinning the perception of personal space. Our results prompt a reappraisal of classical theories that reduce space perception to motor command information.
Publisher: American Association for the Advancement of Science (AAAS)
Date: 22-04-2022
Abstract: Electrophysiological studies in monkeys show that finger utation triggers local remapping within the deprived primary somatosensory cortex (S1). Human neuroimaging research, however, shows persistent S1 representation of the missing hand’s fingers, even decades after utation. Here, we explore whether this apparent contradiction stems from underestimating the distributed peripheral and central representation of fingers in the hand map. Using pharmacological single-finger nerve block and 7-tesla neuroimaging, we first replicated previous accounts (electrophysiological and other) of local S1 remapping. Local blocking also triggered activity changes to nonblocked fingers across the entire hand area. Using methods exploiting interfinger representational overlap, however, we also show that the blocked finger representation remained persistent despite input loss. Computational modeling suggests that both local stability and global reorganization are driven by distributed processing underlying the topographic map, combined with homeostatic mechanisms. Our findings reveal complex interfinger representational features that play a key role in brain (re)organization, beyond (re)mapping.
Publisher: IOP Publishing
Date: 24-01-2022
Abstract: Objective. Considerable resources are being invested to enhance the control and usability of artificial limbs through the delivery of unnatural forms of somatosensory feedback. Here, we investigated whether intrinsic somatosensory information from the body part(s) remotely controlling an artificial limb can be leveraged by the motor system to support control and skill learning. Approach. We used local anaesthetic to attenuate somatosensory inputs to the big toes while participants learned to operate through pressure sensors a toe-controlled and hand-worn robotic extra finger. Motor learning outcomes were compared against a control group who received sham anaesthetic and quantified in three different task scenarios: while operating in isolation from, in synchronous coordination, and collaboration with, the biological fingers. Main results. Both groups were able to learn to operate the robotic extra finger, presumably due to abundance of visual feedback and other relevant sensory cues. Importantly, the availability of displaced somatosensory cues from the distal bodily controllers facilitated the acquisition of isolated robotic finger movements, the retention and transfer of synchronous hand-robot coordination skills, and performance under cognitive load. Motor performance was not impaired by toes anaesthesia when tasks involved close collaboration with the biological fingers, indicating that the motor system can close the sensory feedback gap by dynamically integrating task-intrinsic somatosensory signals from multiple, and even distal, body-parts. Significance. Together, our findings demonstrate that there are multiple natural avenues to provide intrinsic surrogate somatosensory information to support motor control of an artificial body part, beyond artificial stimulation.
Publisher: American Chemical Society (ACS)
Date: 25-01-2023
Publisher: American Psychological Association (APA)
Date: 2017
DOI: 10.1037/XHP0000338
Abstract: Previous research suggests integration of visual and somatosensory inputs is enhanced within reaching (peripersonal) space. In such experiments, somatosensory inputs are presented on the body while visual inputs are moved relatively closer to, or further from the body. It is unclear, therefore, whether enhanced integration in "peripersonal space" is truly due to proximity of visual inputs to the body space, or, simply the distance between the inputs (which also affects integration). Using a modified induction of the rubber hand illusion, here we measured proprioceptive drift as an index of visuosomatosensory integration when distance between the two inputs was constrained, and absolute distance from the body was varied. Further, we investigated whether integration varies with proximity of inputs to the habitual action space of the arm-rather than the actual arm itself. In Experiment 1, integration was enhanced with inputs proximal to habitual action space, and reduced with lateral distance from this space. This was not attributable to an attentional or perceptual bias of external space because the pattern of proprioceptive drift was opposite for left and right hand illusions, that is, consistently maximal at the shoulder of origin (Experiment 2). We conclude that habitual patterns of action modulate visuosomatosensory integration. It appears multisensory integration is modulated in locations of space that are functionally relevant for behavior, whether an actual body part resides within that space or not. (PsycINFO Database Record
Publisher: Elsevier BV
Date: 2022
Publisher: Elsevier BV
Date: 05-2014
DOI: 10.1016/J.CONCOG.2014.02.005
Abstract: In the current study we look at whether subjective and proprioceptive aspects of selfrepresentation are separable components subserved by distinct systems of multisensory integration. We used the rubber hand illusion (RHI) to draw the location of the 'self' away from the body, towards extracorporeal space (Out Condition), thereby violating top-down information about the body location. This was compared with the traditional RHI which drew position of the 'self' towards the body (In Condition). We were successfully able to draw proprioceptive position of the limbs in and out from the body suggesting body perception is a purely bottom-up process, resistant to top-down effects. Conversely, we found subjective self-representation was altered by the violation of top-down body information - as the strong association of subjective and proprioceptive factors found in the In Condition became non-significant in the Out Condition. Interestingly, we also found evidence that subjective embodiment can modulate tactile perception.
Publisher: IEEE
Date: 08-2019
Publisher: Frontiers Media SA
Date: 2012
Publisher: Springer Science and Business Media LLC
Date: 08-11-2019
Publisher: American Psychological Association (APA)
Date: 04-2019
DOI: 10.1037/XGE0000514
Publisher: American Chemical Society (ACS)
Date: 08-05-2023
Publisher: SAGE Publications
Date: 11-02-2021
Abstract: The optimisation of learning has long been a focus of scientific research, particularly in relation to improving psychological treatment and recovery of brain function. Previously, partial N-methyl-D-aspartate agonists have been shown to augment reward learning, procedural learning and psychological therapy, but many studies also report no impact of these compounds on the same processes. Here we investigate whether administration of an N-methyl-D-aspartate partial agonist (D-cycloserine) modulates a previously unexplored process – tactile perceptual learning. Further, we use a longitudinal design to investigate whether N-methyl-D-aspartate-related learning effects vary with time, thereby providing a potentially simple explanation for apparent mixed effects in previous research. Thirty-four volunteers were randomised to receive one dose of 250 mg D-cycloserine or placebo 2 h before tactile sensitivity training. Tactile perception was measured using psychophysical methods before and after training, and 24/48 h later. The placebo group showed immediate within-day tactile perception gains, but no further improvements between-days. In contrast, tactile perception remained at baseline on day one in the D-cycloserine group (no within-day learning), but showed significant overnight gains on day two. Both groups were equivalent in tactile perception by the final testing – indicating N-methyl-D-aspartate effects changed the timing, but not the overall amount of tactile learning. In sum, we provide first evidence for modulation of perceptual learning by administration of a partial N-methyl-D-aspartate agonist. Resolving how the effects of such compounds become apparent over time will assist the optimisation of testing schedules, and may help resolve discrepancies across the learning and cognition domains.
Publisher: Association for Research in Vision and Ophthalmology (ARVO)
Date: 25-07-2013
DOI: 10.1167/13.9.877
Publisher: American Physiological Society
Date: 03-2016
Abstract: Tactile learning transfers from trained to untrained fingers in a pattern that reflects overlap between the representations of fingers in the somatosensory system (e.g., neurons with multifinger receptive fields). While physical proximity on the body is known to determine the topography of somatosensory representations, tactile coactivation is also an established organizing principle of somatosensory topography. In this study we investigated whether tactile coactivation, induced by habitual inter-finger cooperative use (use pattern), shapes inter-finger overlap. To this end, we used psychophysics to compare the transfer of tactile learning from the middle finger to its adjacent fingers. This allowed us to compare transfer to two fingers that are both physically and cortically adjacent to the middle finger but have differing use patterns. Specifically, the middle finger is used more frequently with the ring than with the index finger. We predicted this should lead to greater representational overlap between the former than the latter pair. Furthermore, this difference in overlap should be reflected in differential learning transfer from the middle to index vs. ring fingers. Subsequently, we predicted temporary learning-related changes in the middle finger's representation (e.g., cortical magnification) would cause transient interference in perceptual thresholds of the ring, but not the index, finger. Supporting this, longitudinal analysis revealed a ergence where learning transfer was fast to the index finger but relatively delayed to the ring finger. Our results support the theory that tactile coactivation patterns between digits affect their topographic relationships. Our findings emphasize how action shapes perception and somatosensory organization.
Publisher: Elsevier BV
Date: 09-2019
Publisher: Wiley
Date: 05-05-2023
DOI: 10.1002/HBM.26298
Abstract: Scientists traditionally use passive stimulation to examine the organisation of primary somatosensory cortex (SI). However, given the close, bidirectional relationship between the somatosensory and motor systems, active paradigms involving free movement may uncover alternative SI representational motifs. Here, we used 7 Tesla functional magnetic resonance imaging to compare hallmark features of SI digit representation between active and passive tasks which were unmatched on task or stimulus properties. The spatial location of digit maps, somatotopic organisation, and inter‐digit representational structure were largely consistent between tasks, indicating representational consistency. We also observed some task differences. The active task produced higher univariate activity and multivariate representational information content (inter‐digit distances). The passive task showed a trend towards greater selectivity for digits versus their neighbours. Our findings highlight that, while the gross features of SI functional organisation are task invariant, it is important to also consider motor contributions to digit representation.
Publisher: Frontiers Media SA
Date: 2015
Location: United Kingdom of Great Britain and Northern Ireland
Location: United Kingdom of Great Britain and Northern Ireland
No related grants have been discovered for Olga Rakhmetova.