ORCID Profile
0000-0002-3972-0190
Current Organisation
Justus Liebig Universitat Giessen
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Association for Research in Vision and Ophthalmology (ARVO)
Date: 09-2018
DOI: 10.1167/18.10.881
Publisher: University of Queensland Library
Date: 2019
DOI: 10.14264/UQL.2020.50
Publisher: Association for Research in Vision and Ophthalmology (ARVO)
Date: 09-2016
DOI: 10.1167/16.12.190
Publisher: American Psychological Association (APA)
Date: 12-2018
DOI: 10.1037/XHP0000574
Abstract: Many everyday tasks require selecting relevant objects in the visual field while ignoring irrelevant information. A widely held belief is that attention is tuned to the exact feature value(s) of a sought-after target object (e.g., color, shape). In contrast, subsequent studies have shown that attentional orienting (capture) is often determined by the relative feature(s) that the target has relative to other irrelevant items surrounding (e.g., redder, larger). However, it is unknown whether conscious awareness is also determined by relative features. Alternatively, awareness could be more strongly determined by exact feature values, which seem to determine dwelling on objects. The present study examined eye movements in a color search task with different types of irrelevant distractors to test (a) whether dwelling is more strongly influenced by exact feature matches than relative matches, and (b) which of the processes (capture vs. dwelling) is more important for conscious awareness of the distractor. A second experiment used an electrophysiological marker of attention (N2pc in the electroencephalogram of participants) to test whether the results generalize to covert attention shifts. As expected, the results revealed that the initial capture of attention was strongest for distractors matching the relative color of the target, whereas similarity to the target was the most important determiner for dwelling. Awareness was more strongly determined by the initial capture of attention than dwelling. These results provide important insights into the interplay of attention and awareness and highlight the importance of considering relative, context-dependent features in theories of awareness. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Publisher: Elsevier BV
Date: 11-2021
DOI: 10.1016/J.CORTEX.2021.08.013
Abstract: Visual short-term memory (VSTM) is an important resource that allows temporarily storing visual information. Current theories posit that elementary features (e.g., red, green) are encoded and stored independently of each other in VSTM. However, they have difficulty explaining the similarity effect, that similar items can be remembered better than dissimilar items. In Experiment 1, we tested (N = 20) whether the similarity effect may be due to storing items in a context-dependent manner in VSTM (e.g., as the reddest/yellowest item). In line with a relational account of VSTM, we found that the similarity effect is not due to feature similarity, but to an enhanced sensitivity for detecting changes when the relative colour of a to-be-memorised item changes (e.g., from reddest to not-reddest item than when an item underwent the same change but retained its relative colour e.g., still reddest). Experiment 2 (N = 20) showed that VSTM load, as indexed by the CDA litude in the EEG, was smaller when the colours were ordered so that they all had the same relationship than when the same colours were out-of-order, requiring encoding different relative colours. With this, we report two new effects in VSTM - a relational detection advantage that describes an enhanced sensitivity to relative changes in change detection, and a relational CDA effect, which reflects that VSTM load, as indexed by the CDA, scales with the number of (different) relative features between the memory items. These findings support a relational account of VSTM and question the view that VSTM stores features such as colours independently of each other.
Publisher: Elsevier BV
Date: 2021
Publisher: Elsevier BV
Date: 12-2022
Publisher: Informa UK Limited
Date: 14-09-2019
Publisher: Association for Research in Vision and Ophthalmology (ARVO)
Date: 06-09-2019
DOI: 10.1167/19.10.132B
No related grants have been discovered for Aimee Martin.