ORCID Profile
0000-0001-7271-1459
Current Organisations
Observatoire Royal de Belgique
,
Bond University
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
In Research Link Australia (RLA), "Research Topics" refer to ANZSRC FOR and SEO codes. These topics are either sourced from ANZSRC FOR and SEO codes listed in researchers' related grants or generated by a large language model (LLM) based on their publications.
Biological Psychology (Neuropsychology, Psychopharmacology, Physiological Psychology) | Psychology | Social and Community Psychology
Publisher: Wiley
Date: 24-01-2019
DOI: 10.1111/BJOP.12376
Abstract: Young adults recognize other young adult faces more accurately than older adult faces, an effect termed the own-age bias (OAB). The categorization-in iduation model (CIM) proposes that recognition memory biases like the OAB occur as unfamiliar faces are initially quickly categorized. In-group faces are seen as socially relevant which motivates the processing of in iduating facial features. Outgroup faces are processed more superficially with attention to category-specific information which hinders subsequent recognition. To examine the roles of categorization and in iduation in the context of the OAB, participants completed a face recognition task and a speeded age categorization task including young and older adult faces. In the recognition task, half of the participants were given instructions aimed to encourage in iduation of other-age faces. An OAB emerged that was not influenced by in iduation instructions, but the magnitude of the OAB was correlated with performance in the categorization task. The larger the categorization advantage for older adult over young adult faces, the larger the OAB. These results support the premise that social categorization processes can affect the subsequent recognition of own- and other-age faces, but do not provide evidence for the effectiveness of in iduation instructions in reducing the OAB.
Publisher: American Psychological Association (APA)
Date: 2012
DOI: 10.1037/A0028622
Abstract: The question as to whether poser race affects the happy categorization advantage, the faster categorization of happy than of negative emotional expressions, has been answered inconsistently. Hugenberg (2005) found the happy categorization advantage only for own race faces whereas faster categorization of angry expressions was evident for other race faces. Kubota and Ito (2007) found a happy categorization advantage for both own race and other race faces. These results have vastly different implications for understanding the influence of race cues on the processing of emotional expressions. The current study replicates the results of both prior studies and indicates that face type (computer-generated vs. photographic), presentation duration, and especially stimulus set size influence the happy categorization advantage as well as the moderating effect of poser race.
Publisher: Wiley
Date: 13-03-2018
DOI: 10.1111/BJOP.12297
Abstract: Young adult participants are faster to detect young adult faces in crowds of infant and child faces than vice versa. These findings have been interpreted as evidence for more efficient attentional capture by own-age than other-age faces, but could alternatively reflect faster rejection of other-age than own-age distractors, consistent with the previously reported other-age categorization advantage: faster categorization of other-age than own-age faces. Participants searched for own-age faces in other-age backgrounds or vice versa. Extending the finding to different other-age groups, young adult participants were faster to detect young adult faces in both early adolescent (Experiment 1) and older adult backgrounds (Experiment 2). To investigate whether the own-age detection advantage could be explained by faster categorization and rejection of other-age background faces, participants in experiments 3 and 4 also completed an age categorization task. Relatively faster categorization of other-age faces was related to relatively faster search through other-age backgrounds on target absent trials but not target present trials. These results confirm that other-age faces are more quickly categorized and searched through and that categorization and search processes are related however, this correlational approach could not confirm or reject the contribution of background face processing to the own-age detection advantage.
Publisher: Springer Science and Business Media LLC
Date: 22-06-2021
Publisher: Springer Science and Business Media LLC
Date: 05-07-2017
DOI: 10.3758/S13414-017-1364-Z
Abstract: The magnitude of the happy categorisation advantage, the faster recognition of happiness than negative expressions, is influenced by facial race and sex cues. Previous studies have investigated these relationships using racial outgroups stereotypically associated with physical threat in predominantly Caucasian s les. To determine whether these influences generalise to stimuli representing other ethnic groups and to participants of different ethnicities, Caucasian Australian (Experiments 1 and 2) and Chinese participants (Experiment 2) categorised happy and angry expressions displayed on own-race male faces presented with emotional other-race male, own-race female, and other-race female faces in separate tasks. The influence of social category cues on the happy categorisation advantage was similar in the Australian and Chinese s les. In both s les, the happy categorisation advantage was present for own-race male faces when they were encountered with other-race male faces but reduced when own-race male faces were categorised along with female faces. The happy categorisation advantage was present for own-race and other-race female faces when they were encountered with own-race male faces in both s les. Results suggest similarity in the influence of social category cues on emotion categorisation.
Publisher: American Psychological Association (APA)
Date: 08-2013
DOI: 10.1037/A0031970
Abstract: Previous research has provided inconsistent results regarding visual search for emotional faces, yielding evidence for either anger superiority (i.e., more efficient search for angry faces) or happiness superiority effects (i.e., more efficient search for happy faces), suggesting that these results do not reflect on emotional expression, but on emotion (un-)related low-level perceptual features. The present study investigated possible factors mediating anger/happiness superiority effects specifically search strategy (fixed vs. variable target search Experiment 1), stimulus choice (Nimstim database vs. Ekman & Friesen database Experiments 1 and 2), and emotional intensity (Experiment 3 and 3a). Angry faces were found faster than happy faces regardless of search strategy using faces from the Nimstim database (Experiment 1). By contrast, a happiness superiority effect was evident in Experiment 2 when using faces from the Ekman and Friesen database. Experiment 3 employed angry, happy, and exuberant expressions (Nimstim database) and yielded anger and happiness superiority effects, respectively, highlighting the importance of the choice of stimulus materials. Ratings of the stimulus materials collected in Experiment 3a indicate that differences in perceived emotional intensity, pleasantness, or arousal do not account for differences in search efficiency. Across three studies, the current investigation indicates that prior reports of anger or happiness superiority effects in visual search are likely to reflect on low-level visual features associated with the stimulus materials used, rather than on emotion.
Publisher: Informa UK Limited
Date: 08-08-2016
DOI: 10.1080/02699931.2016.1215293
Abstract: Facial race and sex cues can influence the magnitude of the happy categorisation advantage. It has been proposed that implicit race or sex based evaluations drive this influence. Within this account a uniform influence of social category cues on the happy categorisation advantage should be observed for all negative expressions. Support has been shown with angry and sad expressions but evidence to the contrary has been found for fearful expressions. To determine the generality of the evaluative congruence account, participants categorised happiness with either sadness, fear, or surprise displayed on White male as well as White female, Black male, or Black female faces across three experiments. Faster categorisation of happy than negative expressions was observed for female faces when presented among White male faces, and for White male faces when presented among Black male faces. These results support the evaluative congruence account when both positive and negative expressions are presented.
Publisher: Informa UK Limited
Date: 03-04-2017
DOI: 10.1080/02699931.2017.1310087
Abstract: Facial attributes such as race, sex, and age can interact with emotional expressions however, only a couple of studies have investigated the nature of the interaction between facial age cues and emotional expressions and these have produced inconsistent results. Additionally, these studies have not addressed the mechanism/s driving the influence of facial age cues on emotional expression or vice versa. In the current study, participants categorised young and older adult faces expressing happiness and anger (Experiment 1) or sadness (Experiment 2) by their age and their emotional expression. Age cues moderated categorisation of happiness vs. anger and sadness in the absence of an influence of emotional expression on age categorisation times. This asymmetrical interaction suggests that facial age cues are obligatorily processed prior to emotional expressions. Finding a categorisation advantage for happiness expressed on young faces relative to both anger and sadness which are negative in valence but different in their congruence with old age stereotypes or structural overlap with age cues suggests that the observed influence of facial age cues on emotion perception is due to the congruence between relatively positive evaluations of young faces and happy expressions.
Publisher: American Psychological Association (APA)
Date: 2017
DOI: 10.1037/EMO0000208
Abstract: The speed of recognizing facial expressions of emotion is influenced by a range of factors including other concurrently present facial attributes, like a person's sex. Typically, when participants categorize happy and angry expressions on male and female faces, they are faster to categorize happy than angry expressions displayed by females, but not displayed by males. Using the same emotional faces across tasks, we demonstrate that this influence of sex cues on emotion categorization is dependent on the other faces recently encountered in an experiment. Altering the salience of gender by presenting male and female faces in separate emotion categorization tasks rather than together in a single task changed the influence of sex cues on emotion categorization, whereas changing the evaluative dimension by presenting happy and angry expressions in separate tasks alongside neutral faces rather than together within 1 task did not. These results suggest that the way facial attributes influence emotion categorization depends on the situation in which the faces are encountered and specifically on what information is made salient within or across tasks by other recently encountered faces. (PsycINFO Database Record
Publisher: American Psychological Association (APA)
Date: 2014
DOI: 10.1037/A0036043
Abstract: Recently, D.V. Becker, Anderson, Mortensen, Neufeld, and Neel (2011) proposed recommendations to avoid methodological confounds in visual search studies using emotional photographic faces. These confounds were argued to cause the frequently observed Anger Superiority Effect (ASE), the faster detection of angry than happy expressions, and conceal a true Happiness Superiority Effect (HSE). In Experiment 1, we applied these recommendations (for the first time) to visual search among schematic faces that previously had consistently yielded a robust ASE. Contrary to the prevailing literature, but consistent with D.V. Becker et al. (2011), we observed a HSE with schematic faces. The HSE with schematic faces was replicated in Experiments 2 and 3 using a similar method in discrimination tasks rather than fixed target searches. Experiment 4 isolated background heterogeneity as the key determinant leading to the HSE.
Publisher: Informa UK Limited
Date: 24-02-2021
Publisher: Springer Science and Business Media LLC
Date: 03-2023
DOI: 10.1007/S10393-023-01630-1
Abstract: Climate change and its effects present notable challenges for mental health, particularly for vulnerable populations, including young people. Immediately following the unprecedented Black Summer bushfire season of 2019/2020, 746 Australians (aged 16–25 years) completed measures of mental health and perceptions of climate change. Results indicated greater presentations of depression, anxiety, stress, adjustment disorder symptoms, substance abuse, and climate change distress and concern, as well as lower psychological resilience and perceived distance to climate change, in participants with direct exposure to these bushfires. Findings highlight significant vulnerabilities of concern for youth mental health as climate change advances.
Publisher: Springer Science and Business Media LLC
Date: 08-04-2022
DOI: 10.1038/S41598-022-09397-1
Abstract: Human visual systems have evolved to extract ecologically relevant information from complex scenery. In some cases, the face in the crowd visual search task demonstrates an anger superiority effect, where anger is allocated preferential attention. Across three studies ( N = 419), we tested whether facial hair guides attention in visual search and influences the speed of detecting angry and happy facial expressions in large arrays of faces. In Study 1, participants were faster to search through clean-shaven crowds and detect bearded targets than to search through bearded crowds and detect clean-shaven targets. In Study 2, targets were angry and happy faces presented in neutral backgrounds. Facial hair of the target faces was also manipulated. An anger superiority effect emerged that was augmented by the presence of facial hair, which was due to the slower detection of happiness on bearded faces. In Study 3, targets were happy and angry faces presented in either bearded or clean-shaven backgrounds. Facial hair of the background faces was also systematically manipulated. A significant anger superiority effect was revealed, although this was not moderated by the target’s facial hair. Rather, the anger superiority effect was larger in clean-shaven than bearded face backgrounds. Together, results suggest that facial hair does influence detection of emotional expressions in visual search, however, rather than facilitating an anger superiority effect as a potential threat detection system, facial hair may reduce detection of happy faces within the face in the crowd paradigm.
Publisher: Wiley
Date: 28-11-2019
DOI: 10.1111/BJOP.12435
Abstract: The own-age bias (OAB) is suggested to be caused by perceptual-expertise and/or social-cognitive mechanisms. Bryce and Dodson (2013, Psychology and Aging, 28, 87, Exp 2) provided support for the social-cognitive account, demonstrating an OAB for participants who encountered a mixed-list of own- and other-age faces, but not for participants who encountered a pure-list of only own- or other-age faces. They proposed that own-age/other-age categorization, and the resulting OAB, only emerge when age is made salient in the mixed-list condition. Our study aimed to replicate this finding using methods typically used to investigate the OAB to examine their robustness and contribution to our understanding of how the OAB forms. Across three experiments that removed theoretically unimportant components of the original paradigm, varied face sex, and included background scenes, the OAB emerged under both mixed-list and pure-list conditions. These results are more consistent with a perceptual-expertise than social-cognitive account of the OAB, but may suggest that manipulating age salience using mixed-list and pure-list presentations is not sufficient to alter categorization processes.
Publisher: American Psychological Association (APA)
Date: 2014
DOI: 10.1037/A0037270
Abstract: Both facial cues of group membership (race, age, and sex) and emotional expressions can elicit implicit evaluations to guide subsequent social behavior. There is, however, little research addressing whether group membership cues or emotional expressions are more influential in the formation of implicit evaluations of faces when both cues are simultaneously present. The current study aimed to determine this. Emotional expressions but not race or age cues elicited implicit evaluations in a series of affective priming tasks with emotional Caucasian and African faces (Experiments 1 and 2) and young and old faces (Experiment 3). Spontaneous evaluations of group membership cues of race and age only occurred when those cues were task relevant, suggesting the preferential influence of emotional expressions in the formation of implicit evaluations of others when cues of race or age are not salient. Implications for implicit prejudice, face perception, and person construal are discussed.
Publisher: Elsevier BV
Date: 07-2022
Publisher: Informa UK Limited
Date: 30-03-2022
DOI: 10.1080/02699931.2022.2057442
Abstract: Past research demonstrates that emotion recognition is influenced by social category cues present on faces. However, little research has investigated whether holistic processing is required to observe these influences of social category information on emotion perception, and no studies have investigated whether different visual s ling strategies (i.e. differences in the allocation of attention to different regions of the face) contribute to the interaction between social cues and emotional expressions. The current study aimed to address this. Participants categorised happy and angry expressions on own- and other-race faces, and male and female faces. In Experiments 1 and 2, holistic processing was disrupted by presenting inverted faces (Experiment 1) or part faces (Experiment 2). In Experiments 3 and 4 participants' eye-gaze to eye and mouth regions was also tracked. Disrupting holistic processing did not alter the moderating influence of sex and race cues on emotion recognition (Experiments 1, 2, 4). Gaze patterns differed as a function of emotional expression, and social category cues, however, eye-gaze patterns did not reflect response time patterns (Experiments 3 and 4). Results indicate that the interaction between social category cues and emotion does not require holistic processing and is not driven by differences in visual s ling.
Publisher: Informa UK Limited
Date: 18-12-2013
DOI: 10.1080/02699931.2013.867831
Abstract: Facial cues of threat such as anger and other race membership are detected preferentially in visual search tasks. However, it remains unclear whether these facial cues interact in visual search. If both cues equally facilitate search, a symmetrical interaction would be predicted anger cues should facilitate detection of other race faces and cues of other race membership should facilitate detection of anger. Past research investigating this race by emotional expression interaction in categorisation tasks revealed an asymmetrical interaction. This suggests that cues of other race membership may facilitate the detection of angry faces but not vice versa. Utilising the same stimuli and procedures across two search tasks, participants were asked to search for targets defined by either race or emotional expression. Contrary to the results revealed in the categorisation paradigm, cues of anger facilitated detection of other race faces whereas differences in race did not differentially influence detection of emotion targets.
Publisher: Elsevier BV
Date: 09-2018
DOI: 10.1016/J.BRAT.2018.06.004
Abstract: In associative learning, if stimulus A is presented in the same temporal context as the conditional stimulus (CS) - outcome association (but not in a way that allows an A-CS association to form) it becomes a temporal context cue, acquiring the ability to activate this context and retrieve the CS-outcome association. We examined whether a CS- presented during acquisition or extinction that predicted the absence of the unconditional stimulus (US) could act as a temporal context cue, reducing or enhancing responding, in differential fear conditioning. Two groups received acquisition (CSx-US, CSa-noUS) in phase 1 and extinction (CSx-noUS CSe-noUS) in phase 2 (AE groups), and two groups received extinction in phase 1 and acquisition in phase 2 (EA groups). After a delay, participants were presented with either CSa (AEa and EAa groups) or CSe (AEe and EAe groups). Responding to CSx was enhanced after presentation of CSa but reduced after presentation of CSe, suggesting that training was segmented into two learning episodes and that the unreinforced CS present during an episode retrieved the CSx-US or CSx-noUS association. These findings suggest that temporal context cues may enhance or reduce fear responding, providing an exciting new avenue for relapse prevention research.
Publisher: Wiley
Date: 19-11-2020
DOI: 10.1111/BJOP.12481
Abstract: The own‐age bias (OAB) has been proposed to be caused by perceptual expertise and/or social‐cognitive mechanisms. Investigations into the role of social cognition have, however, yielded mixed results. One reason for this might be the tendency for research to focus on the OAB in young adults, between young and older adult faces where other‐age in iduation experience is low. To explore whether social‐cognitive manipulations may be successful when observers have sufficient other‐age in iduation experience, we examined biases involving middle‐aged other‐age faces and the influence of a context manipulation. Across four experiments, young adult participants were presented with middle‐aged faces alongside young or older adult faces to remember. We predicted that in contexts where middle‐aged faces were positioned as other‐age faces (alongside young adult faces), recognition performance would be worse than when they were positioned as relative own‐age faces (alongside older adult faces). However, the context manipulations did not moderate middle age face recognition. This suggests that past findings that context does not change other‐age face recognition holds for other‐age faces for which observers have higher in iduation experience. These findings are consistent with a perceptual expertise account of the OAB but more investigation of the generality of these results is required.
Publisher: SAGE Publications
Date: 04-08-2014
Abstract: Happy faces are categorized faster as “happy” than angry faces as “angry,” the happy face advantage. Here, we show across three experiments that the size of the happy face advantage for male Caucasian faces varies as a function of the other faces they are presented with. A happy face advantage was present if the male Caucasian faces were presented among male African American faces, but absent if the same faces were presented among female faces, Caucasian or African American. The modulation of the happy face advantage for male Caucasian faces was observed even if the female Caucasian/male African American faces had neutral expressions. This difference in the happy face advantage for a constant set of faces as a function of the other faces presented indicates that it does not reflect on a stimulus-dependent bottom-up process but on the evaluation of the expressive faces within a specific context.
Publisher: The Open Journal
Date: 14-02-2020
DOI: 10.21105/JOSS.01832
Publisher: Association for Research in Vision and Ophthalmology (ARVO)
Date: 06-09-2019
DOI: 10.1167/19.10.152C
Publisher: American Psychological Association (APA)
Date: 10-2019
DOI: 10.1037/EMO0000517
Abstract: We are better at recognizing faces of our own age group compared to faces of other age groups. It has been suggested that this own-age bias (OAB) might occur because of perceptual-expertise and/or social-cognitive mechanisms. Although there is evidence to suggest effects of perceptual-expertise, little research has explored the role of social-cognitive factors. To do so, we looked at how the presence of an emotional expression on the face changes the magnitude of the OAB. Across 3 experiments, young adult participants were presented with young and older adult faces to remember. Neutral faces were first presented alone (Experiment 1) to validate the proposed paradigm and then presented along with angry (Experiment 2) and sad or happy faces (Experiment 3). The presence of an emotional expression improved the recognition of older adult faces, reducing the OAB which was evident for neutral faces. These results support the involvement of social-cognitive factors in the OAB, suggesting that a perceptual-expertise account cannot fully explain this face recognition bias. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Publisher: Springer Science and Business Media LLC
Date: 05-03-2015
DOI: 10.3758/S13414-015-0849-X
Abstract: Previous research has shown that invariant facial features-for ex le, sex-and variant facial features-for ex le, emotional expressions-interact during face categorization. The nature of this interaction is a matter of dispute, however, and has been reported as either asymmetrical, such that sex cues influence emotion perception but emotional expressions do not affect the perception of sex, or symmetrical, such that sex and emotion cues each reciprocally influence the categorization of the other. In the present research, we identified stimulus set size as the critical factor leading to this disparity. Using faces drawn from different databases, in two separate experiments we replicated the finding of a symmetrical interaction between face sex and emotional expression when larger sets of posers were used. Using a subset of four posers, in the same setups, however, did not provide evidence for a symmetrical interaction, which is also consistent with prior research. This pattern of results suggests that different strategies may be used to categorize aspects of faces that are encountered repeatedly.
Publisher: Frontiers Media SA
Date: 03-11-2022
DOI: 10.3389/FPUBH.2022.1049932
Abstract: A Code Red has been declared for the planet and human health. Climate change (e.g., increasing temperatures, adverse weather events, rising sea levels) threatens the planet's already declining ecosystems. Without urgent action, all of Earth's inhabitants face an existential threat. Health professions education should therefore prepare learners to not only practice in a changing world, but authentic educational activities should also develop competencies for global and planetary citizenship. Planetary health has been integrated across the five-year Bond University (Australia) medical curriculum. It begins in the second week of Year 1 and ends with a session on Environmentally Sustainable Healthcare in the General Practice rotation in the final year. The purpose of this article is to describe the outcomes of the first 5 years (2018–2022) of a learner-centered planetary health assignment, underpinned by the 2030 United Nations (UN) Sustainable Development Goals (SDGs), in the second year of a five-year medical program. Using systems and/or design thinking with a focus on SDG13 ( Climate Action ) plus a second SDG of choice, self-selected teams of 4–6 students submit a protocol (with feedback) to develop a deliverable “product” for an intended audience. Data analysis of the first 5 years of implementation found that the most frequently selected SDGs in addition to SDG13 were: SDG12 Sustainable Production and Consumption (41% of teams), mostly relating to healthcare emissions and waste SDG3 Health and Well-being (22%), generally involving the impact of air pollution and SDG6 Clean Water and Sanitation (15%). A survey at the concluding conference garnered student feedback across various criteria. The planetary health assignment is authentic in that teams provide solutions to address climate change. Where appropriate, final “products” are sent to local or federal ministers for consideration (e.g., policy proposals) or integrated into the curriculum (e.g., learning modules). We believe that the competencies, attitudes, and values fostered through engagement with planetary health. Throughout the medical program, as evidenced by their evaluations, stands students in good stead to be change agents, not only in clinical practice but in society. An awareness has been created about the need for planetary citizenship in addition to global citizenship.
Publisher: American Psychological Association (APA)
Date: 09-2019
DOI: 10.1037/EMO0000513
Abstract: A happy face advantage has consistently been shown in emotion categorization tasks happy faces are categorized as happy faster than angry faces as angry. Furthermore, social category cues, such as facial sex and race, moderate the happy face advantage in evaluatively congruent ways with a larger happy face advantage for more positively evaluated faces. We investigated whether attractiveness, a facial attribute unrelated to more defined social categories, would moderate the happy face advantage consistent with the evaluative congruence account. A larger happy face advantage for the more positively evaluated attractive faces than for unattractive faces was predicted. Across 4 experiments participants categorized attractive and unattractive faces as happy or angry as quickly and accurately as possible. As predicted, when female faces were categorized separately, a happy face advantage emerged for the attractive females but not for the unattractive females. Corresponding results were only found in the error rates for male faces. This pattern was confirmed when female and male faces were categorized together, indicating that attractiveness may have a stronger influence on emotion perception for female faces. Attractiveness is shown to moderate emotion perception in line with the evaluative congruence account and is suggested to have a stronger influence on emotion perception than facial sex cues in contexts where attractiveness is a salient evaluative dimension. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Publisher: SAGE Publications
Date: 25-03-2019
Abstract: The beard is arguably one of the most obvious signals of masculinity in humans. Almost 150 years ago, Darwin suggested that beards evolved to communicate formidability to other males, but no studies have investigated whether beards enhance recognition of threatening expressions, such as anger. We found that the presence of a beard increased the speed and accuracy with which participants recognized displays of anger but not happiness (Experiment 1, N = 219). This effect was not due to negative evaluations shared by beardedness and anger or to negative stereotypes associated with beardedness, as beards did not facilitate recognition of another negative expression, sadness (Experiment 2, N = 90), and beards increased the rated prosociality of happy faces in addition to the rated masculinity and aggressiveness of angry faces (Experiment 3, N = 445). A computer-based emotion classifier reproduced the influence of beards on emotion recognition (Experiment 4). The results suggest that beards may alter perceived facial structure, facilitating rapid judgments of anger in ways that conform to evolutionary theory.
Publisher: American Psychological Association (APA)
Date: 12-2019
DOI: 10.1037/EMO0000530
Abstract: Previous research has demonstrated that facial social category cues influence emotion perception such that happy expressions are categorized faster than negative expressions on faces belonging to positively evaluated social groups. We examined whether character information that is experimentally manipulated can also influence emotion perception. Across two experiments, participants learned to associate in iduals posing neutral expressions with positive or negative acts. In a subsequent task, participants categorized happy and angry expressions of these same in iduals as quickly and accurately as possible. As predicted, a larger happy face advantage emerged for in iduals associated with positive character information than for in iduals associated with negative character information. These results demonstrate that experimentally manipulated evaluations of an in idual's character are available quickly and affect early stages of face processing. Emotion perception is not only influenced by preexisting attitudes based on facial attributes, but also by information about a person that has been recently acquired. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Publisher: American Psychological Association (APA)
Date: 05-2023
DOI: 10.1037/EMO0001243
Publisher: Springer Science and Business Media LLC
Date: 06-2020
Publisher: Elsevier BV
Date: 03-2018
Start Date: 01-2023
End Date: 01-2026
Amount: $404,680.00
Funder: Australian Research Council
View Funded Activity