Mon. Nov 25th, 2024

L,as well as age and emotion identification of content,sad,angry,fearful,disgusted,and neutral faces (GunningDixon et al. see also Williams et al. ITSA-1 web Keightley et al. This agerelated shift toward prefrontalbased and away from amygdalabased facial emotion processing has been interpreted as reflecting much more deliberative,controlled processing of emotional details in older than young adults (Satpute and Lieberman Williams et al. see Mather et al. St Jacques et al ,for comparable evidence with scenes and objects) and may well reflect agerelated improved emotion regulation strategies mediated by frontal brain regions (see St Jacques et al ,for an overview and also a discussion). In distinct,using an emotional face viewing process (followed by a facial expression identification activity outdoors the scanner) with blocks of pleased and fearful faces in an fMRI study,Williams et al. discovered a linear lower in dmPFC (MNI: x ,y ,z activity to happy faces along with a linear raise in dmPFC (MNI: x ,y ,z activity to fearful faces with increasing age. This getting was interpreted as further assistance of greater work and enhanced controlled processing of damaging compared to constructive faces with advancing age. Importantly,this shift in mPFC activity for processing positive vs. adverse faces was associated with emotional stability: Less dmPFC response to happy faces and more dmPFC response to fearful faces for the duration of the face viewing process predicted greater selfreported emotional stability (i.e lower levels of selfreported neuroticism). Williams et al.’s findings are in line with yet another study that examined variations amongst young and older adults’ brain activity in the context of a facial expression identification job and that explicitly differentiated delighted from various unfavorable expressions. Keightley and colleagues (Keightley et al performed an eventrelated fMRI study with faces depicting anger,disgust,fear,happiness,sadness,and surprise. To prevent verbal responses plus the higher memory load of a multiplealternative forcedchoice response format,participants overtly labeled the faces before getting into the scanner. They then saw each and every face once again through thescanner job and have been asked to silently (re)label every single of them. Largely in line with the literature (Isaacowitz et al. Ruffman et al. Ebner and Johnson,,young and older adults performed equally properly in identifying delighted faces,with ceiling functionality in both groups. Also,young adults outperformed older adults in identifying sadness,anger,and disgust but there had been no differences in identifying surprise,fear,or neutral faces. With respect for the fMRI data,Keightley et al. reported many findings. One particular pattern that distinguished happy from other expressions,largely driven by young adults,was characterized by greater activity in vmPFC,amongst other regions (i.e anterior and posterior cingulate gyrus,left postcentral gyrus,and bilateral middle frontal gyri,bilateral cuneus,precuneus,inferior parietal lobe,and superior temporal gyrus). This was accompanied by decreased activity in left dorsal anterior cingulate gyrus for content in comparison with other facial expressions. In PubMed ID:https://www.ncbi.nlm.nih.gov/pubmed/27582324 addition,at a lower threshold,for young (but not older) adults,there was greater activity in modest regions of bilateral amygdala and greater activity in left hippocampus for satisfied compared to other expressions. A second pattern distinguishing content from other expressions was largely driven by older adults,and was characterized by higher activity in vmPFC among other.