In Fall 2021, students returned to campus for in person classes. Students were eager to explore their experiences during the pandemic and designed an experiment in which they viewed images and videos depicting various events that occurred during the pandemic. The image task includeded 155 images sampled from categories including: Black Lives Matter protests, the 2020 Presidential Election, Dartmouth specific images, COVID-19 cases, healthcare during COVID-19, COVID-19 vaccines, pop culture, voting rights, memorials, and political events. Participants (N=7) completed a survey, in which they rated each image on a variety of dimensions after being scanned (e.g. when did this event occur? how much empathy does this image evoke?, etc). Participants also watched a short video from the insurrection at the capital containing video footage from the rioters as they breached the capital. Data were preprocessed using fmriprep and analyzed using nltools in Python.
Numb to the world: Desensitization to emotional stimuli over time#
Lucy Shao, Maeve Brown, Helen Horan
Paying attention to evocative visual stimuli over an extended period of time could be an emotionally draining task. Emotional regulation skills may be of use to manage compassion fatigue when repeatedly encountering such tasks, which leads to an individual’s eventual habituation to such stimuli. This study sought to explore the phenomena of habituation specifically to emotional stimuli using COVID-related images. The results implicated regions traditionally associated with emotion (i.e. amygdala) become more active over time while those associated with executive control become less active. Our results further elucidate the process of habituation during a short time period: people become less attuned to visual stimuli and exert less executive regulation over time. These findings hold particular importance in the context of the COVID-19 pandemic given the tendency for individuals to experience emotional desensitization to news and blunted emotional responses over time despite rising death tolls.
Synchronization of brain activity in the vmPFC across participants during naturalistic stimulation using highly affective stimuli#
One of the central abilities that humans share is that to create a subjective experience based on emotions derived from our sensory perception, context, and previous memory. Much of our daily interactions and experiences are composed of naturalistic stimuli, such as video clips, which can evoke multimodal processing and strong emotions with increased use of technology in every facet of life. The ability to share experiences is a key contributor to how human beings interact socially. Activity in the vmPFC has been shown to be central in emotion, decision making and social cognition and as such a key component in the representation of shared affective experiences. To investigate the synchronization of brain activity in the vmPFC during affective naturalistic stimuli viewing, I leveraged an analytical framework to investigate functional brain activity and connectivity across participants during an activity where they viewed a four and a half-minute first person video with a high affective valence. I made use of inter-subject correlations of hemodynamic data acquired with functional magnetic resonance imaging and analyzed the inter subject phase synchrony over time. The data showed that the participants had increased alignment of vmPFC responses when viewing the highly affective video at multiple timepoints. The average synchronization in the vmPFC was higher compared to the primary visual and auditory cortex for the shared experience. My findings suggest that the vmPFC plays a key role in affective and conceptual processing. The data suggests that there exists a higher-level similarity in the response of the vmPFC to highly affective stimuli which points to a shared underlying neural mechanism for understanding and interpreting such experiences.
Empathy, emotion, and brain regions associated with viewing real images compared to data#
Brianna Aubrey & Halla Hafermann
This study examined varying responses in empathy and emotion after being exposed to information in the form of data versus real images. We hypothesized that subjects would experience a higher level of empathy and emotion for real images rather than data and that we would see brain activation in areas such as the parietal lobule, temporoparietal junction (TPJ), anterior cingulate cortex (ACC), and posterior superior temporal sulcus. To test our hypotheses, we presented subjects with stimuli in the form of real images and data in an fMRI machine and later asked them to rate their emotional and empathetic responses to the same stimuli in a behavioral survey. We found significantly higher ratings of empathy and feeling emotional/moved for real images, and brain data analysis revealed brain activation regions that process empathy and emotion, including the ventral medial prefrontal cortex (vmPFC), TPJ, and ACC. Finally, we were able to use a classification conditioning model to predict whether subjects were viewing real images or data with just over 70% accuracy.
The effect of time on brain responses and empathy levels#
Hannah LeBaron, Olivia Marquis, Ashley Post, Brandon Zhou
Our PSYC 60 group set out to test the effect of perceived recency on brain responses and empathy levels. We predicted that more recent images might evoke stronger brain activity and higher empathy levels. To complete the study, we collected fMRI data from participants in the PSYC 60 class. Participants in the scanner were shown 155 images representing various aspects and events since the beginning of the COVID-19 pandemic. Images were related to major events that occurred over the past year and a half, ranging from COVID-19 vaccines, to Black Lives Matter protests, to pop culture. Based on the stimuli, we used the fMRI machine to measure brain activation levels when each image was shown. We also asked participants to complete a behavioral survey. We collected data from participants about each of the images regarding how long ago they perceived the event to be and how empathetic it made them feel. After collecting our imaging and behavioral data, we completed four analyses to explore our question. The first was a within subject time vs brain region representational similarity analysis (RSA). The second was a group average time vs. brain region representational similarity analysis. The third was a first level time vs. empathy correlation. Finally, we completed a group coverage time vs empathy representational similarity analysis to measure this correlation at the second level. In our within subject time vs activation region RSA, we found that the brain regions with the highest correlation are in the posterior regions of the brain, specifically near the occipital lobe and posterior hippocampus. Our group average time vs. brain region RSA used a group level t-test and confirmed the highest level of activation in the occipital lobe and posterior hippocampus. Our third analysis, a first level time vs. empathy correlation, produced an r-value of 0.001 and p = 0.992, showing no meaningful relationship. Finally, the group average time vs. empathy RSA resulted in a correlation value of r = -0.008 and p = 0.665, which is also a non-significant correlation. The conclusions and significance of our results are limited, likely due to our small sample size. However, we can see that the posterior hippocampus activation, a region frequently associated with memory and spatial navigation, may highlight a relationship with the time judgement processing that we are investigating.