In general, sad music is thought to cause us to experience sadness, which is considered an unpleasant emotion. As a result, the question arises as to why we listen to sad music if it evokes sadness. One possible answer to this question is that we may actually feel positive emotions when we listen to sad music. This suggestion may appear to be counterintuitive; however, in this study, by dividing musical emotion into perceived emotion and felt emotion, we investigated this potential emotional response to music. We hypothesized that felt and perceived emotion may not actually coincide in this respect: sad music would be perceived as sad, but the experience of listening to sad music would evoke positive emotions. A total of 44 participants listened to musical excerpts and provided data on perceived and felt emotions by rating 62 descriptive words or phrases related to emotions on a scale that ranged from 0 (not at all) to 4 (very much). The results revealed that the sad music was perceived to be more tragic, whereas the actual experiences of the participants listening to the sad music induced them to feel more romantic, more blithe, and less tragic emotions than they actually perceived with respect to the same music. Thus, the participants experienced ambivalent emotions when they listened to the sad music. After considering the possible reasons that listeners were induced to experience emotional ambivalence by the sad music, we concluded that the formulation of a new model would be essential for examining the emotions induced by music and that this new model must entertain the possibility that what we experience when listening to music is vicarious emotion.
Guilt is an important social and moral emotion. In addition to feeling unpleasant, guilt is metaphorically described as a “weight on one’s conscience.” Evidence from the field of embodied cognition suggests that abstract metaphors may be grounded in bodily experiences, but no prior research has examined the embodiment of guilt. Across four studies we examine whether i) unethical acts increase subjective experiences of weight, ii) feelings of guilt explain this effect, and iii) whether there are consequences of the weight of guilt. Studies 1-3 demonstrated that unethical acts led to more subjective body weight compared to control conditions. Studies 2 and 3 indicated that heightened feelings of guilt mediated the effect, whereas other negative emotions did not. Study 4 demonstrated a perceptual consequence. Specifically, an induction of guilt affected the perceived effort necessary to complete tasks that were physical in nature, compared to minimally physical tasks.
Reactions to memorable experiences of sad music were studied by means of a survey administered to a convenience (N = 1577), representative (N = 445), and quota sample (N = 414). The survey explored the reasons, mechanisms, and emotions of such experiences. Memorable experiences linked with sad music typically occurred in relation to extremely familiar music, caused intense and pleasurable experiences, which were accompanied by physiological reactions and positive mood changes in about a third of the participants. A consistent structure of reasons and emotions for these experiences was identified through exploratory and confirmatory factor analyses across the samples. Three types of sadness experiences were established, one that was genuinely negative (Grief-Stricken Sorrow) and two that were positive (Comforting Sorrow and Sweet Sorrow). Each type of emotion exhibited certain individual differences and had distinct profiles in terms of the underlying reasons, mechanisms, and elicited reactions. The prevalence of these broad types of emotional experiences suggested that positive experiences are the most frequent, but negative experiences were not uncommon in any of the samples. The findings have implications for measuring emotions induced by music and fiction in general, and call attention to the non-pleasurable aspects of these experiences.
Pattern classification of human brain activity provides unique insight into the neural underpinnings of diverse mental states. These multivariate tools have recently been used within the field of affective neuroscience to classify distributed patterns of brain activation evoked during emotion induction procedures. Here we assess whether neural models developed to discriminate among distinct emotion categories exhibit predictive validity in the absence of exteroceptive emotional stimulation. In two experiments, we show that spontaneous fluctuations in human resting-state brain activity can be decoded into categories of experience delineating unique emotional states that exhibit spatiotemporal coherence, covary with individual differences in mood and personality traits, and predict on-line, self-reported feelings. These findings validate objective, brain-based models of emotion and show how emotional states dynamically emerge from the activity of separable neural systems.
The sense of body ownership represents a fundamental aspect of our self-consciousness. Influential experimental paradigms, such as the rubber hand illusion (RHI), in which a seen rubber hand is experienced as part of one’s body when one’s own unseen hand receives congruent tactile stimulation, have extensively examined the role of exteroceptive, multisensory integration on body ownership. However, remarkably, despite the more general current interest in the nature and role of interoception in emotion and consciousness, no study has investigated how the illusion may be affected by interoceptive bodily signals, such as affective touch. Here, we recruited 52 healthy, adult participants and we investigated for the first time, whether applying slow velocity, light tactile stimuli, known to elicit interoceptive feelings of pleasantness, would influence the illusion more than faster, emotionally-neutral, tactile stimuli. We also examined whether seeing another person’s hand vs. a rubber hand would reduce the illusion in slow vs. fast stroking conditions, as interoceptive signals are used to represent one’s own body from within and it is unclear how they would be integrated with visual signals from another person’s hand. We found that slow velocity touch was perceived as more pleasant and it produced higher levels of subjective embodiment during the RHI compared with fast touch. Moreover, this effect applied irrespective of whether the seen hand was a rubber or a confederate’s hand. These findings provide support for the idea that affective touch, and more generally interoception, may have a unique contribution to the sense of body ownership, and by implication to our embodied psychological “self.”
Recent progress in Affective Computing (AC) has enabled integration of physiological cues and spontaneous expressions to reveal a subject’s emotional state. Due to the lack of an effective technique for evaluating multimodal correlations, experience and intuition play a main role in present AC studies when fusing affective cues or modalities, resulting in unexpected outcomes. This study seeks to demonstrate a dynamic correlation between two such affective cues, physiological changes and spontaneous expressions, which were obtained by a combination of stereo vision based tracking and imaging photoplethysmography (iPPG), with a designed protocol involving 20 healthy subjects. The two cues obtained were sampled into a Statistical Association Space (SAS) to evaluate their dynamic correlation. It is found that the probability densities in the SAS increase as the peaks in two cues are approached. Also the complex form of the high probability density region in the SAS suggests a nonlinear correlation between two cues. Finally the cumulative distribution on the zero time-difference surface is found to be small (<0.047) demonstrating a lack of simultaneity. These results show that the two cues have a close interrelation, that is both asynchronous and nonlinear, in which a peak of one cue heralds a peak in the other.
Facial expressions convey key cues of human emotions, and may also be important for interspecies interactions. The universality hypothesis suggests that six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) should be expressed by similar facial expressions in close phylogenetic species such as humans and nonhuman primates. However, some facial expressions have been shown to differ in meaning between humans and nonhuman primates like macaques. This ambiguity in signalling emotion can lead to an increased risk of aggression and injuries for both humans and animals. This raises serious concerns for activities such as wildlife tourism where humans closely interact with wild animals. Understanding what factors (i.e., experience and type of emotion) affect ability to recognise emotional state of nonhuman primates, based on their facial expressions, can enable us to test the validity of the universality hypothesis, as well as reduce the risk of aggression and potential injuries in wildlife tourism.
Covert digital manipulation of vocal emotion alter speakers' emotional states in a congruent direction
- Proceedings of the National Academy of Sciences of the United States of America
- Published almost 5 years ago
Research has shown that people often exert control over their emotions. By modulating expressions, reappraising feelings, and redirecting attention, they can regulate their emotional experience. These findings have contributed to a blurring of the traditional boundaries between cognitive and emotional processes, and it has been suggested that emotional signals are produced in a goal-directed way and monitored for errors like other intentional actions. However, this interesting possibility has never been experimentally tested. To this end, we created a digital audio platform to covertly modify the emotional tone of participants' voices while they talked in the direction of happiness, sadness, or fear. The result showed that the audio transformations were being perceived as natural examples of the intended emotions, but the great majority of the participants, nevertheless, remained unaware that their own voices were being manipulated. This finding indicates that people are not continuously monitoring their own voice to make sure that it meets a predetermined emotional target. Instead, as a consequence of listening to their altered voices, the emotional state of the participants changed in congruence with the emotion portrayed, which was measured by both self-report and skin conductance level. This change is the first evidence, to our knowledge, of peripheral feedback effects on emotional experience in the auditory domain. As such, our result reinforces the wider framework of self-perception theory: that we often use the same inferential strategies to understand ourselves as those that we use to understand others.
Takotsubo syndrome (TTS) is typically provoked by negative stressors such as grief, anger, or fear leading to the popular term ‘broken heart syndrome’. However, the role of positive emotions triggering TTS remains unclear. The aim of the present study was to analyse the prevalence and characteristics of patients with TTS following pleasant events, which are distinct from the stressful or undesirable episodes commonly triggering TTS.
Individuals in close relationships help each other in many ways, from listening to each other’s problems, to making each other feel understood, to providing practical support. However, it is unclear if these supportive behaviors track each other across days and as stable tendencies in close relationships. Further, although past work suggests that giving support improves providers' well-being, the specific features of support provision that improve providers' psychological lives remain unclear. We addressed these gaps in knowledge through a daily diary study that comprehensively assessed support provision and its effects on well-being. We found that providers' emotional support (e.g., empathy) and instrumental support represent distinct dimensions of support provision, replicating prior work. Crucially, emotional support, but not instrumental support, consistently predicted provider well-being. These 2 dimensions also interacted, such that instrumental support enhanced well-being of both providers and recipients, but only when providers were emotionally engaged while providing support. These findings illuminate the nature of support provision and suggest targets for interventions to enhance well-being. (PsycINFO Database Record