Laughter is an universal social sign that plays a key role in human communication and interactions. It strengthens the bonds among people and provides a positive feedback to the interlocutor during a conversation.
Given the relevance of laughter for humans’ well-being, a system able to automatically recognize laughter episodes would have several applications.
Most of the existing datasets containing laughter episodes only report video and audio information. However, it has been studied that laughter is also characterized by movements and physiological reactions. The cues which characterize laughter episodes might however be confounded with tasks which elicit similar reactions such as cognitive load and intense movements such as clapping hands.
To investigate the possibility of recognizing laughter episodes using physiological and movement data we have created the USI_Laughs dataset.
The USI_Laughs dataset contains laughter episodes from 34 participants (28 males and 6 females) of age between 22 and 37 (Mean = 26.70, SD = 4.04) . The participants could decide whether to take part to the experiment alone or with a friend. This allowed us to collect data from 16 pairs and two individuals.
To gather physiological data we used the Empatica E4 wristband (https://www.empatica.com/en-eu/research/e4/). The E4 is a lightweight device equipped with four high-quality sensors which measures: the electrodermal activity (EDA), the blood volume pulse (BVP), the skin temperature (ST) and the 3-axis accelerometer data (ACC). Each participant wore a device for each wrist.
The total duration of the experiment is 16 minutes. Participants were asked to relax for 60 seconds, after that they watched a set of funny videos (total duration 10 minutes) and then relax again. We asked them to perform an acted laughter – to analyze the difference between real and acted laughter. After relaxing for other 30 seconds participants clapped their hands three times – to analyze the difference between the body movements generated by laughter episodes and the ones from slightly intense motions –. Additional 30 seconds of relax then they performed the Stroop test – to compare the activation of the SNS due to laughter episodes and the one due to cognitive load tasks. The experiment ended with final 60 seconds of relax.
Participants were video recorded, the laughter episodes have been rated by three experts. The data set contains the “laughter_episodes_annotations.xlsx” with the final labels. More details can be found in the README file in the data set as well as in .
Request the USI_Laughs dataset
To request the USI_Laughs dataset please contact Elena Di Lascio at email@example.com
Relevant papers Elena Di Lascio, Shkurta Gashi and Silvia Santini. Laughter Recognition Using Noninvasive Wearable Devices. In: Proceedings of the 13th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth), May 2019.
If you use this dataset for research purpose, please acknowledge publication .
Contact: Elena Di Lascio firstname.lastname@example.org