In this work we present ASCERTAIN– a multimodal databaASe for impliCit pERsonaliTy and Affect recognitIoN using commercial physiological sensors. To our knowledge, ASCERTAIN is the first database to connect personality traits and emotional states via physiological responses. ASCERTAIN contains big-five personality scales and emotional self-ratings of 58 users along with synchronously recorded Electroencephalogram (EEG), Electrocardiogram (ECG), Galvanic Skin Response (GSR) and facial activity data, recorded using off-the-shelf sensors while viewing affective movie clips. We first examine relationships between users’ affective ratings and personality scales in the context of prior observations, and then study linear and non-linear physiological correlates of emotion and personality. Our analysis suggests that the emotion–personality relationship is better captured by non-linear rather than linear statistics. We finally attempt binary emotion and personality trait recognition using physiological features. Experimental results cumulatively confirm that personality differences are better revealed while comparing user responses to emotionally homogeneous videos, and above-chance recognition is achieved for both affective and personality dimensions.
You can access the dataset here
The Extracted Features that are explained in the paper and used for experimental analysis are shared here. Also the raw data from each clip and each modality can be accessed here. We used MATLAB for signal analysis. The shared files have ".mat" extensions and can be read using MATLAB/OCTAVE.
If you have any further questions about the dataset please contact:
ascertain.mhug [at] gmail [dot] com