Towards in situ Affect Detection in Mobile Devices: A Multimodal Approach
Document Type
Conference Proceeding
Language
eng
Format of Original
7 p.
Publication Date
2013
Publisher
Association for Computing Machinery
Source Publication
Proceedings of the 2013 Research in Adaptive and Convergent Systems (RACS 2013)
Source ISSN
978-1-4503-2348-2
Original Item ID
doi: 10.1145/2513228.2513290
Abstract
Most of the research in multi-modal affect detection has been done in laboratory environment. Little work has been done for in situ affect detection. In this paper, we investigate affect detection in natural environment using sensors available in smart phones. We use facial expression and energy expenditure of a person to classify a person's affective state by continuously capturing fine grained accelerometer data for energy and camera image for facial expression and measure the performance of the system. We have deployed our system in natural environment and have provided special attention on annotation for the training data validating the 'ground truth'. We have found important correlation between facial image and energy which validates Russell's two dimensional theory of emotion using arousal and valence space. In this paper, we have presented initial findings in multi-modal affect detection.
Recommended Citation
Adibuzzaman, Mohammad; Jain, Niharika; Steinhafel, Nicholas; Haque, Munirul; Ahamed, Ferdaus; Ahamed, Sheikh Iqbal; and Love, Richard, "Towards in situ Affect Detection in Mobile Devices: A Multimodal Approach" (2013). Mathematics, Statistics and Computer Science Faculty Research and Publications. 293.
https://epublications.marquette.edu/mscs_fac/293
Comments
Published as part of the proceedings of the conference, the 2013 Research in Adaptive and Convergent Systems (RACS 2013), 2013: 454-460. DOI.