1-06 - Towards Multimodal MIR: Predicting Individual Differences from Music-induced Movement
Yudhik Agrawal, Samyak Jain, Emily Carlson, Petri Toiviainen, Vinoo Alluri
Keywords: Human-centered MIR, User behavior analysis and mining, user modeling, Domain knowledge, Machine learning/Artificial intelligence for music, Human-computer interaction and interfaces, Personalization, MIR fundamentals and methodology, Multimodality
Abstract:
As the field of Music Information Retrieval grows, it is important to take into consideration the multi-modality of music and how aspects of musical engagement such as movement and gesture might be taken into account. Bodily movement is universally associated with music and reflective of important individual features related to music preference such as personality, mood, and empathy. Future multimodal MIR systems may benefit from taking these aspects into account. The current study addresses this by identifying individual differences, specifically Big Five personality traits, and scores on the Empathy and Systemizing Quotients (EQ/SQ) from participants’ free dance movements. Our model successfully explored the unseen space for personality as well as EQ, SQ, which has not previously been accomplished for the latter. R2 scores for personality, EQ, and SQ were 76.3%, 77.1%, and 86.7% respectively. As a follow-up, we investigated which bodily joints were most important in defining these traits. We discuss how further research may explore how the mapping of these traits to movement patterns can be used to build a more personalized, multi-modal recommendation system, as well as potential therapeutic applications.