Skip to main content

Research Repository

Advanced Search

EmoteControl: An interactive system for real-time control of emotional expression in music

Micallef Grimaud, A.; Eerola, T.

EmoteControl: An interactive system for real-time control of emotional expression in music Thumbnail


Authors



Abstract

Several computer systems have been designed for music emotion research that aim to identify how different structural or expressive cues of music influence the emotions conveyed by the music. However, most systems either operate offline by pre-rendering different variations of the music or operate in real-time but focus mostly on structural cues. We present a new interactive system called EmoteControl, which allows users to make changes to both structural and expressive cues (tempo, pitch, dynamics, articulation, brightness, and mode) of music in real-time. The purpose is to allow scholars to probe a variety of cues of emotional expression from non-expert participants who are unable to articulate or perform their expression of music in other ways. The benefits of the interactive system are particularly important in this topic as it offers a massive parameter space of emotion cues and levels for each emotion which is challenging to exhaustively explore without a dynamic system. A brief overview of previous work is given, followed by a detailed explanation of EmoteControl’s interface design and structure. A portable version of the system is also described, and specifications for the music inputted in the system are outlined. Several use-cases of the interface are discussed, and a formal interface evaluation study is reported. Results suggested that the elements controlling the cues were easy to use and understood by the users. The majority of users were satisfied with the way the system allowed them to express different emotions in music and found it a useful tool for research.

Citation

Micallef Grimaud, A., & Eerola, T. (2021). EmoteControl: An interactive system for real-time control of emotional expression in music. Personal and Ubiquitous Computing, 25(4), 677-689. https://doi.org/10.1007/s00779-020-01390-7

Journal Article Type Article
Acceptance Date Mar 4, 2020
Online Publication Date Apr 2, 2020
Publication Date 2021-08
Deposit Date Mar 9, 2020
Publicly Available Date Mar 28, 2024
Journal Personal and Ubiquitous Computing
Print ISSN 1617-4909
Electronic ISSN 1617-4917
Publisher Springer
Peer Reviewed Peer Reviewed
Volume 25
Issue 4
Pages 677-689
DOI https://doi.org/10.1007/s00779-020-01390-7

Files

Published Journal Article (Advance online version) (554 Kb)
PDF

Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/

Copyright Statement
Advance online version This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.




You might also like



Downloadable Citations