Brain-Computer Music Interface for Monitoring and Inducing Affective States
The BCMI-MIdAS (Brain-Computer Music Interface for Monitoring and Inducing Affective States) is a collaborative project between the Universities of Plymouth and Reading. The work is funded by two 54-month EPSRC grants, with additional support from the host institutions. The project aims to use coupled EEG-fMRI to inform a Brain-Computer Interface for music. Principal investigators are, jointly,Professor Slawomir Nasuto and Professor Eduardo Miranda.
The central purpose of the project is to develop technology for building innovative intelligent systems that can monitor our affective state, and induce specific affective states through music, automatically and adaptively. This is a highly interdisciplinary project, which will address several technical challenges at the interface between science, technology and performing arts/music (incorporating computer-generated music and machine learning).
Research questions which will be investigated by the project include:
- How can music change affective states and what are the specific musical traits (i.e., the parameters of a piece of music) that elicit such states?
- How can we control such traits in a piece of music in order to induce specific affective states in a participant?
- How can we effectively detect information about affective states induced by music in the EEG signal, going beyond EEG asymmetry and characterising information contained in synchronisation patterns?
- How can we use the EEG to monitor the affective state induced by music on-line (i.e., in “real-time”)?
- How can we produce a generative music system capable of generating music embodying musical traits aimed at inducing specific affective states, observable in the EEG of the participant?
- How can we build an intelligent adaptive system for monitoring and inducing affective states through music on-line?