Brain-Computer Interface for Measuring Musical effect
Music has long been known to affect a person’s emotional state. The lab is using BCI and machine learning to investigate whether there is a measurable neurological basis to sense affect, as an effect of emotional state, elicited by music.
Brain-Controlled Drum Interface
We developed a proof of concept system that allows a person to control an electronic drum, only using facial movements such as blinking, winking, raising eyebrows, and smiling, as detected by the BCI.
Mindtrack: Using Brain-Computer Interface to Translate Emotions into Music
In this exploratory project, a user wore an electroencephalogram (EEG) EMOTIV Insight headset. The raw EEG data was converted to brain wave components, followed by high-level EEG characteristics that were used to control the music’s tempo and key signature. Other musical parameters, such as harmony, rhythm and melody were specified by the user.Tempo and key were calculated based on the emotion detected from the EEG. In Mindtrack, the brain is used as the sole instrument for translating emotions to music.