MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music

Abstract : We present MAD-EEG, a new, freely available dataset for studying EEG-based auditory attention decoding considering the challenging case of subjects attending to a target instrument in polyphonic music. The dataset represents the first music-related EEG dataset of its kind, enabling, in particular, studies on single-trial EEG-based attention decoding, while also opening the path for research on other EEG-based music analysis tasks. MAD-EEG has so far collected 20-channel EEG signals recorded from 8 subjects listening to solo, duo and trio music excerpts and attending to one pre-specified instrument. The proposed experimental setting differs from the ones previously considered as the stimuli are polyphonic and are played to the subject using speakers instead of headphones. The stimuli were designed considering variations in terms of number and type of instruments in the mixture, spatial rendering, music genre and melody that is played. Preliminary results obtained with a state-of-the-art stimulus reconstruction algorithm commonly used for speech stimuli show that the audio representation reconstructed from the EEG response is more correlated with that of the attended source than with the one of the unattended source, proving the dataset to be suitable for such kind of studies.
Complete list of metadatas

Cited literature [28 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02291882
Contributor : Giorgia Cantisani <>
Submitted on : Thursday, September 19, 2019 - 11:57:25 AM
Last modification on : Tuesday, September 24, 2019 - 1:23:07 AM

File

MAD-EEG.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02291882, version 1

Citation

Giorgia Cantisani, Gabriel Trégoat, Slim Essid, Gaël Richard. MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music. 2019. ⟨hal-02291882⟩

Share

Metrics

Record views

43

Files downloads

90