Kristian Nymoen, Universitet i Oslo...•Recommendation systems, automated playlist generations...

Preview:

Citation preview

Music Information RetrievalKristian Nymoen,Universitet i Oslo

RITMO

• RITMO is an interdisciplinary research centre focused on rhythm as a structuring mechanism for

the temporal dimensions of human life.

• Funded by the Research Council of Norway's Centres of Excellence scheme

• 13 permanent faculty members

• 23 PhD/Postdocs

• 10 more PhD/Postdoc positions recently announced

https://www.hf.uio.no/ritmo/english/about/vacancies/

• 4 administration / technical support staff members

Dept. of Psychology Dept. of Musicology Dept. of Informatics

Outline

• Musical information

• First example: signal processing approach

• Machine learning applications in music information retrieval

• Understanding listeners

Information in a sound/music signal

Intensity

Pitch

Attack time

Decay time

Duration

Sound source

LocationMotion

Harmony

Melody

Rhythm

Movement/Dance

Grouping in time

and frequencyType

Single events Affordances

Emotions

First example:

Chord recognition

Chord prediction•Offline: Create lead sheets from audio

•Real time: Predict chords and have automated accompaniment

https://www.youtube.com/watch?v=COPNciY510g

(Dorfer, Henkel & Widmer, 2018)

Score following•Offline: Create more advanced music databases

•Real time: Music education

Music classification •Genre, artist, instrument, composer…

• Fill in missing metadata in music databases

•Recommendation systems, automated playlist generations

Pattern matching and detection •Audio fingerprinting

•Protect against copyright violations

•Cover song detection

•Query by example (e.g. Shazam)

•Query by humming (e.g. SoundHound)

•Query by tapping (e.g. SongTapper)

•Query by gesture (e.g. SoundTracer)

Sound source separation •Create music notation from audio files

•Music analysis, music education

Chandna, Miron, Janer, & Gomez (2017)

https://www.youtube.com/watch?v=71WwHyNaDfE

Listening patterns

Emotional response

Psychoacoustics

Behavioural response

• MIRtoolbox function:

miremotion

“Simple visualisation of Valence and Arousal in music. The model is based on work by

Tuomas Eerola, Petri Toiviainen and Olivier Lartillot (see e.g. Eerola et al., 2009)

and the real-time implementation was made by Petri Toiviainen in MAX/MSP. “

https://www.youtube.com/watch?v=JqYoA3OM-b4 https://www.youtube.com/watch?v=EJQw5XGK3tI

More material:• MIR communities:

- The International Society of Music Information Retrieval:

www.ismir.net

- Music Information Retrieval Evaluation eXchange:

www.music-ir.org

- Sound and Music Computing Network:

http://www.smcnetwork.org

- Nordic Sound and Music Computing Network:

https://nordicsmc.create.aau.dk

• Open Source Toolboxes:

- essentia.upf.edu (C++/Python library)

- https://github.com/olivierlar/miningsuite/wiki (Matlab)

- https://www.jyu.fi/hytk/fi/laitokset/mutku/en/research/materials/mirtoolbox (Matlab)

https://www.audiolabs-

erlangen.de/fau/professor/mueller/bo

okFMP

Book: