Music mood and theme classification

A hybrid approach

verfasst von
Kerstin Bischoff, Claudiu S. Firan, Raluca Paiu, Wolfgang Nejdl, Cyril Laurier, Mohamed Sordo
Abstract

Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users' information seeking actions aim at retrieving music songs based on these perceptual dimensions - moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs' latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users' information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs' thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

Organisationseinheit(en)
Forschungszentrum L3S
Externe Organisation(en)
Universität Pompeu Fabra (UPF)
Typ
Aufsatz in Konferenzband
Seiten
657-662
Anzahl der Seiten
6
Publikationsdatum
2009
Publikationsstatus
Veröffentlicht
Peer-reviewed
Ja
ASJC Scopus Sachgebiete
Musik, Information systems