Improving music genre classification using collaborative tagging data

verfasst von
Ling Chen, Phillip Wright, Wolfgang Nejdl
Abstract

As a fundamental and critical component of music information retrieval (MIR) systems, music genre classification has attracted considerable research attention. Automatically classifying music by genre is, however, a challenging problem due to the fact that music is an evolving art. While most of the existing work categorizes music using features extracted from music audio signals, in this paper, we propose to exploit the semantic information embedded in tags supplied by users of social networking websites. Particularly, we consider the tag information by creating a graph of tracks so that tracks are neighbors if they are similar in terms of their associated tags. Two classification methods based on the track graph are developed. The first one employs a classification scheme which simultaneously considers the audio content and neighborhood of tracks. In contrast, the second one is a two-level classifier which initializes genre label for unknown tracks using their audio content, and then iteratively updates the genres considering the influence from their neighbors. A set of optimizing strategies are designed for the purpose of further enhancing the quality of the two-level classifier. Extensive experiments are conducted on real-world data collected from Last.fm. Promising experimental results demonstrate the benefit of using tags for accurate music genre classification.

Organisationseinheit(en)
Forschungszentrum L3S
Externe Organisation(en)
Georgia Institute of Technology
Typ
Aufsatz in Konferenzband
Seiten
84-93
Anzahl der Seiten
10
Publikationsdatum
09.02.2009
Publikationsstatus
Veröffentlicht
Peer-reviewed
Ja
ASJC Scopus Sachgebiete
Computernetzwerke und -kommunikation, Software
Elektronische Version(en)
https://doi.org/10.1145/1498759.1498812 (Zugang: Geschlossen)