Improving music genre classification using collaborative tagging data
- authored by
- Ling Chen, Phillip Wright, Wolfgang Nejdl
- Abstract
As a fundamental and critical component of music information retrieval (MIR) systems, music genre classification has attracted considerable research attention. Automatically classifying music by genre is, however, a challenging problem due to the fact that music is an evolving art. While most of the existing work categorizes music using features extracted from music audio signals, in this paper, we propose to exploit the semantic information embedded in tags supplied by users of social networking websites. Particularly, we consider the tag information by creating a graph of tracks so that tracks are neighbors if they are similar in terms of their associated tags. Two classification methods based on the track graph are developed. The first one employs a classification scheme which simultaneously considers the audio content and neighborhood of tracks. In contrast, the second one is a two-level classifier which initializes genre label for unknown tracks using their audio content, and then iteratively updates the genres considering the influence from their neighbors. A set of optimizing strategies are designed for the purpose of further enhancing the quality of the two-level classifier. Extensive experiments are conducted on real-world data collected from Last.fm. Promising experimental results demonstrate the benefit of using tags for accurate music genre classification.
- Organisation(s)
-
L3S Research Centre
- External Organisation(s)
-
Georgia Institute of Technology
- Type
- Conference contribution
- Pages
- 84-93
- No. of pages
- 10
- Publication date
- 09.02.2009
- Publication status
- Published
- Peer reviewed
- Yes
- ASJC Scopus subject areas
- Computer Networks and Communications, Software
- Electronic version(s)
-
https://doi.org/10.1145/1498759.1498812 (Access:
Closed)