dc.description.abstract |
Music, as an art form, combines rhythm and sound to form a functional melodic structure, uniquely capable of conveying emotions non-verbally. Within the domain of Music Information Retrieval (MIR), Music Emotion Classification (MEC) represents a specialized subset dedicated to the
identification and labeling of emotional attributes in songs. This is achieved by extracting and
comparing features from musical compositions. This research aims to discern the contemporary
landscape of research and its associated research gaps in this domain. The study comprised a
collection of publications found through searches conducted on Google Scholar between the years
2006 and 2023, with the search terms: Music Emotion Classification, Music Emotion Classification
in Sri Lanka, Music Emotion Recognition, and Emotion Classification in Music. This study was
narrowed down to the research that utilized audio files for classification. Among the initial set of
42 studies, 20 were selected for detailed analysis using the purposive sampling method. The review
encompassed essential aspects, including acoustic feature analysis, emotional modeling,
classification methodologies, and performance evaluation. The findings highlighted a paucity of
research considering cultural, regional, and linguistic variations. The most often used acoustic
features encompassed rhythm, pitch, timbre, spectral, and harmony whereas the most frequently
used emotion categories for the classification were happiness, anger, sadness, and relaxation.
Support Vector Machine (SVM) was the most used machine learning algorithm for classification,
although regression methods, neural network-based approaches, and fuzzy classifications were also
explored. Notably, the adoption of multi-modal approaches for emotion classification, as well as
multi-labeled emotion classification, remained limited. These insights underscore the need for
future research to address the cultural and language diversity of datasets, explore innovative
classification techniques, and embrace multi-modal and multi-labeled emotional classification
methodologies in the context of music emotion classification within MIR. |
en_US |