Timbre with Note Based Features for Improving Performance of Music Classification

  • Kalyani C. Waghmare, Balwant A. Sonkamble


The huge amount of Music data is available in digital form. The people expects efficient and accurate retrieval of this data. In this paper, we are classifying Indian Music based on Raga. The Raga plays an important role in Indian Music. It is melodious combination of notes, which is very complex for naïve user to differentiate from one another. A lot of work is done on Raga identification using either by considering Note or Mel Frequency Cepstrum Coefficient (MFCC) features. The combination of MFCC and Note features will help in designing a generic model for Raga Identification. In this work the notes in audio signal is recognized by calculating Fundamental frequency using three different methods: Autocorrelation, Harmonic Product Spectrum, Cepstrum based methods. The experimentation is done by considering self-generated data of 1200 samples from 8 different Ragas, recorded in an isolated soundproof room with Tanpura and Tabla. Standard CompMusic dataset is also considered for comparison. The augmented features of Notes and MFCC are calculated. The averaging method and K-Nearest Neighbor (KNN) classifier is implemented on augmented features. Accuracy and F1-score is used to measure the performance. The performance of augmented features is better than the individual Note based Pitch Class Profile features and MFCC based Features.

How to Cite
Kalyani C. Waghmare, Balwant A. Sonkamble. (2020). Timbre with Note Based Features for Improving Performance of Music Classification. International Journal of Advanced Science and Technology, 29(3), 10328 - 10338. Retrieved from https://sersc.org/journals/index.php/IJAST/article/view/27098