The most fundamental improvement is that EchoNest provides tempo and simple meter information. So I’ve been able to add tempo information to a number of songs that I didn’t have cataloged that way before. And I’ve also added the time signature tags to these songs.
But more importantly, EchoNest does some interesting analysis of the music to come up with some acoustic attributes to describe a song musically. They are:
- Beat: An attempt to characterize the strength and consistency of the beat.
- Energy: The intensity and power of the music – this is probably the most intuitive of the attributes.
- Mood: A measure of the mood from positive or happy down to sad or angry.
I am representing each of these attributes as a graphical column headed with an appropriate icon (drum for beat, flame for energy and smile for mood). Each column is sortable both from high to low and low to high and when I have data for a song, the icon in that song’s row is ‘filled’ to a proportional amount. EchoNest provides a number from 0 .0 to 1 .0 and I translate that into icons that are 0 to 100% filled (in 10 bands).
For instance, here is a snapshot of a list of songs that are tagged as “First Dance,” can be danced to some form of Foxtrot and contain the word “Love“. They are sorted by “Beat” from strongest to weakest. If you’re looking for a song to dance your first dance to and aren’t an experienced dancer, you probably want a strong beat.
What do you think? Are there other acoustic attributes that I should include? Would you like to be able to sort on multiple attributes on the same time or filter on one attribute and sort on another? These are all entirely possibly, but I need your help to prioritize these features. Please feel free to reply to this post or leave feedback with any thoughts you have on this set of features.