We can explain the meaning of lyrics by looking at their component words and grammatical structure. But how do we explain the meaning of music? What does the music of, say, Leonard Cohen’s Hallelujah convey? Some people think if we gather enough data to answer these questions then we might be able to program a machine to work out what our ears tell us with ease: that Jerusalem is rousing and Singin’ in the Rain is joyful.
We should beware the lazy assumption that words carry the true meaning of a song and music and the rest are just feelings, to be applied like cake decorations. Music has its own elements and structures, and speaks in many ways. The experience of music is so much more than just its sounds.
New research published in the journal Royal Society Open Science attempts to tackle this issue by investigating the links between the emotions of lyrics and the musical elements they are set to. While the methods used are sophisticatedly statistical, the conclusions are extremely dry. The finding that a single chord type is most associated with positive lyrics is a huge simplification of the way that music works, highlighting the sheer scale of the challenge of creating a machine that could understand and compose music like a human can.
Counting things is a proven way of making discoveries in other domains, so we shouldn’t be surprised this is so in music also. And those who are frightened of the musical machines need to be aware that it is too late: they are among us already. Look, for example, at Microsoft’s Songsmith. My fear, instead, is that humans will make do with poorly made musical machines. We should not ignore the knowledge of centuries of music theory just because we have shiny new data science tools.
Quantitative studies have huge potential to help understand these processes, but they need to treat the music in the light of what we know about it as music. After all, the meaning of the music of Leonard Cohen’s Hallelujah seems clear. If only the same could be said about the words.