FILE - In this Oct. 25, 2011 photo provided by the Santa Cruz Conference and Visitors Council, kayaker Alan Brady is surprised by two breaching humpback whales while kayaking off the coast of Seabright State Beach in Santa Cruz, Calif. The U.S. Coast Guard is warning people to stay away from a pod of whales that has settled unusually close to the shore off Santa Cruz or face fines for whale harassment of at least $2,500. The agency plans to monitor the waters on Wednesday, Nov. 2, 2011. (AP Photo/Santa Cruz Conference and Visitors Council , Paul Schraub, file)

Cape Town - Any connection between the songs of whales and those of groups like the Beatles and Queen may seem tenuous.

But computer scientists at Lawrence Technological University in Michigan have used techniques honed on analysing whale vocalisations to develop an “artificial intelligence” computer algorithm that can analyse and compare musical styles.

This has enabled them to research the musical progression of groups like The Beatles. The study, by Assistant Professor Lior Shamir and graduate student Joe George, appears in the journal Pattern Recognition Letters.

According to a media summary, the new algorithm was used to analyse 11 songs from each of the Beatles’ 13 studio albums between 1963 and 1970. The results for the individual songs were then used to compare the similarities between the albums – and the algorithm correctly placed them in chronological recording order.

Let It Be was the last album released by the Beatles, in May 1970, but the algorithm correctly identified those songs as having been recorded earlier than the songs on Abbey Road that was released in September 1969.

The same impeccable results were obtained when analysing songs by Queen, U2 and ABBA.

The algorithm works by first converting each song to a spectrogram – a visual representation of the audio content, the researchers explained.

That turns an audio analysis task into an image analysis problem, which is solved by applying algorithms that turn each music spectrogram into a set of almost 3 000 numeric descriptors, reflecting visual aspects such as textures, shapes and the statistical distribution of the pixels. Pattern recognition and statistical methods are then used to detect and quantify the similarities between different pieces of music.

“People who aren’t Beatles fans normally can’t tell that Help! was recorded before Rubber Soul, but the algorithm can,” Shamir said. “This experiment demonstrates that artificial intelligence can identify the changes and progression in musical styles by ‘listening’ to popular music albums in a completely new way.” This type of research would have historical significance, he suggested.

“The Baby Boomers loved the music of the Beatles, I love the Beatles, and now my daughters and their friends love the Beatles. Their music will live on for a very long time. It’s worthwhile to study what makes their music so distinctive, and computer science and big data can help.”

One has to wonder how these algorithms would have measured the music of Bob Marley and “the Whalers”!?

[email protected]

Cape Argus