Menu
Menu
Machine learning algorithms can 'bust a rhyme' better than humans by 21%

Machine learning algorithms can 'bust a rhyme' better than humans by 21%

New research paper reveals DeepBeat, a machine-based artist that constructs lyrical patterns in rap

Researchers from Aalto University in Finland have been rapping with machine learning algorithms that can construct lyrical patterns in rap.

The machine rap artist, dubbed DeepBeat, uses algorithms RankSVM and a deep neural network to achieve a rhyme density – based on the length and frequency of rhymes – that is 21 per cent higher than the top ranked human rapper used in the experiment, Inspectah Deck.

More than 100 popular English-speaking rap artists, along with 583,669 lines of lyrics/text from 10,980 songs, were used to build DeepBeat. The researchers claim they achieved an 82 per cent accuracy rate when combining all the features for next line predictions.

“We approach the problem of rap lyrics generation as an information-retrieval task. We consider a large repository of rap lyrics, which we treat as training data, and which we use to learn a model between consecutive lines in rap lyrics,” the researchers said in their paper published on 18 May 2015.

The researchers used an open source speech synthesizer, eSpeak, to obtain a phonetic transcription of the rap lyrics. They then used a set of seed lines to predict next lines of text based on previous lines, looking at structural similarity and semantic similarity. This included looking at the number of matching vowel phonemes at the end of lines, as well as the average number of matching vowel phonemes per word and line length.

“The method can then be used to construct a song line-by-line, appending relevant lines from different songs,” the researchers stated.

The rap’s technical quality, referred to as 'rhyme density' and described as the average length of the longest rhyme per word, was then measured. The lyrics were scanned word by word, finding the longest matching vowel sequence ending with one of the 15 previous words.

“If they are the same, we proceed to the second to last vowels, third to last, and so on. We proceed, ignoring word boundaries, until the first non-matching vowels have been encountered,” the researchers stated.

The rhyme density is computed by averaging the lengths of the longest matching vowel sequences of all words.

The idea for the machine learning program came after the researchers published average rhyme densities for a range of popular artists like Jay-Z and Nicki Minaj. This spurred one rap artist to ask the researchers if they could compute the rhyme density for his debut album.

The researchers asked the rapper to rank his own lyrics to then compare with the algorithms. The results suggest that the rhyme density captures the technical quality of the lyrics adequately.

“Our work has great commercial potential as it can provide the basis for a tool that would either generate rap lyrics or assist the writer to produce good lyrics.

“Second, as the interface between humans and machines becomes more intelligent, there is increasing demand for systems that interact with humans with non-mechanical and rather pleasant ways,” said the researchers.

Future development of DeepBeat could include refining storylines and integrating a speech synthesizer to give it a “voice”, the researchers added.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the CIO newsletter!

Error: Please check your email address.

Tags rapDeepBeatnatural language processing (NLP)machine learningartificial intelligence

More about

Show Comments
Computerworld
ARN
Techworld
CMO