Album cover generation from genre tags

Research output: Contribution to conferenceConference Paper

257 Downloads (Pure)

Abstract

This paper presents a method for generating album cover art by including side information regarding the music content. In this preliminary work, using state of the art Generative Adversarial Networks (GAN), album cover arts are generated given a genre tag. In order to have a sufficient dataset containing both the album cover and genre, the Spotify API was used to create a dataset of 50,000 images separated into 5 genres. The main network was pre-trained using the One Million Audio Cover Images for Research (OMACIR) dataset and then trained on the Spotify dataset. This is shown to be successful as the images generated have distinct characteristics for each genre and minimal repeated textures. The network can also distinguish which genre a generated image comes from with an accuracy of 35%.
Original languageEnglish
Number of pages7
Publication statusPublished - 6 Oct 2017
Event10th International Workshop on Machine Learning and Music - Barcelona, Spain
Duration: 6 Oct 2017 → …

Conference

Conference10th International Workshop on Machine Learning and Music
CountrySpain
CityBarcelona
Period6/10/17 → …

Fingerprint Dive into the research topics of 'Album cover generation from genre tags'. Together they form a unique fingerprint.

  • Cite this

    Hepburn, A., McConville, R., & Santos-Rodriguez, R. (2017). Album cover generation from genre tags. Paper presented at 10th International Workshop on Machine Learning and Music, Barcelona, Spain.