Proceedings of the 46th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Loops, seamlessly repeatable musical segments, are a cornerstone of modern music production. Contemporary artists often mix and match various sampled or pre-recorded loops based on musical criteria such as rhythm, harmony and timbral texture to create compositions. Taking such criteria into account, we present LoopNet, a feed-forward generative model for creating loops conditioned on intuitive parameters. We leverage Music Information Retrieval (MIR) models as well as a large collection of public loop samples in our study and use the Wave-U-Net architecture to map control parameters to audio. We also evaluate the quality of the generated audio and propose intuitive controls for composers to map the ideas in their minds to an audio loop.
Chandna P, Ramires A, Serra X, Gómez E. Loopnet: Musical Loop Synthesis Conditioned On Intuitive Musical Parameters. In: Androutsos, Dimitri; Plataniotis, Kostas; Zhang, Xiao-Ping. Proceedings of the 46th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 1 ed. 2021. p. 3395-3399.