Multi-channel U-Net for Music Source Separation

  • Authors
  • Kadandale VS, Montesinos JF, Haro G, Gómez E
  • UPF authors
  • SHENOY KADANDALE, VENKATESH; MONTESINOS GARCIA, JUAN FELIPE; GÓMEZ GUTIÉRREZ, EMILIA; HARO ORTEGA, GLORIA;
  • Authors of the book
  • AA. VV.
  • Book title
  • IEEE MMSP 2020. . IEEE 22nd International Workshop on Multimedia Signal Processing (MMSP)
  • Publisher
  • IEEE
  • Publication year
  • 2020
  • Pages
  • 1-6
  • Abstract
  • A fairly straightforward approach for music source separation is to train independent models, wherein each model is dedicated for estimating only a specific source. Training a single model to estimate multiple sources generally does not perform as well as the independent dedicated models. However, Conditioned U-Net (C-U-Net) uses a control mechanism to train a single model for multi-source separation and attempts to achieve a performance comparable to that of the dedicated models. We propose a multi-channel U-Net (M-U-Net) trained using a weighted multi-task loss as an alternative to the C-U-Net. We investigate two weighting strategies for our multi-task loss: 1) Dynamic Weighted Average (DWA), and 2) Energy Based Weighting (EBW). DWA determines the weights by tracking the rate of change of loss of each task during training. EBW aims to neutralize the effect of the training bias arising from the difference in energy levels of each of the sources in a mixture. Our methods provide three-fold advantages compared to C-U-Net: 1) Fewer effective training iterations per epoch, 2) Fewer trainable network parameters (no control parameters), and 3) Faster processing at inference. Our methods achieve performance comparable to that of C-U-Net and the dedicated U-Nets at a much lower training cost.
  • Complete citation
  • Kadandale VS, Montesinos JF, Haro G, Gómez E. Multi-channel U-Net for Music Source Separation. In: AA. VV.. IEEE MMSP 2020. IEEE 22nd International Workshop on Multimedia Signal Processing (MMSP). 1 ed. IEEE; 2020. p. 1-6.
Bibliometric indicators
  • 2 times cited Scopus
  • Índex Scimago de 0