List of results published directly linked with the projects co-funded by the Spanish Ministry of Economy and Competitiveness under the María de Maeztu Units of Excellence Program (MDM-2015-0502).

List of publications acknowledging the funding in Scopus.

The record for each publication will include access to postprints (following the Open Access policy of the program), as well as datasets and software used. Ongoing work with UPF Library and Informatics will improve the interface and automation of the retrieval of this information soon.

The MdM Strategic Research Program has its own community in Zenodo for material available in this repository   as well as at the UPF e-repository   

 

 

Back Oramas S., Espinosa-Anke L., Sordo M., Saggion H., Serra X. Information extraction for knowledge base construction in the music domain. Data and Knowledge Engineering.

Oramas S., Espinosa-Anke L., Sordo M., Saggion H., Serra X. Information extraction for knowledge base construction in the music domain. Data and Knowledge Engineering.

The rate at which information about music is being created and shared on the web is growing exponentially. However, the challenge of making sense of all this data remains an open problem. In this paper, we present and evaluate an Information Extraction pipeline aimed at the construction of a Music Knowledge Base. Our approach starts off by collecting thousands of stories about songs from the songfacts.com website. Then, we combine a state-of-the-art Entity Linking tool and a linguistically motivated rule-based algorithm to extract semantic relations between entity pairs. Next, relations with similar semantics are grouped into clusters by exploiting syntactic dependencies. These relations are ranked thanks to a novel confidence measure based on statistical and linguistic evidence. Evaluation is carried out intrinsically, by assessing each component of the pipeline, as well as in an extrinsic task, in which we evaluate the contribution of natural language explanations in music recommendation. We demonstrate that our method is able to discover novel facts with high precision, which are missing in current generic as well as music-specific knowledge repositories.

Additional material: