Technology Enhanced Learning for Instrument learning & analytics tools for assessment
Technology Enhanced Learning for Instrument learning & analytics tools for assessment
Technology Enhanced Learning for Instrument learning & analytics tools for assessment
Using machine learning techniques, the Music and Machine Learning Lab at the Music Technology Group investigates the creative process of manipulating the sound properties of music performance in an attempt to understand, re-create and teach expression in performances.
In this context, the Lab coordinates the H2020 TELMI project, whose aim is to study how we learn musical instruments, taking the violin as a case study, from a pedagogical and scientific perspective and to create new interactive, assistive, self-learning, augmented-feedback, and social-aware systems complementary to traditional teaching. In this context, new interaction paradigms for music learning and training (student-teacher, student only, and collabroative training) based on state-of-the-art multi-modal (audio, image, video and motion) technologies are designed and implemented.
This works includes the creation of a large-scale, publicly available pedagogy-oriented reference database of multimodal music recordings by masters and the set of data analytics tools for analyzing it (in cooperation with the Educational Technologies research line of the Interactive Technologies Group), as well as the interactions of the interactions of the users with it. The database will be based in RepoVizz, and its creation and use fostered by the cooperations with international music institutions such as Royal College of Music (TELMI partner), McGill University and Berklee School of Music.
To know more:
-
Presentation of the project at the Data-driven Knowledge Extraction Workshop, June 2016 (Slides)