Intelligent music learning system prototype based on state-of-the-art real-time audio analysis, motion capture and artificial intelliegence techniques which provides real-time feedback on the sound and gestures of music students.

Learning to play a musical instrument is a highly complex task. It requires mastering 1) quality sound production, 2) timing and pitch accuracy (i.e. playing the correct note at the right time), 3) correct intonation (e.g. fine tuning of violin notes pitch), 4) correct gestural technique, and 5) musical expression (e.g. playing with intentional emotions). Learning such skills is mostly based on the masterapprentice
model in which the student observes and imitates the teacher, the teacher provides verbal feedback on the performance of the student, and the student engages in long periods of self-study without teacher supervision. An important limitation of such a model is the time lag between the student’s performance and the teacher‘s feedback specially since most of the student's performance practice takes place long after the teacher‘s feedback.

SkyNote is an intelligent learning system which complements and overcme the limitations of the master-apprentice model by providing real-time feedback on the sound and gestures of music students, allowing them to acquire skills more efficiently [1]. It uses state-of-the-art real-time audio analysis, motion capture and artificial intelliegence techniques. Skynote allows students to detect and correct errors, build awareness of their technique, and practice more efficiently. It may be incorporated into the master-apprentice model or used for self-practice.
Strongly grounded in music pedagogy, SkyNote addresses all stages, from beginner to advamced levels, and all aspects of music instrument learning. By applying state-of-the art audio analysis and artificial intelligence techniques, SkyNote provides real-time feedback about sound quality [2], timing and pitch accuracy, dynamics, intonation, and music expression [3], and applies motion capture techniques to provide real-time feedback of motion and gestural technique [4]. The feedback can be displayed in customized widgets or directly on the musical score, allowing for real-time experimentation and overall performance evaluation.

The research supporting SkyNote goes progress beyond the state-of-the art in technology-enhanced music learning by combining audio analysis, motion capture and artificial intelligence. By applying artificial intelligence techniques to audio and motion data, SkyNote is able to learn from expert multimodal recordings and compare to the student’s performance, as well as to personalise the feedback to the student’s learning progress. All this provides a competitive advantage to SkyNote over existing music learning systems which limit their feedback to straightforward pitch, timing and intonation.

Starting date: June 2021

Duration: 6 months

With the support of: