Towards the next generation of multimodal interactive expressive technologies
RAPID-MIX was a 3 year project starting the first of February 2015 coordinated by the Music Technology Group with the following partner entities involved:
- GOLDSMITHS' COLLEGE (UK)
- INSTITUT DE RECHERCHE ET DE COORDINATION ACOUSTIQUE MUSIQUE - IRCAM (France)
- PLUX - WIRELESS BIOSIGNALS S.A. (Portugal)
- REACTABLE SYSTEMS SL (Barcelona)
- SOMETHIN' ELSE SOUND DIRECTIONS LIMITED (UK)
- ROLI Ltd (UK)
- ORBE (France)
Abstract: RAPID-MIX brings together 3 leading research institutions with 4 dynamic creative industries SMEs and 1 leading wearable technology SME in a technology transfer consortium to bring to market innovative interface products for music, gaming, and e-Health applications.
RAPID-MIX uses an intensely user-centric development process to gauge industry pull and end-user desire for new modes of interaction that integrate physiological human sensing, gesture and body language, and smart information analysis and adaptation. Physiological biosignals (EEG, EMG) are used in multimodal hardware configurations with motion sensors and haptic actuators. Advanced machine learning software adapts to expressive human variation, allowing fluid interaction and personalized experience. An iterative, rapid development cycle of hardware prototyping, software development, and application integration accelerates the availability of advanced interface technologies to industry partners. An equally user-centric evaluation phase assures market validation and end-user relevance and usability, feeding back to subsequent design cycles and informing ultimate market deployment.
The RAPID-MIX consortium leverages contemporary dissemination channels such as crowd funding, industry trade shows, and contributions to the DIY community to raise awareness across the professional and consumer landscapes of novel interface technologies. Project output is encapsulated in an Open Source RAPID-API exposing application level access to software libraries, hardware designs, and middleware layers. This will enable creative partner SMEs to build a new range of products called Multimodal Interactive eXpressive systems (MIX). It also allows broader industries such as quantified self, and DIY communities, to use the API in their own products in cost effective ways. This assures the legacy of RAPID-MIX and marks its contribution to European competitiveness in rapidly evolving markets for embodied interaction technologies.
Our role in the RAPID-MIX project: We are the Project Coordinators (PI Sergi Jordà). In terms of areas of expertise, we are involved in tasks related to UCD, research data management using repoVizz and we are the partner in charge of the development of the API through which a number of technologies will be accessed (from our side Essentia & Gaia, Freesound, repoVizz and GestureAgents amongst others).
Keywords: multimodal interfaces, embodiment, biosensing, user-centred design, non-verbal communication, expressiveness, music, videogame, wellbeing, wearables, motion, mobile, audiovisual, open access
Further information: official website