The MULTIDATA project, Skills and resources for the multimodal turn: Unlocking audiovisual datasets for research and learning, is funded by an ERASMUS PLUS KA220-HED to the University of Murcia, with Cristóbal Pagán Cánovas as coordinator, with FAU Erlangen-Nürnberg and Radboud University Nijmegen. The project also has two esteemed associated partners: the Red Hen Lab™, an international consortium for research into multimodal communication, and the Department of Multimodal Language at the Max Planck Institute for Psycholinguistics.
MULTIDATA is a free online platform for the study of multimodal communication. It provides an AI-based pipeline for analyzing speech and gesture data from videos, as well as other resources for developing audiovisual collections and exploiting them for education, research and professional applications. The MULTIDATA team is constantly integrating the most relevant open-source tools, or developing tools of their own.

Currently, the MULTIDATA platform is in its testing phase. As a result, access to the pipeline is limited to selected users with research profiles. Don’t worry! In the coming months, MULTIDATA will be available to anyone interested in multimodal data analysis. In the meantime, you can visit the project website (www.multi-data.eu) for more details. There, you’ll find a link to the Github forum where you can engage with other MULTIDATA users and exchange ideas. You’ll also find information on how to register for free MULTIDATA webinars, as well as the dates of upcoming presentations at conferences and other events.
Over the next few months, our resources will come with explanatory video tutorials and user guidelines, as well as with didactic materials for the use of multimodal data across disciplines. By subscribing to the MULTIDATA newsletter, you’ll stay informed about new resources and every online and onsite activity we organize.
Interested in learning more about MULTIDATA? Don’t hesitate to reach out via email at hello@multi-data.eu