AISB convention 2017

  In the run up to AISB2017 convention, I've asked Joanna Bryson, from the organising team, to answer few questions about the convention and what comes with it. Mohammad Majid al-Rifaie ( Tu...


Harold Cohen

Harold Cohen, tireless computer art pioneer dies at 87   Harold Cohen at the Tate (1983) Aaron image in background   Harold Cohen died at 87 in his studio on 27th April 2016 in Encintias California, USA.The first time I hear...


Dancing with Pixies?...

At TEDx Tottenham, London Mark Bishop (the former chair of the Society) demonstrates that if the ongoing EU flagship science project - the 1.6 billion dollar "Human Brain Project” - ultimately succeeds in understanding all as...


Computerised Minds. ...

A video sponsored by the society discusses Searle's Chinese Room Argument (CRA) and the heated debates surrounding it. In this video, which is accessible to the general public and those with interest in AI, Olly's Philosophy Tube ...


Connection Science

All individual members of The Society for the Study of Artificial Intelligence and Simulation of Behaviour have a personal subscription to the Taylor Francis journal Connection Science as part of their membership. How to Acce...



AISB event Bulletin Item

LECTURE: "Gesture-sound interaction in digital media", 30 March 2011, Goldsmith's College, LONDON

by Frederic Bevilacqua, Head of the Real-Time Musical Interactions Team IRCAM- Centre Pompidou, STMS-CNRS UPMC Paris, France Location: LG01, New Academic Building Department: Computing Time: 30 March 2011, 14:00 - 15:00

I will present an overview of the research and applications performed by the Real-Time Musical 
Interactions Team of IRCAM (Paris). We have developed for the last seven years various methods
and tools for computer-based gesture analysis, with the general goal to use body movements to 
interact with sonic and/or visual environments. This research has largely been influenced by 
sustained collaborations with musicians/composers and dancers/choreographers. We will present 
some of these works, focusing on gesture research and interfaces. In particular, we will present
the cases of musical interfaces and various experiments we have been carried on in music pedagogy.
We will also present dance performances and interactive installations we have collaborated on. 

In music, we studied physical gestures of musicians such as the bow movement of violin players. 
This allowed us to formalize key concepts about continuous gesture control, gesture vocabulary 
and co-articulation (similarly to speech production). This fundamental research led us to design 
augmented instruments, incorporating these challenging concepts. In parallel, we are designing 
new interfaces and paradigms to control sonic environments, individually or collectively. In 
particular, we are developing tools to re-perform sound and music with such interfaces. 
In particular, we developed a "gesture follower" system that allows for the recognition and 
synchronization of gestures with sound materials. 

In dance, we will present performances and installations, where we used the same technology than
for music. While designed with different goals and aesthetics, two of them use a similar interaction 
principle: the visitor is invited to dance imitating dance material displayed on a large screen. 
This brings us back to open questions with musical interfaces: how can we learn gestures and the 
interaction with digital media, and how this affects our gesture and sound perception?