AISB Convention 2015

The AISB Convention is an annual conference covering the range of AI and Cognitive Science, organised by the Society for the Study of Artificial Intelligence and Simulation of Behaviour. The 2015 Convention will be held at the Uni...


Yasemin Erden on BBC

AISB Committee member, and Philosophy Programme Director and Lecturer, Dr Yasemin J. Erden interviewed for the BBC on 29 October 2013. Speaking on the Today programme for BBC Radio 4, as well as the Business Report for BBC world N...


Mark Bishop on BBC ...

Mark Bishop, Chair of the Study of Artificial Intelligence and the Simulation of Behaviour, appeared on Newsnight to discuss the ethics of ‘killer robots’. He was approached to give his view on a report raising questions on the et...


AISB YouTube Channel

The AISB has launched a YouTube channel: ( The channel currently holds a number of videos from the AISB 2010 Convention. Videos include the AISB round t...


Lighthill Debates

The Lighthill debates from 1973 are now available on YouTube. You need to a flashplayer enabled browser to view this YouTube video  



AISB event Bulletin Item

CALL FOR PARTICIPATION: WHITEHEAD LECTURE - 'Motion, Sound, and Interaction', Wed 5th Dec, Goldsmiths, London, UK - Frederic.pdf


The eighth Whitehead Lecture of autumn term 2012 will be given by Frdric Bevilacqua. Frdric is 
the head of the Real Time Musical Interactions team at IRCAM in Paris.

An abstract for the lecture and short biography for the speaker are appended below.

The lecture will take place at 4pm on Wednesday 5th December in room 256 of the Richard Hoggart 
Building, Goldsmiths College  

Motion, Sound, and Interaction

ABSTRACT: I will present an overview of research performed by the Real-Time Musical Interactions 
Team of IRCAM (Paris). We have developed various methods and systems that allow for the interaction
between gesture, motion and digital media. This research has been influenced by sustained 
collaborations with musicians/composers and dancers/choreographers. For example, the study of 
musician gestures allowed us to formalize key concepts about continuous gesture control, movement 
segmentation and co-articulation. This guided us in designing various real-time gesture analysis 
systems using machine learning techniques, such as the gesture follower that enables gesture 
synchronization with sound synthesis. Concrete applications concerning augmented musical 
instruments, new music interfaces and music games will be described. Finally, recent research 
on sensori-motor learning will be presented, which opens novel perspectives for the design of 
musical interfaces and medical applications.
BRIEF BIO: Frdric Bevilacqua is the head of the Real Time Musical Interactions team at IRCAM - 
Institute for Music/AcousticResearch and Coordination in Paris. His research interests concern 
gestural interactive systems and interfaces for music expression. He holds a master degree in 
physics and a Ph.D. in Biomedical Optics  from EPFL (Swiss Federal Institute of Technolgy in 
Lausanne). He also studied music at the Berklee College of Music in Boston and has participated 
in several music and media arts projects. From 1999 to 2003 he conducted research at the Beckman 
Laser Institute, University of California Irvine. He joined IRCAM in October 2003 as researcher 
on gesture analysis for music and performing arts.