Computerised Minds. ...

A video sponsored by the society discusses Searle's Chinese Room Argument (CRA) and the heated debates surrounding it. In this video, which is accessible to the general public and those with interest in AI, Olly's Philosophy Tube ...


Erden in AI roundtab...

On Friday 4th September, philosopher and AISB member Dr Yasemin J Erden, participated in an AI roundtable at Second Home, hosted by Index Ventures and SwiftKey.   Joining her on the panel were colleagues from academia and indu...


AISB Convention 2016

The AISB Convention is an annual conference covering the range of AI and Cognitive Science, organised by the Society for the Study of Artificial Intelligence and Simulation of Behaviour. The 2016 Convention will be held at the Uni...


Bishop and AI news

Stephen Hawking thinks computers may surpass human intelligence and take over the world. This view is based on the ideology that all aspects of human mentality will eventually be realised by a program running on a suitable compu...


Connection Science

All individual members of The Society for the Study of Artificial Intelligence and Simulation of Behaviour have a personal subscription to the Taylor Francis journal Connection Science as part of their membership. How to Acce...


Al-Rifaie on BBC

AISB Committee member and Research Fellow at Goldsmiths, University of London, Dr Mohammad Majid al-Rifaie was interviewed by the BBC (in Farsi) along with his colleague Mohammad Ali Javaheri Javid on the 6 November 2014. He was a...


AISB YouTube Channel

The AISB has launched a YouTube channel: ( The channel currently holds a number of videos from the AISB 2010 Convention. Videos include the AISB round t...



AISB event Bulletin Item

CALL FOR PAPERS: Gaze in HRI: From Modeling to Communication Workshop, March 5, 2012, Boston, MA USA

Important Dates:

Papers Due - January 20, 2012
Notifications - January 30, 2012
HRI Early registration deadline - January 31, 2012 Final paper versions due - February 24, 2012 
Workshop March 5, 2012

Call for papers - 1st call:

We are now accepting submissions of both long (up to 6 pages) and short (up to 3 pages) papers for
presentation at the workshop, "Gaze in HRI: From Modeling to Communication." Papers will be 
subject to a peer-review process. Accepted papers will be archived online on the workshop website.

We are interested in submissions of original HRI research or research in related fields that is 
relevant to the HRI community. Topics of particular interest are papers which,

   *Propose or describe novel methodologies for collecting or analyzing human gaze during interaction
   *Investigate a specific aspect of social gaze behavior through interaction with a robot
   *Propose or describe robot gaze controllers based on empirical interaction data
   *Investigate social gaze in human-human interaction, with a discussion of the relevance of the 
results to human-robot interaction

About the workshop:

The purpose of this workshop is to explore the role of social gaze in human-robot interaction, 
both how to measure gaze behavior by humans and how to implement it in robots that interact with 
them. Social gaze, gaze directed at an interaction partner, has been a subject of increased 
attention in human-robot interaction research. While traditional robotics research has focused 
work on robot gaze solely on the identification and manipulation of objects, researchers in HRI 
have come to recognize that gaze is a social behavior in addition to a sensor.

This workshop will approach the problem of understanding the role of social gaze in human-robot 
interaction from the dual perspectives of investigating human gaze for design principles to apply 
to robots and of experimentally evaluating human-robot gaze interaction in order to assess how 
humans engage in gaze behavior with robots. 

The goal of the workshop is to exchange ideas and develop and improve methodologies for this 
growing area of research. We hope to bring together different researchers in the field of human 
robot interaction working on gaze behavior to find answers to a variety of questions, for example:

*How do people perceive robot gaze behavior during human-robot interaction?
*How can human gaze behavior provide a robot with information about the state of the interaction?
*Which properties of human-human gaze transfer to human-robot gaze and which don't?
*How should one model human-robot gaze for autonomous robots?
*What techniques can be used to establish and measure mutual gaze and joint attention between humans and robots?

Organizing committee:

Frank Broz, University of Hertfordshire
Hagen Lehmann, University of Hertfordshire Yukiko Nakano, Seikei University Bilge Mutlu, 
University of Wisconsin-Madison