Yasemin Erden on BBC

AISB Committee member, and Philosophy Programme Director and Lecturer, Dr Yasemin J. Erden interviewed for the BBC on 29 October 2013. Speaking on the Today programme for BBC Radio 4, as well as the Business Report for BBC world N...


Read More...

AISB Convention 2014

AISB-50: a convention commemorating both 50 years since the founding of the society for the study of Artificial Intelligence and the Simulation of Behaviour (the AISB) and sixty years since the death of Alan Turing, founding fathe...


Read More...

Mark Bishop on BBC ...

Mark Bishop, Chair of the Study of Artificial Intelligence and the Simulation of Behaviour, appeared on Newsnight to discuss the ethics of ‘killer robots’. He was approached to give his view on a report raising questions on the et...


Read More...

AISB YouTube Channel

The AISB has launched a YouTube channel: http://www.youtube.com/user/AISBTube (http://www.youtube.com/user/AISBTube). The channel currently holds a number of videos from the AISB 2010 Convention. Videos include the AISB round t...


Read More...

Lighthill Debates

The Lighthill debates from 1973 are now available on YouTube. You need to a flashplayer enabled browser to view this YouTube video  


Read More...
01234

Notice

AISB miscellaneous Bulletin Item

1st CFP: JMUI Special Issue "Nonverbal Behavior Synthesis For Embodied Agents"

http://www.editorialmanager.com/jmui/

First Call for Papers
for a Special Issue of
the Journal on Multimodal User Interfaces (JMUI), Springer, on

NONVERBAL BEHAVIOR SYNTHESIS FOR EMBODIED AGENTS:
Producing Expressive, Individual, Interpersonally and Socio-Culturally
Aware Behavior

===================================================================

Deadline for paper submission: 4 September 2009

This special issue will address the question of how to synthesize
realistic, contextually appropriate nonverbal behavior for embodied
agents. Such techniques may be based on empirical studies, cognitive
models or statistical modeling of observed behaviors. The range of
nonverbal behaviors includes all modalities of human communication,
most notably speech, gesture, facial expression and posture. Key
challenges include how to model cross-modal synchronization and how to
take into account the world context, especially that of the
participating interlocutors - both human and artificial. Current
research aims at producing nonverbal behavior that tightly coheres
across modalities while relating to the behavior of other
interlocutors with respect to timing/form and taking social factors
like status and intercultural differences into account. Approaches
range from using low-level data such as motion capture, prosody or
physical forces to drive the synthesis to more high-level, functional
models employing grammars, semantics, information structure and
discourse knowledge.

The articles of this special issue will share the vision of making
virtual agents more believable as a crucial step in making them an
acceptable interface metaphor. Thus deployed, the agents have
considerable potential, as virtual service or tutoring assistants,
avatars in online social worlds or for non-player characters in
computer games.

We encourage submissions from AI researchers working on models for
multimodal behavior planning, as well as interdisciplinary research
covering psychology and the social sciences with a focus on evaluation
studies. We specifically welcome contributions from computer graphics
with a focus on efficient, data-driven algorithms. Finally, we also
invite more theoretical contributions that clarify the conceptual
levels of intra-agent processing as well as the contextual factors
that affect the communicative setting and thus must be included in any
model of behavior synthesis.

===================================================================
Guest editors
===================================================================

Michael Kipp, DFKI, Germany
Michael Neff, UC Davis, USA
Jean-Claude Martin, LIMSI-CNRS, France

===================================================================
Important Dates
===================================================================

Deadline for paper submission:    4 September
Notification of acceptance:     9 October
Camera-ready version:         26 October
Publication date:            December 2009

===================================================================
Topics
===================================================================

- Efficient methods and tools for realtime nonverbal behavior
synthesis for agents, including novel computer animation techniques

- Representations for nonverbal behavior; both for empirical research
and as an interface between planning and realization modules

- Virtual rapport and social resonance

- Social and cultural side-conditions of nonverbal behavior production

- Constructing and operationalizing cognitive models of nonverbal
behavior synthesis

- Interpersonal relation modeling

- Evaluation studies of artificial nonverbal behavior

- Empirical models for behavior synthesis

- Modeling idiosyncratic nonverbal behavior, modeling style

- Expressive embodied agents as a research tool for the social
sciences

- Applications: Multimodal dialogue systems with embodied agents,
tools for creating nonverbal behavior, multimodal interaction in
assisting, tutoring, therapy systems and computer games

===================================================================
Instructions for Authors
===================================================================

Submissions should be 8-12 pages long and must be written in English.

Formatting instructions and templates are available on:
http://www.jmui.org

Authors should register and upload their submission on the following
website: http://www.editorialmanager.com/jmui/

During the submission process, please select "NONVERBAL special issue"
as article type.

Authors are encouraged to send to: kipp@dfki.de a brief email
indicating their intention to participate as soon as possible,
including their contact information and the topic they intend to
address in their submissions.