Computerised Minds. ...

A video sponsored by the society discusses Searle's Chinese Room Argument (CRA) and the heated debates surrounding it. In this video, which is accessible to the general public and those with interest in AI, Olly's Philosophy Tube ...


Erden in AI roundtab...

On Friday 4th September, philosopher and AISB member Dr Yasemin J Erden, participated in an AI roundtable at Second Home, hosted by Index Ventures and SwiftKey.   Joining her on the panel were colleagues from academia and indu...


AISB Convention 2016

The AISB Convention is an annual conference covering the range of AI and Cognitive Science, organised by the Society for the Study of Artificial Intelligence and Simulation of Behaviour. The 2016 Convention will be held at the Uni...


Bishop and AI news

Stephen Hawking thinks computers may surpass human intelligence and take over the world. This view is based on the ideology that all aspects of human mentality will eventually be realised by a program running on a suitable compu...


Connection Science

All individual members of The Society for the Study of Artificial Intelligence and Simulation of Behaviour have a personal subscription to the Taylor Francis journal Connection Science as part of their membership. How to Acce...


Al-Rifaie on BBC

AISB Committee member and Research Fellow at Goldsmiths, University of London, Dr Mohammad Majid al-Rifaie was interviewed by the BBC (in Farsi) along with his colleague Mohammad Ali Javaheri Javid on the 6 November 2014. He was a...


AISB YouTube Channel

The AISB has launched a YouTube channel: ( The channel currently holds a number of videos from the AISB 2010 Convention. Videos include the AISB round t...



AISB event Bulletin Item

CFP: Workshop on Cores, Clusters, and Clouds


           Learning on Cores, Clusters, and Clouds
            NIPS 2010 Workshop, Whistler, British Columbia, Canada


         -- Submission Deadline: October 17, 2010 --


In the current era of web-scale datasets, high throughput biology, and multilanguage machine translation, modern datasets no longer fit on a single computer and traditional machine learning algorithms often have prohibitively long running times. Parallel and distributed machine learning is no longer a luxury; it has become a necessity. Moreover, industry leaders have already declared that clouds are the future of computing, and new computing platforms such as Microsoft's Azure and Amazon's EC2 are bringing distributed computing to the masses.

The machine learning community is reacting to this trend in computing by developing new parallel and distributed machine learning techniques. However, many important challenges remain unaddressed. Practical distributed learning algorithms must deal with limited network resources, node failures and nonuniform network latencies. In cloud environments, where network latencies are especially large, distributed learning algorithms should take advantage of asynchronous updates.

Many similar issues have been addressed in other fields, where distributed computation is more mature, such as convex optimization and numerical computation. We can learn from their successes and their failures.

The one day workshop on "Learning on Cores, Clusters, and Clouds" aims to bring together experts in the field and curious newcomers, to present the state-of-the-art in applied and theoretical distributed learning, and to map out the challenges ahead. The workshop will include invited and contributed presentations from leaders in distributed learning and adjacent fields.

We would like to invite short high-quality submissions on the following topics:

  * Distributed algorithms for online and batch learning
  * Parallel (multicore) algorithms for online and batch learning
  * Computational models and theoretical analysis of distributed and
    parallel learning
  * Communication avoiding algorithms
  * Learning algorithms that are robust to hardware failures
  * Experimental results and interesting applications

    Interesting submissions in other relevant topics not listed above
    are welcome too. Due to the time constraints, most accepted
    submissions will be presented as poster spotlights.

          _Submission guidelines:_

    Submissions should be written as extended abstracts, no longer
    than 4 pages in the NIPS latex style. NIPS style files and
    formatting instructions can be found at The submissions should
    include the authors' name and affiliation since the review process
    will not be double blind. The extended abstract may be accompanied
    by an unlimited appendix and other supplementary material, with
    the understanding that anything beyond 4 pages may be ignored by
    the program committee. Please send your submission by email to  before
    October 17 at midnight PST. Notifications will be given on or
    before Nov 7. Topics that were recently published or presented
    elsewhere are allowed, provided that the extended abstract
    mentions this explicitly; topics that were presented in
    non-machine-learning conferences are especially encouraged.


    Alekh Agarwal (UC Berkeley), Ofer Dekel (Microsoft), John Duchi
    (UC Berkeley), John Langford (Yahoo!)

          _Program Committee:_

    Ron Bekkerman (LinkedIn), Misha Bilenko (Microsoft), Ran
    Gilad-Bachrach (Microsoft), Guy Lebanon (Georgia Tech), Ilan Lobel
    (NYU), Gideon Mann (Google), Ryan McDonald (Google), Ohad Shamir
    (Microsoft), Alex Smola (Yahoo!), S V N Vishwanathan (Purdue),
    Martin Wainwright (UC Berkeley), Lin Xiao (Microsoft)