AISB Convention 2015

The AISB Convention is an annual conference covering the range of AI and Cognitive Science, organised by the Society for the Study of Artificial Intelligence and Simulation of Behaviour. The 2015 Convention will be held at the Uni...


Read More...

Yasemin Erden on BBC

AISB Committee member, and Philosophy Programme Director and Lecturer, Dr Yasemin J. Erden interviewed for the BBC on 29 October 2013. Speaking on the Today programme for BBC Radio 4, as well as the Business Report for BBC world N...


Read More...

Mark Bishop on BBC ...

Mark Bishop, Chair of the Study of Artificial Intelligence and the Simulation of Behaviour, appeared on Newsnight to discuss the ethics of ‘killer robots’. He was approached to give his view on a report raising questions on the et...


Read More...

AISB YouTube Channel

The AISB has launched a YouTube channel: http://www.youtube.com/user/AISBTube (http://www.youtube.com/user/AISBTube). The channel currently holds a number of videos from the AISB 2010 Convention. Videos include the AISB round t...


Read More...

Lighthill Debates

The Lighthill debates from 1973 are now available on YouTube. You need to a flashplayer enabled browser to view this YouTube video  


Read More...
01234

Notice

AISB event Bulletin Item

CFP: Workshop on Cores, Clusters, and Clouds

http://lccc.eecs.berkeley.edu/
Contact:

CALL FOR PAPERS

           Learning on Cores, Clusters, and Clouds
            NIPS 2010 Workshop, Whistler, British Columbia, Canada

               http://lccc.eecs.berkeley.edu/

         -- Submission Deadline: October 17, 2010 --

=========================================================================

In the current era of web-scale datasets, high throughput biology, and multilanguage machine translation, modern datasets no longer fit on a single computer and traditional machine learning algorithms often have prohibitively long running times. Parallel and distributed machine learning is no longer a luxury; it has become a necessity. Moreover, industry leaders have already declared that clouds are the future of computing, and new computing platforms such as Microsoft's Azure and Amazon's EC2 are bringing distributed computing to the masses.

The machine learning community is reacting to this trend in computing by developing new parallel and distributed machine learning techniques. However, many important challenges remain unaddressed. Practical distributed learning algorithms must deal with limited network resources, node failures and nonuniform network latencies. In cloud environments, where network latencies are especially large, distributed learning algorithms should take advantage of asynchronous updates.

Many similar issues have been addressed in other fields, where distributed computation is more mature, such as convex optimization and numerical computation. We can learn from their successes and their failures.

The one day workshop on "Learning on Cores, Clusters, and Clouds" aims to bring together experts in the field and curious newcomers, to present the state-of-the-art in applied and theoretical distributed learning, and to map out the challenges ahead. The workshop will include invited and contributed presentations from leaders in distributed learning and adjacent fields.

We would like to invite short high-quality submissions on the following topics:

  * Distributed algorithms for online and batch learning
  * Parallel (multicore) algorithms for online and batch learning
  * Computational models and theoretical analysis of distributed and
    parallel learning
  * Communication avoiding algorithms
  * Learning algorithms that are robust to hardware failures
  * Experimental results and interesting applications

    Interesting submissions in other relevant topics not listed above
    are welcome too. Due to the time constraints, most accepted
    submissions will be presented as poster spotlights.


          _Submission guidelines:_

    Submissions should be written as extended abstracts, no longer
    than 4 pages in the NIPS latex style. NIPS style files and
    formatting instructions can be found at
    http://nips.cc/PaperInformation/StyleFiles. The submissions should
    include the authors' name and affiliation since the review process
    will not be double blind. The extended abstract may be accompanied
    by an unlimited appendix and other supplementary material, with
    the understanding that anything beyond 4 pages may be ignored by
    the program committee. Please send your submission by email to
    submit.lccc@gmail.com  before
    October 17 at midnight PST. Notifications will be given on or
    before Nov 7. Topics that were recently published or presented
    elsewhere are allowed, provided that the extended abstract
    mentions this explicitly; topics that were presented in
    non-machine-learning conferences are especially encouraged.


          _Organizers:_


    Alekh Agarwal (UC Berkeley), Ofer Dekel (Microsoft), John Duchi
    (UC Berkeley), John Langford (Yahoo!)


          _Program Committee:_


    Ron Bekkerman (LinkedIn), Misha Bilenko (Microsoft), Ran
    Gilad-Bachrach (Microsoft), Guy Lebanon (Georgia Tech), Ilan Lobel
    (NYU), Gideon Mann (Google), Ryan McDonald (Google), Ohad Shamir
    (Microsoft), Alex Smola (Yahoo!), S V N Vishwanathan (Purdue),
    Martin Wainwright (UC Berkeley), Lin Xiao (Microsoft)