<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<p>The submission deadline for contributions to our workshop has
been extended to
<br>
<br>
*29.08.18*
<br>
<br>
See you in Madrid!
<br>
<br>
Pablo<br>
<br>
CALL FOR PAPERS for the international workshop:
<br>
<br>
* Crossmodal Learning for Intelligent Robotics * in conjunction
with
<br>
IEEE/RSJ IROS 2018
<br>
<br>
* Madrid, Spain - Friday 5 October 2018 *
<br>
<br>
* Website: <a class="moz-txt-link-freetext"
href="http://www.informatik.uni-hamburg.de/wtm/WorkshopCLIR18/index.php">http://www.informatik.uni-hamburg.de/wtm/WorkshopCLIR18/index.php</a>
*
<br>
<br>
I. Aim and Scope
<br>
<br>
The ability to efficiently process crossmodal information is a key
<br>
feature of the human
<br>
brain that provides a robust perceptual experience and behavioural
responses.
<br>
Consequently, the processing and integration of multisensory
<br>
information streams such as
<br>
vision, audio, haptics and proprioception play a crucial role in
the
<br>
development of
<br>
autonomous agents and cognitive robots, yielding an efficient
<br>
interaction with the
<br>
environment also under conditions of sensory uncertainty.
<br>
<br>
Multisensory representations have been shown to improve
performance in
<br>
the research areas
<br>
of human-robot interaction and sensory-driven motor behaviour. The
perception,
<br>
integration, and segregation of multisensory cues improve the
<br>
capability to physically
<br>
interact with objects and persons with higher levels of autonomy.
However, the
<br>
multisensory input must be represented and integrated in an
<br>
appropriate way so that they
<br>
result in a reliable perceptual experience aimed to trigger
adequate
<br>
behavioural
<br>
responses. The interplay of multisensory representations can be
used to solve
<br>
stimulus-driven conflicts for executive control. Embodied agents
can
<br>
develop complex
<br>
sensorimotor behaviour through the interaction with a crossmodal
<br>
environment, leading to
<br>
the development and evaluation of scenarios that better reflect
the
<br>
challenges faced by
<br>
operating robots in the real world.
<br>
<br>
This half-day workshop focuses on presenting and discussing new
<br>
findings, theories,
<br>
systems, and trends in crossmodal learning applied to
neurocognitive
<br>
robotics. The
<br>
workshop will feature a list of invited speakers with outstanding
<br>
expertise in crossmodal
<br>
learning.
<br>
II. Target Audience
<br>
<br>
This workshop is open to doctoral students and senior researchers
<br>
working in computer and cognitive science, psychology,
neuroscience
<br>
and related areas with the focus on crossmodal learning.
<br>
<br>
III. Confirmed Speakers
<br>
<br>
1. * Yulia Sandamirskaya *
<br>
Institute of Neuroinformatics (INI), University and
ETH Zurich
<br>
2. * Angelo Cangelosi *
<br>
Plymouth University and University of Manchester, UK
<br>
3. * Stefan Wermter *
<br>
Hamburg University, Germany
<br>
<br>
IV. Submission
<br>
<br>
1. Topics of interest:
<br>
<br>
- New methods and applications for crossmodal processing
<br>
(e.g., integrating vision, audio, haptics, proprioception)
<br>
- Machine learning and neural networks for multisensory robot
perception
<br>
- Computational models of crossmodal attention and perception
<br>
- Bio-inspired approaches for crossmodal learning
<br>
- Crossmodal conflict resolution and executive control
<br>
- Sensorimotor learning for autonomous agents and robots
<br>
- Crossmodal learning for embodied and cognitive robots
<br>
<br>
2. For paper submission, use the following IEEE template:
<br>
<a class="moz-txt-link-rfc2396E"
href="http://ras.papercept.net/conferences/support/support.php"><http://ras.papercept.net/conferences/support/support.php></a>*
<br>
<br>
3. Submitted papers should be limited to *2 pages (extended
<br>
abstract)* or *4 pages
<br>
(short paper)*.
<br>
<br>
4. Send your pdf file to <a class="moz-txt-link-abbreviated"
href="mailto:barros@informatik.uni-hamburg.de">barros@informatik.uni-hamburg.de</a>
AND
<br>
<a class="moz-txt-link-abbreviated"
href="mailto:jirak@informatik.uni-hamburg.de">jirak@informatik.uni-hamburg.de</a>
<br>
<br>
Selected contributions will be presented during the workshop as
<br>
spotlight talks and in a
<br>
poster session.
<br>
<br>
Contributors to the workshop will be invited to submit extended
<br>
versions of the
<br>
manuscripts to a special issue (to be arranged). Submissions will
be
<br>
peer reviewed
<br>
consistent with the journal practices.
<br>
<br>
V. Important Dates
<br>
<br>
* Paper submission deadline: August 15, 2018
<br>
* Notification of acceptance: September 5, 2018
<br>
* Camera-ready version: September 15, 2018
<br>
* Workshop: Friday 5 October 2018
<br>
<br>
VI. Organizers
<br>
<br>
* German I. Parisi * Hamburg University, Germany
<br>
* Pablo Barros * Hamburg University, Germany
<br>
* Doreen Jirak * Hamburg University, Germany
<br>
* Jun Tani * Okinawa Institute of Science and Technology,
Japan
<br>
* Yoonsuck Choe * Samsung Research & Texas A&M
University, TX, USA
<br>
</p>
<pre class="moz-signature" cols="72">--
Dr.rer.nat. Pablo Barros
Postdoctoral Research Associate - Crossmodal Learning Project (CML)
Knowledge Technology
Department of Informatics
University of Hamburg
Vogt-Koelln-Str. 30
22527 Hamburg, Germany
Phone: +49 40 42883 2535
Fax: +49 40 42883 2515
barros at informatik.uni-hamburg.de
<a class="moz-txt-link-freetext" href="https://www.inf.uni-hamburg.de/en/inst/ab/wtm/people/barros.html">https://www.inf.uni-hamburg.de/en/inst/ab/wtm/people/barros.html</a>
<a class="moz-txt-link-freetext" href="https://www.inf.uni-hamburg.de/en/inst/ab/wtm/">https://www.inf.uni-hamburg.de/en/inst/ab/wtm/</a></pre>
</body>
</html>