<div dir="ltr"><div><p class="MsoNormal"><span style="font-size:12pt;line-height:107%;font-family:arial,sans-serif;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial">--------------------------------------------</span><span style="font-size:12pt;line-height:107%;font-family:arial,sans-serif;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal"><span style="font-size:12pt;font-family:arial,sans-serif">Due to several requests, we would
like to inform that the submission deadline of our workshop will be extended as
follow.<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">***<b>Important Dates (Extended)**</b>*:<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Paper submission
deadline: <b>5 May 2017</b><span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Notification of
acceptance: <b>9 May 2017</b><span></span></span></p>

<p class="MsoNormal"><span style="font-size:12pt;line-height:107%;font-family:arial,sans-serif;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial">--------------------------------------------<span></span></span></p>

<p class="MsoNormal" style="text-align:justify"><span style="font-size:12pt;line-height:107%;font-family:arial,sans-serif">We are planning to organize
a special issue in a top journal where extended version of the accepted
submission will be invited to participate.<span></span></span></p>

<p class="MsoNormal"><span style="font-size:12pt;line-height:107%;font-family:arial,sans-serif;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial">--------------------------------------------<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal"><span style="font-size:12pt;font-family:arial,sans-serif">Dear Colleagues,<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal"><span style="font-size:12pt;font-family:arial,sans-serif"><span> </span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal"><span style="font-size:12pt;font-family:arial,sans-serif">We are calling for contributions on
our ICRA 2017 workshop on "The Robotic Sense of Touch: From Sensing to
Understanding"<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">This workshop aims to
bring together experts in the fields of tactile sensor design, tactile data analysis,
machine learning and cognitive modeling to share their knowledge, through a mix
of oral presentations, round-table discussions, poster sessions and live
demonstrations.<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">***Call for
contributions***<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">We are soliciting the
submission of extended abstracts *<b>(1-3 pages PDF, IEEE template, optional
videos)</b> to <a href="mailto:rsot.icra2017@gmail.com" target="_blank"><b>rsot.icra2017@gmail.com</b></a>*.
*<b>The submission deadline is extended to 5 May 2017</b>*. Furthermore,
we particularly call for live demonstrations to be presented on the workshop
day. We encourage researchers as well as companies (both hardware and software
companies) to contribute to the workshop.<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Accepted submissions
will be presented during the “Demonstrations and poster session” of the
workshop. Selected submissions will be given time for oral presentations (15
minutes including Q&A). Please indicate in your email if you want to
present as a poster, oral, and/or live demonstration.<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">***<b>Important Dates
(Extended)**</b>*:<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Paper submission
deadline: <b>5 May 2017</b><span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Notification of
acceptance: <b>9 May 2017</b><span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Workshop day: 29
May 2017<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Abstract:<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">This workshop (<a href="https://roboticsenseoftouchws.wordpress.com/" target="_blank">https://roboticsenseoftouchws.wordpress.com</a>)
focuses on the development of novel tactile sensors (i.e. the bodyware) and how
they can contribute to robot intelligence (i.e. the mindware). Robots need
touch to interact with the surroundings (humans and/or objects) safely and
effectively, to learn about the outside world and to develop self-awareness. To
achieve these goals, the artificial skin of the next generation should measure
temperature, vibration, proximity and the complete force vectors on multiple
contact points; also, it should be both soft and robust to facilitate long-term
interactions. Still, major challenges are posed by the need to analyze and
interpret massive amounts of data in a fast and accurate way, and to combine
such sensing information with other cognitive and perceptual processes to
achieve real understanding. While advanced computational techniques (e.g. deep
learning, Bayesian inference) can critically improve data representations,
bio-inspired strategies for multimodal integration, prediction and reasoning
seem to be necessary as well to revolutionize the robotic sense of touch.
Therefore, the goal of this workshop is to discuss if and how the recent
advancements in tactile technology and data analysis have been accompanied by
an increased understanding of the ways in which tactile perception can support
robot autonomy and cognition.<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Content:<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Part of human
intelligence comes from using the sense of touch to explore the own-body and
learn about the world. Similarly, a humanoid robot can use tactile sensing to
learn about itself and its surroundings, and to improve its ability in
interacting with humans and objects. Indeed, there has been a remarkable
progress in tactile sensors for robotics. Several robotic platforms and hands
are equipped with sensors which allow measuring pressure, proximity and
temperature; also, novel technologies (soft, small, distributed, stretchable
sensors) can be used to cover the entire body of a humanoid. These sensors can
allow robots to actively explore objects and extract information which is hidden
or difficult to extract from vision, like texture, material and weight;
however, this requires the development of appropriate learning strategies which
incorporate explorative behaviors, signal processing and machine learning.<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Topics of interest
include, but are not limited to:<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Multimodal
distributed skin sensors;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Modular skin
sensors;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Force sensing;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">-
Soft/flexible/stretchable sensors;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Large-scale data
processing of tactile information;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Use of tactile data
for interaction control, tactile servoing, tactile feature recognition, object
handling;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Integration of touch
with other sensing modalities (e.g. vision, inertial sensing);<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Whole-body multiple
contact behaviors of humanoid robots;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Predictive models
for touch perception;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Slip prediction,
detection and control;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Tactile sensing for
self-contact, self-perception and self-calibration in humanoid robots;<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">- Haptic
representations of objects and their affordances.<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Organizers:<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Sophon Somlor (contact
person), Waseda University<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Alexander Schmitz,
Waseda University<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Lorenzo Jamone, The
Queen Mary University of London<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span lang="IT" style="font-size:12pt;font-family:arial,sans-serif">Lorenzo Natale, Istituto Italiano di Tecnologia</span><span style="font-size:12pt;font-family:arial,sans-serif"><span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">Gordon Cheng,
Technical University of Munich<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"> <span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif">***URL***<span></span></span></p>

<p class="MsoNormal" style="margin-bottom:0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="font-size:12pt;font-family:arial,sans-serif"><a href="https://roboticsenseoftouchws.wordpress.com/" target="_blank">https://roboticsenseoftouchws.wordpress.com/</a></span></p></div><br clear="all"><div><div class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div style="font-size:small"><div>================================================</div><div>Dr. Sophon Somlor<br></div><div>Junior Researcher</div><div>Top Global University Project<br>Unit for Frontier of Embodiment Informatics:ICT and Robotics<br>Faculty of Science and Engineering<br></div><div>Waseda University<br></div><div><br></div><div>早稲田大学 理工学術院</div><div>スーパーグローバル大学創生支援<br>ICT・ロボット工学拠点</div><div>次席研究員<br>ソムロア ソフォン</div><div>================================================</div></div></div></div></div></div></div></div></div></div></div></div></div>
</div>