<div dir="ltr"><div>Dear Colleagues,</div><div><br></div><div>We would like to invite you to contribute a chapter for the upcoming volume entitled “Neural and Machine Learning for Emotion and Empathy Recognition: Experiences from the <span class="gmail-il">OMG</span>-Challenges” to be published by the Springer Series on Competitions in Machine Learning. Our <span class="gmail-il">book</span> will be available by mid-2020.</div><div><br class="gmail-m_7754086919707478362gmail-Apple-interchange-newline">Website: <a href="https://easychair.org/cfp/OMGBook2019" target="_blank">https://easychair.org/cfp/OMGBook2019</a><br></div><div><br></div><div>Follows a short description of our volume:<br></div><div><br></div><div>Emotional expression perception and categorization are extremely popular in the affective computing community. However, the inclusion of emotions in the decision-making process of an agent is not considered in most of the research in this field. To treat emotion expressions as the final goal, although necessary, reduces the usability of such solutions in more complex scenarios. To create a general affective model to be used as a modulator for learning different cognitive tasks, such as modeling intrinsic motivation, creativity, dialog processing, grounded learning, and human-level communication, instantaneous emotion perception cannot be the pivotal focus.</div><div><br></div><div>This <span class="gmail-il">book</span> aims to present recent contributions for multimodal emotion recognition and empathy prediction which take into consideration the long-term development of affective concepts. On this regard, we provide access to two datasets: the <span class="gmail-il">OMG</span>-Emotion Behavior Recognition and <span class="gmail-il">OMG</span>-Empathy Prediction datasets. These datasets were designed, collected and formalized to be used on the <span class="gmail-il">OMG</span>-Emotion Recognition Challenge and the <span class="gmail-il">OMG</span>-Empathy Prediction challenge, respectively. All the participants of our challenges are invited to submit their contribution to our <span class="gmail-il">book</span>. We also invite interested authors to use our datasets on the development of inspiring and innovative research on affective computing. By formatting these solutions and editing this <span class="gmail-il">book</span>, we hope to inspire further research in affective and cognitive computing over longer timescales.</div><div><br></div><div>TOPICS OF INTEREST</div><div><br></div><div>The topics of interest for this <span class="gmail-m_7754086919707478362gmail-il"><span class="gmail-il">call</span></span> for <span class="gmail-m_7754086919707478362gmail-il"><span class="gmail-il">chapters</span></span> include, but are not limited to:</div><div>- New theories and findings on continuous emotion recognition</div><div>- Multi- and Cross-modal emotion perception and interpretation</div><div>- Novel neural network models for affective processing</div><div>- Lifelong affect analysis, perception, and interpretation</div><div>- New neuroscientific and psychological findings on continuous emotion representation</div><div>- Embodied artificial agents for empathy and emotion appraisal</div><div>- Machine learning for affect-driven interventions</div><div>- Socially intelligent human-robot interaction</div><div>- Personalized systems for human affect recognition</div><div>- New theories and findings on empathy modeling</div><div>- Multimodal processing of empathetic and social signals</div><div>- Novel neural network models for empathy understanding</div><div>- Lifelong models for empathetic interactions</div><div>- Empathetic Human-Robot-Interaction Scenarios</div><div>- New neuroscientific and psychological findings on empathy representation</div><div>- Multi-agent communication for empathetic interactions</div><div>- Empathy as a decision-making modulator</div><div>- Personalized systems for empathy prediction</div><div><br></div><div>Each contributed chapter is expected to present a novel research study, a comparative study, or a survey of the literature.</div><div><br></div><div>SUBMISSIONS</div><div><br></div><div>All submissions should be done via EasyChair:</div><div><br></div><div> <a href="https://easychair.org/cfp/OMGBook2019" target="_blank">https://easychair.org/cfp/OMGBook2019</a><br></div><div><br></div><div>Original artwork and a signed copyright release form will be required for all accepted <span class="gmail-m_7754086919707478362gmail-il"><span class="gmail-il">chapters</span></span>. For author instructions, please visit:<br></div><div> <a href="https://www.springer.com/us/authors-editors/book-authors-editors/resources-guidelines/book-manuscript-guidelines" target="_blank">https://www.springer.com/us/authors-editors/<span class="gmail-il">book</span>-authors-editors/resources-guidelines/<span class="gmail-il">book</span>-manuscript-guidelines</a></div><div><br></div><div>We would also like to announce that our two datasets, related to emotion expressions and empathy prediction, are now fully available. You can have access to them and obtain more information by visiting their website:<br></div><div><div><br></div><div>- <span class="gmail-il">OMG</span>-EMOTION - <a href="https://www2.informatik.uni-hamburg.de/wtm/omgchallenges/omg_emotion.html" target="_blank">https://www2.informatik.uni-hamburg.de/wtm/omgchallenges/omg_emotion.html</a><br></div><div><div>- <span class="gmail-il">OMG</span>-EMPATHY - <a href="https://www2.informatik.uni-hamburg.de/wtm/omgchallenges/omg_empathy.html" target="_blank">https://www2.informatik.uni-hamburg.de/wtm/omgchallenges/omg_empathy.html</a></div></div></div><div><br></div><div>If you want more information, please do not hesitate to contact me: <a href="mailto:barros@informatik.uni-hamburg.de" target="_blank">barros@informatik.uni-hamburg.de</a></div><div><br></div><div>IMPORTANT DATES:</div><div><br></div><div>- Submissions of full-length <span class="gmail-m_7754086919707478362gmail-il"><span class="gmail-il">chapters</span></span>: 31st of October 2019<br></div><div><br></div>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><pre cols="72">Dr. Pablo Barros
Postdoctoral Research Associate - Crossmodal Learning Project (CML)
Knowledge Technology
Department of Informatics
University of Hamburg
Vogt-Koelln-Str. 30
22527 Hamburg, Germany
Phone: +49 40 42883 2535
Fax: +49 40 42883 2515
barros at <a href="http://informatik.uni-hamburg.de" target="_blank">informatik.uni-hamburg.de</a>
<a href="http://www.pablobarros.net" target="_blank">http://www.pablobarros.net</a>
<a href="https://www.inf.uni-hamburg.de/en/inst/ab/wtm/people/barros.html" target="_blank">https://www.inf.uni-hamburg.de/en/inst/ab/wtm/people/barros.html</a>
<a href="https://www.inf.uni-hamburg.de/en/inst/ab/wtm/" target="_blank">https://www.inf.uni-hamburg.de/en/inst/ab/wtm/</a></pre></div></div></div></div></div>