<div dir="ltr"><div dir="ltr"><span id="gmail-docs-internal-guid-5f866c14-7fff-ac1e-5fc1-096190e29ba5" style="color:rgb(0,0,0)"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Dear Colleagues,</font></span></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;text-align:justify;margin-top:0pt;margin-bottom:0pt"><font face="arial, sans-serif"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">We have extended the deadline for our workshop titled “Socially-Informed AI for Healthcare: Understanding and Generating Multimodal Nonverbal Cues” which takes place as part of the 23</span><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><span style="vertical-align:super">rd</span></span><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"> ACM International Conference on Multimodal Interaction 2021 in October 2021 in Montreal, Canada. </span></font></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;text-align:justify;margin-top:0pt;margin-bottom:0pt"><font face="arial, sans-serif"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">The new deadline is </span><span style="font-weight:700;font-variant-ligatures:normal;font-variant-east-asian:normal;text-decoration:underline;text-decoration-skip:none;vertical-align:baseline;white-space:pre-wrap">June 27, 2021</span><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">. For further information, please see </span><a href="https://social-ai-for-healthcare.github.io/" style="text-decoration:none"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;text-decoration:underline;text-decoration-skip:none;vertical-align:baseline;white-space:pre-wrap">https://social-ai-for-healthcare.github.io/</span></a><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">. </span></font></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif"><b>CALL FOR PAPERS</b></font></span></p><p dir="ltr" style="line-height:1.2;text-align:justify;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Advances in the areas of face and gesture analysis, computational paralinguistics, multimodal interaction, and human-computer interaction have all played a major role in shaping research into assistive technologies over the last decade, resulting in a breadth of practical applications ranging from diagnosis and treatment tools to social companion technologies. While nonverbal cues play an essential role, there are still many key issues to overcome, which affect both the development and the deployment of multimodal technologies in real-world settings. The key aim of this multidisciplinary workshop is to foster cross-pollination by bringing together computer scientists and social psychologists to discuss innovative ideas, challenges and opportunities for understanding and generating multimodal nonverbal cues within the scope of healthcare applications. Topics include, but are not limited to:</font></span></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;text-align:justify;margin-top:0pt;margin-bottom:0pt"></p><ul><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Analysis and synthesis of multimodal nonverbal cues, including facial expressions, paralinguistics, eye gaze and head movements, body postures and hand gestures, audio (e.g., turn taking, vocal outbursts, etc.) within an interaction context, </span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap"> Co-modelling of nonverbal and verbal cues,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Cross-modality learning for the analysis and synthesis of nonverbal cues,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Novel methodologies for modelling long-term interactions through multiple modalities,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Modelling of interpersonal coordination such as convergence, synchrony or mimicry,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Personalisation and adaptation mechanisms for healthcare applications,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Automatic detection of non-conforming patterns and/or violations in social norms and expectations (expectancy violations theory) in nonverbal interaction,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Clinical applications (e.g., autism, depression, anxiety, etc.), including defining appropriate qualitative and quantitative evaluation methods,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Privacy preserving approaches to data collection or synthetic data generation,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Novel technological devices or robotics platforms specifically for healthcare applications,</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Explainable AI techniques focused towards clinical applications.</span></li></ul><p></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif"><b>KEYNOTE SPEAKERS</b></font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Prof. Antonia Hamilton, University College London, UK</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Dr Stefan Scherer, Embodied, Inc., USA</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Prof. Laurel Riek, University of California, San Diego, USA</font></span></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif"><b>KEY DATES</b></font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Paper submission: June 27, 2021</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Paper notification: August 9, 2021</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Camera-ready paper submission: September 10, 2021</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Workshop day: TBA (held in conjunction with ACM ICMI 2021)</font></span></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif"><b>INFORMATION FOR AUTHORS</b></font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><font face="arial, sans-serif"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap">Submissions will be via Easychair and should follow the ICMI template. Please follow the link </span><a href="https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Feasychair.org%2Fconferences%2F%3Fconf%3Dsocialaiforhealthcar&data=04%7C01%7Coya.celiktutan%40kcl.ac.uk%7Ca9294c6fb9ec4304a98208d92f2257a9%7C8370cf1416f34c16b83c724071654356%7C0%7C0%7C637592646802427073%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=5HUN5y98zu0KJ1FEB2nWlz%2BBr69obNMOKlqz%2BSwbjf0%3D&reserved=0" style="text-decoration:none"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;text-decoration:underline;text-decoration-skip:none;vertical-align:baseline;white-space:pre-wrap">https://easychair.org/conferences/?conf=socialaiforhealthcar</span></a><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"> for submitting your paper and please refer to please see </span><a href="https://social-ai-for-healthcare.github.io/" style="text-decoration:none"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;text-decoration:underline;text-decoration-skip:none;vertical-align:baseline;white-space:pre-wrap">https://social-ai-for-healthcare.github.io/</span></a><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"> for more information. We welcome research papers in the following formats: </span></font></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"></p><ul><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Workshop Full Paper =8 Page Limit + extra pages for references</span></li><li><span style="font-family:arial,sans-serif;white-space:pre-wrap">Workshop Short Paper =4 Page Limit + extra pages for references</span></li><li><span style="white-space:pre-wrap;font-family:arial,sans-serif">Workshop Poster Abstract =3 Page Limit + extra pages for references</span></li></ul><p></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><font face="arial, sans-serif"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><br></span></font></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Please share this message with interested colleagues.</font></span></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">With many thanks and best wishes,</font></span></p><font face="arial, sans-serif"><br></font><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Oya Celiktutan</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Alexandra Georgescu</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif">Nicholas Cummins</font></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-variant-ligatures:normal;font-variant-east-asian:normal;vertical-align:baseline;white-space:pre-wrap"><font face="arial, sans-serif"><i>Organisers, Socially Informed AI for Healthcare 2021 </i></font></span></p></span><br class="gmail-Apple-interchange-newline" style="color:rgb(0,0,0)"></div></div>