Connectionists: CFP: 2nd Workshop "Multimodal, Affective and Interactive Explainable AI" (MAI-XAI 25), collocated with ECAI, October 25-30, Bologna, Italy

Philipp Cimiano cimiano at techfak.uni-bielefeld.de
Mon Mar 10 16:28:38 EDT 2025


* Apologies for cross-postings *

Call for Papers 

2nd Workshop on Multimodal, Affective and Interactive Explainable AI (MAI-XAI-25), collocated with the European Conference on Artificial Intelligence (ECAI), October 25-30, Bologna Italy

https://sites.google.com/view/mai-xai25 


The field of eXplainable Artificial Intelligence (XAI) is concerned with developing methods that make the decisions / predictions by machine learned models accessible and understandable to different stakeholders, ranging from machine learning experts to lay users. An important goal is to design systems in a human-centered manner, ensuring that explanations are effective in enhancing the understanding of human users about the model and empower them to perform an appropriate action. 
Yet, the current state of the art in XAI is limited in this respect. Many studies in the field of XAI are concerned with evaluating technology in an intrinsic fashion regarding measures such as validity, proximity, etc.  that tell us little about the actual effectiveness of explanations from an end user perspective. Further, there is a lack of methods that allow to interactively tailor explanations to the (evolving) needs of explainees as well as to measure the effectiveness of the provided explanations in terms of enhancing user understanding. 
The MAI-XAI workshop focuses on improving effectiveness of explanations by moving to “natural” explanations that are more accessible to a non-technical audience. Natural explanations leverage multiple modalities (text, speech, visual, tabular, ...) to select the form of presentation of an explanation that most suits the context and the explanatory needs of an explainee. XAI systems providing natural explanations might react  to affective aspects and emotions to e.g. identify dissatisfaction with an explanation and react accordingly. Finally, XAI systems should be able to effectively interact with the user to move from one-shot static explanations to dynamically adapted explanations that can be informed by the reactions or feedback of a user during the interaction. 

We aim to offer researchers and practitioners the opportunity to identify new promising research directions on XAI along the above mentioned lines, focusing on how to provide “natural explanations”. Attendants are encouraged to present case studies in real-world applications where XAI has been successfully applied, emphasizing the practical benefits and challenges encountered. 

The topics of interest include (but are not limited to):

Multimodal XAI 
XAI for multi-modal data retrieval, collection, augmentation, generation, and validation
XAI for Human-Computer Interaction (HCI)
Augmented reality for multi-modal XAI
 XAI approaches leveraging application-specific domain knowledge
Design and validation of multi-modal explainers
Quantifying XAI: From defining metrics and methodologies to assessing the effectiveness of explanations in enhancing user understanding, reliance, and trust
Large knowledge bases and graphs that can be used for multi-modal explanation generation
Large language models and their generative power for multi-modal XAI
Proof-of-concepts and demonstrators of how to integrate effective and efficient XAI into real-world human decision-making processes
Ethical, Legal, Socio-Economic and Cultural (ELSEC) considerations in XAI: Examining ethical implications surrounding the use of high-risk AI applications, including potential biases and the responsible deployment of sustainable “green” AI in sensitive domains  


Affective XAI
Explainable affective computing in healthcare, psychology, physiology, education, entertainment, and gaming
Privacy, fairness, and ethical considerations in affective computing
Multimodal (textual, visual, vocal, physiological) emotion recognition systems
User environments for the design of systems to better detect and classify affect
Sentiment analysis and explainability
Social robots and explainability
Emotion-aware XAY Systems
Accuracy and explainability in emotion recognition
Machine learning using biometric data to classify biosignals
Virtual reality in affective computing
Human–Computer Interaction (HCI) and Human in the Loop (HITL) approaches in affective computing

 
Interactive XAI
Dialogue-based approaches to XAI
Approaches to dynamically adapt explanations in interaction with a user
XAI approaches that use a model of the partner to adapt explanations
XAI approaches for collaborative decision-making between humans and AI models
Methods to measure and evaluate the understanding of the users about a model
Methods to measure and evaluate the ability to use models effectively in downstream tasks
Interactive methods by which a system and a user can negotiate what is to be explained
Modelling the social functions and aspects of an explanation
Methods to identify users’ information and explainability needs

Papers submitted to ECAI that are under review for the conference cannot be submitted to the workshop. If rejected from ECAI, authors can submit a request for their paper to be considered for the workshop by July 18th to one of the emails from the Contact page.

Accepted manuscripts will be published in CEUR Workshop Proceedings (CEUR-WS.org <http://ceur-ws.org/>). Papers must be written in English, be prepared for double-blind review using the CEUR-WS template <https://www.overleaf.com/latex/templates/template-for-submissions-to-ceur-workshop-proceedings-ceur-ws-dot-org/wqyfdgftmcfw>. 

The following types of submissions are allowed:
Regular/Long Papers (10 - 15 pages): describing novel and original technical contributions enhancing our understanding of multimodality, affectiveness and interaction in XAI. 
Short Papers (5 - 9 pages): describing work in progress, a case study, … 


Submissions should be made through Easychair:  (https://easychair.org/conferences/?conf=maixai25). 

Registering an abstract of your paper (of around 100-300 words in plain text) is mandatory in advance of the paper submission deadline and you will be asked to provide additional information (such as keywords) at that time. Please do not leave things to the very last moment; you can resubmit any number of times until the submission deadline.

The workshop is planned as an in-person event. Each accepted paper will get assigned either an oral presentation slot or a combined poster/spotlight presentation slot.

Important Dates:
Abstract registration: May 15th, 2025
Paper submission: May 21th, 2025
Acceptance/rejection notification: July 24th, 2025
Camera-ready paper submission: September 11th, 2025
Conference dates: October 25-30, 2025 (The MAI-XAI 25 Workshop will take place from 25th - 26th of October, 2025)

Organizers:
Philipp Cimiano, Bielefeld University, Germany
Fosca Giannotti, Scuola Normale Superiore, Pisa, Italy
Tim Miller, The University of Queensland, Australia
Bárbara Hammer, Bielefeld University, Germany
Alejandro Catalá, Universidade de Santiago de Compostela (USC), Spain
Peter Flach, University of Bristol, UK
Jose M. Alonso-Moral, Universidade de Santiago de Compostela (USC), Spain

Contact  Details: 
Jose M. Alonso-Moral, https://citius.gal/es/team/jose-maria-alonso-moral <https://citius.gal/es/team/jose-maria-alonso-moral>  
Philipp Cimiano, https://ekvv.uni-bielefeld.de/pers_publ/publ/PersonDetail.jsp?personId=15020699&lang=EN <https://ekvv.uni-bielefeld.de/pers_publ/publ/PersonDetail.jsp?personId=15020699&lang=EN> 



Prof. Dr. Philipp Cimiano
AG Semantic Computing
Coordinator of the Cognitive Interaction Technology Center (CITEC)
Co-Director of the Joint Artificial Intelligence Institute (JAII)
Universität Bielefeld

Tel: +49 521 106 12249
Fax: +49 521 106 6560
Mail: cimiano at cit-ec.uni-bielefeld.de
Personal Zoom Room: https://uni-bielefeld.zoom-x.de/my/pcimiano

Office CITEC-2.307
Universitätsstr. 21-25
33615 Bielefeld, NRW
Germany

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20250310/b6c6073f/attachment.html>


More information about the Connectionists mailing list