Connectionists: MSPARC 2026: Workshop on Multimodal Signal Processing for Attentional Resource Cognition (MSPARC) @ IEEE ICASSP. May 4-8, Barcelona, Spain
Muhammad Aqdus Ilyas
engr.aqdus at gmail.com
Wed Oct 15 06:49:47 EDT 2025
[Apologies if you got multiple copies of this invitation]
*Multimodal Signal Processing for Attentional Resource Cognition (MSPARC)
<https://msparc.compute.dtu.dk/> *
at
*IEEE ICASSP 2026 <https://2026.ieeeicassp.org/> *
May 4-8, Barcelona, Spain <https://2026.ieeeicassp.org/>
MSPARC investigates how Modeling Eye, Brain, Speech, and Behavioral Signals
for Cognitive Resource Allocation can deepen our understanding of
attentional resource management in human cognition. By integrating signals
from brain activity (EEG/fMRI), eye movements, pupillometry, speech, and
behavior, we can build comprehensive models of how cognitive resources are
distributed and modulated in real-time.
We invite the submission of original papers on all topics related to signal
processing, cognitive neurosciences, AI/ML, and human-computer interaction
(HCI), with special interest in, but not limited to:
- Multimodal Signal Processing: Integration of EEG, fMRI, MEG,
eye-tracking, pupillometry, speech, and behavior for attention and resource
models
- Real-Time Cognitive Monitoring: Low-latency algorithms, edge
computing, wearables for continuous assessment of attention, load, and
workload
- Machine Learning for Attention: Deep learning, transformers, LLMs for
lapse prediction, individual modeling, and personalized resource management
- Clinical Tools: Biomarkers, diagnostics (ADHD, TBI), neurofeedback,
cognitive rehabilitation, and attention-enhancement interventions
- Educational Applications: Attention-aware LMS (learning management
systems), adaptive content, engagement monitoring, and personalized pacing
- Workplace Systems: Overload prevention in aviation, surgery,
transport, and driver
- monitoring, fatigue detection, and team load balancing
- Immersive Tech: Attention-adaptive AR/VR, BCIs, gaze-based
interaction, and cognitive load-responsive mixed reality
- Auditory Attention: Hearing aids, attended speech enhancement with
EEG/eye-tracking, adaptive acoustic processing, and scene analysis
- Theoretical Frameworks: Attention as a resource, standardized
protocols, benchmark datasets, ecological validity, and cross-modal
validation
*Submission and Guidelines Proceedings*
Manuscripts should be prepared according to the ICASSP 2026 format.
Submission should be a maximum of 4 pages + 1 page for references. Please
follow the templates provided in the Paper Kit
<https://cmsworkshops.com/ICASSP2026/papers/paper_kit.php#Templates>. Do
not use the templates from ICASSP 2025. Accepted papers will appear in the
proceedings of ICASSP 2026 and will be published by IEEE Xplore.
*Contact:*
Please send any inquiries to MSPARC Organizers: *msparc at compute.dtu.dk
<msparc at compute.dtu.dk>*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20251015/feea054c/attachment.html>
More information about the Connectionists
mailing list