MSPARC 2026: Workshop on Multimodal Signal Processing for Attentional Resource Cognition (MSPARC) @ IEEE ICASSP. May 4-8, Barcelona, Spain

18 views
Skip to first unread message

Muhammad Aqdus Ilyas

unread,
Oct 17, 2025, 2:09:14 AM (4 days ago) Oct 17
to comp-...@lists.cnsorg.org, systems-ne...@googlegroups.com, connect...@cs.cmu.edu, cosyne-di...@googlegroups.com, cv...@lists.auth.gr
[Apologies if you got multiple copies of this invitation]


at  
IEEE ICASSP 2026 
May 4-8,  Barcelona, Spain

MSPARC investigates how Modeling Eye, Brain, Speech, and Behavioral Signals for Cognitive Resource Allocation can deepen our understanding of attentional resource management in human cognition. By integrating signals from brain activity (EEG/fMRI), eye movements, pupillometry, speech, and behavior, we can build comprehensive models of how cognitive resources are distributed and modulated in real-time.

We invite the submission of original papers on all topics related to signal processing, cognitive neurosciences, AI/ML, and human-computer interaction (HCI), with special interest in, but not limited to:
  • Multimodal Signal Processing: Integration of EEG, fMRI, MEG, eye-tracking, pupillometry, speech, and behavior for attention and resource models
  • Real-Time Cognitive Monitoring: Low-latency algorithms, edge computing, wearables for continuous assessment of attention, load, and workload
  • Machine Learning for Attention: Deep learning, transformers, LLMs for lapse prediction, individual modeling, and personalized resource management
  • Clinical Tools: Biomarkers, diagnostics (ADHD, TBI), neurofeedback, cognitive rehabilitation, and attention-enhancement interventions
  • Educational Applications: Attention-aware LMS (learning management systems), adaptive content, engagement monitoring, and personalized pacing
  • Workplace Systems: Overload prevention in aviation, surgery, transport, and driver
  • monitoring, fatigue detection, and team load balancing
  • Immersive Tech: Attention-adaptive AR/VR, BCIs, gaze-based interaction, and cognitive load-responsive mixed reality
  • Auditory Attention: Hearing aids, attended speech enhancement with EEG/eye-tracking, adaptive acoustic processing, and scene analysis
  • Theoretical Frameworks: Attention as a resource, standardized protocols, benchmark datasets, ecological validity, and cross-modal validation
Submission and Guidelines Proceedings
Manuscripts should be prepared according to the ICASSP 2026 format.  Submission should be a maximum of 4 pages + 1 page for references. Please follow the templates provided in the Paper Kit.  Do not use the templates from ICASSP 2025.  Accepted papers will appear in the proceedings of ICASSP 2026 and will be published by IEEE Xplore. 

Contact:
Please send any inquiries to MSPARC Organizers: msp...@compute.dtu.dk




Reply all
Reply to author
Forward
0 new messages