Connectionists: NeurIPS 2022 Gaze Meets ML Workshop

Alex Karargyris akarargyris at gmail.com
Fri Aug 19 11:19:30 EDT 2022


Dear all,

I am very excited to share that we will be organizing our new Gaze Meets ML
workshop at NeurIPS 2022 (*in person*). We hope that this workshop will
provide a unique opportunity to bring experts from various backgrounds
(e.g. neuroscience, machine learning, computer vision, medical imaging,
NLP, etc.) together to discuss ideas and ways to bridge human and machine
attention that can help make machine learning more efficient. Please find
more information in the Call for Papers below.

Sincerely,
Alexandros Karargyris on behalf of the organising committee

------------------------------
*****************************************************************

NeurIPS 2022 Gaze Meets ML Workshop

*****************************************************************

*Webpage: https://gaze-meets-ml.github.io/
<https://gaze-meets-ml.github.io/gaze-meets-ml/>*

*Submission site:
<https://cmt3.research.microsoft.com/OpenEDS2019>https://openreview.net/group?id=NeurIPS.cc/2022/Workshop/GMML
<https://openreview.net/group?id=NeurIPS.cc/2022/Workshop/GMML> *

*Submission Deadline: September 22nd, 2022*

*Date: December 3rd, 2022*

*Location: New Orleans Convention Center, New Orleans, LA*

** Overview **

Eye gaze has proven to be a cost-efficient way to collect large-scale
physiological data that can reveal the underlying human attentional
patterns in real life workflows, and thus has long been explored as a
signal to directly measure human-related cognition in various domains
Physiological data (including but not limited to eye gaze) offer new
perception capabilities, which could be used in several ML domains, e.g.,
egocentric perception, embodiedAI, NLP, etc. They can help infer human
perception, intentions, beliefs, goals and other cognition properties that
are much needed for human-AI interactions and agent coordination. In
addition, large collections of eye-tracking data have enabled data-driven
modeling of human visual attention mechanisms, both for saliency or scan
path prediction, with twofold advantages: from the neuroscientific
perspective to understand biological mechanisms better, from the AI
perspective to equip agents with the ability to mimic or predict human
behavior and improve interpretability and interactions.

With the emergence of immersive technologies, now more than any time there
is a need for experts of various backgrounds (e.g., machine learning,
vision, and neuroscience communities) to share expertise and contribute to
a deeper understanding of the intricacies of cost-efficient human
supervision signals (e.g., eye-gaze) and their utilization towards by
bridging human cognition and AI in machine learning research and
development. The goal of this workshop is to bring together an active
research community to collectively drive progress in defining and
addressing core problems in gaze-assisted machine learning.

** Call for Papers **

We welcome submissions that present aspects of eye-gaze in regards to
cognitive science, psychophysiology and computer science, propose methods
on integrating eye gaze into machine learning, and application domains from
radiology, AR/VR, autonomous driving, etc. that introduce methods and
models utilizing eye gaze technology in their respective domains.

Topics of interest include but are not limited to the following:

   -

   Understanding the neuroscience of eye-gaze and perception.
   -

   State of the art in incorporating machine learning and eye-tracking.
   -

   Data annotation and ML supervision with eye-gaze.
   -

   Attention mechanisms and their correlation with eye-gaze.
   -

   Methods for gaze estimation and prediction using machine learning.
   -

   Unsupervised ML using eye gaze information for feature
   importance/selection.
   -

   Understanding human intention and goal inference.
   -

   Using saccadic vision for ML applications.
   -

   Use of gaze for human-AI interaction and agent coordination in
   multi-agent environments.
   -

   Eye gaze used for AI, e.g., NLP, Computer Vision, RL, Explainable AI,
   Embodied AI, Trustworthy AI.
   -

   Gaze applications in cognitive psychology, radiology, neuroscience,
   AR/VR, autonomous cars, privacy, etc.


** Submission Guidelines **

Submissions must be written in English and must be sent in PDF format. Each
submitted paper must be no longer than nine (9) pages, excluding appendices
and references. Please refer to the NeurIPS2022 formatting instructions
<https://nips.cc/Conferences/2022/CallForPapers> for instructions regarding
formatting, templates, and policies. The submissions will be peer-reviewed
by the program committee and accepted papers will be presented as lightning
talks during the workshop.

Submit your paper at
https://openreview.net/group?id=NeurIPS.cc/2022/Workshop/GMML  before
the *September
22 deadline*.

** Awards and Funding **

Possibly award prizes for best papers or cover registration fees of
presenting authors with a focus on underrepresented minorities.

** Important dates for Workshop paper submission **

   -

   Paper submission deadline: *September 22, 2022*
   -

   Notification of acceptance: *October 14, 2022*
   -

   Workshop: *December 3, 2022 (in person)*


** Organizing Committee **

Ismini Lourentzou <https://isminoula.github.io/> (Virginia Tech)

Joy Tzung-yu Wu
<https://scholar.google.com/citations?user=03O8mIMAAAAJ&hl=en> (Stanford,
IBM Research)

Satyananda Kashyap
<https://researcher.watson.ibm.com/researcher/view.php?person=ibm-Satyananda.Kashyap>
(IBM Research)

Alexandros Karargyris <https://www.linkedin.com/in/alexandroskarargyris/>
(IHU Strasbourg)

Leo Antony Celi <https://imes.mit.edu/research-staff-prof/leo-anthony-celi/>
(MIT)

Ban Kawas
<https://www.re-work.co/events/trusted-ai-summit-2022/speakers/ban-kawas>
(Meta, Reality Labs Research)

Sachin Talathi <https://www.linkedin.com/in/sachin-t-0b8b608/> (Meta,
Reality Labs Research)

** Contact **
Organizing Committee gaze.neurips at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20220819/a38dd412/attachment.html>


More information about the Connectionists mailing list