Connectionists: CFP: D-7 reminder; Learning on Distributions, Functions, Graphs and Groups workshop @ NIPS-2017
Zoltan Szabo
zoltan.szabo.list at gmail.com
Tue Oct 3 15:18:12 EDT 2017
Dear All,
This is a gentle reminder; the submission to the 'Learning on
Distributions, Functions, Graphs and Groups workshop @ NIPS-2017' will
close in 7 days.
=========================================================================
CALL FOR PAPERS:
Learning on Distributions, Functions, Graphs and Groups workshop @ NIPS-2017
December 8th, 2017
Long Beach, CA, U.S.
https://sites.google.com/site/nips2017learningon/
Important dates:
- Submission deadline: Oct. 10, 2017 (5pm, Pacific Time).
- Notification of acceptance: Oct. 20, 2017 (5pm, Pacific Time).
Confirmed speakers:
- Kenji Fukumizu (Institute for Statistical Mathematics, Tokyo)
- Hachem Kadri (Aix-Marseille University)
- Risi Kondor (University of Chicago)
- Simon Lacoste-Julien (University of Montreal)
- Barnabás Póczos (Carnegie Mellon University)
Description:
The increased variability of acquired data has recently pushed the field
of machine learning to extend its scope to non-standard data including
for example functional, distributional, graph, or topological data.
Successful applications span across a wide range of disciplines such as
healthcare, action recognition from iPod/iPhone accelerometer data,
causal inference, bioinformatics, cosmology, acoustic-to-articulatory
speech inversion, network inference, climate research, and ecological
inference.
Leveraging the underlying structure of these non-standard data types
often leads to significant boost in prediction accuracy and inference
performance. In order to achieve these compelling improvements, however,
numerous challenges and questions have to be addressed:
- choosing an adequate representation of the data,
- constructing appropriate similarity measures (inner product, norm or
metric) on these representations,
- efficiently exploiting their intrinsic structure such as multi-scale
nature or invariances,
- designing affordable computational schemes (relying e.g., on surrogate
losses),
- understanding the computational-statistical tradeoffs of the resulting
algorithms, and
- exploring novel application domains.
The goal of this workshop is
- to discuss new theoretical considerations and applications related to
learning with non-standard data,
- to explore future research directions by bringing together
practitioners with various domain expertise and algorithmic tools, and
theoreticians interested in providing sound methodology,
- to accelerate the advances of this recent area and application arsenal.
We encourage submissions on a variety of topics, including but not
limited to:
- Novel applications for learning on non-standard objects
- Learning theory/algorithms on distributions
- Topological and geometric data analysis
- Functional data analysis
- Multi-task learning, structured output prediction, and surrogate losses
- Vector-valued learning (e.g., operator-valued kernel)
- Gaussian processes
- Learning on graphs and networks
- Group theoretic methods and invariances in learning
- Learning with non-standard input/output data
- Large-scale approximations (e.g. sketching, random Fourier features,
hashing, Nyström method, inducing point methods), and
statistical-computational efficiency tradeoffs.
Organizers:
- Florence d'Alché-Buc (Télécom ParisTech, Paris-Saclay University)
- Krikamol Muandet (Mahidol University, MPI Tübingen)
- Bharath K. Sriperumbudur (Pennsylvania State University)
- Zoltán Szabó (École Polytechnique)
=========================================================================
Best,
Workshop Organizers
More information about the Connectionists
mailing list