<div dir="ltr">this talk could be of interest to all of us who work in the ML4HC space and who are curious about causality in the context of training predictive models<div><br><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">---------- Forwarded message ---------<br>From: <strong class="gmail_sendername" dir="auto">Sharon Cavlovich</strong> <span dir="auto"><<a href="mailto:sharonw@cs.cmu.edu">sharonw@cs.cmu.edu</a>></span><br>Date: Wed, Oct 26, 2022 at 12:37 PM<br>Subject: ML/Duolingo Seminar - Michael Oberst<br>To: <<a href="mailto:ml-seminar@cs.cmu.edu">ml-seminar@cs.cmu.edu</a>><br></div><br><br><div dir="ltr"><div>Please join us for a ML/Duolingo Seminar!</div><div><br></div><div>Tuesday, Nov. 1, 2022</div><div>NSH 4305</div><div>10:30am</div><div><br></div><div>Michael Oberst, PhD Candidate, MIT<br></div><div><br></div><div>Title: What is the role of causality in reliable prediction?<br><br>Abstract:
How should we incorporate causal knowledge into the development of
predictive models in high-risk domains like healthcare? Rather than
attempting to learn "causal" models, I present an alternative viewpoint:
Partial causal knowledge can be used to anticipate how model
performance will change in novel (but plausible) scenarios, and can be
used as a guide for developing reliable models.<br><br>First, I will
discuss my work on learning linear predictors that are worst-case
optimal under a set of user-specified interventions on unobserved
variables (e.g., moving from a hospital with high-income patients to one
with lower-income patients). This work assumes the existence of noisy
proxies for those background variables at training time, and an
underlying linear causal model over all variables. A key insight is that
the optimal predictor is not necessarily a "causal" predictor, but
depends on the scale (and direction) of plausible interventions.<br><br>Second,
I will demonstrate how similar ideas can be extended to more general
settings, including computer vision. Here, I will discuss work on
evaluating the worst-case performance of predictive models under a set
of user-specified, causally interpretable changes in distribution (e.g.,
a change in X-ray scanning policies). In contrast to work that
considers a worst-case over subpopulations or distributions in an
f-divergence ball, we consider parametric shifts in the distribution of a
subset of variables. This allows us to further constrain the space of
plausible shifts, and in some cases directly interpret the worst-case
shift to build intuition for model vulnerabilities.<br><br>This talk is based on joint work with Nikolaj Thams, David Sontag, and Jonas Peters.<br>(<a href="https://arxiv.org/abs/2103.02477" target="_blank">https://arxiv.org/abs/2103.02477</a>, <a href="https://arxiv.org/abs/2205.15947" target="_blank">https://arxiv.org/abs/2205.15947</a>)<br><br>Bio:
Michael Oberst is a PhD Candidate in EECS at MIT, advised by David
Sontag. His research lies at the intersection of causality, machine
learning, and healthcare, with an emphasis on improving the reliability
of both causal inference and prediction models. His work has been
published at a range of machine learning venues (NeurIPS / ICML /
AISTATS / KDD), including work with clinical collaborators from NYU
Langone, Beth Israel Deaconess Medical Center, and Mass General Brigham.
He has also worked on clinical applications of machine learning,
including work on learning effective antibiotic treatment policies
(published in Science Translational Medicine). He earned his
undergraduate degree in Statistics at Harvard.</div><div><br>-- <br><div dir="ltr" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><pre style="text-transform:none;text-indent:0px;letter-spacing:normal;font-size:12px;font-style:normal;font-weight:normal;word-spacing:0px;background-color:rgb(255,255,255)" cols="72">--
Sharon Cavlovich
Senior Department Administrative Assistant | Machine Learning Department
Carnegie Mellon University
5000 Forbes Avenue | Gates Hillman Complex 8215
Pittsburgh, PA 15213
412.268.5196 (office) | 412.268.3431 (fax)</pre></div></div></div></div></div></div></div></div></div></div>
</div></div></div>