Fwd: RI Ph.D. Thesis Proposal: Jack Good

Artur Dubrawski awd at cs.cmu.edu
Thu Nov 2 11:44:10 EDT 2023


Please mark your calendars and plan to attend a very amusing presentation
by Jack!

Artur

---------- Forwarded message ---------
From: Suzanne Muth <lyonsmuth at cmu.edu>
Date: Thu, Nov 2, 2023 at 11:42 AM
Subject: RI Ph.D. Thesis Proposal: Jack Good
To: RI People <ri-people at andrew.cmu.edu>


Date: 14 November 2023
Time: 4:30 p.m. (ET)
Location: GHC 8102

Zoom Link:
https://cmu.zoom.us/j/96881722005?pwd=NjhCb2gwQ3ZzSmtlVlJ3Qnp5QTd1Zz09

Type: Ph.D. Thesis Proposal
Who: Jack Good
Title: Trustworthy Learning using Uncertain Interpretation of Data



Abstract:

Non-parametric models are popular in real-world applications of machine
learning. However, many modern ML methods that ensure that models are
pragmatic, safe, robust, fair, and otherwise trustworthy in increasingly
critical applications, assume parametric, differentiable models. We show
that, by interpreting data as locally uncertain, we can achieve many of
these without being limited to parametric or inherently differentiable
models. In particular, we focus on decision trees, which are popular for
their good performance on tabular data as well as ease of use, low design
cost, low computational requirements, fast inference, and interpretability.
We propose a new kind of fuzzy decision tree we call a kernel density
decision tree (KDDT) because the uncertain input interpretation is similar
to kernel density estimation.

We organize the completed and proposed contributions of this thesis into
three pillars. The first pillar is robustness and verification: we show
improvement of robustness to various adverse conditions and discuss
verification of safety properties for FDTs and KDDTs. The second pillar is
interpretability: by leveraging the efficient fitting and differentiability
of our trees, we alternatingly optimize a parametric feature transformation
using gradient descent and the tree by refitting to obtain compact,
interpretable single-tree models with competitive performance. The third
pillar is pragmatic advancements: we make advances in semi-supervised
learning, federated learning, and ensemble merging for decision trees.



Thesis Committee Members:

Artur Dubrawski, Chair

Jeff Schneider

Tom Mitchell

Gilles Clermont, University of Pittsburgh
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/autonlab-users/attachments/20231102/96408657/attachment.html>


More information about the Autonlab-users mailing list