Fwd: PhD Speaking Qualifier: Kernel Density Decision Trees
Artur Dubrawski
awd at cs.cmu.edu
Fri Apr 8 13:02:00 EDT 2022
fyi, if you have not seen this work from Jack, you've got to catch up!
Artur
---------- Forwarded message ---------
From: Jack Good <jhgood at cs.cmu.edu>
Date: Fri, Apr 8, 2022 at 12:46 PM
Subject: PhD Speaking Qualifier: Kernel Density Decision Trees
To: <ri-people at lists.andrew.cmu.edu>
Hi everyone,
I'll be presenting my speaking qualifier *Tuesday, April 19 at 10:00 am* on
Zoom. Everyone is welcome.
*Date: *Tuesday, April 19
*Time:* 10:00 am EDT
*Zoom: https://cmu.zoom.us/j/93872260456?pwd=SHVWRjZXTW9aZHB6U1lTWG9TUTcvUT09
<https://cmu.zoom.us/j/93872260456?pwd=SHVWRjZXTW9aZHB6U1lTWG9TUTcvUT09>*
*Meeting ID: *938 7226 0456
*Passcode: *790431
*Title: *Kernel Density Decision Trees
*Abstract*
We propose kernel density decision trees (KDDTs), a novel fuzzy decision
tree (FDT) formalism based on kernel density estimation that improves the
robustness of decision trees and ensembles and offers additional utility.
FDTs mitigate the sensitivity of decision trees to uncertainty by
representing uncertainty through fuzzy partitions. However, compared to
conventional, crisp decision trees, FDTs are generally complex to apply,
sensitive to design choices, slow to fit and make predictions, and
difficult to interpret. Moreover, finding the optimal threshold for a given
fuzzy split is challenging, resulting in methods that discretize data,
settle for near-optimal thresholds, or fuzzify crisp trees. Our KDDTs
address these shortcomings by representing uncertainty intuitively through
kernel distributions and by using a novel, scalable generalization of the
CART algorithm for finding optimal partitions for FDTs with
piecewise-linear splitting functions or KDDTs with piecewise-constant
fitting kernels. KDDTs can improve robustness to random or adversarial
perturbation, reduce or eliminate the need for ensemble methods, enable
smooth regression with trees, and give access to gradient-based feature
learning methods that can improve prediction performance and reduce model
complexity.
*Committee*
Artur Dubrawski (Chair)
Barnabas Poczos
Mario Berges
Nick Gisolfi
_______________________________________________
ri-people mailing list
ri-people at lists.andrew.cmu.edu
https://lists.andrew.cmu.edu/mailman/listinfo/ri-people
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/autonlab-users/attachments/20220408/b31bf196/attachment.html>
More information about the Autonlab-users
mailing list