Connectionists: ICBINB Monthly Seminar Series Talk: Thomas Dietterich

Francisco J. Rodríguez Ruiz franrruiz87 at gmail.com
Fri Jul 1 10:20:52 EDT 2022


Dear all,

We are pleased to announce that the next speaker of the *“I Can’t Believe
It’s Not Better!” (**ICBINB)* virtual seminar series will be *Thomas
Dietterich** (**Oregon State University**)*. More details about this series
and the talk are below.

The *"I Can't Believe It's Not Better!" (ICBINB) monthly online seminar
series* seeks to shine a light on the "stuck" phase of research. Speakers
will tell us about their most beautiful ideas that didn't "work", about
when theory didn't match practice, or perhaps just when the going got
tough. These talks will let us peek inside the file drawer of unexpected
results and peer behind the curtain to see the real story of *how real
researchers did real research*.

*When: *July 7th, 2022 at 11am EDT / 5pm CEST
(*Note*: time different from usual one.)

*Where: *RSVP for the Zoom link here:
https://us02web.zoom.us/meeting/register/tZUuf–hpzgvEtxEIOcuo1-PJ8wDkvqmR8L6

*Title:* *Struggling to Achieve Novelty Detection in Deep Learning*

*Abstract: **In 2005, motivated by an open world computer vision
application, I became interested in novelty detection. However, there were
few methods available in computer vision at that time, and my research
turned to studying anomaly detection in standard feature vector data. In
that arena, many good algorithms were being published. Fundamentally, these
methods rely on a notion of distance or density in feature space and detect
anomalies as outliers in that space.*


*Returning to deep learning 10 years later, my students and I attempted,
without much success. to apply these methods to the latent representations
in deep learning. Other groups attempted to apply deep density models,
again with limited success. Summary: I couldn’t believe it was not better.
In the meantime, simple anomaly scores such as the maximum softmax
probability of the max logit score were shown to be doing very well.We
decided that we had reached the limits of what macro-level analysis (error
rates, AUC scores) could tell us about these techniques. It was time to
look closely at the actual feature values. In this talk, I’ll show our
analysis of feature activations and introduce the Familiarity Hypothesis,
which states that the max logit/max softmax score is measuring the amount
of familiarity in an image rather than the amount of novelty. This is a
direct consequence of the fact that the only features that are learned are
ones that capture variability in the training data. Hence, deep nets can
only represent images that fall within this variability. Novel images are
mapped into this representation, and hence cannot be detected as
outliers.I’ll close with some potential directions to overcome this
limitation.*

*Bio:*

*Dr. Dietterich (AB Oberlin College 1977; MS University of Illinois 1979;
PhD Stanford University 1984) is Distinguished Professor Emeritus in the
School of Electrical Engineering and Computer Science at Oregon State
University.  Dietterich is one of the pioneers of the field of Machine
Learning and has authored more than 225 refereed publications and two
books. His current research topics include robust artificial intelligence,
robust human-AI systems, and applications in sustainability.Dietterich has
devoted many years of service to the research community and was recently
given the ACML and AAAI distinguished service awards. He is a former
President of the Association for the Advancement of Artificial Intelligence
and the founding president of the International Machine Learning Society.
Other major roles include Executive Editor of the journal Machine Learning,
co-founder of the Journal for Machine Learning Research, and program chair
of AAAI 1990 and NIPS 2000. He currently serves as one of the moderators
for the cs.LG category on arXiv.*

For more information and for ways to get involved, please visit us at
http://icbinb.cc/, Tweet to us @ICBINBWorkhop
<https://twitter.com/ICBINBWorkshop>, or email us at
cant.believe.it.is.not.better at gmail.com.

--
Best wishes,
The ICBINB Organizers
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20220701/5ef8c332/attachment.html>


More information about the Connectionists mailing list