[CMU AI Seminar] October 10 at 12pm (GHC 6115 & Zoom) -- Jane Lange (MIT) -- Agnostic Proper Learning of Monotone Functions: Beyond the Black-Box Correction Barrier -- AI Seminar sponsored by SambaNova Systems

Asher Trockman ashert at cs.cmu.edu
Fri Oct 6 16:54:30 EDT 2023


Dear all,

We look forward to seeing you *this Tuesday (10/10)* from *1**2:00-1:00 PM
(U.S. Eastern time)* for the next talk of this semester's *CMU AI Seminar*,
sponsored by SambaNova Systems <https://sambanova.ai/>. The seminar will be
held in GHC 6115 *with pizza provided *and will be streamed on Zoom.

To learn more about the seminar series or to see the future schedule,
please visit the seminar website <http://www.cs.cmu.edu/~aiseminar/>.

On this Tuesday (10/10), *Jane Lange* (MIT) will be giving a talk
titled *"**Agnostic
Proper Learning of Monotone Functions: Beyond the Black-Box Correction
Barrier**" *to present an agnostic, efficient, proper learning algorithm
for monotone Boolean functions.

*Title*: Agnostic Proper Learning of Monotone Functions: Beyond the
Black-Box Correction Barrier

*Talk Abstract*: We give the first agnostic, efficient, proper learning
algorithm for monotone Boolean functions. Given 2^Õ(sqrt(n)/ε) uniformly
random examples of an unknown function f:{±1}^n→{±1}, our algorithm outputs
a hypothesis g:{±1}^n→{±1} that is monotone and (opt+ε)-close to f, where
opt is the distance from f to the closest monotone function. The running
time of the algorithm (and consequently the size and evaluation time of the
hypothesis) is also 2^Õ(sqrt(n)/ε), nearly matching the lower bound of
Blais et al (RANDOM '15). We also give an algorithm for estimating up to
additive error ε the distance of an unknown function f to monotone using a
run-time of 2^Õ(sqrt(n)/ε). Previously, for both of these problems,
sample-efficient algorithms were known, but these algorithms were not
run-time efficient. Our work thus closes this gap in our knowledge between
the run-time and sample complexity. This work builds upon the improper
learning algorithm of Bshouty and Tamon (JACM '96) and the proper
semiagnostic learning algorithm of Lange, Rubinfeld, and Vasilyan (FOCS
'22), which obtains a non-monotone Boolean-valued hypothesis, then
"corrects" it to monotone using query-efficient local computation
algorithms on graphs. This black-box correction approach can achieve no
error better than 2opt+ε information-theoretically; we bypass this barrier
by a) augmenting the improper learner with a convex optimization step, and
b) learning and correcting a real-valued function before rounding its
values to Boolean. Our real-valued correction algorithm solves the "poset
sorting'' problem of [LRV22] for functions over general posets with
non-Boolean labels.

*Speaker Bio:* Jane Lange is a PhD student at MIT CSAIL studying
theoretical computer science under Ronitt Rubinfeld. She is broadly
interested in algorithms related to machine learning and property testing.
More specifically, she is interested in the application of techniques from
the fields of sublinear algorithms and analysis of Boolean functions for
the purpose of designing efficient algorithms for learning, property
testing, and other similar problems.

*In person: *GHC 6115
*Zoom Link*:
https://cmu.zoom.us/j/99510233317?pwd=ZGx4aExNZ1FNaGY4SHI3Qlh0YjNWUT09

Thanks,
Asher Trockman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/ai-seminar-announce/attachments/20231006/c7752293/attachment.html>


More information about the ai-seminar-announce mailing list