NIPS*95 workshop on Optimization
Jagota Arun Kumar
jagota at ponder.csci.unt.edu
Tue Oct 17 13:10:48 EDT 1995
Dear Connectionists:
Attached find a description of the NIPS*95 workshop on optimization.
For up-to-date information, including abstracts of talks, see the URL.
We might be able to fit in one or two more talks. Send me a title and
abstract by e-mail if you'd like to give a talk.
Arun Jagota
----------------------------------------------------------------------
OPTIMIZATION PROBLEM SOLVING WITH NEURAL NETS
NIPS95 Workshop, Organizer: Arun Jagota
Friday Dec 1 1995, 7:30--9:30 AM and 4:30--6:30 PM
E-mail: jagota at cs.unt.edu
Workshop URL: http://www.msci.memphis.edu/~jagota/NIPS95
Ever since the work of Hopfield and Tank, neural nets have found increasing use
in the approximate solution of difficult optimization problems, arising in many
applications. Such neural nets are well-suited in principle to these problems,
because they minimize, in parallel form, an energy function into which an
optimization problem's objective and constraints can be mapped. Unfortunately,
often they haven't worked well in practice, for two reasons. First, mapping
the objective and constraints of a problem onto a single good energy function
has turned out difficult for certain problems, for example for the Travelling
Salesman Problem. The ease or difficulty of mapping has turned out moreover
to be problem-dependent, making it difficult to find a good general mapping
methodology. Second, the dynamical algorithms have often been limited to some
form of local search or gradient-descent. In recent years, there have been
significant advances on both fronts. Provably good mappings of several
optimization problems have been found. Powerful dynamical algorithms that go
beyond gradient-descent have also been developed, with ideas borrowed from
different fields. Examples are Mean Field Annealing, Simulated Annealing,
Projection Methods, and Randomized Multi-Start Algorithms. This workshop aims
to take stock of the state of the art on this topic, and to study directions
for future research and applications.
Target Audience
Both the topics---neural nets and optimization---are of relevance to a
wide range of disciplines and we hope that several of these will be
represented at this workshop. These include Cognitive Science, Computer
Science, Engineering, Mathematics, Neurobiology, Physics, Chemistry, and
Psychology.
Format
6-8 30 minute talks, each including 5 minutes for discussion. 30 minutes for
discussion at the end.
Talks
The Complexity of Stability in Hopfield Networks
Ian Parberry, University of North Texas
Title to be announced
Anand Rangarajan, Yale University
Performance of Neural Network Algorithms for
Maximum Clique on Highly Compressible Graphs
Arun Jagota, University of North Texas
Population-based Incremental Learning
Shumeet Baluja, Carnegie-Mellon University
How Good are Neural Networks Algorithms for the Travelling Salesman Problem?
Marco Budinich, Dipartimento di Fisica, Via Valerio 2, 34127 Trieste ITALY
Relaxation Labeling Networks for the Maximum Clique Problem
Marcello Pelillo, University of Venice, Italy
----------------------------------------------------------------------
More information about the Connectionists
mailing list