UAI '97 program and registration information

Eric Horvitz horvitz at MICROSOFT.com
Mon Jun 5 16:42:55 EDT 2006


Dear Colleague:

I have appended program and registration information for the Thirteenth
Conference on Uncertainty and Artificial Intelligence (UAI '97).  More
details and an online registration form are linked to the UAI '97 home
page at http://cuai97.microsoft.com.  UAI '97 will be held at Brown
University in Providence, Rhode Island, August 1-3.  In addition to the
main program, you may find interesting the Full-Day Course on Uncertain
Reasoning which will be held on Thursday, July 31.  Details on the
course can be found at http://cuai97.microsoft.com/course.htm.  Please
register for the conference and/or the course before early registration
comes to an end on May 31, 1997.

I would be happy to answer any additional questions about the
conference. 

    Best regards,

         Eric Horvitz
         Conference Chair

  ====================================================
  
  Thirteenth Conference on Uncertainty in Artificial Intelligence
                                   (UAI '97)

                           http://cuai97.microsoft.com

                              August 1-3, 1997
                              Brown University
                      Providence, Rhode Island, USA

      =============================================

                    ** UAI '97 Conference Program **

      =============================================


Thursday, July 31, 1997


Conference and Course Registration 8:00-8:30am
http://cuai97.microsoft.com/register/reg.htm


Full-Day Course on Uncertain Reasoning 8:30-6:00pm
http://cuai97.microsoft.com/course.htm

_____________________________________________

Friday, August 1, 1997

Main Conference Registration 8:00-8:25am 

Opening Remarks 
Dan Geiger and Prakash P. Shenoy
8:25-8:30am

Invited talk I: Local Computation Algorithms 
Steffen L. Lauritzen 
8:30-9:30am

Abstract: Inference in probabilistic expert systems has been made
possible through the development of efficient algorithms that in one way
or another involve message passing between local entities arranged to
form a junction tree. Many of these algorithms have a common structure
which can be partly formalized in abstract axioms with an algebraic
flavor. However, the existing abstract frameworks do not fully capture
all interesting cases of such local computation algorithms. The lecture
will describe the basic elements of the algorithms, give examples of
interesting local computations that are covered by current abstract
frameworks, and also examples of interesting computations that are not,
with a view towards reaching a fuller exploitation of the potential in
these ideas. 

Invited talk II: Coding Theory and Probability Propagation in Loopy
Bayesian Networks 
Robert J. McEliece 
9:30-10:30am

Abstract: In 1993 a group coding researchers in France devised, as part
of their astonishing "turbo code" breakthrough, a remarkable iterative
decoding algorithm. This algorithm can be viewed as an inference
algorithm on a Bayesian network, but (a) it is approximate, not exact,
and (b) it violates a sacred assumption in Bayesian analysis, viz., that
the network should have no loops. Indeed, it is accurate to say that the
turbo decoding algorithm is functionally equivalent to Pearl's algorithm
applied to a certain directed bipartite graph in which the messages
circulate around indefinitely, until either convergence is reached, or
(more realistically) for a fixed number of cycles. With hindsight, it is
possible to trace a continuous chain of "loopy" belief propagation
algorithms within the coding community beginning in 1962 (with
Gallager's iterative decoding algorithm for low density parity check
codes), continued in 1981 by Tanner and much more recently (1995-1996)
by Wiberg and MacKay-Neal. In this talk I'd like to challenge the UAI
community to reassess the conventional wisdom that probability
propagation only works in trees, since the coding community has now
accumulated considerable experimental evidence that in some cases at
least, "loopy" belief propagation works, at least approximately. Along
the way, I'll do my best to bring the AI audience up to speed on the
latest developments in coding. My emphasis will be on convolutional
codes, since they are the building blocks for turbo-codes. I will
mention that two of the most important (pre-turbo) decoding algorithms,
viz. Viterbi (1967) and BCJR (1974) can be stated in orthodox Bayesian
network terms. BCJR, for example, is an anticipation of Pearls'
algorithm on a special kind of tree, and Viterbi's algorithm gives a
solution to the "most probable explanation" problem on the same
structure. Thus coding theorists and AI people have been working on, and
solving, similar problems for a long time. It would be nice if they
became more aware of each other's work. 

Break 10:30-11:00am

** Plenary Session I: Modeling
11:00-12:00am

Object-Oriented Bayesian Networks 
Daphne Koller and Avi Pfeffer 
     (winner of the best student paper award) 

Problem-Focused Incremental Elicitation of Multi-Attribute Utility
Models 
Vu Ha and Peter Haddawy 

Representing Aggregate Belief through the Competitive Equilibrium of a
Securities Market 
David M. Pennock and Michael P. Wellman 


Lunch 12:00-1:30pm


** Plenary Session II: Learning & Clustering
1:30-3:00pm

A Bayesian Approach to Learning Bayesian Networks with Local Structure 
David Maxwell Chickering and David Heckerman 

Batch and On-line Parameter Estimation in Bayesian Networks 
Eric Bauer, Daphne Koller, and Yoram Singer 

Sequential Update of Bayesian Networks Structure 
Nir Friedman and Moises Goldszmidt 

An Information-Theoretic Analysis of Hard and Soft Assignment Methods
for Clustering 
Michael Kearns, Yishay Mansour, and Andrew Ng 


** Poster Session I: Overview Presentations
3:00-3:30pm

* Poster Session I 
3:30-5:30pm

Algorithms for Learning Decomposable Models and Chordal Graphs 
Luis M. de Campos and Juan F. Huete 

Defining Explanation in Probabilistic Systems 
Urszula Chajewska and Joseph Y. Halpern 

Exploring Parallelism in Learning Belief Networks 
T. Chu and Yang Xiang 

Efficient Induction of Finite State Automata 
Matthew S. Collins and Jonathon J. Oliver 

A Scheme for Approximating Probabilistic Inference 
Rina Dechter and Irina Rish 

Limitations of Skeptical Default Reasoning 
Jens Doerpmund 

The Complexity of Plan Existence and Evaluation in Probabilistic Domains

Judy Goldsmith, Michael L. Littman, and Martin Mundhenk 

Learning Bayesian Nets that Perform Well 
Russell Greiner 

Model Selection for Bayesian-Network Classifiers 
David Heckerman and Christopher Meek 

Time-Critical Action 
Eric Horvitz and Adam Seiver 

Composition of Probability Measures on Finite Spaces 
Radim Jirousek 

Computational Advantages of Relevance Reasoning in Bayesian Belief
Networks 
Yan Lin and Marek J. Druzdzel 

Support and Plausibility Degrees in Generalized Functional Models 
Paul-Andre Monney 

On Stable Multi-Agent Behavior in Face of Uncertainty 
Moshe Tennenholtz 

Cost-Sharing in Bayesian Knowledge Bases 
Solomon Eyal Shimony, Carmel Domshlak and Eugene Santos Jr. 

Independence of Causal Influence and Clique Tree Propagation 
Nevin L. Zhang and Li Yan 
__________________________________________________________

Saturday, August 2, 1997

Invited talk III: Genetic Linkage Analysis 
Alejandro A. Schaffer 
8:30-9:30am

Abstract: Genetic linkage analysis is a collection of statistical
techniques used to infer the approximate chromosomal location of disease
susceptibility genes using family tree data. Among the widely publicized
linkage discoveries in 1996 were the approximate locations of genes
conferring susceptibility to Parkinson's disease, prostate cancer,
Crohn's disease, and adult-onset diabetes. Most linkage analysis methods
are based on maximum likelihood estimation. Parametric linkage analysis
methods use probabilistic inference on Bayesian networks, which is also
used in the UAI community. I will give a self-contained overview of the
genetics, statistics, algorithms, and software used in real linkage
analysis studies. 


** Plenary Session III: Markov Decision Processes
9:30-10:30am

Model Reduction Techniques for Computing Approximately Optimal Solutions
for Markov Decision Processes 
Thomas Dean, Robert Givan and Sonia Leach 

Incremental Pruning: A Simple, Fast, Exact Algorithm for Partially
Observable Markov Decision Processes 
Anthony Cassandra, Michael L. Littman and Nevin L. Zhang 

Region-based Approximations for Planing in Stochastic Domains 
Nevin L. Zhang and Wenju Liu 

Break 10:30-11:00am

* Panel Discussion: 11:00-12:00am

Lunch 12:00-1:30pm


** Plenary Session IV: Foundations
1:30-3:00pm

Two Senses of Utility Independence 
Yoav Shoham 

Probability Update: Conditioning vs. Cross-Entropy 
Adam J. Grove and Joseph Y. Halpern 

Probabilistic Acceptance 
Henry E. Kyburg Jr. 

Estimation of Effects of Sequential Treatments By Reparameterizing
Directed Acyclic Graphs 
James M. Robins and Larry Wasserman 


** Poster Session II: Overview Presentations
3:00-3:30pm

* Poster Session II 
3:30-5:30pm

Network Fragments: Representing Knowledge for Probabilistic Models 
Kathryn Blackmond Laskey and Suzanne M. Mahoney 

Correlated Action Effects in Decision Theoretic Regression 
Craig Boutilier 

A Standard Approach for Optimizing Belief-Network Inference 
Adnan Darwiche and Gregory Provan 

Myopic Value of Information for Influence Diagrams 
Soren L. Dittmer and Finn V. Jensen 

Algorithm Portfolio Design Theory vs. Practice 
Carla P. Gomes and Bart Selman 

Learning Belief Networks in Domains with Recursively Embedded Pseudo
Independent Submodels 
J. Hu and Yang Xiang 

Relational Bayesian Networks 
Manfred Jaeger 

A Target Classification Decision Aid 
Todd Michael Mansell 

Structure and Parameter Learning for Causal Independence and Causal
Interactions Models 
Christopher Meek and David Heckerman 

An Investigation into the Cognitive Processing of Causal Knowledge 
Richard E. Neapolitan, Scott B. Morris, and Doug Cork 

Learning Bayesian Networks from Incomplete Databases 
Marco Ramoni and Paola Sebastiani 

Incremental Map Generation by Low Cost Robots Based on
Possibility/Necessity Grids 
M. Lopez Sanchez, R. Lopez de Mantaras, and C. Sierra 

Sequential Thresholds: Evolving Context of Default Extensions 
Choh Man Teng 

Score and Information for Recursive Exponential Models with Incomplete
Data 
Bo Thiesson 

Fast Value Iteration for Goal-Directed Markov Decision Processes 
Nevin L. Zhang and Weihong Zhang 
__________________________________________________________

Sunday, August 3, 1997

Invited talk IV: Gaussian processes - a replacement for supervised
neural networks? 
David J.C. MacKay
8:20-9:20am

Abstract: Feedforward neural networks such as multilayer perceptrons are
popular tools for nonlinear regression and classification problems. From
a Bayesian perspective, a choice of a neural network model can be viewed
as defining a prior probability distribution over non-linear functions,
and the neural network's learning process can be interpreted in terms of
the posterior probability distribution over the unknown function. (Some
learning algorithms search for the function with maximum posterior
probability and other Monte Carlo methods draw samples from this
posterior probability). In the limit of large but otherwise standard
networks, Neal (1996) has shown that the prior distribution over
non-linear functions implied by the Bayesian neural network falls in a
class of probability distributions known as Gaussian processes. The
hyperparameters of the neural network model determine the characteristic
lengthscales of the Gaussian process. Neal's observation motivates the
idea of discarding parameterized networks and working directly with
Gaussian processes. Computations in which the parameters of the network
are optimized are then replaced by simple matrix operations using the
covariance matrix of the Gaussian process. In this talk I will review
work on this idea by Neal, Williams, Rasmussen, Barber, Gibbs and
MacKay, and will assess whether, for supervised regression and
classification tasks, the feedforward network has been superceded. 

* Plenary Session V: Applications of Uncertain Reasoning
9:20-10:40am

Bayes Networks for Sonar Sensor Fusion 
Ami Berler and Solomon Eyal Shimony 

Image Segmentation in Video Sequences: A Probabilistic Approach 
Nir Friedman and Stuart Russell 

Lexical Access for Speech Understanding using Minimum Message Length
Encoding 
Ian Thomas, Ingrid Zukerman, Bhavani Raskutti, Jonathan Oliver, David
Albrecht 

A Decision-Theoretic Approach to Graphics Rendering 
Eric Horvitz and Jed Lengyel 

* Break 10:40-11:00am

* Panel Discussion: 11:00-12:00am

Lunch 12:00-1:30pm


** Plenary Session VI: Developments in Belief and Possibility
1:30-3:00pm

Decision-making under Ordinal Preferences and Comparative Uncertainty 
D. Dubois, H. Fargier, and H. Prade 

Inference with Idempotent Valuations 
Luis D. Hernandez and Serafin Moral 

Corporate Evidential Decision Making in Performance Prediction Domains 
A.G. Buchner, W. Dubitzky, A. Schuster, P. Lopes P.G. O'Donoghue, J.G.
Hughes, D.A. Bell, K. Adamson, J.A. White, J. Anderson, M.D. Mulvenna 

Exploiting Uncertain and Temporal Information in Correlation 
John Bigham 


Break 3:00-3:30am


** Plenary Session VII: Topics on Inference 
3:30-5:00pm

Nonuniform Dynamic Discretization in Hybrid Networks 
Alexander V. Kozlov and Daphne Koller 

Robustness Analysis of Bayesian Networks with Local Convex Sets of
Distributions 
Fabio Cozman 

Structured Arc Reversal and Simulation of Dynamic Probabilistic Networks

Adrian Y. W. Cheuk and Craig Boutilier 

Nested Junction Trees 
Uffe Kjaerulff 

__________________________________________________________

If you have questions about the UAI '97 program, contact the UAI '97
Program Chairs, Dan Geiger and Prakash P. Shenoy. For other questions
about UAI '97, please contact the Conference Chair, Eric Horvitz.

                            *  *  *

   UAI '97 Conference Chair 

Eric Horvitz (horvitz at microsoft.com)
Microsoft Research, 9S
Redmond, WA, USA 
http://research.microsoft.com/~horvitz


   UAI '97 Program Chairs 

Dan Geiger (dang at cs.technion.ac.il)
Computer Science Department
Technion, Israel Institute of Technology

Prakash Shenoy (pshenoy at ukans.edu)
School of Business
University of Kansas
http://pshenoy@stat1.cc.ukans.edu/~pshenoy/


====================================================

To register for UAI '97, please use the online registration form at:
http://cuai97.microsoft.com/register/reg.htm

If you do not have access to the web, please use the appended ascii
form.

Detailed information on accomodations can be found at
http://cuai97.microsoft.com/#lodge. Several blocks of rooms of on-campus
housing at Brown University have been reserved for UAI attendees on a
first come, first serve basis. In addition, there are five hotels within
a 1 mile radius from the UAI Conference (see
http://www.providenceri.com/as220/hotels.html for additional information
on hotels).

Travel information is available at:
http://cuai97.microsoft.com/#trav


======================================================

                *****  UAI '97 Registration Form *****

(If possible, please use the online form at
http://cuai97.microsoft.com/register/reg.htm)

------------------------------------------------------------------------
-----------------

* Name (Last, First): _____________________________

* Affiliation: ___________________________

* Email address:  ___________________________

* Mailing address: ___________________________

* Telephone: ___________________________

------------------------------------------------------------------------
-----------------

** Registration Fees:


>>> Main Conference <<<<

Fees (please circle and tally below):

Early Registration: $225, Late Registration (After May 31): $285

Student Registration (certify below): $125,  Late Registration (After
May 31):  $150


                               *  *  *  

>>> Full-Day Course on Uncertain Reasoning (July 31, 1997) <<<

* Fees:

With Conference Registration: $75, Without Conference: $125

Student (certify below): 

With Conference Registration: $35, Without Conference: $55

The registration fee includes the conference banquet on August 2nd and a
package of three lunches which will be served on campus. 


* Student certification 

I am a full-time student at the following
institution:____________________

Academic advisor's name:____________________
 
Academic advisor's email:____________________


* Conference Registration Fees: U.S. $ ________________________

Full-Day Course: U.S. $ ________________________

TOTAL: U.S. $ ________________________ 

______________________________________________________

Please make check payable to: AUAI or Association for Uncertainty in
Artificial Intelligence

Or

Indicate credit card payment(s) enclosed:

______ Mastercard ______ Visa

Credit Card No.: _____________________________________________ 

Exp. Date:
________________________


Signature: ____________________________

For credit card payment, you may fax this form to: (206) 936-1616 

Registrations by check/money order should be mailed to: 

Eric Horvitz
Microsoft Research, 9S
Redmond, WA 98052-6399
Fax: 206-936-1616


More information about the Connectionists mailing list