NIPS*93 program

Bartlett Mel mel at cns.caltech.edu
Mon Oct 4 17:41:11 EDT 1993



	NIPS*93  MEETING PROGRAM and REGISTRATION REMINDER

The 1993 Neural Information Processing Systems (NIPS*93) meeting is
the seventh meeting of an inter-disciplinary conference which brings
together neuroscientists, engineers, computer scientists, cognitive
scientists, physicists, and mathematicians interested in all aspects
of neural processing and computation.  There will be an afternoon of
tutorial presentations (Nov. 29), two and a half days of regular
meeting sessions (Nov. 30 - Dec. 2), and two days of focused workshops
at a nearby ski area (Dec. 3-4).

An electronic copy of the 1993 NIPS registration brochure is available
in postscript format via anonymous ftp at helper.systems.caltech.edu
in /pub/nips/NIPS_93_brochure.ps.Z.  For a hardcopy of the brochure or
other information, please send a request to nips93 at systems.caltech.edu
or to: NIPS Foundation, P.O. Box 60035, Pasadena, CA 91116-6035.

EARLY REGISTRATION DEADLINE (for $100 discount): Oct. 30

_________________


NIPS*93 ORAL PRESENTATIONS PROGRAM


Tues. AM: Cognitive Science

8:30	Invited Talk: Jeff Elman, UC San Diego:
	From Weared to Wore: A Connectionist Account of the
	History of the Past Tense
9:00	Richard O. Duda, San Jose State Univ.:
	Connectionist Models for Auditory Scene Analysis
9:20	Reza Shadmehr and Ferdinando A. Mussa-Ivaldi, MIT:
	Computational Elements of the Adaptive Controller of the Human Arm
9:40	Catherine Stevens and Janet Wiles, University of Queensland:
	Tonal Music as a Componential Code: Learning Temporal Relationships
	Between and Within Pitch and Timing Components
10:00	Poster Spotlights:
	Thea B. Ghiselli-Crispa and Paul Munro, Univ. of Pittsburgh:
	Emergence of Global Structure from Local Associations
	Tony A. Plate, University of Toronto:
	Estimating Structural Similarity by Vector Dot Products of
	Holographic Reduced Representations
10:10	BREAK

	Speech Recognition

10:40 	Jose C. Principe, Hui-H. Hsu and Jyh-M. Kuo, Univ. of Florida:
	Analysis of Short Term Neural Memory Structures for Nonlinear Prediction
11:00 	Eric I. Chang and Richard P. Lippmann, MIT Lincoln Laboratory:
	Figure of Merit Training for Detection and Spotting
11:20	Gregory J. Wolff, K. Venkatesh Prasad, David G. Stork and
        Marcus Hennecke, Ricoh California Research Center:
 	Lipreading by Neural Networks: Visual Preprocessing, Learning and
	Sensory Integration
11:40	Poster Spotlights:
	Steve Renals, Mike Hochberg and Tony Robinson, Cambridge University:
	Learning Temporal Dependencies In Large-Scale Connectionist
	Speech Recognition
	Ying Zhao, John Makhoul, Richard Schwartz and George
        Zavaliagkos, BBN Systems and Technologies:
	Segmental Neural Net Optimization for Continuous Speech Recognition
11:50	Rod Goodman, Caltech: Posner Memorial Lecture


Tues. PM: Temporal Prediction and Control

2:00 	Invited Talk: Doyne Farmer, Prediction Co.:
	Time Series Analysis of Nonlinear and Chaotic Time Series:  State Space
	Reconstruction and the Curse of Dimensionality
2:30	Kenneth M. Buckland and Peter D. Lawrence, Univ. of British Columbia:
	Transition Point Dynamic Programming
2:50	Gary W. Flake, Guo-Zhen Sun, Yee-Chun Lee and Hsing-Hen Chen,
        University of Maryland:
 	Exploiting Chaos to Control The Future
3:10	Satinder P. Singh, Andrew G. Barto, Roderic Grupen and
        Christopher Connolly, University of Massachusetts:
	Robust Reinforcement Learning in Motion Planning
3:30	BREAK

	Theoretical Analysis

4:00	Scott Kirkpatrick, Naftali Tishby, Lidror Troyansky,
        The Hebrew Univ. of Jerusalem, and Geza Gyorgi, Eotvos Univ.:
	The Statistical Mechanics of K-Satisfaction
4:20	Santosh S. Venkatesh, Changfeng Wang, Univ. of Pennsylvania,
        and Stephen Judd, Siemens Corporate Research:
	When To Stop: On Optimal Stopping And Effective Machine Size In Learning
4:40	Wolfgang Maass, Technische Univ. Graz:
	Agnostic PAC-Learning Functions on Analog Neural Nets
5:00	H.N. Mhaskar, California State Univ. and Charles A. Micchelli, IBM:
	How To Choose An Activation Function
5:20	Poster Spotlights
	Iris Ginzburg, Tel Aviv Univ. and Haim Sompolinsky, Hebrew Univ.:
	Correlation Functions on a Large Stochastic Neural Network
	Xin Wang, Qingnan Li and Edward K. Blum, USC:
	Asynchronous Dynamics of Continuous-Time Neural Networks
	Tal Grossman and Alan Lapedes, Los Alamos National Laboratory:
	Use of Bad Training Data for Better Predictions


Wed. AM: Learning Algorithms

8:30	Invited Talk: Geoff Hinton, Univ. of Toronto:
	Using the Minimum Description Length Principle to Discover Factorial
	Codes
9:00	Richard S. Zemel, Salk Institute, and G. Hinton, Univ. of Toronto:
	Developing Population Codes By Minimizing Description Length
9:20	Sreerupa Das and Michael C. Mozer, University of Colorado:
	A Hybrid Gradient-Descent/Clustering Technique for Finite State
	Machine Induction
9:40	Eric Saund, Xerox Palo Alto Research Center:
	Unsupervised Learning of Mixtures of Multiple Causes in Binary Data
10:00	BREAK
10:30	A. Uzi Levin and Todd Leen, Oregon Graduate Institute:
	Fast Pruning Using Principal Components
10:50	Christoph Bregler and Stephen Omohundro, ICSI:
	Surface Learning with Applications to Lip Reading
11:10	Melanie Mitchell, Santa Fe Inst. and John H. Holland, Univ. Michigan:
	When Will a Genetic Algorithm Outperform Hill Climbing
11:30	Oded Maron and Andrew W. Moore, MIT:
	Hoeffding Races: Accelerating Model Selection Search for Classification
	and Function Approximation
11:50	Poster Spotlights:
	Zoubin Ghahramani and Michael I. Jordan, MIT:
	Supervised Learning from Incomplete Data via an EM Approach
	Mats Osterberg and Reiner Lenz, Linkoping Univ.
	Unsupervised Parallel Feature Extraction from First Principles
	Terence D. Sanger, LAC-USC Medical Center:
	Two Iterative Algorithms for Computing the Singular Value Decomposition
	from Input/Output Samples
	Patrice Y. Simard and Edi Sackinger, AT&T Bell Laboratories:
	Efficient Computation of Complex Distance Metrics Using Hierarchical
	Filtering


Wed. PM: Neuroscience

2:00	Invited Talk: Eve Marder, Brandeis Univ.:
	Dynamic Modulation of Neurons and Networks
2:30	Ojvind Bernander, Rodney Douglas and Christof Koch, Caltech:
	Amplifying and Linearizing Apical Synaptic Inputs to
	Cortical Pyramidal Cells
2:50	Christiane Linster and David Marsan, ESPCI,
        Claudine Masson and Michel Kerzberg, CNRS:
	Odor Processing in the Bee: a Preliminary Study of the
	Role of Central Input to the Antennal Lobe
3:10	M.G. Maltenfort, R. E. Druzinsky, C. J. Heckman and
        W. Z. Rymer, Northwestern Univ.:
	Lower Boundaries of Motoneuron Desynchronization Via
	Renshaw Interneurons
3:30	BREAK

	Visual Processing

4:00	K. Obermayer, The Salk Institute, L. Kiorpes, NYU and
        Gary G. Blasdel, Harvard Medical School:
	Development of Orientation and Ocular Dominance Columns
	in Infant Macaques
4:20	Yoshua Bengio, Yann Le Cun and Donnie Henderson, AT&T Bell Labs:
	Globally Trained Handwritten Word Recognizer using Spatial
	Representation, Spatial Displacement Neural Networks and
	Hidden Markov Models
4:40	Trevor Darrell and A. P. Pentland, MIT:
	Classification of Hand Gestures using a View-based
	Distributed Representation
5:00	Ko Sakai and Leif H. Finkel, Univ. of Pennsylvania:
	A Network Mechanism for the Determination of Shape-from-Texture
5:20	Video Poster Spotlights (to be announced)


Thurs. AM: Implementations and Applications

8:30	Invited Talk: Dan Seligson, Intel:
	A Radial Basis Function Classifier with On-chip Learning
9:00	Michael A. Glover, Current Technology, Inc. and
        W. Thomas Miller III, University of New Hampshire:
	A Massively-Parallel SIMD Processor for Neural Network and
	Machine Vision Application
9:20	Steven S. Watkins, Paul M. Chau, and Mark Plutowski, UCSD,
        Raoul Tawel and Bjorn Lambrigsten, JPL:
	A Hybrid Radial Basis Function Neurocomputer
9:40	Gert Cauwenberghs, Caltech :
	A Learning Analog Neural Network Chip with Continuous-Time
	Recurrent Dynamics
10:00	BREAK
10:30	Invited Talk: Paul Refenes, University College London:
	Neural Network Applications in the Capital Markets
11:00	Jane Bromley, Isabelle Guyon, Yann Le Cun, Eduard Sackinger
        and Roopak Shah, AT&T Bell Laboratories:
	Signature Verification using a "Siamese" Time Delay Neural Network
11:20	John Platt and Ralph Wolf, Synaptics, Inc.:
	Postal Address Block Location Using a Convolutional Locator Network
11:40	Shumeet Baluja and Dean Pomerleau, Carnegie Mellon University:
	Non-Intrusive Gaze Tracking Using Artificial Neural Networks
12:00	Adjourn to Vail for Workshops


_____________________


NIPS*93 POSTER PROGRAM


Tues. PM Posters:

Cognitive Science (CS)
CS-1 Blasig	Using Backpropagation to Automatically Generate Symbolic Classification Rules
CS-2 Munro, Ghiselli-Crispa	Emergence of Global Structure from Local Associations
CS-3 Plate	Estimating structural similarity by vector dot products of Holographic Reduced Representations
CS-4 Shultz, Elman	Analyzing Cross Connected Networks
CS-5 Sperduti	Encoding of Labeled Graphs by Labeling RAAM

Speech Processing (SP)
SP-1 Farrell, Mammone	Speaker Recognition Using Neural Tree Networks
SP-2 Hirayama, Vatikiotis-Bateson, Kawato	Inverse Dynamics of Speech Motor Control
SP-3 Renals, Hochberg, Robinson	Learning Temporal Dependencies In Large-Scale Connectionist Speech Recognition
SP-4 Zhao, Makhoul, Schwartz, Zavaliagkos	Segmental Neural Net Optimization for Continuous Speech Recognition

Control, Navigation and Planning (CT)
CT-1 Atkeson	Using Local Trajectory Optimizers To Speed Up Global Optimization In Dynamic Programming
CT-2 Boyan, Littman	A Reinforcement Learning Scheme for Packet Routing Using a Network of Neural Networks
CT-3 Cohn	Queries and Exploration Using Optimal Experiment Design
CT-4 Duff, Barto	Monte Carlo Matrix Inversion and Reinforcement Learning
CT-5 Gullapalli, Barto	Convergence of Indirect Adaptive Asynchronous Dynamic Programming Algorithms
CT-6 Jaakkola, Jordan, Singh	Stochastic Convergence Of Iterative DP Algorithms
CT-7 Moore	The Parti-game Algorithm for Variable Resolution Reinforcement Learning in Multidimensional State-spaces
CT-8 Nowlan, Cacciatore	Mixtures of Controllers for Jump Linear and Non-linear Plants
CT-9 Wada, Koike, Vatikiotis-Bateson, Kawato	A Computational Model for Cursive Handwriting Based on the Minimization Principle

Learning Theory, Generalization and Complexity (LT)
LT-01 Cortes, Jackel, Solla, Vapnik, Denker	Learning Curves: Asymptotic Values and Rates of Convergence
LT-02 Fefferman	Recovering A Feed-Forward Net From Its Output
LT-03 Grossman, Lapedes	Use of Bad Training Data for Better Predictions
LT-04 Hassibi, Sayed, Kailath	H-inf Optimality Criteria for LMS and Backpropagation
LT-05 Hush, Horne	Bounds on the complexity of recurrent neural network implementations of finite state machines
LT-06 Ji 	A Bound on Generalization Error Using Network-Parameter-Dependent Information and Its Applications
LT-07 Kowalczyk	Counting function theorem for multi-layer networks
LT-08 Mangasarian, Solodov	Backpropagation Convergence Via Deterministic Nonmonotone Perturbed Minimization
LT-09 Plutowski, White	Delete-1 Cross-Validation Estimates IMSE
LT-10 Schwarze, Hertz	Discontinuous Generalization in Large Commitee Machines
LT-11 Shapiro, Prugel-Bennett	Non-Linear Statistical Analysis and Self-Organizing Competitive Networks
LT-12 Wahba	Structured Machine Learning for 'Soft' Classification, with Smoothing Spline ANOVA Models and Stacked Tuning, Testing
and Evaluation
LT-13 Watanabe	Solvable models of artificial neural networks
LT-14 Wiklicky	On the Non-Existence of a Universal Learning Algorithm for Recurrent Neural Networks

Dynamics/Statistical Analysis (DS)
DS-1 Coolen, Penney, Sherrington	Coupled Dynamics of Fast Neurons and Slow Interactions
DS-2 Garzon, Botelho	Observability of neural network behavior
DS-3 Gerstner, van Hemmen	How to Describe Neuronal Activity: Spikes, Rates, or Assemblies?
DS-4 Ginzburg, Sompolinsky	Correlation Functions on a Large Stochastic Neural Network
DS-5 Leen, Orr	Momentum and Optimal Stochastic Search
DS-6 Ruppin, Meilijson	Optimal signalling in Attractor Neural Networks
DS-7 Wang, Li, Blum	Asynchronous Dynamics of Continuous-Time Neural Networks

Recurrent Networks (RN)
RN-1 Baird, Troyer, Eeckman	Grammatical Inference by Attentional Control of Synchronization in an Oscillating Elman Net
RN-2 Bengio, Frasconi	Credit Assignment through Time: Alternatives to Backpropagation
RN-3 Kolen	Fool's Gold: Extracting Finite State Machines From Recurrent Network Dynamics
RN-4 Movellan	A Reinforcement Algorithm to Learn Trajectories with Stochastic Neural Networks
RN-5 Saunders, Angeline, Pollack	 Structural and behavioral evolution of recurrent networks

Applications (AP)
AP-01 Baldi, Brunak, Chauvin, Krogh	Hidden Markov Models in Molecular Biology: Parsing the Human Genome
AP-02 Eeckman, Buhmann, Lades	A Silicon Retina for Face Recognition
AP-03 Flann	A Hierarchal Approach to Recognizing On-line Cursive Handwriting
AP-04 Graf, Cosatto, Ting	Locating Address Blocks with a Neural Net System
AP-05 Karunanithi	Identifying Fault-Prone Software Modules Using Feed-Forward Networks: A Case Study
AP-06 Keymeulen	Comparison Training for a Rescheduling Problem in Neural Networks
AP-07 Lapedes, Steeg	Use of Adaptive Networks to Find Highly Predictable Protein Structure Classes
AP-08 Schraudolph, Dayan, Sejnowski	Using the TD(lambda) Algorithm to Learn an Evaluation Funcion for the Game of Go
AP-09 Smyth	Probabilistic Anomaly Detection in Dynamic Systems
AP-10 Tishby, Singer	Decoding Cursive Scripts


Wed. PM posters:


Learning Algorithms (LA)
LA-01 Gold, Mjolsness	Clustering with a Domain-Specific Distance Metric
LA-02 Buhmann	Central and Pairwise Data Clustering by Competitive Neural Networks
LA-03 de Sa	Learning Classification without Labeled Data
LA-04 Ghahramani, Jordan	Supervised learning from incomplete data via an EM approach
LA-05 Tresp, Ahmad, Neuneier	Training Neural Networks with Deficient Data
LA-06 Osterberg, Lenz	Unsupervised Parallel Feature Extraction from First Principles
LA-07 Sanger	Two Iterative Algorithms for Computing the Singular Value Decomposition from Input/Output Samples
LA-08 Leen, Kambhatla	Fast Non-Linear Dimension Reduction
LA-09 Schaal, Atkeson	Assessing The Quality of Learned Local Models
LA-10 Simard, Sackinger	Efficient computation of complex distance metrics using hierarchical filtering
LA-11 Tishby, Ron, Singer	The Power of Amnesia
LA-12 Wettscherek, Dietterich	Locally Adaptive Nearest Neighbor Algorithms
LA-13 Liu	Robust Parameter Estimation and Model Selection for Neural Network Regression
LA-14 Wolpert	Bayesian Backpropagation Over Functions Rather Than Weights
LA-15 Thodberg	Bayesian Backprop in Action: Pruning, Ensembles, Error Bars and Application to Strectroscopy
LA-16 Dietterich, Jain, Lanthop	Dynamic Reposing for Drug Activity Prediction
LA-17 Ginzburg, Horn 	Combined Neural Networks For Time Series Analysis
LA-18 Graf, Simard	Backpropagation without Multiplication
LA-19 Harget, Bostock	A Comparative Study of the Performance of a Modified Bumptree with Radial Basis Function Networks and the
Standard Multi-Layer Perceptron
LA-20 Najafi, Cherkassky	Adaptive Knot Placement Based on Estimated Second Derivative of Regression Surface

Constructive/Pruning Algorithms (CP)
CP-1 Fritzke	Supervised Learning with Growing Cell Structures
CP-2 Hassibi, Stork, Wolff	Optimal Brain Surgeon: Extensions, streamlining and performance comparisons
CP-3 Kamimura	Generation of Internal Representations by alpha-transformation
CP-4 Leerink, Jabri	Constructive Learning Using Internal Representation Conflicts
CP-5 Utans	Learning in Compositional Hierarchies: Inducing the Structure of Objects from Data
CP-6 Watanabe	An Optimization Method of Layered Neural Networks Based on the Modified Information Criterion

Neuroscience (NS)
NS-01 Bialek, Ruderman 	Statistics of Natural Images: Scaling in the Woods
NS-02 Boussard, Vibert	Dopaminergic neuromodulation brings a dynamical plasticiy to the retina
NS-03 Doya, Selverston, Rowat	A Hodgkin-Huxley Type Neuron Model that Learns Slow Non-Spike Oscillation
NS-04 Gusik, Eaton	Directional Hearing by the Mauthner System
NS-05 Horiuchi, Bishofberger, Koch 	Building an Analog VLSI, Saccadic Eye Movement  System
NS-06 Lewicki	Bayesian Modeling and Classification of Neural Signals
NS-07 Montague, Dayan, Sejnowski	Foraging in an Uncertain Environment Using Predictive Hebbian Learning
NS-08 Rosen, Rumelhart, Knudsen	A Connectionist Model of the Owl's Sound Localization System
NS-09 Sanger	Optimal Unsupervised Motor Learning Predicts the Internal Representation of Barn Owl Head Movements
NS-10 Siegal	An Analog VLSI Model Of Central Pattern Generation In The Medicinal Leech
NS-11 Usher, Stemmler, Koch	High spike rate variability as a consequence of network amplification of local fluctuations

Visual Processing (VP)
VP-1 Ahmad	Feature Densities are Required for Computing Feature Corresponces
VP-2 Buracas, Albright 	Proposed function of MT neurons' receptive field surrounds: computing shapes of objects from velocity fields
VP-3 Geiger, Diamantaras	Resolving motion ambiguities
VP-4 Mjolsness	Two-Dimensional Object Localization by Coarse-to-fine Correlation Matching
VP-5 Sajda, Finkel	Dual Mechanisms for Neural Binding and Segmentation and Their Role in Cortical Integration
VP-6 Yuille, Smirnakis, Xu	Bayesian Self-Organization

Implementations (IM)
IM-01 Andreou, Edwards	VLSI Phase Locking Architecture for Feature Linking in Multiple Target Tracking Systems
IM-02 Coggins, Jabri	WATTLE: A Trainable Gain Analogue VLSI Neural Network
IM-03 Elfadel, Wyatt	The "Softmax" Nonlinearity: Derivation Using Statistical Mechanics and Useful Properties as a Multiterminal
Analog Circuit Element
IM-04 Muller, Kocheisen, Gunzinger	High Performance Neural Net Simulation on a Multiprocessor System with "Intelligent"
Communication
IM-05 Murray, Burr, Stork, et al.	Digital Boltzmann VLSI for constraint satisfaction and learning
IM-06 Niebur, Brettle	Efficient Simulation of Biological Neural Networks on Massively Parallel Supercomputers with Hypercube
Architecture
IM-07 Oliveira, Sangiovanni-Vincentelli	Learning Complex Boolean Functions: Algorithms and Applications
IM-08 Shibata, Kotani, Yamashita et al.	Implementing Intelligence on Silicon Using Neuron-Like Functional MOS Transistors
IM-09 Watts	Event-Driven Simulation of Networks of Spiking Neurons




More information about the Connectionists mailing list