Neural Network Course (Announcement)
Hong Pi
pihong at merlot.cse.ogi.edu
Wed Jul 19 16:00:42 EDT 1995
Oregon Graduate Institute of Science & Technology, Office of Continuing
Education, offers the short course:
NEURAL NETWORKS: ALGORITHMS AND APPLICATIONS
September 25-29, 1995, at the OGI campus near Portland, Oregon.
Course Organizer: John E. Moody
Lead Instructor: Hong Pi
With Lectures By:
Todd K. Leen
John E. Moody
Thorsteinn S. Rognvaldsson
Eric A. Wan
Artificial neural networks (ANN) have emerged as a new information
processing technique and an effective computational model for solving
pattern recognition and completion, feature extraction, optimization, and
function approximation problems. This course introduces participants to
the neural network paradigms and their applications in pattern
classification; system identification; signal processing and image
analysis; control engineering; diagnosis; time series prediction; and
financial analysis and trading. An introduction to fuzzy logic and
fuzzy control systems is also given.
Designing a neural network application involves steps from data
preprocessing to network tuning and selection. This course, with many
examples, application demos and hands-on lab practice, will familiarize the
participants with the techniques necessary for building successful
applications. About 50 percent of the class time is assigned to lab
sessions. The simulations will be based on Matlab, the Matlab Neural Net
Toolbox, and other software running on Windows-NT workstations.
Prerequisites: Linear algebra and calculus. Previous experience with
using Matlab is helpful, but not required.
Who will benefit:
Technical professionals, business analysts, financial market
practitioners, and other individuals who wish to gain a basic understanding
of the theory and algorithms of neural computation and/or are interested in
applying ANN techniques to real-world, data-driven modeling problems.
Course Objectives:
After completing the course, students will:
- Understand the basic neural networks paradigms
- Be familiar with the range of ANN applications
- Have a good understanding of the techniques for designing
successful applications
- Gain hands-on experience with ANN modeling.
Course Outline (8:30am - 5:00pm September 25 - 28, and
8:30am - 12:30am September 29):
Neural Networks: Biological and Artificial
Biological inspirations. Basic models of a neuron.
Types of architectures and learning paradigms.
Simple Perceptrons and Adalines
Decision surfaces. Linear separability.
Perceptron learning rules. Linear units.
Gradient descent learning.
Multi-Layer Feed-Forward Networks I
Multi-Layer perceptrons. Back-propagation learning.
Generalization. Early Stopping via validation.
Momentum and adaptive learning rate.
Examples and applications.
Multi-Layer Feed-Forward Networks II
Newton's method. Conjugate gradient. Levenburg-Marquardt.
Radial basis function networks. Projection pursuit regression.
Neural Networks for Pattern Recognition and Classification
Bayes decision theory. The Bayes risk.
Non-neural and neural methods for classification.
Neural networks as estimators of the posterior probability.
Methods for improving the classification performance.
Benchmark tests of neural networks vs. other methods.
Some applications.
Improving the Generalization Performance
Model bias and model variance.
Weight decay. Regularizers. Optimal brain surgeon.
Learning from hints. Sensitivity analysis.
Input variable selection. The delta-test.
Time Series Prediction: Classical and Nonlinear Approaches
Linear time series models. Simple nonlinear models.
Recurrent network models and training algorithms.
Case studies: sunspots, economic forecasting.
Self-Organized Networks and Unsupervised Learning
K-means clustering. Kohonen feature maps. Learning
vector quantization. Adaptive principal components analysis.
Neural Network for Adaptive Control
What is control. Heuristic, open loop, and inverse control.
Feedback algorithms for control. Neural network feedback control.
Reinforcement learning.
Survey of Neural Network Applications in Financial Markets
Bond and stock valuation. Currency rate forecasting.
Trading systems. Commodity price forecasting.
Risk management. Option pricing.
Fuzzy Systems
Fuzzy logic. Fuzzy control systems.
Adaptive fuzzy and neural-fuzzy.
About the Instructors
Todd K. Leen is associate professor of Computer Science and Engineering at
Oregon Graduate Institute of Science & Technology. He received his Ph.D. in
theoretical Physics from the University of Wisconsin in 1982. From
1982-1987 he worked at IBM Corporation, and then pursued research in
mathematical biology at Good Samaritan Hospital's Neurological Sciences
Institute. He joined OGI in 1989. Dr. Leen's current research interests
include neural learning, algorithms and architectures, stochastic
optimization, model constraints and pruning, and neural and non-neural
approaches to data representation and coding. He is particularly
interested in fast, local modeling approaches, and applications to image
and speech processing. Dr. Leen served as theory program chair for the 1993
Neural Information Processing Systems (NIPS) conference, and workshops
chair for the 1994 NIPS conference.
John E. Moody is associate professor of Computer Science and Engineering at
Oregon Graduate Institute of Science & Technology. His current research
focuses on neural network learning theory and algorithms in it's many
manifestations. He is particularly interested in statistical learning
theory, the dynamics of learning, and learning in dynamical contexts. Key
application areas of his work are adaptive signal processing, adaptive
control, time series analysis, forecasting, economics and finance. Moody
has authored over 35 scientific papers, more than 25 of which concern the
theory, algorithms, and applications of neural networks. Prior to joining
the Oregon Graduate Institute, Moody was a member of the Computer Science
and Neuroscience faculties at Yale University. Moody received his Ph.D.
and M.A. degrees in Theoretical Physics from Princeton University, and
graduated Summa Cum Laude with a B.A. in Physics from the University of
Chicago.
Hong Pi is a senior research associate at Oregon Graduate Institute. He
received his Ph.D. in theoretical physics from University of Wisconsin in
1989. Prior to joining OGI in 1994 he had been a postdoctoral fellow and
research scientist in Lund University, Sweden. His research interests
include nonlinear modeling, neural network algorithms and applications.
Thorsteinn S. Rognvaldsson received the Ph.D. degree in theoretical physics
from Lund University, Sweden, in 1994. His research interests are Neural
Networks for prediction and classification. He is currently a postdoctoral
research associate at Oregon Graduate Institute.
Eric A. Wan, Assistant Professor of Electrical Engineering and Applied
Physics, Oregon Graduate Institute of Science & Technology, received his
Ph.D. in electrical engineering from Stanford University in 1994. His
research interests include learning algorithms and architectures for neural
networks and adaptive signal processing. He is particularly interested in
neural applications to time series prediction, speech enhancement, system
identification, and adaptive control. He is a member of IEEE, INNS, Tau
Beta Pi, Sigma Xi, and Phi Beta Kappa.
For a complete course brochure contact:
Linda M. Pease, Director
Office of Continuing Education
Oregon Graduate Institute of Science & Technology
PO Box 91000
Portland, OR 97291-1000
+1-503-690-1259
+1-503-690-1686 (fax)
e-mail: continuinged at admin.ogi.edu
WWW home page: http://www.ogi.edu
^*^*^*^*^*^*^*^*^*^*^**^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*
Linda M. Pease, Director lpease at admin.ogi.edu
Office of Continuing Education
Oregon Graduate Institute of Science & Technology
20000 N.W. Walker Road, Beaverton OR 97006 USA (shipping)
P.O. Box 91000, Portland, OR 97291-1000 USA (mailing)
+1-503-690-1259 +1-503-690-1686 fax
"The future belongs to those who believe
in the beauty of their dreams"
-Eleanor Roosevelt
^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*
More information about the Connectionists
mailing list