new book

Achilles D. Zapranis achilles at uom.gr
Fri Jul 16 04:50:54 EDT 1999


      New book (monograph):

      "Principles of Neural Model Identification, Selection and Adequacy -

      With Applications to Financial Econometrics"

      Springer - Verlag

      ISBN 1-85233-139-9

      Achilleas Zapranis and Apostolos-Paul Refenes

      Neural networks are receiving much attention because of their
powerful universal approximation properties. They are essentially
devices for non-parametric statistical inference, providing an elegant
formalism for unifying different non-parametric paradigms, such as
nearest neighbours, kernel smoothers, projection pursuit, etc. Neural
networks have shown considerable successes in a variety of disciplines
ranging from engineering, control, and financial modelling. However, a
major weakness of neural modelling is the lack of established
procedures for performing tests for misspecified models and tests of
statistical significance for the various parameters that have been
estimated. This is a serious disadvantage in applications where there
is a strong culture for testing not only the predictive power of a
model or the sensitivity of the dependent variable to changes in the
inputs but also the statistical significance of the finding at a
specified level of confidence. This is very important in the majority
of financial applications where the data generating processes are
dominantly stochastic and only partially deterministic.


      In this book we investigate a broad range of issues arising with
relation to their use as non-parametric statistical tools, including
controlling the bias and variance parts of the estimation error,
eliminating parameter and explanatory-variable redundancy, assessing
model adequacy and estimating sampling variability. Based upon the
latest, most significant developments in estimation theory, model
selection and the theory of misspecified models this book develops
neural networks into an advanced financial econometrics tool for
non-parametric modelling. It provides the theoretical framework and
displays through a selected case study and examples the efficient use
of neural networks for modelling complex financial phenomena.


      The majority of existing books on neural networks and their
application to finance concentrate on some of intricate algorithmic
aspects of neural networks, the bulk of which is irrelevant to
practitioners in this field. They use terminology, which is
incomprehensible to professional financial engineers, statisticians
and econometricians who are the natural readership in this
subject. Neural networks are essentially statistical devices for
non-linear, non-parametric regression analysis, but most of the
existing literature discuss neural networks as a form of artificial
intelligence. In our opinion this work meets an urgent demand for a
textbook illustrating how to use neural networks in real-life
financial contexts and provide methodological guidelines on how to
develop robust applications which work from a platform of statistical
insight.


      Contents:

      1 INTRODUCTION
    
      1.1 Overview
    
      1.2 Active Asset Management, Neural Networks and Risk
    
      1.2.1 Factor Analysis
    
      1.2.2 Estimating Returns
    
      1.2.3 Portfolio Optimisation
    
      1.3 Non-Parametric Estimation with Neural Networks
    
      1.3.1 Sources of Specification Bias
    
      1.3.2 Principles of Neural Models Identification
    
      1.4 Overview of the Remaining Chapters
    
       
      2 NEURAL MODEL IDENTIFICATION
    
      2.1 Overview
    
      2.2 Neural Model Selection
    
      2.2.1 Model Specification
    
      2.2.2 Fitness Criteria
    
      2.2.3 Parameter Estimation Procedures
    
      2.2.4 Consistency and the Bias-Variance Dilemma
    
      2.3 Variable Significance Testing
    
      2.3.1 Relevance Quantification
    
      2.3.2 Sampling Variability Estimation
    
      2.3.3 Hypothesis Testing
    
      2.4 Model Adequacy Testing
    
      2.5 Summary
    
       
      3 Review of Current Practice in Neural Model Identification
    
      3.1 Overview
    
      3.2 Current Practice in Neural Model Selection
    
      3.2.1 Regularisation
    
      3.2.2 Topology-Modifying Algorithms
    
      3.2.3 The Structural Risk Minimisation (SRM) Principle
    
      3.2.4 The Minimum Description Length (MDL) Principle
    
      3.2.5 The Maximum a-Posteriori Probability (MAP) Principle
    
      3.2.6 The Minimum Prediction Risk (MPR) Principle
    
      3.3 Variable Significance Testing
    
      3.3.1 Common Relevance Criteria
    
      3.3.1.1 Criteria based on the Derivative dy/dx
    
      3.3.1.2 Alternative Criteria
    
      3.3.1.3 Comparing between Different Relevance Criteria
    
      3.3.2 Sampling Variability and Bias Estimation with Bootstrap
    
      3.3.2.1 Pairs Bootstrap
    
      3.3.2.2 Residuals Bootstrap
    
      3.3.2.3 Bias Estimation
    
      3.3.3 Hypothesis Tests for Variable Selection
    
      3.4 Model Adequacy Testing : Misspecification Tests
    
      3.5 Summary
    
       
      4 NEURAL MODEL SELECTION : THE MINIMUM PREDICTION RISK PRINCIPLE
    
      4.1 Overview
    
      4.2 Algebraic Estimation of Prediction Risk
    
      4.3 Estimation of Prediction Risk with Resampling Methods
    
      4.3.1 The Bootstrap and Jack-knife Methods for Estimating Prediction Risk
    
      4.3.2 Cross-Validatory Methods for Estimating Prediction Risk
    
      4.4 Evaluation of Model Selection Procedures
    
      4.4.1 Experimental Set-Up
    
      4.4.2 Algebraic Estimates
    
      4.4.3 Bootstrap Estimates
    
      4.4.4 Discussion, 103
   
      4.5 Summary
    
       
      5 VARIABLE SIGNIFICANCE TESTING : A STATISTICAL APPROACH
    
      5.1 Overview
    
      5.2 Relevance Quantification
    
      5.2.1 Sensitivity Criteria
    
      5.2.2 Model-Fitness Sensitivity Criteria
    
      5.2.2.1 The Effect on the Empirical Loss of a Small Perturbation of x
    
      5.2.2.2 The Effect on the Empirical Loss of Replacing x by its Mean
    
      5.2.2.3 Effect on the Coefficient of Determination of a Small Perturbation of x
    
      5.3 Sampling Variability Estimation
    
      5.3.1 Local Bootstrap for Neural Models
    
      5.3.2 Stochastic Sampling from the Asymptotic Distribution of the Network=92s

      Parameters (Parametric Sampling)
    
      5.3.3 Evaluation of Bootstrap Schemes for Sampling Variability Estimation
    
      5.3.3.1 Example 1 : The Burning Ethanol Sample
    
      5.3.3.2 Example 2 : Wahba=92s Function
    
      5.3.3.3 Example 3 : Network Generated Data
    
      5.4 Hypothesis Testing
    
      5.4.1 Confidence Intervals
    
      5.4.2 Evaluating the Effect of a Variable=92s Removal
    
      5.4.3 Variable Selection with Backwards Elimination
    
      5.5 Evaluation of Variable Significance Testing
    
      5.6 Summary
    
       
      6 MODEL ADEQUACY TESTING
    
      6.1 Overview
    
      6.2 Testing for Serial Correlation in the Residuals
    
      6.2.1 The Correlogram
    
      6.2.2 The Box-Pierce Q-Statistic
    
      6.2.3 The Ljung-Box LB-Statistic
    
      6.2.4 The Durbin-Watson Test
    
      6.3 An F-test for Model Adequacy
    
      6.4 Summary
    
      7 NEURAL NETWORKS IN TACTICAL ASSET ALLOCATION
    
      7.1 Overview
    
      7.2 Quantitative Models for Tactical Asset Allocation
    
      7.3 Data Pre-Processing
    
      7.4 Forecasting the Equity Premium with Linear Models
    
      7.4.1 Model Estimation
    
      7.4.2 Model Adequacy Testing
    
      7.4.2.1 Testing the Assumptions of Linear Regression
    
      7.4.2.2 The Effect of Influential Observations
    
      7.4.3 Variable Selection
    
      7.5 Forecasting the Equity Premium with Neural Models
    
      7.5.1 Model Selection and Adequacy Testing
    
      7.5.2 Variable Selection
    
      7.5.2.1 Relevance Quantification
    
      7.5.2.2 Sampling Variability Estimation
    
      7.5.2.3 Backwards Variable Elimination
    
      7.6 Comparative Performance Evaluation
    
      7.7 Summary
    
       
      9 CONCLUSIONS
    
       
      Appendix I : Computing Network Derivatives
    
      Appendix II : Generating Random Deviates
    
      Bibliography


---------------------------------------------------------------------------------------------
Dr Achilleas D. Zapranis
University of Macedonia of Economic and Social Sciences
156 Egnatia St. PO BOX 1591, 540 06 Thessaloniki
Greece
Tel. 00-31-(0)31-891690, Fax 00-30-(0)31-844536
e-mail : achilles at macedonia.uom.gr
--------------------------------------------------------------------------------------------


More information about the Connectionists mailing list