Introduction



Next: Modified Back-Propagation Up: Parallel Back-Propagation for Previous: Parallel Back-Propagation for

Introduction

Artificial neural networks (ANN) are suitable for the prediction of chaotic time series. They can approximate any function after an amount of training. Especially for prediction ANNs are an alternative to the classical methods. Today ANNs are already applied to the calculation of the demand for electrical power and to the forecasting of economic data [1], [2], [3].

ANNs learn to approximate a function by presenting discrete values of this function to the net. For the prediction of time series values of the past - the so called training set - are given to the net. With successive values in the input layer, the net is trained to calculate the th value in the output layer (cf. Figure 1).

The learning of the entire training set is repeated until the error is less than a given bound. One path through the training set is called epoch. A trained net can be used for the prediction of a time series by presenting the last known values. Then the net determines the next value for the future.

 
Figure 1: Prediction of Time Series

The feed-forward multilayer perceptron (FMP) network is used together with the back-propagation algorithm [4]. The error minimization by back-propagation takes an enormous amount of time but the training can be accelerated by efficient parallelizations with PVM and PARIXgif.



Frank M. Thiesing
Mon Dec 19 16:19:41 MET 1994