next up previous
Next: Acknowledgement Up: Parallel Back-Propagation for Sales Previous: Implementations and experimental

Conclusions and future research

It has been shown that feedforward multilayer perceptron networks can learn to approximate the time series of sales in supermarkets. For a special group of articles neural networks can be trained to forecast future demands on the basis of the past data.

The back-propagation algorithm has been parallelized to reduce the enormous training times. Three different parallelizations have been implemented on PARSYTEC parallel systems. The batch learning is best for large training sets on systems with bad communication performance. The on-line parallelization works well with large numbers of neurons and scales on systems with very good communication performance in proportion to the computing performance.

The variant of Yoon et.al. has less communication demands than that of Morgan et.al. This advantage appears especially for small nets.

For the future the modelling of the input vectors should be improved: especially season and holiday information have to be given to the net; the value of changing prices can be modelled quantitatively. One important aim will be the reduction of input neurons. By correlation analysis some of the hundreds of single time series should be merged or denied. This will lead to smaller nets with shorter training times.

To handle the complex amount of data within an acceptable time the presented parallelizations of the back-propagation algorithm are worthwile and necessary.



WWW-Administration
Fri Jun 30 13:29:58 MET DST 1995