Parallelization



Next: On-Line Training Up: Parallel Back-Propagation for Previous: Modified Back-Propagation

Parallelization

Because it takes very long to train a FMP, a parallelization of the algorithm is worthwhile. There are two completely different methods for training a MFP with consequences for the parallelization: on-line training and batch learning.





Frank M. Thiesing
Mon Dec 19 16:19:41 MET 1994