 |
Computational Intelligence -
Learning with Neural
Methods on Structured Data
|
Supervised Relevance Neural Gas
Main contributors:
Thorsten Bojer, Barbara Hammer,
Marc Strickert,
Thomas Villmann (University of Leipzig), industrial cooperation with
PROGNOST DIAGNOSTIC SYSTEMS
Publications:
See publications related to vector quantization, relevance determination
on Barbara's or
Marc's
page.
A download is available.
Main idea:
Learning Vector Quantization (LVQ) as proposed by Kohonen is a
very intuitive prototype based supervised clustering algorithm trained
with Hebbian learning. Inputs are vectors from a finite and fixed dimensional vector
space.
If high dimensional or hybrid data are dealt with, several problems arise:
-
LVQ is not stable for noisy data and overlapping classes;
-
LVQ is based on the Euclidian metric which is not appropriate
for data where the several input dimensions might contain noise
or some of them might be less relevant;
-
LVQ is very sensitive with respect to initialization of the prototypes.
SRNG involves several ideas to overcome these problems which
are particularly severe for standard vector-representations
of non-standard and structural data:
-
Instead of LVQ updates, a variation which can be
interpreted as a stochastic gradient descent on an
appropriate cost function is used which is
based on GLVQ as proposed by Sato and Yamada.
-
The Euclidian metric is substituted by an adaptive metric
which involves relevance terms for the input dimensions.
The relevance terms are automatically optimized during training.
-
Neighborhood cooperation is included such that initialization
is less critical. This method involves the dynamic of
Neural Gas as proposed by Martinetz et.al. into the
training dynamics.
Successful applications have been done for:
-
time series prediction,
-
satellite images,
-
industrial cooperation for the supervision of piston engines,
-
various benchmarks mostly from UCI.
back - LNM -
Computer Science -
University of Osnabrück.
B.Hammer