Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

dlvq

Create and train a dlvq network


Description

Dynamic learning vector quantization (DLVQ) networks are similar to self-organizing maps (SOM, som). But they perform supervised learning and lack a neighborhood relationship between the prototypes.

Usage

dlvq(x, ...)

## Default S3 method:
dlvq(x, y, initFunc = "DLVQ_Weights",
  initFuncParams = c(1, -1), learnFunc = "Dynamic_LVQ",
  learnFuncParams = c(0.03, 0.03, 10), updateFunc = "Dynamic_LVQ",
  updateFuncParams = c(0), shufflePatterns = TRUE, ...)

Arguments

x

a matrix with training inputs for the network

...

additional function parameters (currently not used)

y

the corresponding target values

initFunc

the initialization function to use

initFuncParams

the parameters for the initialization function

learnFunc

the learning function to use

learnFuncParams

the parameters for the learning function

updateFunc

the update function to use

updateFuncParams

the parameters for the update function

shufflePatterns

should the patterns be shuffled?

Details

The input data has to be normalized in order to use DLVQ.

Learning in DLVQ: For each class, a mean vector (prototype) is calculated and stored in a (newly generated) hidden unit. Then, the net is used to classify every pattern by using the nearest prototype. If a pattern gets misclassified as class y instead of class x, the prototype of class y is moved away from the pattern, and the prototype of class x is moved towards the pattern. This procedure is repeated iteratively until no more changes in classification take place. Then, new prototypes are introduced in the net per class as new hidden units, and initialized by the mean vector of misclassified patterns in that class.

Network architecture: The network only has one hidden layer, containing one unit for each prototype. The prototypes/hidden units are also called codebook vectors. Because SNNS generates the units automatically, and does not need their number to be specified in advance, the procedure is called dynamic LVQ in SNNS.

The default initialization, learning, and update functions are the only ones suitable for this kind of network. The three parameters of the learning function specify two learning rates (for the cases correctly/uncorrectly classified), and the number of cycles the net is trained before mean vectors are calculated.

A detailed description of the theory and the parameters is available, as always, from the SNNS documentation and the other referenced literature.

Value

an rsnns object. The fitted.values member contains the activation patterns for all inputs.

References

Kohonen, T. (1988), Self-organization and associative memory, Vol. 8, Springer-Verlag.

Zell, A. et al. (1998), 'SNNS Stuttgart Neural Network Simulator User Manual, Version 4.2', IPVR, University of Stuttgart and WSI, University of Tübingen. http://www.ra.cs.uni-tuebingen.de/SNNS/welcome.html

Zell, A. (1994), Simulation Neuronaler Netze, Addison-Wesley. (in German)

Examples

## Not run: demo(dlvq_ziff)
## Not run: demo(dlvq_ziffSnnsR)


data(snnsData)
dataset <- snnsData$dlvq_ziff_100.pat

inputs <- dataset[,inputColumns(dataset)]
outputs <- dataset[,outputColumns(dataset)]

model <- dlvq(inputs, outputs)

fitted(model) == outputs
mean(fitted(model) - outputs)

RSNNS

Neural Networks using the Stuttgart Neural Network Simulator (SNNS)

v0.4-12
LGPL (>= 2) | file LICENSE
Authors
Christoph Bergmeir [aut, cre, cph], José M. Benítez [ths], Andreas Zell [ctb] (Part of original SNNS development team), Niels Mache [ctb] (Part of original SNNS development team), Günter Mamier [ctb] (Part of original SNNS development team), Michael Vogt [ctb] (Part of original SNNS development team), Sven Döring [ctb] (Part of original SNNS development team), Ralf Hübner [ctb] (Part of original SNNS development team), Kai-Uwe Herrmann [ctb] (Part of original SNNS development team), Tobias Soyez [ctb] (Part of original SNNS development team), Michael Schmalzl [ctb] (Part of original SNNS development team), Tilman Sommer [ctb] (Part of original SNNS development team), Artemis Hatzigeorgiou [ctb] (Part of original SNNS development team), Dietmar Posselt [ctb] (Part of original SNNS development team), Tobias Schreiner [ctb] (Part of original SNNS development team), Bernward Kett [ctb] (Part of original SNNS development team), Martin Reczko [ctb] (Part of original SNNS external contributors), Martin Riedmiller [ctb] (Part of original SNNS external contributors), Mark Seemann [ctb] (Part of original SNNS external contributors), Marcus Ritt [ctb] (Part of original SNNS external contributors), Jamie DeCoster [ctb] (Part of original SNNS external contributors), Jochen Biedermann [ctb] (Part of original SNNS external contributors), Joachim Danz [ctb] (Part of original SNNS development team), Christian Wehrfritz [ctb] (Part of original SNNS development team), Patrick Kursawe [ctb] (Contributors to SNNS Version 4.3), Andre El-Ama [ctb] (Contributors to SNNS Version 4.3)
Initial release
2019-09-16

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.