Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

BATCHgd.MLPnet

Batch gradient descent training


Description

Modifies the neural network weights and biases according to the training set.

Usage

BATCHgd.MLPnet(net,P,T,n.epochs, n.threads=0L)

Arguments

net

Neural Network to train.

P

Input data set.

T

Target output data set.

n.epochs

Number of epochs to train

n.threads

Number of threads to spawn. If <1, spawns NumberProcessors-1 threads. If no OpenMP is found, this argument will be ignored.

Value

This function returns a neural network object modified according to the chosen data.

Author(s)

Manuel Castejón Limas. manuel.castejon@gmail.com
Joaquin Ordieres Meré. j.ordieres@upm.es
Ana González Marcos. ana.gonzalez@unirioja.es
Alpha V. Pernía Espinoza. alpha.pernia@unirioja.es
Francisco Javier Martinez de Pisón. fjmartin@unirioja.es
Fernando Alba Elías. fernando.alba@unavarra.es

References

Simon Haykin. Neural Networks – a Comprehensive Foundation. Prentice Hall, New Jersey, 2nd edition, 1999. ISBN 0-13-273350-1.

See Also


AMORE

Artificial Neural Network Training and Simulating

v0.2-16
GPL (>= 2)
Authors
Manuel Castejon Limas, Joaquin B. Ordieres Mere, Ana Gonzalez Marcos, Francisco Javier Martinez de Pison Ascacibar, Alpha V. Pernia Espinoza, Fernando Alba Elias, Jose Maria Perez Ramos
Initial release
2020-02-11

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.