wmultinomial function
A function to evaluate the weighted multinomial loss function and the derivative of this function to be used when training a neural network. This is eqivalent to a multinomial cost function employing a Dirichlet prior on the probabilities. Its effect is to regularise the estimation so that in the case where we apriori expect more of one particular category compared to another then this can be included in the objective.
wmultinomial(w, batchsize)
w |
a vector of weights, adding up whose length is equal to the output length of the net |
batchsize |
of batch used in inference WARNING: ensure this matches with actual batchsize used! |
a list object with elements that are functions, evaluating the loss and the derivative
Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)
Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)
Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
http://neuralnetworksanddeeplearning.com/
netwts <- train(dat=train_set, truth=truth, net=net, eps=0.001, tol=0.95, loss=wmultinomial(c(10,5,6,9)), # here assuming output of length 4 batchsize=100)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.