Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

Initalizers

Define the way to set the initial random weights of Keras layers.


Description

These functions are used to set the initial weights and biases in a keras model.

Usage

Zeros()

Ones()

Constant(value = 0)

RandomNormal(mean = 0, stddev = 0.05, seed = NULL)

RandomUniform(minval = -0.05, maxval = 0.05, seed = NULL)

TruncatedNormal(mean = 0, stddev = 0.05, seed = NULL)

VarianceScaling(scale = 1, mode = "fan_in", distribution = "normal",
  seed = NULL)

Orthogonal(gain = 1, seed = NULL)

Identity(gain = 1)

lecun_uniform(seed = NULL)

glorot_normal(seed = NULL)

glorot_uniform(seed = NULL)

he_normal(seed = NULL)

he_uniform(seed = NULL)

Arguments

value

constant value to start all weights at

mean

average of the Normal distribution to sample from

stddev

standard deviation of the Normal distribution to sample from

seed

Integer. Used to seed the random generator.

minval

Lower bound of the range of random values to generate.

maxval

Upper bound of the range of random values to generate.

scale

Scaling factor (positive float).

mode

One of "fan_in", "fan_out", "fan_avg".

distribution

distribution to use. One of 'normal' or 'uniform'

gain

Multiplicative factor to apply to the orthogonal matrix

Author(s)

Taylor B. Arnold, taylor.arnold@acm.org

References

Examples

if(keras_available()) {
  X_train <- matrix(rnorm(100 * 10), nrow = 100)
  Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3)
  
  mod <- Sequential()
  mod$add(Dense(units = 50, input_shape = dim(X_train)[2]))
  mod$add(Activation("relu"))
  mod$add(Dense(units = 3, kernel_initializer = Zeros(),
                bias_initializer = Ones()))
  mod$add(Dense(units = 3, kernel_initializer = Constant(),
                bias_initializer = RandomNormal()))
  mod$add(Dense(units = 3, kernel_initializer = RandomUniform(),
                bias_initializer = TruncatedNormal()))
  mod$add(Dense(units = 3, kernel_initializer = Orthogonal(),
                bias_initializer = VarianceScaling()))
  mod$add(Dense(units = 3, kernel_initializer = Identity(),
                bias_initializer = lecun_uniform()))
  mod$add(Dense(units = 3, kernel_initializer = glorot_normal(),
                bias_initializer = glorot_uniform()))
  mod$add(Dense(units = 3, kernel_initializer = he_normal(),
                bias_initializer = he_uniform()))
  mod$add(Activation("softmax"))
  keras_compile(mod,  loss = 'categorical_crossentropy', optimizer = RMSprop())
  
  keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5, verbose = 0)
  
}

kerasR

R Interface to the Keras Deep Learning Library

v0.6.1
LGPL-2
Authors
Taylor Arnold [aut, cre]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.