Layer wrappers
Apply a layer to every temporal slice of an input or to bi-directional RNN.
TimeDistributed(layer) Bidirectional(layer, merge_mode = "concat")
layer |
a layer instance (must be a recurrent layer for the bi-directional case) |
merge_mode |
Mode by which outputs of the forward and backward RNNs will be combined. One of 'sum', 'mul', 'concat', 'ave', None. If None, the outputs will not be combined, they will be returned as a list. |
Taylor B. Arnold, taylor.arnold@acm.org
Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.
Other layers: Activation
,
ActivityRegularization
,
AdvancedActivation
,
BatchNormalization
, Conv
,
Dense
, Dropout
,
Embedding
, Flatten
,
GaussianNoise
,
LocallyConnected
, Masking
,
MaxPooling
, Permute
,
RNN
, RepeatVector
,
Reshape
, Sequential
if(keras_available()) { X_train <- matrix(sample(0:19, 100 * 100, TRUE), ncol = 100) Y_train <- rnorm(100) mod <- Sequential() mod$add(Embedding(input_dim = 20, output_dim = 10, input_length = 100)) mod$add(Dropout(0.5)) mod$add(Bidirectional(LSTM(16))) mod$add(Dense(1)) mod$add(Activation("sigmoid")) keras_compile(mod, loss = "mse", optimizer = RMSprop()) keras_fit(mod, X_train, Y_train, epochs = 3, verbose = 0) }
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.