W3cubDocs

/TensorFlow Python

tf.contrib.layers.layer_norm(args, *kwargs)

tf.contrib.layers.layer_norm(*args, **kwargs)

See the guide: Layers (contrib) > Higher level ops for building neural network layers

Adds a Layer Normalization layer from https://arxiv.org/abs/1607.06450.

"Layer Normalization"

Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton

Can be used as a normalizer function for conv2d and fully_connected.

Args:

  • inputs: a tensor with 2 or more dimensions. The normalization occurs over all but the first dimension.
  • center: If True, add offset of beta to normalized tensor. If False, beta is ignored.
  • scale: If True, multiply by gamma. If False, gamma is not used. When the next layer is linear (also e.g. nn.relu), this can be disabled since the scaling can be done by the next layer.
  • activation_fn: activation function, default set to None to skip it and maintain a linear activation.
  • reuse: whether or not the layer and its variables should be reused. To be able to reuse the layer scope must be given.
  • variables_collections: optional collections for the variables.
  • outputs_collections: collections to add the outputs.
  • trainable: If True also add variables to the graph collection GraphKeys.TRAINABLE_VARIABLES (see tf.Variable).
  • scope: Optional scope for variable_scope.

Returns:

A Tensor representing the output of the operation.

Raises:

  • ValueError: if rank or last dimension of inputs is undefined.

Defined in tensorflow/contrib/framework/python/ops/arg_scope.py.

© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/layers/layer_norm