tf.contrib.layers.variance_scaling_initializer(factor=2.0, mode='FAN_IN', uniform=False, seed=None, dtype=tf.float32)
See the guide: Layers (contrib) > Initializers
Returns an initializer that generates tensors without scaling variance.
When initializing a deep network, it is in principle advantageous to keep the scale of the input variance constant, so it does not explode or diminish by reaching the final layer. This initializer use the following formula:
if mode='FAN_IN': # Count only number of input connections. n = fan_in elif mode='FAN_OUT': # Count only number of output connections. n = fan_out elif mode='FAN_AVG': # Average number of inputs and output connections. n = (fan_in + fan_out)/2.0 truncated_normal(shape, 0.0, stddev=sqrt(factor / n))
factor=2.0 mode='FAN_IN' uniform=False
factor=1.0 mode='FAN_IN' uniform=True
factor=1.0 mode='FAN_AVG' uniform=True.
xavier_initializer
use either:factor=1.0 mode='FAN_AVG' uniform=True
, orfactor=1.0 mode='FAN_AVG' uniform=False
.factor
: Float. A multiplicative factor.mode
: String. 'FAN_IN', 'FAN_OUT', 'FAN_AVG'.uniform
: Whether to use uniform or normal distributed random initialization.seed
: A Python integer. Used to create random seeds. See set_random_seed
for behavior.dtype
: The data type. Only floating point types are supported.An initializer that generates tensors with unit variance.
ValueError
: if dtype
is not a floating point type.TypeError
: if mode
is not in ['FAN_IN', 'FAN_OUT', 'FAN_AVG'].Defined in tensorflow/contrib/layers/python/layers/initializers.py
.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/layers/variance_scaling_initializer