tf.nn.elu(features, name=None)See the guide: Neural Network > Activation Functions
Computes exponential linear: exp(features) - 1 if < 0, features otherwise.
See Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
features: A Tensor. Must be one of the following types: float32, float64, int32, int64, uint8, int16, int8, uint16, half.name: A name for the operation (optional).A Tensor. Has the same type as features.
Defined in tensorflow/python/ops/gen_nn_ops.py.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/nn/elu