tf.nn.crelu(features, name=None)
See the guide: Neural Network > Activation Functions
Computes Concatenated ReLU.
Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: https://arxiv.org/abs/1603.05201
features
: A Tensor
with type float
, double
, int32
, int64
, uint8
, int16
, or int8
.name
: A name for the operation (optional).A Tensor
with the same type as features
.
Defined in tensorflow/python/ops/nn_ops.py
.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/nn/crelu