tf.contrib.losses.softmax_cross_entropy(*args, **kwargs)
See the guide: Losses (contrib) > Loss operations for use in neural networks.
Creates a cross-entropy loss using tf.nn.softmax_cross_entropy_with_logits. (deprecated)
THIS FUNCTION IS DEPRECATED. It will be removed after 2016-12-30. Instructions for updating: Use tf.losses.softmax_cross_entropy instead.
weights
acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If weights
is a tensor of size [batch_size
], then the loss weights apply to each corresponding sample.
If label_smoothing
is nonzero, smooth the labels towards 1/num_classes: new_onehot_labels = onehot_labels * (1 - label_smoothing) + label_smoothing / num_classes
logits
: [batch_size, num_classes] logits outputs of the network .onehot_labels
: [batch_size, num_classes] one-hot-encoded labels.weights
: Coefficients for the loss. The tensor must be a scalar or a tensor of shape [batch_size].label_smoothing
: If greater than 0 then smooth the labels.scope
: the scope for the operations performed in computing the loss.A scalar Tensor
representing the mean loss value.
ValueError
: If the shape of logits
doesn't match that of onehot_labels
or if the shape of weights
is invalid or if weights
is None.Defined in tensorflow/python/util/deprecation.py
.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/losses/softmax_cross_entropy