W3cubDocs

/TensorFlow Python

tf.losses.log_loss(labels, predictions, weights=1.0, epsilon=1e-07, scope=None, loss_collection=tf.GraphKeys.LOSSES)

tf.losses.log_loss(labels, predictions, weights=1.0, epsilon=1e-07, scope=None, loss_collection=tf.GraphKeys.LOSSES)

Adds a Log Loss term to the training procedure.

weights acts as a coefficient for the loss. If a scalar is provided, then the loss is simply scaled by the given value. If weights is a tensor of size [batch_size], then the total loss for each sample of the batch is rescaled by the corresponding element in the weights vector. If the shape of weights matches the shape of predictions, then the loss of each measurable element of predictions is scaled by the corresponding value of weights.

Args:

  • labels: The ground truth output tensor, same dimensions as 'predictions'.
  • predictions: The predicted outputs.
  • weights: Optional Tensor whose rank is either 0, or the same rank as labels, and must be broadcastable to labels (i.e., all dimensions must be either 1, or the same as the corresponding losses dimension).
  • epsilon: A small increment to add to avoid taking a log of zero.
  • scope: The scope for the operations performed in computing the loss.
  • loss_collection: collection to which the loss will be added.

Returns:

A scalar Tensor representing the loss value.

Raises:

  • ValueError: If the shape of predictions doesn't match that of labels or if the shape of weights is invalid.

Defined in tensorflow/python/ops/losses/losses_impl.py.

© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/losses/log_loss