tf.contrib.seq2seq.sequence_loss(logits, targets, weights, average_across_timesteps=True, average_across_batch=True, softmax_loss_function=None, name=None)
Weighted cross-entropy loss for a sequence of logits (per example).
logits
: A 3D Tensor of shape [batch_size x sequence_length x num_decoder_symbols] and dtype float. The logits correspond to the prediction across all classes at each timestep.targets
: A 2D Tensor of shape [batch_size x sequence_length] and dtype int. The target represents the true class at each timestep.weights
: A 2D Tensor of shape [batch_size x sequence_length] and dtype float. Weights constitutes the weighting of each prediction in the sequence. When using weights as masking set all valid timesteps to 1 and all padded timesteps to 0.average_across_timesteps
: If set, sum the cost across the sequence dimension and divide by the cost by the total label weight across timesteps.average_across_batch
: If set, sum the cost across the batch dimension and divide the returned cost by the batch size.softmax_loss_function
: Function (inputs-batch, labels-batch) -> loss-batch to be used instead of the standard softmax (the default if this is None).name
: Optional name for this operation, defaults to "sequence_loss".A scalar float Tensor: The average log-perplexity per symbol (weighted).
ValueError
: logits does not have 3 dimensions or targets does not have 2 dimensions or weights does not have 2 dimensions.Defined in tensorflow/contrib/seq2seq/python/ops/loss.py
.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/seq2seq/sequence_loss