tf.contrib.legacy_seq2seq.sequence_loss_by_example(logits, targets, weights, average_across_timesteps=True, softmax_loss_function=None, name=None)
Weighted cross-entropy loss for a sequence of logits (per example).
logits
: List of 2D Tensors of shape [batch_size x num_decoder_symbols].targets
: List of 1D batch-sized int32 Tensors of the same length as logits.weights
: List of 1D batch-sized float-Tensors of the same length as logits.average_across_timesteps
: If set, divide the returned cost by the total label weight.softmax_loss_function
: Function (labels-batch, inputs-batch) -> loss-batch to be used instead of the standard softmax (the default if this is None).name
: Optional name for this operation, default: "sequence_loss_by_example".1D batch-sized float Tensor: The log-perplexity for each sequence.
ValueError
: If len(logits) is different from len(targets) or len(weights).Defined in tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py
.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/legacy_seq2seq/sequence_loss_by_example