tf.contrib.seq2seq.prepare_attention(attention_states, attention_option, num_units, reuse=False)
Prepare keys/values/functions for attention.
attention_states
: hidden states to attend over.attention_option
: how to compute attention, either "luong" or "bahdanau".num_units
: hidden state dimension.reuse
: whether to reuse variable scope.attention_keys
: to be compared with target states.attention_values
: to be used to construct context vectors.attention_score_fn
: to compute similarity between key and target states.attention_construct_fn
: to build attention states.Defined in tensorflow/contrib/seq2seq/python/ops/attention_decoder_fn.py
.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/seq2seq/prepare_attention