W3cubDocs

/TensorFlow Python

tf.contrib.opt.ScipyOptimizerInterface

class tf.contrib.opt.ScipyOptimizerInterface

Wrapper allowing scipy.optimize.minimize to operate a tf.Session.

Example:

vector = tf.Variable([7., 7.], 'vector')

# Make vector norm as small as possible.
loss = tf.reduce_sum(tf.square(vector))

optimizer = ScipyOptimizerInterface(loss, options={'maxiter': 100})

with tf.Session() as session:
  optimizer.minimize(session)

# The value of vector should now be [0., 0.].

Example with constraints:

vector = tf.Variable([7., 7.], 'vector')

# Make vector norm as small as possible.
loss = tf.reduce_sum(tf.square(vector))
# Ensure the vector's y component is = 1.
equalities = [vector[1] - 1.]
# Ensure the vector's x component is >= 1.
inequalities = [vector[0] - 1.]

# Our default SciPy optimization algorithm, L-BFGS-B, does not support
# general constraints. Thus we use SLSQP instead.
optimizer = ScipyOptimizerInterface(
    loss, equalities=equalities, inequalities=inequalities, method='SLSQP')

with tf.Session() as session:
  optimizer.minimize(session)

# The value of vector should now be [1., 1.].

Methods

__init__(loss, var_list=None, equalities=None, inequalities=None, **optimizer_kwargs)

Initialize a new interface instance.

Args:

  • loss: A scalar Tensor to be minimized.
  • var_list: Optional list of Variable objects to update to minimize loss. Defaults to the list of variables collected in the graph under the key GraphKeys.TRAINABLE_VARIABLES.
  • equalities: Optional list of equality constraint scalar Tensors to be held equal to zero.
  • inequalities: Optional list of inequality constraint scalar Tensors to be kept nonnegative. **optimizer_kwargs: Other subclass-specific keyword arguments.

minimize(session=None, feed_dict=None, fetches=None, step_callback=None, loss_callback=None)

Minimize a scalar Tensor.

Variables subject to optimization are updated in-place at the end of optimization.

Note that this method does not just return a minimization Op, unlike Optimizer.minimize(); instead it actually performs minimization by executing commands to control a Session.

Args:

  • session: A Session instance.
  • feed_dict: A feed dict to be passed to calls to session.run.
  • fetches: A list of Tensors to fetch and supply to loss_callback as positional arguments.
  • step_callback: A function to be called at each optimization step; arguments are the current values of all optimization variables flattened into a single vector.
  • loss_callback: A function to be called every time the loss and gradients are computed, with evaluated fetches supplied as positional arguments.

Defined in tensorflow/contrib/opt/python/training/external_optimizer.py.

© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/opt/ScipyOptimizerInterface