Optimisers
- class openqaoa.optimizers.training_vqa.OptimizeVQA(vqa_object, variational_params, optimizer_dict)[source]
Bases:
ABC
Training Class for optimizing VQA algorithm that wraps around VQABaseBackend and QAOAVariationalBaseParams objects. This function utilizes the update_from_raw of the QAOAVariationalBaseParams class and expectation method of the VQABaseBackend class to create a wrapper callable which is passed into scipy.optimize.minimize for minimization. Only the trainable parameters should be passed instead of the complete AbstractParams object. The construction is completely backend and type of VQA agnostic.
This class is an AbstractBaseClass on top of which other specific Optimizer classes are built.
Tip
Optimizer that usually work the best for quantum optimization problems
Gradient free optimizer - Cobyla
- Parameters:
vqa_object (
Type
[VQABaseBackend
]) – Backend object of class VQABaseBackend which contains information on the backend used to perform computations, and the VQA circuit.variational_params (
Type
[QAOAVariationalBaseParams
]) – Object of class QAOAVariationalBaseParams, which contains information on the circuit to be executed, the type of parametrisation, and the angles of the VQA circuit.method – which method to use for optimization. Choose a method from the list of supported methods by scipy optimize, or from the list of custom gradient optimisers.
optimizer_dict (
dict
) – All extra parameters needed for customising the optimising, as a dictionary.problems (Optimizers that usually work the best for quantum optimization) –
Gradient free optimizer: BOBYQA, ImFil, Cobyla
Gradient based optimizer: L-BFGS, ADAM (With parameter shift gradients)
Note: Adam is not a part of scipy, it will added in a future version
- abstract optimize()[source]
Main method which implements the optimization process. Child classes must implement this method according to their respective optimization process.
- Returns:
The optimized return object from the
scipy.optimize
package the result is assigned to the attributeopt_result
- optimize_this(x, n_shots=None)[source]
A function wrapper to execute the circuit in the backend. This function will be passed as argument to be optimized by scipy optimize.
- Parameters:
x – Parameters (a list of floats) over which optimization is performed.
n_shots – Number of shots to be used for the backend when computing the expectation. If None, nothing is passed to the backend.
- Returns:
Cost value which is evaluated on the declared backend.
- Return type:
cost value
- Returns:
Cost Value evaluated on the declared backed or on the Wavefunction Simulator if specified so
- results_dictionary(file_path=None, file_name=None)[source]
This method formats a dictionary that consists of all the results from the optimization process. The dictionary is returned by this method. The results can also be saved by providing the path to save the pickled file
Important
Child classes must implement this method so that the returned object, a
Dictionary
is consistent across all Optimizers.
SciPy Optimizers
- class openqaoa.optimizers.training_vqa.ScipyOptimizer(vqa_object, variational_params, optimizer_dict)[source]
Bases:
OptimizeVQA
Python vanilla scipy based optimizer for the VQA class.
Tip
Using bounds may result in lower optimization performance
- Parameters:
vqa_object (
Type
[VQABaseBackend
]) – Backend object of class VQABaseBackend which contains information on the backend used to perform computations, and the VQA circuit.variational_params (
Type
[QAOAVariationalBaseParams
]) – Object of class QAOAVariationalBaseParams, which contains information on the circuit to be executed, the type of parametrisation, and the angles of the VQA circuit.optimizer_dict (
dict
) –‘jac’: gradient as
Callable
, if defined elseNone
’hess’: hessian as
Callable
, if defined elseNone
’bounds’: parameter bounds while training, defaults to
None
’constraints’: Linear/Non-Linear constraints (only for COBYLA, SLSQP and trust-constr)
’tol’: Tolerance for termination
’maxiter’: sets
maxiters = 100
by default if not specified.’maxfev’: sets
maxfev = 100
by default if not specified.’optimizer_options’: dictionary of optimiser-specific arguments, defaults to
None
- GRADIENT_FREE = ['cobyla', 'nelder-mead', 'powell', 'slsqp']
- SCIPY_METHODS = ['nelder-mead', 'powell', 'cg', 'bfgs', 'newton-cg', 'l-bfgs-b', 'tnc', 'cobyla', 'slsqp', 'trust-constr', 'dogleg', 'trust-ncg', 'trust-exact', 'trust-krylov']
- optimize()[source]
Main method which implements the optimization process using
scipy.minimize
.- Returns:
Returns self after the optimization process is completed.
- optimize_this(x, n_shots=None)
A function wrapper to execute the circuit in the backend. This function will be passed as argument to be optimized by scipy optimize.
- Parameters:
x – Parameters (a list of floats) over which optimization is performed.
n_shots – Number of shots to be used for the backend when computing the expectation. If None, nothing is passed to the backend.
- Returns:
Cost value which is evaluated on the declared backend.
- Return type:
cost value
- Returns:
Cost Value evaluated on the declared backed or on the Wavefunction Simulator if specified so
- results_dictionary(file_path=None, file_name=None)
This method formats a dictionary that consists of all the results from the optimization process. The dictionary is returned by this method. The results can also be saved by providing the path to save the pickled file
Important
Child classes must implement this method so that the returned object, a
Dictionary
is consistent across all Optimizers.
- class openqaoa.optimizers.training_vqa.CustomScipyGradientOptimizer(vqa_object, variational_params, optimizer_dict)[source]
Bases:
OptimizeVQA
Python custom scipy gradient based optimizer for the VQA class.
Tip
Using bounds may result in lower optimization performance
- Parameters:
vqa_object (
Type
[VQABaseBackend
]) – Backend object of class VQABaseBackend which contains information on the backend used to perform computations, and the VQA circuit.variational_params (
Type
[QAOAVariationalBaseParams
]) – Object of class QAOAVariationalBaseParams, which contains information on the circuit to be executed, the type of parametrisation, and the angles of the VQA circuit.optimizer_dict (
dict
) –‘jac’: gradient as
Callable
, if defined elseNone
’hess’: hessian as
Callable
, if defined elseNone
’bounds’: parameter bounds while training, defaults to
None
’constraints’: Linear/Non-Linear constraints (only for COBYLA, SLSQP and trust-constr)
’tol’: Tolerance for termination
’maxiter’: sets
maxiters = 100
by default if not specified.’maxfev’: sets
maxfev = 100
by default if not specified.’optimizer_options’: dictionary of optimiser-specific arguments, defaults to
None
- CUSTOM_GRADIENT_OPTIMIZERS = ['vgd', 'newton', 'rmsprop', 'natural_grad_descent', 'spsa', 'cans', 'icans']
- CUSTOM_GRADIENT_OPTIMIZERS_MAPPER = {'cans': <function CANS>, 'icans': <function iCANS>, 'natural_grad_descent': <function natural_grad_descent>, 'newton': <function newton_descent>, 'rmsprop': <function rmsprop>, 'spsa': <function SPSA>, 'vgd': <function grad_descent>}
- optimize()[source]
Main method which implements the optimization process using
scipy.minimize
.- Returns:
Returns self after the optimization process is completed. The optimized result is assigned to the attribute
opt_result
- optimize_this(x, n_shots=None)
A function wrapper to execute the circuit in the backend. This function will be passed as argument to be optimized by scipy optimize.
- Parameters:
x – Parameters (a list of floats) over which optimization is performed.
n_shots – Number of shots to be used for the backend when computing the expectation. If None, nothing is passed to the backend.
- Returns:
Cost value which is evaluated on the declared backend.
- Return type:
cost value
- Returns:
Cost Value evaluated on the declared backed or on the Wavefunction Simulator if specified so
- results_dictionary(file_path=None, file_name=None)
This method formats a dictionary that consists of all the results from the optimization process. The dictionary is returned by this method. The results can also be saved by providing the path to save the pickled file
Important
Child classes must implement this method so that the returned object, a
Dictionary
is consistent across all Optimizers.
- class openqaoa.optimizers.training_vqa.PennyLaneOptimizer(vqa_object, variational_params, optimizer_dict)[source]
Bases:
OptimizeVQA
Python custom scipy optimization with pennylane optimizers for the VQA class.
Tip
Using bounds may result in lower optimization performance
- Parameters:
vqa_object (
Type
[VQABaseBackend
]) – Backend object of class VQABaseBackend which contains information on the backend used to perform computations, and the VQA circuit.variational_params (
Type
[QAOAVariationalBaseParams
]) – Object of class QAOAVariationalBaseParams, which contains information on the circuit to be executed, the type of parametrisation, and the angles of the VQA circuit.optimizer_dict (
dict
) –‘jac’: gradient as
Callable
, if defined elseNone
’hess’: hessian as
Callable
, if defined elseNone
’bounds’: parameter bounds while training, defaults to
None
’constraints’: Linear/Non-Linear constraints (only for COBYLA, SLSQP and trust-constr)
’tol’: Tolerance for termination
’maxiter’: sets
maxiters = 100
by default if not specified.’maxfev’: sets
maxfev = 100
by default if not specified.’optimizer_options’: dictionary of optimiser-specific arguments, defaults to
None
. Used also for the pennylande optimizers (and step function) arguments.
- PENNYLANE_OPTIMIZERS = ['pennylane_adagrad', 'pennylane_adam', 'pennylane_vgd', 'pennylane_momentum', 'pennylane_nesterov_momentum', 'pennylane_rmsprop', 'pennylane_rotosolve', 'pennylane_spsa']
- optimize()[source]
Main method which implements the optimization process using
scipy.minimize
.- Returns:
Returns self after the optimization process is completed. The optimized result is assigned to the attribute
opt_result
- optimize_this(x, n_shots=None)
A function wrapper to execute the circuit in the backend. This function will be passed as argument to be optimized by scipy optimize.
- Parameters:
x – Parameters (a list of floats) over which optimization is performed.
n_shots – Number of shots to be used for the backend when computing the expectation. If None, nothing is passed to the backend.
- Returns:
Cost value which is evaluated on the declared backend.
- Return type:
cost value
- Returns:
Cost Value evaluated on the declared backed or on the Wavefunction Simulator if specified so
- results_dictionary(file_path=None, file_name=None)
This method formats a dictionary that consists of all the results from the optimization process. The dictionary is returned by this method. The results can also be saved by providing the path to save the pickled file
Important
Child classes must implement this method so that the returned object, a
Dictionary
is consistent across all Optimizers.
Optimization Methods
A set of functions to implement pennylane optimization algorithms. Read https://docs.pennylane.ai/en/stable/introduction/interfaces.html#optimizers Optimisers requiring a pennylane backend haven’t been implemented yet. Similarly as with the custom optimization methods Scipy minimize is used. Extends available scipy methods.
- openqaoa.optimizers.pennylane.optimization_methods_pennylane.pennylane_optimizer(fun, x0, args=(), maxfev=None, pennylane_method='vgd', maxiter=100, tol=1e-06, jac=None, callback=None, nums_frequency=None, spectra=None, shifts=None, **options)[source]
Minimize a function fun using some pennylane method. To check available methods look at the available_methods_dict variable. Read https://docs.pennylane.ai/en/stable/introduction/interfaces.html#optimizers
- Parameters:
fun (callable) – Function to minimize
x0 (ndarray) – Initial guess.
args (sequence, optional) – Arguments to pass to func.
maxfev (int, optional) – Maximum number of function evaluations.
pennylane_method (string, optional) – Optimizer method to compute the steps.
maxiter (int, optional) – Maximum number of iterations.
tol (float) – Tolerance before the optimizer terminates; if tol is larger than the difference between two steps, terminate optimization.
jac (callable, optinal) – Callable gradient function. Required for all methods but rotosolve and spsa.
callback (callable, optional) – Called after each iteration, as
callback(xk)
, wherexk
is the current parameter vector.options (dict, optional) – Dictionary where keys are the arguments for the optimizers object, and the values are the values to pass to these arguments. To know all the possible options see https://docs.pennylane.ai/en/stable/introduction/interfaces.html#optimizers.
nums_frequency (dict[dict], optional) – It is required for rotosolve method The number of frequencies in the fun per parameter.
spectra (dict[dict], optional) – It is required for rotosolve method Frequency spectra in the objective_fn per parameter.
shifts (dict[dict], optional) – It is required for rotosolve method Shift angles for the reconstruction per parameter. Read https://docs.pennylane.ai/en/stable/code/api/pennylane.RotosolveOptimizer.html#pennylane.RotosolveOptimizer.step for more information.
- Returns:
OptimizeResult – Scipy OptimizeResult object.
- Return type:
OptimizeResult
Derivate functions
Collection of functions to return derivative computation functions. Usually called from the derivative_function method of a QAOABaseBackend object. New gradient/higher-order derivative computation methods can be added here. To add new computation methods: #. Write function in the format : new_function(backend_obj, params_std, params_ext, gradient_options), or with less arguments. #. Give this function a string identifier (eg: ‘param_shift’), and add this to the list derivative_methods of the function derivative, and as a possible ‘out’.
- openqaoa.derivatives.derivative_functions.derivative(backend_obj, params, logger, derivative_type=None, derivative_method=None, derivative_options=None)[source]
Returns a callable function that calculates the gradient according to the specified gradient_method.
- Parameters:
backend_obj (QAOABaseBackend) – QAOABaseBackend object that contains information about the backend that is being used to perform the QAOA circuit
params (QAOAVariationalBaseParams) – QAOAVariationalBaseParams object containing variational angles.
logger (Logger) – Logger Class required to log information from the evaluations required for the jacobian/hessian computation.
derivative_type (str) – Type of derivative to compute. Either gradient or hessian.
derivative_method (str) – Computational method of the derivative. Either finite_difference, param_shift, stoch_param_shift, or grad_spsa.
derivative_options (dict) – Dictionary containing options specific to each derivative_method.
cost_std – object that computes expectation values when executed. Standard parametrisation.
cost_ext – object that computes expectation values when executed. Extended parametrisation. Mainly used to compute parameter shifts at each individual gate, which is summed to recover the parameter shift for a parametrised layer.
- Returns:
The callable derivative function of the cost function, generated based on the derivative_type, derivative_method, and derivative_options specified.
- Return type:
out
- openqaoa.derivatives.derivative_functions.grad_fd(backend_obj, params, gradient_options, logger, variance=False)[source]
Returns a callable function that calculates the gradient (and its variance if variance=True) with the finite difference method.
- Parameters:
backend_obj (QAOABaseBackend) – backend object that computes expectation values when executed.
params (QAOAVariationalBaseParams) – parameters of the variational circuit.
gradient_options (dict) –
- stepsize :
Stepsize of finite difference.
logger (Logger) – logger object to log the number of function evaluations.
variance (bool) – If True, the variance of the gradient is also computed. If False, only the gradient is computed.
- Returns:
grad_fd_func – Callable derivative function.
- Return type:
Callable
- openqaoa.derivatives.derivative_functions.grad_ps(backend_obj, params, params_ext, gradient_options, logger, stochastic=False, variance=False)[source]
If stochastic=False returns a callable function that calculates the gradient (and its variance if variance=True) with the parameter shift method. If stochastic=True returns a callable function that approximates the gradient (and its variance if variance=True) with the stochastic parameter shift method, which samples (n_beta_single, n_beta_pair, n_gamma_single, n_gamma_pair) gates at each layer instead of all gates. See “Algorithm 4” of https://arxiv.org/pdf/1910.01155.pdf. By convention, (n_beta_single, n_beta_pair, n_gamma_single, n_gamma_pair) = (-1, -1, -1, -1) will sample all gates (which is then equivalent to the full parameter shift rule).
- Parameters:
backend_obj (QAOABaseBackend) – backend object that computes expectation values when executed.
params (QAOAVariationalStandardParams) – variational parameters object, standard parametrisation.
params_ext (QAOAVariationalExtendedParams) – variational parameters object, extended parametrisation.
gradient_options –
‘n_beta_single’: Number of single-qubit mixer gates to sample for the stochastic parameter shift.
’n_beta_pair’: Number of two-qubit mixer gates to sample for the stochastic parameter shift.
’n_gamma_single’: Number of single-qubit cost gates to sample for the stochastic parameter shift.
’n_gamma_pair’: Number of two-qubit cost gates to sample for the stochastic parameter shift.
logger (Logger) – logger object to log the number of function evaluations.
variance (bool) – If True, the variance of the gradient is also computed. If False, only the gradient is computed.
- Returns:
Callable derivative function.
- Return type:
grad_ps_func
- openqaoa.derivatives.derivative_functions.grad_spsa(backend_obj, params, gradient_options, logger, variance=False)[source]
Returns a callable function that calculates the gradient approxmiation with the Simultaneous Perturbation Stochastic Approximation (SPSA) method.
- Parameters:
backend_obj (QAOABaseBackend) – backend object that computes expectation values when executed.
params (QAOAVariationalBaseParams) – variational parameters object.
gradient_options (dict) –
- gradient_stepsize :
stepsize of stochastic shift.
logger (Logger) – logger object to log the number of function evaluations.
- Returns:
grad_spsa_func – Callable derivative function.
- Return type:
Callable
- openqaoa.derivatives.derivative_functions.hessian_fd(backend_obj, params, hessian_options, logger)[source]
Returns a callable function that calculates the hessian with the finite difference method.
- Parameters:
backend_obj (QAOABaseBackend) – backend object that computes expectation values when executed.
params (QAOAVariationalBaseParams) – variational parameters object.
hessian_options –
- hessian_stepsize :
stepsize of finite difference.
logger (Logger) – logger object to log the number of function evaluations.
- Returns:
Callable derivative function.
- Return type:
hessian_fd_func
- openqaoa.derivatives.derivative_functions.update_and_compute_expectation(backend_obj, params, logger)[source]
Helper function that returns a callable that takes in a list/nparray of raw parameters. This function will handle:
Updating logger object with logger.log_variables
Updating variational parameters with update_from_raw
Computing expectation with backend_obj.expectation
- Parameters:
backend_obj (QAOABaseBackend) – QAOABaseBackend object that contains information about the backend that is being used to perform the QAOA circuit
params (QAOAVariationalBaseParams) – QAOAVariationalBaseParams object containing variational angles.
logger (Logger) – Logger Class required to log information from the evaluations required for the jacobian/hessian computation.
- Returns:
A callable that accepts a list/array of parameters, and returns the computed expectation value.
- Return type:
out
- openqaoa.derivatives.derivative_functions.update_and_get_counts(backend_obj, params, logger)[source]
Helper function that returns a callable that takes in a list/nparray of raw parameters. This function will handle:
Updating logger object with logger.log_variables
Updating variational parameters with update_from_raw
Getting the counts dictonary with backend_obj.get_counts
- Parameters:
backend_obj (QAOABaseBackend) – QAOABaseBackend object that contains information about the backend that is being used to perform the QAOA circuit
params (QAOAVariationalBaseParams) – QAOAVariationalBaseParams object containing variational angles.
logger (Logger) – Logger Class required to log information from the evaluations required for the jacobian/hessian computation.
- Returns:
A callable that accepts a list/array of parameters, and returns the counts dictonary.
- Return type:
out
QFIM
- openqaoa.derivatives.qfim.qfim(backend_obj, params, logger, eta=1e-08)[source]
Returns a callable qfim_fun(args) that computes the Quantum Fisher Information Matrix at args according to : $$[QFI]_{ij} = Re(<∂iφ|∂jφ>) − <∂iφ|φ><φ|∂jφ>$$.
- Parameters:
params (QAOAVariationalBaseParams) – The QAOA parameters as a 1D array (derived from an object of one of the parameter classes, containing hyperparameters and variable parameters).
eta (float) – The infinitesimal shift used to compute |∂jφ>, the partial derivative of the wavefunction w.r.t a parameter.
- Returns:
The quantum fisher information matrix, a 2p*2p symmetric square matrix with elements [QFI]_ij = Re(<∂iφ|∂jφ>) − <∂iφ|φ><φ|∂jφ>.
- Return type:
qfim_array
Optimizers selectors
- openqaoa.optimizers.qaoa_optimizer.available_optimizers()[source]
Return a list of available optimizers.
- openqaoa.optimizers.qaoa_optimizer.get_optimizer(vqa_object, variational_params, optimizer_dict)[source]
Initialise the specified optimizer class with provided method and optimizer-specific options
- Parameters:
vqa_object (
VQABaseBackend
) – Backend object of class VQABaseBackend which contains information on the backend used to perform computations, and the VQA circuit.variational_params (
QAOAVariationalBaseParams
) – Object of class QAOAVariationalBaseParams, which contains information on the circuit to be executed, the type of parametrisation, and the angles of the VQA circuit.optimizer_dict (
dict
) – Optimizer information dictionary used to construct the optimizer with specified options
- Returns:
Optimizer object of type specified by specified method
- Return type:
optimizer