neuromancer.modules.solvers module

class neuromancer.modules.solvers.GradientProjection(constraints, input_keys, output_keys=[], decay=0.1, num_steps=1, step_size=0.01, energy_update=True, name=None)[source]

Bases: Solver

Implementation of projected gradient method for gradient-based corrections of constraints violations Abstract steps of the gradient projection method:

1, compute aggregated constraints violation penalties (con_viol_energy method) 2, compute gradient of the constraints violations w.r.t. variables in input_keys (forward method) 3, update the variable values with the negative gradient scaled by step_size (forward method)

References

method: https://neos-guide.org/guide/algorithms/gradient-projection/ DC3 paper: https://arxiv.org/abs/2104.12225

con_viol_energy(input_dict)[source]

Calculate the constraints violation potential energy over batches

forward(data)[source]

forward pass of the projected gradient solver :param data: (dict: {str: Tensor}) :return: (dict: {str: Tensor})

class neuromancer.modules.solvers.IterativeSolver(constraints, input_keys, output_keys=[], num_steps=1, step_size=1.0, name=None)[source]

Bases: Module

TODO: to debug

Class for a family of iterative solvers for root-finding solutions to the problem:

\(g(x) = 0\)

general iterative solver update rules: \(x_k+1 = phi(x_k)\) \(x_k+1 = x_k + phi(x_k)\)

https://en.wikipedia.org/wiki/Iterative_method https://en.wikipedia.org/wiki/Root-finding_algorithms

Newton’s method: \(x_k+1 = x_k - J_g(x_k)^-1 g(x_k)\) \(J_g(x_k)\): Jacobian of \(g(x_k)\) w.r.t. :math:`x_k’

con_values(data)[source]

Calculate values g(x) of the constraints expressions

forward(data)[source]

foward pass of the Newton solver :param data: (dict: {str: Tensor}) :return: (dict: {str: Tensor})

newton_step(data, x)[source]

Calculate the newton step for a given variable x

property num_steps
class neuromancer.modules.solvers.Solver(objectives=[], constraints=[], input_keys=[], output_keys=[], name=None)[source]

Bases: Module, ABC

Abstract class for the differentiable solver implementation

abstract forward(data)[source]

differentiable solver update to be implemented here

Parameters:

datadict – (dict {str: Tensor}) input to solver with associated input_keys

Returns:

(dict {str: Tensor}) Output of solver with associated output_keys