neuromancer.modules.solvers module
- class neuromancer.modules.solvers.GradientProjection(constraints, input_keys, output_keys=[], decay=0.1, num_steps=1, step_size=0.01, energy_update=True, name=None)[source]
Bases:
Solver
Implementation of projected gradient method for gradient-based corrections of constraints violations Abstract steps of the gradient projection method:
1, compute aggregated constraints violation penalties (con_viol_energy method) 2, compute gradient of the constraints violations w.r.t. variables in input_keys (forward method) 3, update the variable values with the negative gradient scaled by step_size (forward method)
References
method: https://neos-guide.org/guide/algorithms/gradient-projection/ DC3 paper: https://arxiv.org/abs/2104.12225
- class neuromancer.modules.solvers.IterativeSolver(constraints, input_keys, output_keys=[], num_steps=1, step_size=1.0, name=None)[source]
Bases:
Module
TODO: to debug
- Class for a family of iterative solvers for root-finding solutions to the problem:
\(g(x) = 0\)
general iterative solver update rules: \(x_k+1 = phi(x_k)\) \(x_k+1 = x_k + phi(x_k)\)
https://en.wikipedia.org/wiki/Iterative_method https://en.wikipedia.org/wiki/Root-finding_algorithms
Newton’s method: \(x_k+1 = x_k - J_g(x_k)^-1 g(x_k)\) \(J_g(x_k)\): Jacobian of \(g(x_k)\) w.r.t. :math:`x_k’
- forward(data)[source]
foward pass of the Newton solver :param data: (dict: {str: Tensor}) :return: (dict: {str: Tensor})
- property num_steps