pylops.optimization.cls_sparsity.SPGL1#

class pylops.optimization.cls_sparsity.SPGL1(Op, callbacks=None)[source]#

Spectral Projected-Gradient for L1 norm.

Solve a constrained system of equations given the operator Op and a sparsyfing transform SOp aiming to retrive a model that is sparse in the sparsifying domain.

This is a simple wrapper to spgl1.spgl1 which is a porting of the well-known SPGL1 MATLAB solver into Python. In order to be able to use this solver you need to have installed the spgl1 library.

Parameters
Oppylops.LinearOperator

Operator to invert of size \([N \times M]\).

Raises
ModuleNotFoundError

If the spgl1 library is not installed

Notes

Solve different variations of sparsity-promoting inverse problem by imposing sparsity in the retrieved model [1].

The first problem is called basis pursuit denoise (BPDN) and its cost function is

\[\|\mathbf{x}\|_1 \quad \text{subject to} \quad \left\|\mathbf{Op}\,\mathbf{S}^H\mathbf{x}-\mathbf{y}\right\|_2^2 \leq \sigma,\]

while the second problem is the ℓ₁-regularized least-squares or LASSO problem and its cost function is

\[\left\|\mathbf{Op}\,\mathbf{S}^H\mathbf{x}-\mathbf{y}\right\|_2^2 \quad \text{subject to} \quad \|\mathbf{x}\|_1 \leq \tau\]
1

van den Berg E., Friedlander M.P., “Probing the Pareto frontier for basis pursuit solutions”, SIAM J. on Scientific Computing, vol. 31(2), pp. 890-912. 2008.

Methods

__init__(Op[, callbacks])

callback(x, *args, **kwargs)

Callback routine

finalize(*args[, show])

Finalize solver

run(x[, show])

Run solver

setup(y[, SOp, tau, sigma, show])

Setup solver

solve(y[, x0, SOp, tau, sigma, show])

Run entire solver

step()

Run one step of solver