pylops.optimization.sparsity.SPGL1¶

pylops.optimization.sparsity.
SPGL1
(Op, data, SOp=None, tau=0, sigma=0, x0=None, **kwargs_spgl1)[source]¶ Spectral ProjectedGradient for L1 norm.
Solve a constrained system of equations given the operator
Op
and a sparsyfing transformSOp
aiming to retrive a model that is sparse in the sparsyfing domain.This is a simple wrapper to
spgl1.spgl1
which is a porting of the wellknown SPGL1 MATLAB solver into Python. In order to be able to use this solver you need to have installed thespgl1
library.Parameters:  Op :
pylops.LinearOperator
Operator to invert
 data :
numpy.ndarray
Data
 SOp :
pylops.LinearOperator
Sparsyfing transform
 tau :
float
Nonnegative LASSO scalar. If different from
0
, SPGL1 will solve LASSO problem sigma :
list
BPDN scalar. If different from
0
, SPGL1 will solve BPDN problem x0 :
numpy.ndarray
Initial guess
 **kwargs_spgl1
Arbitrary keyword arguments for
spgl1.spgl1
solver
Returns:  xinv :
numpy.ndarray
Inverted model in original domain.
 pinv :
numpy.ndarray
Inverted model in sparse domain.
 info :
dict
Dictionary with the following information:
tau
, final value of tau (see sigma above)rnorm
, twonorm of the optimal residualrgap
, relative duality gap (an optimality measure)gnorm
, Lagrange multiplier of (LASSO)stat
,1
: found a BPDN solution,2
: found a BP solution; exit based on small gradient,3
: found a BP solution; exit based on small residual,4
: found a LASSO solution,5
: error, too many iterations,6
: error, linesearch failed,7
: error, found suboptimal BP solution,8
: error, too many matrixvector products.
niters
, number of iterationsnProdA
, number of multiplications with AnProdAt
, number of multiplications with A’n_newton
, number of Newton stepstime_project
, projection time (seconds)time_matprod
, matrixvector multiplications time (seconds)time_total
, total solution time (seconds)niters_lsqr
, number of lsqr iterations (ifsubspace_min=True
)xnorm1
, L1norm model solution history through iterationsrnorm2
, L2norm residual history through iterationslambdaa
, Lagrange multiplier history through iterations
Raises:  ModuleNotFoundError
If the
spgl1
library is not installed
Notes
Solve different variations of sparsitypromoting inverse problem by imposing sparsity in the retrieved model [1].
The first problem is called basis pursuit denoise (BPDN) and its cost function is
\[\mathbf{x}_1 \quad subj. to \quad \mathbf{Op}\mathbf{S}^H\mathbf{x}\mathbf{b}_2^2 <= \sigma,\]while the second problem is the l1regularized leastsquares or LASSO problem and its cost function is
\[\mathbf{Op}\mathbf{S}^H\mathbf{x}\mathbf{b}_2^2 \quad subj. to \quad \mathbf{x}_1 <= \tau\][1] van den Berg E., Friedlander M.P., “Probing the Pareto frontier for basis pursuit solutions”, SIAM J. on Scientific Computing, vol. 31(2), pp. 890912. 2008.  Op :