diff --git a/nightly/.doctrees/environment.pickle b/nightly/.doctrees/environment.pickle index debbabb2a6..f2ff769e42 100644 Binary files a/nightly/.doctrees/environment.pickle and b/nightly/.doctrees/environment.pickle differ diff --git a/nightly/.doctrees/optimisation.doctree b/nightly/.doctrees/optimisation.doctree index 1f850c1ba9..202ddaa0ca 100644 Binary files a/nightly/.doctrees/optimisation.doctree and b/nightly/.doctrees/optimisation.doctree differ diff --git a/nightly/.doctrees/plugins.doctree b/nightly/.doctrees/plugins.doctree index 5528d51dc2..701e22d26d 100644 Binary files a/nightly/.doctrees/plugins.doctree and b/nightly/.doctrees/plugins.doctree differ diff --git a/nightly/_modules/cil/optimisation/algorithms/Algorithm.html b/nightly/_modules/cil/optimisation/algorithms/Algorithm.html index 5f4871abf7..647b5e0b83 100644 --- a/nightly/_modules/cil/optimisation/algorithms/Algorithm.html +++ b/nightly/_modules/cil/optimisation/algorithms/Algorithm.html @@ -232,7 +232,7 @@
run
method will stop when the stopping criterion is met.
-__init__
(**kwargs)[source]#Constructor
-Set the minimal number of parameters:
-max_iteration (int, optional, default 0) – maximum number of iterations
update_objective_interval (int, optional, default 1) – the interval every which we would save the current objective. 1 means every iteration, 2 every 2 iteration and so forth. This is by default 1 and should be increased when evaluating the objective is computationally expensive.
log_file (str, optional, default None) – log verbose output to file
set_up
(*args, **kwargs)[source]#__set_up_logger
(fname)#Set up the logger if desired
-max_iteration_stop_criterion
()[source]#default stop criterion for iterative algorithm: max_iteration reached
next
()[source]#__next__
()[source]#Algorithm is an iterable
-calling this method triggers update and update_objective
-_update_previous_solution
()[source]#Update the previous solution with the current one
-The concrete algorithm calls update_previous_solution. Normally this would -entail the swapping of pointers:
-tmp = self.x_old
-self.x_old = self.x
-self.x = tmp
-
get_output
()[source]#__weakref__
#list of weak references to the object (if defined)
-verbose_output
(verbose=False)[source]#Does a dot linearity test on the operator Evaluates if the following equivalence holds .. math:
-Ax\times y = y \times A^Tx
+Ax\times y = y \times A^Tx
|Ax\times y - y \times A^Tx|/(\|A\|\|x\|\|y\| + 1e-12) < tolerance
+|Ax\times y - y \times A^Tx|/(\|A\|\|x\|\|y\| + 1e-12) < tolerance
:type : float, default 1e-6
@@ -3264,29 +3210,17 @@
Base classes
- Parameters
-
-L (number, positive, default None) – Lipschitz constant of the gradient of the function F(x), when it is differentiable.
-domain – The domain of the function.
-
+L (number, positive, default None) – Lipschitz constant of the gradient of the function F(x), when it is differentiable.
-Lipschitz of the gradient of the function; it is a positive real number, such that |f'(x) - f'(y)| <= L ||x-y||, assuming f: IG –> R
-
-
-
-
+
+Note
+The Lipschitz of the gradient of the function is a positive real number, such that \(\|f'(x) - f'(y)\| \leq L \|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+
-
gradient
(x, out=None)[source]#
-Returns the value of the gradient of function F at x, if it is differentiable
+Returns the value of the gradient of function \(F\) at \(x\), if it is differentiable
\[F'(x)\]
@@ -3294,8 +3228,9 @@ Base classes
-
proximal
(x, tau, out=None)[source]#
-Returns the proximal operator of function \(\tau F\) at x
-.. math:: mathrm{prox}_{tau F}(x) = underset{z}{mathrm{argmin}} frac{1}{2}|z - x|^{2} + tau F(z)
+Returns the proximal operator of function \(\tau F\) at x
+
+\[\text{prox}_{\tau F}(x) = \underset{z}{\text{argmin}} \frac{1}{2}\|z - x\|^{2} + \tau F(z)\]
@@ -3303,7 +3238,7 @@ Base classesconvex_conjugate(x)[source]#
Returns the convex conjugate of function \(F\) at \(x^{*}\),
-\[F^{*}(x^{*}) = \underset{x^{*}}{\sup} <x^{*}, x> - F(x)\]
+\[F^{*}(x^{*}) = \underset{x^{*}}{\sup} \langle x^{*}, x \rangle - F(x)\]
Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
L
#
Lipschitz of the gradient of function f.
-L is positive real number, such that |f'(x) - f'(y)| <= L ||x-y||, assuming f: IG –> R
-__weakref__
#list of weak references to the object (if defined)
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
>>> F = SumFunction(*[L2NormSquared(b=i) for i in range(50)])
L
#__call__
(x)[source]#Returns the value of the sum of functions at \(x\).
-gradient
(x, out=None)[source]#__add__
(other)[source]#Addition for the SumFunction.
-SumFunction
+ SumFunction
is a SumFunction
.
SumFunction
+ Function
is a SumFunction
.
centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
+proximal
(x, tau, out=None)#Returns the proximal operator of function \(\tau F\) at x
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+\(G'(x) = \alpha F'(x)\) ( gradient method )
\(G^{*}(x^{*}) = \alpha F^{*}(\frac{x^{*}}{\alpha})\) ( convex_conjugate method )
\(\mathrm{prox}_{\tau G}(x) = \mathrm{prox}_{(\tau\alpha) F}(x)\) ( proximal method )
\(\text{prox}_{\tau G}(x) = \text{prox}_{(\tau\alpha) F}(x)\) ( proximal method )
__init__
(function, scalar)[source]#Initialize self. See help(type(self)) for accurate signature.
-L
#Lipschitz of the gradient of function f.
-L is positive real number, such that |f'(x) - f'(y)| <= L ||x-y||, assuming f: IG –> R
-__call__
(x, out=None)[source]#Returns the value of the scaled function.
-L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
Returns the proximal operator of the scaled function.
centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+__init__
(function, constant)[source]#Initialize self. See help(type(self)) for accurate signature.
-convex_conjugate
(x)[source]#Returns the proximal operator of \(F+scalar\)
Lmax
#Returns the maximum Lipschitz constant for the SumFunction
+where \(L_{i}\) is the Lipschitz constant of the smooth function \(F_{i}\).
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+gradient
(x, out=None)#Returns the value of the sum of the gradient of functions at \(x\), if all of them are differentiable.
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+\(G'(x) = F'(x - b)\) ( gradient method )
\(G^{*}(x^{*}) = F^{*}(x^{*}) + <x^{*}, b >\) ( convex_conjugate method )
\(\mathrm{prox}_{\tau G}(x) = \mathrm{prox}_{\tau F}(x - b) + b\) ( proximal method )
\(\text{prox}_{\tau G}(x) = \text{prox}_{\tau F}(x - b) + b\) ( proximal method )
__init__
(function, center)[source]#Initialize self. See help(type(self)) for accurate signature.
-gradient
(x, out=None)[source]#Returns the proximal operator of the translated function.
L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+cil.optimisation.functions.
ConstantFunction
(constant=0)[source]#
ConstantFunction: \(F(x) = constant, constant\in\mathbb{R}\)
- - - -gradient
(x, out=None)[source]#Returns the proximal operator of the constant function, which is the same element, i.e.,
L
#Lipschitz of the gradient of function f.
-L is positive real number, such that |f'(x) - f'(y)| <= L ||x-y||, assuming f: IG –> R
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
__rmul__
(other)[source]#defines the right multiplication with a number
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+cil.optimisation.functions.
ZeroFunction
[source]#
ZeroFunction represents the zero function, \(F(x) = 0\)
__init__
()[source]#Initialize self. See help(type(self)) for accurate signature.
+L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#The convex conjugate of constant function \(F(x) = c\in\mathbb{R}\) is
+However, \(x^{*} = 0\) only in the limit of iterations, so in fact this can be infinity. +We do not want to have inf values in the convex conjugate, so we have to penalise this value accordingly. +The following penalisation is useful in the PDHG algorithm, when we compute primal & dual objectives +for convergence purposes.
+gradient
(x, out=None)#Returns the value of the gradient of the function, \(F'(x)=0\)
+proximal
(x, tau, out=None)#Returns the proximal operator of the constant function, which is the same element, i.e.,
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+__init__
(alpha, beta)[source]#Initialize self. See help(type(self)) for accurate signature.
+gradient
(x, out=None)[source]#Gradient of the Rosenbrock function
+nabla f(x,y) = left[ 2*((x-alpha) - 2beta x(y-x^2)) ; 2beta (y - x^2) right]
__call__
(x)[source]#Returns the value of the function F at x: \(F(x)\)
+L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
gradient
(x, out=None)[source]#Gradient of the Rosenbrock function
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
nabla f(x,y) = left[ 2*((x-alpha) - 2beta x(y-x^2)) ; 2beta (y - x^2) right]
+\[F^{*}(x^{*}) = \underset{x^{*}}{\sup} \langle x^{*}, x \rangle - F(x)\]proximal
(x, tau, out=None)#Returns the proximal operator of function \(\tau F\) at x
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+where \(A\) is an operator. For instance the least squares function l2norm_ Norm2Sq
can
+
where \(A\) is an operator. For instance the least squares function l2norm_ Norm2Sq
can
be expressed as
__init__
(function, operator)[source]#creator
-A (Operator
) – operator
f (Function
) – function
L
#Lipschitz of the gradient of function f.
-L is positive real number, such that |f'(x) - f'(y)| <= L ||x-y||, assuming f: IG –> R
-__call__
(x)[source]#Returns \(F(Ax)\)
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
+proximal
(x, tau, out=None)#Returns the proximal operator of function \(\tau F\) at x
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+__new__
(cls, lower=None, upper=None, accelerated=True)[source]#Create and return a new object. See help(type) for accurate signature.
-set_suppress_evaluation
(value)[source]#__call__
(x)[source]#Evaluates IndicatorBox at x
-x (DataContainer) –
-Evaluates the IndicatorBox at x. If suppress_evaluation
is True
, returns 0.
proximal
(x, tau, out=None)[source]#L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+__new__
(cls, b, eta=None, mask=None, backend='numba')[source]#Create and return a new object. See help(type) for accurate signature.
+L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
__init__
(b, eta=None, mask=None, backend='numba')[source]#Initialize self. See help(type(self)) for accurate signature.
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
+gradient
(x, out=None)#Returns the value of the gradient of function \(F\) at \(x\), if it is differentiable
+proximal
(x, tau, out=None)#Returns the proximal operator of function \(\tau F\) at x
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+__init__
(**kwargs)[source]#creator
-Cases considered (with/without data): -a) \(f(x) = ||x||_{1}\) -b) \(f(x) = ||x - b||_{1}\)
-b (DataContainer
, optional) – translation of the function
__call__
(x)[source]#Returns the value of the L1Norm function at x.
-convex_conjugate
(x)[source]#L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+gradient
(x, out=None)#Returns the value of the gradient of function \(F\) at \(x\), if it is differentiable
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+\(F(x) = \|x - b\|^{2}_{2}\)
b (DataContainer, optional) – Translation of the function
+Note
-For case b) case we can use F = L2NormSquared().centered_at(b)
,
-see TranslateFunction.
For case b) we can use F = L2NormSquared().centered_at(b)
, see TranslateFunction.
>>> F = L2NormSquared()
+Example
+>>> F = L2NormSquared()
>>> F = L2NormSquared(b=b)
>>> F = L2NormSquared().centered_at(b)
-
-
-
__init__
(**kwargs)[source]#creator
-b (DataContainer
, optional) – translation of the function
__call__
(x)[source]#Returns the value of the L2NormSquared function at x.
-Following cases are considered:
----
-- -
\(F(x) = \|x\|^{2}_{2}\)
- -
\(F(x) = \|x - b\|^{2}_{2}\)
\(x\)
-\(\underset{i}{\sum}x_{i}^{2}\)
-gradient
(x, out=None)[source]#L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
cil.optimisation.functions.
WeightedL2NormSquared
(**kwargs)[source]#WeightedL2NormSquared function: \(F(x) = \| x\|_{w}^{2}_{2} = \underset{i}{\sum}w_{i}*x_{i}^{2} = <x, w*x> = x^{T}*w*x\)
__init__
(**kwargs)[source]#Initialize self. See help(type(self)) for accurate signature.
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
__call__
(x)[source]#Returns the value of the function F at x: \(F(x)\)
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+cil.optimisation.functions.
WeightedL2NormSquared
(**kwargs)[source]#WeightedL2NormSquared function: \(F(x) = \|x\|_{W,2}^2 = \Sigma_iw_ix_i^2 = \langle x, Wx\rangle = x^TWx\) +where \(W=\text{diag}(weight)\) if weight is a DataContainer or \(W=\text{weight} I\) if weight is a scalar.
+**kwargs –
weight (a scalar or a DataContainer with the same shape as the intended domain of this WeightedL2NormSquared function) –
b (a DataContainer with the same shape as the intended domain of this WeightedL2NormSquared function) – A shift so that the function becomes \(F(x) = \| x-b\|_{W,2}^2 = \Sigma_iw_i(x_i-b_i)^2 = \langle x-b, W(x-b) \rangle = (x-b)^TW(x-b)\)
gradient
(x, out=None)[source]#Returns the value of the gradient of function F at x, if it is differentiable
-Returns the value of \(F'(x) = 2Wx\) or, if b is defined, \(F'(x) = 2W(x-b)\) +where \(W=\text{diag}(weight)\) if weight is a DataContainer or \(\text{weight}I\) if weight is a scalar.
convex_conjugate
(x)[source]#Returns the convex conjugate of function \(F\) at \(x^{*}\),
-Returns the value of the convex conjugate of the WeightedL2NormSquared function at x.
+L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
proximal
(x, tau, out=None)[source]#Returns the proximal operator of function \(\tau F\) at x -.. math:: mathrm{prox}_{tau F}(x) = underset{z}{mathrm{argmin}} frac{1}{2}|z - x|^{2} + tau F(z)
+Returns the value of the proximal operator of the WeightedL2NormSquared function at x.
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+--A : LinearOperator
-b : Data, DataContainer
-c : Scaling Constant, float, default 1.0
-weight: DataContainer with all positive elements of size of the range of operator A, default None
-
--L : Lipshitz Constant of the gradient of \(F\) which is \(2 c ||A||_2^2 = 2 c s1(A)^2\), or
-L : Lipshitz Constant of the gradient of \(F\) which is \(2 c ||weight|| ||A||_2^2 = 2s1(A)^2\),
-
where s1(A) is the largest singular value of A.
-__init__
(A, b, c=1.0, weight=None)[source]#Initialize self. See help(type(self)) for accurate signature.
-where \(W=\text{diag}(weight)\).
+A (LinearOperator) –
b (Data, DataContainer) –
c (Scaling Constant, float, default 1.0) –
weight (DataContainer with all positive elements of size of the range of operator A, default None) –
Note
+L is the Lipshitz Constant of the gradient of \(F\) which is \(2 c ||A||_2^2 = 2 c \sigma_1(A)^2\), or \(2 c ||W|| ||A||_2^2 = 2c||W|| \sigma_1(A)^2\), where \(\sigma_1(A)\) is the largest singular value of \(A\) and \(W=\text{diag}(weight)\).
+where \(W=\text{diag}(weight)\).
L
#Lipschitz of the gradient of function f.
-L is positive real number, such that |f'(x) - f'(y)| <= L ||x-y||, assuming f: IG –> R
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
+proximal
(x, tau, out=None)#Returns the proximal operator of function \(\tau F\) at x
+__rmul__
(other)[source]#defines the right multiplication with a number
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+cil.optimisation.functions.
MixedL21Norm
(**kwargs)[source]#
MixedL21Norm function: \(F(x) = ||x||_{2,1} = \sum |x|_{2} = \sum \sqrt{ (x^{1})^{2} + (x^{2})^{2} + \dots}\)
where x is a BlockDataContainer, i.e., \(x=(x^{1}, x^{2}, \dots)\)
- - -__call__
(x)[source]#Returns the value of the MixedL21Norm function at x.
-x – BlockDataContainer
convex_conjugate
(x)[source]#L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+gradient
(x, out=None)#Returns the value of the gradient of function \(F\) at \(x\), if it is differentiable
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+Conjugate, proximal and proximal conjugate methods no closed-form solution
__init__
(epsilon)[source]#epsilon – smoothing parameter making MixedL21Norm differentiable
-gradient
(x, out=None)[source]#Returns the value of the gradient of the SmoothMixedL21Norm function at x.
+frac{x}{|x|}
__call__
(x)[source]#Returns the value of the SmoothMixedL21Norm function at x.
+L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
gradient
(x, out=None)[source]#Returns the value of the gradient of the SmoothMixedL21Norm function at x.
-frac{x}{|x|}
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
+proximal
(x, tau, out=None)#Returns the proximal operator of function \(\tau F\) at x
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+__call__
(x)[source]#Returns the value of the MixedL11Norm function at x.
-x – BlockDataContainer
proximal
(x, tau, out=None)[source]#L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+convex_conjugate
(x)#Returns the convex conjugate of function \(F\) at \(x^{*}\),
+gradient
(x, out=None)#Returns the value of the gradient of function \(F\) at \(x\), if it is differentiable
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+The algorithm used for the proximal operator of TV is the Fast Gradient Projection algorithm (or FISTA) -applied to the _dual problem_ of the above problem, see [1], [2], [9].
+applied to the _dual problem_ of the above problem, see [1], [2], [9].See also “Multicontrast MRI Reconstruction with Structure-Guided Total Variation”, Ehrhardt, Betcke, 2016.
__init__
(max_iteration=10, tolerance=None, correlation='Space', backend='c', lower=None, upper=None, isotropic=True, split=False, info=False, strong_convexity_constant=0, warm_start=True)[source]#Initialize self. See help(type(self)) for accurate signature.
-L
#Lipschitz of the gradient of function f.
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
+centered_at
(center)#Returns a translated function, namely if we have a function \(F(x)\) the center is at the origin. +TranslateFunction is \(F(x - b)\) and the center is at point b.
+proximal_conjugate
(x, tau, out=None)#Returns the proximal operator of the convex conjugate of function \(\tau F\) at \(x^{*}\)
+Due to Moreau’s identity, we have an analytic formula to compute the proximal operator of the convex conjugate \(F^{*}\)
+gradient
#__rmul__
(scalar)[source]#Returns a function multiplied by a scalar.
-The block framework allows writing more advanced `optimisation problems`_. Consider the typical +
The block framework allows writing more advanced `optimisation problems`_. Consider the typical Tikhonov regularisation:
BlockOperator
.
BlockDataContainer holds `DataContainer`_ as column vector. It is possible to +
BlockDataContainer holds `DataContainer`_ as column vector. It is possible to do basic algebra between BlockDataContainer s and with numbers, list or numpy arrays.
L
#
Lipschitz of the gradient of function f.
-L is positive real number, such that |f'(x) - f'(y)| <= L ||x-y||, assuming f: IG –> R
+L is positive real number, such that \(\|f'(x) - f'(y)\| \leq L\|x-y\|\), assuming \(f: IG \rightarrow \mathbb{R}\)
__call__
(x)[source]#Returns the value of the function F at x: \(F(x)\)
+Call self as a function.
Returns the convex conjugate of function \(F\) at \(x^{*}\),
__call__
(x)[source]#Returns the value of the function F at x: \(F(x)\)
+Call self as a function.
Returns the convex conjugate of function \(F\) at \(x^{*}\),
__call__
(x)[source]#Returns the value of the function F at x: \(F(x)\)
+Call self as a function.
Returns the convex conjugate of function \(F\) at \(x^{*}\),