-
Notifications
You must be signed in to change notification settings - Fork 10
JSO Compliance : LM #200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
JSO Compliance : LM #200
Changes from 8 commits
58f26a1
3ffc73b
02b6cce
458aae8
5101a10
144f9a4
576bb88
f85deac
53c40b8
3c8e92e
de6a62b
63b9e3a
b4cb5b7
25fc1da
77ddd78
69e8409
7df41cc
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,57 @@ | ||
export LMModel | ||
|
||
@doc raw""" | ||
LMModel(j_prod!, jt_prod, F, v, σ, xk) | ||
|
||
Given the unconstrained optimization problem: | ||
```math | ||
\min \tfrac{1}{2} \| F(x) \|^2, | ||
``` | ||
this model represents the smooth LM subproblem: | ||
```math | ||
\min_s \ \tfrac{1}{2} \| F(x) + J(x)s \|^2 + \tfrac{1}{2} σ \|s\|^2 | ||
``` | ||
where `J` is the Jacobian of `F` at `xk`, represented via matrix-free operations. | ||
`j_prod!(xk, s, out)` computes `J(xk) * s`, and `jt_prod!(xk, r, out)` computes `J(xk)' * r`. | ||
|
||
`σ > 0` is a regularization parameter and `v` is a vector of the same size as `F(xk)` used for intermediary computations. | ||
""" | ||
mutable struct LMModel{T <: Real, V <: AbstractVector{T}, J <: Function , Jt <: Function} <: | ||
AbstractNLPModel{T, V} | ||
j_prod!::J | ||
jt_prod!::Jt | ||
F::V | ||
v::V | ||
xk::V | ||
σ::T | ||
meta::NLPModelMeta{T, V} | ||
counters::Counters | ||
end | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Here too, would it be possible to reuse LLSModels? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I am not sure how, but this will be impractical in my opinion. |
||
|
||
function LMModel(j_prod!::J, jt_prod!::Jt, F::V, σ::T, xk::V) where {T, V, J, Jt} | ||
meta = NLPModelMeta( | ||
length(xk), | ||
x0 = xk, # Perhaps we should add lvar and uvar as well here. | ||
) | ||
v = similar(F) | ||
return LMModel(j_prod!, jt_prod!, F, v, xk, σ, meta, Counters()) | ||
end | ||
|
||
function NLPModels.obj(nlp::LMModel, x::AbstractVector{T}) where{T} | ||
@lencheck nlp.meta.nvar x | ||
increment!(nlp, :neval_obj) | ||
nlp.j_prod!(nlp.xk, x, nlp.v) # v = J(xk)x | ||
nlp.v .+= nlp.F | ||
return ( dot(nlp.v, nlp.v) + nlp.σ * dot(x, x) ) / 2 | ||
end | ||
|
||
function NLPModels.grad!(nlp::LMModel, x::AbstractVector{T}, g::AbstractVector{T}) where{T} | ||
@lencheck nlp.meta.nvar x | ||
@lencheck nlp.meta.nvar g | ||
increment!(nlp, :neval_grad) | ||
nlp.j_prod!(nlp.xk, x, nlp.v) # v = J(xk)x + F | ||
nlp.v .+= nlp.F | ||
nlp.jt_prod!(nlp.xk, nlp.v, g) # g = J^T(xk) v | ||
@. g += nlp.σ .* x | ||
return g | ||
end |
Uh oh!
There was an error while loading. Please reload this page.