Skip to content

Commit ef589d4

Browse files
author
Alexey Stukalov
committed
fixup SemEns: docs
1 parent 26c7c94 commit ef589d4

3 files changed

Lines changed: 46 additions & 34 deletions

File tree

src/implied/empty.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,8 @@ Empty placeholder for models that don't need an implied part.
1313
- `specification`: either a `RAMMatrices` or `ParameterTable` object
1414
1515
# Examples
16-
A multigroup model with ridge regularization could be specified as a `SemEnsemble` with one
17-
model per group and an additional model with `ImpliedEmpty` and `SemRidge` for the regularization part.
16+
A multigroup model with ridge regularization could be specified as a `Sem` with one
17+
SEM term (`SemLoss`) per group and an additional `SemRidge` regularization term.
1818
1919
# Extended help
2020

src/objective_gradient_hessian.jl

Lines changed: 22 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -24,26 +24,33 @@ is_hessian_required(::EvaluationTargets{<:Any, <:Any, H}) where {H} = H
2424
(targets::EvaluationTargets)(arg_tuple::Tuple) = targets(arg_tuple...)
2525

2626
"""
27-
evaluate!(objective, gradient, hessian [, lossfun], model, params)
27+
evaluate!(objective, gradient, hessian, loss::AbstractLoss, params)
28+
evaluate!(objective, gradient, hessian, model::AbstractSem, params)
2829
2930
Evaluates the objective, gradient, and/or Hessian at the given parameter vector.
30-
If a loss function is passed, only this specific loss function is evaluated, otherwise,
31-
the sum of all loss functions in the model is evaluated.
31+
32+
If a single loss term (`loss`) is passed, only this specific term is evaluated,
33+
otherwise, if the entire SEM `model` is passed, the weighted sum of all loss terms
34+
in the model is evaluated.
3235
3336
If objective, gradient or hessian are `nothing`, they are not evaluated.
3437
For example, since many numerical optimization algorithms don't require a Hessian,
35-
the computation will be turned off by setting `hessian` to `nothing`.
38+
its computation will be turned off by setting `hessian` to `nothing`.
39+
40+
During the evaluation, the internal state of the loss term or of the model
41+
could be modified.
3642
3743
# Arguments
3844
- `objective`: a Number if the objective should be evaluated, otherwise `nothing`
3945
- `gradient`: a pre-allocated vector the gradient should be written to, otherwise `nothing`
4046
- `hessian`: a pre-allocated matrix the Hessian should be written to, otherwise `nothing`
41-
- `lossfun::SemLossFunction`: loss function to evaluate
47+
- `loss::AbstractLoss`: loss function to evaluate
4248
- `model::AbstractSem`: model to evaluate
4349
- `params`: vector of parameters
4450
4551
# Implementing a new loss function
46-
To implement a new loss function, a new method for `evaluate!` has to be defined.
52+
To implement a new loss (subtype of `SemLoss` for SEM terms, or of `AbstractLoss` for
53+
regularization terms), a new method for `evaluate!` has to be defined.
4754
This is explained in the online documentation on [Custom loss functions](@ref).
4855
"""
4956
function evaluate! end
@@ -191,43 +198,33 @@ hessian!(model::AbstractSemOrLoss, par, model) =
191198
"""
192199
objective!(model::AbstractSem, params)
193200
194-
Returns the objective value at `params`.
195-
The model object can be modified.
201+
Calculates the objective value at `params`.
196202
197-
# Implementation
198-
To implement a new `SemImplied` or `SemLossFunction` subtype, you need to add a method for
199-
objective!(newtype::MyNewType, params, model::AbstractSemSingle)
203+
The model object can be modified during calculation.
200204
201-
To implement a new `AbstractSem` subtype, you need to add a method for
202-
objective!(model::MyNewType, params)
205+
See also [`evaluate!`](@ref).
203206
"""
204207
function objective! end
205208

206209
"""
207210
gradient!(gradient, model::AbstractSem, params)
208211
209-
Writes the gradient value at `params` to `gradient`.
212+
Calculates the model's gradient at `params` and writes it to `gradient`.
210213
211-
# Implementation
212-
To implement a new `SemImplied` or `SemLossFunction` type, you can add a method for
213-
gradient!(newtype::MyNewType, params, model::AbstractSemSingle)
214+
The model object can be modified during calculation.
214215
215-
To implement a new `AbstractSem` subtype, you can add a method for
216-
gradient!(gradient, model::MyNewType, params)
216+
See also [`evaluate!`](@ref).
217217
"""
218218
function gradient! end
219219

220220
"""
221221
hessian!(hessian, model::AbstractSem, params)
222222
223-
Writes the hessian value at `params` to `hessian`.
223+
Calculates the model's hessian at `params` and writes it to `hessian`.
224224
225-
# Implementation
226-
To implement a new `SemImplied` or `SemLossFunction` type, you can add a method for
227-
hessian!(newtype::MyNewType, params, model::AbstractSemSingle)
225+
The model object can be modified during calculation.
228226
229-
To implement a new `AbstractSem` subtype, you can add a method for
230-
hessian!(hessian, model::MyNewType, params)
227+
See also [`evaluate!`](@ref).
231228
"""
232229
function hessian! end
233230

src/types.jl

Lines changed: 22 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,8 @@ abstract type SemOptimizer{E} end
3737
abstract type SemOptimizerResult{O <: SemOptimizer} end
3838

3939
"""
40+
abstract type SemObserved
41+
4042
Supertype of all objects that can serve as the observed field of a SEM.
4143
Pre-processes data and computes sufficient statistics for example.
4244
If you have a special kind of data, e.g. ordinal data, you should implement a subtype of SemObserved.
@@ -55,7 +57,7 @@ abstract type SemImplied end
5557
abstract type SemImpliedSymbolic <: SemImplied end
5658

5759
"""
58-
Sem(;observed = SemObservedData, implied = RAM, loss = SemML, kwargs...)
60+
abstract type SemLoss{O <: SemObserved, I <: SemImplied} <: AbstractLoss
5961
6062
The base type for calculating the loss of the implied SEM model when explaining the observed data.
6163
@@ -65,7 +67,11 @@ All subtypes of `SemLoss` should have the following fields:
6567
"""
6668
abstract type SemLoss{O <: SemObserved, I <: SemImplied} <: AbstractLoss end
6769

68-
"Most abstract supertype for all SEMs"
70+
"""
71+
abstract type AbstractSem
72+
73+
The base type for all SEMs.
74+
"""
6975
abstract type AbstractSem end
7076

7177
"""
@@ -83,9 +89,12 @@ end
8389
"""
8490
Sem(loss_terms...; [params], kwargs...)
8591
86-
SEM model (including model ensembles) that combines all the data, implied SEM structure
87-
and regularization terms and implements the calculation of their weighted sum, as well as its
88-
gradient and (optionally) Hessian.
92+
SEM model (including multi-group SEMs) that combines all the data, implied SEM structure
93+
and regularization terms.
94+
95+
All terms of the `Sem` object share the same set of parameters.
96+
`Sem` implements the calculation of the weighted sum of its terms (the *objective*
97+
function), as well as the gradient and Hessian of this sum.
8998
9099
# Arguments
91100
- `loss_terms...`: [`AbstractLoss`](@ref) objects, including SEM losses ([`SemLoss`](@ref)),
@@ -107,8 +116,12 @@ end
107116
"""
108117
SemFiniteDiff(model::AbstractSem)
109118
110-
A wrapper around [`Sem`](@ref) that substitutes dedicated evaluation of gradient and hessian with
111-
finite difference approximation.
119+
A wrapper around [`AbstractSem`](@ref) that substitutes dedicated evaluation of gradient and
120+
hessian with finite difference approximation.
121+
122+
`SemFiniteDiff` could be used to enable gradient-based optimization of the SEM models
123+
when the dedicated calculation of gradient and hessian are not available.
124+
For approximation, it uses the *FiniteDiff.jl* package.
112125
113126
# Arguments
114127
- `model::Sem`: the SEM model to wrap
@@ -126,6 +139,8 @@ struct SemLossFiniteDiff{O, I, L <: SemLoss{O, I}} <: SemLoss{O, I}
126139
end
127140

128141
"""
142+
abstract type SemSpecification end
143+
129144
Base type for all SEM specifications.
130145
"""
131146
abstract type SemSpecification end

0 commit comments

Comments
 (0)