Skip to content

Commit 97cb8d6

Browse files
Maximilian-Stefan-Ernstalyst
authored andcommitted
start fixing docs
1 parent 2e6ba32 commit 97cb8d6

15 files changed

Lines changed: 61 additions & 50 deletions

File tree

docs/src/developer/loss.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -60,11 +60,12 @@ graph = @StenoGraph begin
6060
end
6161
6262
partable = ParameterTable(
63+
graph,
6364
latent_vars = latent_vars,
64-
observed_vars = observed_vars,
65-
graph = graph)
65+
observed_vars = observed_vars
66+
)
6667
67-
parameter_indices = get_identifier_indices([:a, :b, :c], partable)
68+
parameter_indices = param_indices([:a, :b, :c], partable)
6869
myridge = Ridge(0.01, parameter_indices)
6970
7071
model = SemFiniteDiff(
@@ -269,4 +270,4 @@ model_ml = SemFiniteDiff(
269270
model_fit = sem_fit(model_ml)
270271
```
271272

272-
If you want to differentiate your own loss functions via automatic differentiation, check out the [AutoDiffSEM](https://github.com/StructuralEquationModels/AutoDiffSEM) package (spoiler allert: it's really easy).
273+
If you want to differentiate your own loss functions via automatic differentiation, check out the [AutoDiffSEM](https://github.com/StructuralEquationModels/AutoDiffSEM) package.

docs/src/developer/sem.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,8 @@ struct SemFiniteDiff{
1111
observed::O
1212
imply::I
1313
loss::L
14-
optimizer::Dend
14+
optimizer::D
15+
end
1516
```
1617

1718
Additionally, we need to define a method to compute at least the objective value, and if you want to use gradient based optimizers (which you most probably will), we need also to define a method to compute the gradient. For example, the respective fallback methods for all `AbstractSemSingle` models are defined as

docs/src/performance/simulation.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,9 +43,10 @@ graph = @StenoGraph begin
4343
end
4444
4545
partable = ParameterTable(
46+
graph,
4647
latent_vars = latent_vars,
47-
observed_vars = observed_vars,
48-
graph = graph)
48+
observed_vars = observed_vars
49+
)
4950
```
5051

5152
```@example swap_observed

docs/src/performance/sorting.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ To automatically reorder your variables in a way that makes this optimization po
1313
We use it as
1414

1515
```julia
16-
sort!(parameter_table)
16+
sort_vars!(parameter_table)
1717

1818
model = Sem(
1919
specification = parameter_table,

docs/src/tutorials/collection/multigroup.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -61,8 +61,8 @@ You can then use the resulting graph to specify an `EnsembleParameterTable`
6161
```@example mg; ansicolor = true
6262
groups = [:Pasteur, :Grant_White]
6363
64-
partable = EnsembleParameterTable(;
65-
graph = graph,
64+
partable = EnsembleParameterTable(
65+
graph,
6666
observed_vars = observed_vars,
6767
latent_vars = latent_vars,
6868
groups = groups)
@@ -71,7 +71,7 @@ partable = EnsembleParameterTable(;
7171
The parameter table can be used to create a `Dict` of RAMMatrices with keys equal to the group names and parameter tables as values:
7272

7373
```@example mg; ansicolor = true
74-
specification = RAMMatrices(partable)
74+
specification = convert(Dict{Symbol, RAMMatrices}, partable)
7575
```
7676

7777
That is, you can asses the group-specific `RAMMatrices` as `specification[:group_name]`.

docs/src/tutorials/constraints/constraints.md

Lines changed: 21 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -16,13 +16,13 @@ graph = @StenoGraph begin
1616
1717
# loadings
1818
ind60 → fixed(1)*x1 + x2 + x3
19-
dem60 → fixed(1)*y1 + y2 + y3 + y4
19+
dem60 → fixed(1)*y1 + label(:λ₂)*y2 + label(:λ₃)*y3 + y4
2020
dem65 → fixed(1)*y5 + y6 + y7 + y8
2121
2222
# latent regressions
2323
ind60 → dem60
2424
dem60 → dem65
25-
ind60 → dem65
25+
ind60 → label(:λₗ)*dem65
2626
2727
# variances
2828
_(observed_vars) ↔ _(observed_vars)
@@ -31,15 +31,15 @@ graph = @StenoGraph begin
3131
# covariances
3232
y1 ↔ y5
3333
y2 ↔ y4 + y6
34-
y3 ↔ y7
35-
y8 ↔ y4 + y6
34+
y3 ↔ label(:y3y7)*y7
35+
y8 ↔ label(:y8y4)*y4 + y6
3636
3737
end
3838
3939
partable = ParameterTable(
40+
graph,
4041
latent_vars = latent_vars,
41-
observed_vars = observed_vars,
42-
graph = graph)
42+
observed_vars = observed_vars)
4343
4444
data = example_data("political_democracy")
4545
@@ -64,17 +64,19 @@ Let's introduce some constraints:
6464

6565
(Of course those constaints only serve an illustratory purpose.)
6666

67-
We first need to get the indices of the respective parameters that are invoved in the constraints. We can look up their labels in the output above, and retrieve their indices as
67+
We first need to get the indices of the respective parameters that are invoved in the constraints.
68+
We can look up their labels in the output above, and retrieve their indices as
6869

6970
```@example constraints
70-
parameter_indices = get_identifier_indices([:θ_29, :θ_30, :θ_3, :θ_4, :θ_11], model)
71+
parind = param_indices(model)
72+
parind[:y3y7] # 29
7173
```
7274

73-
The bound constraint is easy to specify: Just give a vector of upper or lower bounds that contains the bound for each parameter. In our example, only parameter number 11 has an upper bound, and the number of total parameters is `n_par(model) = 31`, so we define
75+
The bound constraint is easy to specify: Just give a vector of upper or lower bounds that contains the bound for each parameter. In our example, only the parameter labeled `:λₗ` has an upper bound, and the number of total parameters is `n_par(model) = 31`, so we define
7476

7577
```@example constraints
7678
upper_bounds = fill(Inf, 31)
77-
upper_bounds[11] = 0.5
79+
upper_bounds[parind[:λₗ]] = 0.5
7880
```
7981

8082
The equailty and inequality constraints have to be reformulated to be of the form `x = 0` or `x ≤ 0`:
@@ -84,6 +86,8 @@ The equailty and inequality constraints have to be reformulated to be of the for
8486
Now they can be defined as functions of the parameter vector:
8587

8688
```@example constraints
89+
parind[:y3y7] # 29
90+
parind[:y8y4] # 30
8791
# θ[29] + θ[30] - 1 = 0.0
8892
function eq_constraint(θ, gradient)
8993
if length(gradient) > 0
@@ -94,6 +98,8 @@ function eq_constraint(θ, gradient)
9498
return θ[29] + θ[30] - 1
9599
end
96100
101+
parind[:λ₂] # 3
102+
parind[:λ₃] # 4
97103
# θ[3] - θ[4] - 0.1 ≤ 0
98104
function ineq_constraint(θ, gradient)
99105
if length(gradient) > 0
@@ -109,7 +115,7 @@ If the algorithm needs gradients at an iteration, it will pass the vector `gradi
109115
With `if length(gradient) > 0` we check if the algorithm needs gradients, and if it does, we fill the `gradient` vector with the gradients
110116
of the constraint w.r.t. the parameters.
111117

112-
In NLopt, vector-valued constraints are also possible, but we refer to the documentation fot that.
118+
In NLopt, vector-valued constraints are also possible, but we refer to the documentation for that.
113119

114120
### Fit the model
115121

@@ -153,10 +159,11 @@ As you can see, the optimizer converged (`:XTOL_REACHED`) and investigating the
153159

154160
```@example constraints
155161
update_partable!(
156-
partable,
157-
model_fit_constrained,
162+
partable,
163+
:estimate_constr,
164+
params(model_fit_constrained),
158165
solution(model_fit_constrained),
159-
:estimate_constr)
166+
)
160167
161168
sem_summary(partable)
162169
```

docs/src/tutorials/construction/build_by_parts.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,9 +39,9 @@ graph = @StenoGraph begin
3939
end
4040
4141
partable = ParameterTable(
42+
graph,
4243
latent_vars = latent_vars,
43-
observed_vars = observed_vars,
44-
graph = graph)
44+
observed_vars = observed_vars)
4545
```
4646

4747
Now, we construct the different parts:

docs/src/tutorials/construction/outer_constructor.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ model = Sem(
7474
specification = partable,
7575
data = data,
7676
imply = RAMSymbolic,
77-
loss = SemWLS
77+
loss = SemWLS,
7878
wls_weight_matrix = W
7979
)
8080

docs/src/tutorials/first_model.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -83,9 +83,9 @@ We then use this graph to define a `ParameterTable` object
8383

8484
```@example high_level; ansicolor = true
8585
partable = ParameterTable(
86+
graph,
8687
latent_vars = latent_vars,
87-
observed_vars = observed_vars,
88-
graph = graph)
88+
observed_vars = observed_vars)
8989
```
9090

9191
load the example data

docs/src/tutorials/inspection/inspection.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -31,9 +31,9 @@ graph = @StenoGraph begin
3131
end
3232
3333
partable = ParameterTable(
34+
graph,
3435
latent_vars = latent_vars,
35-
observed_vars = observed_vars,
36-
graph = graph)
36+
observed_vars = observed_vars)
3737
3838
data = example_data("political_democracy")
3939
@@ -87,8 +87,8 @@ We can also update the `ParameterTable` object with other information via [`upda
8787
se_bs = se_bootstrap(model_fit; n_boot = 20)
8888
se_he = se_hessian(model_fit)
8989
90-
update_partable!(partable, model_fit, se_he, :se_hessian)
91-
update_partable!(partable, model_fit, se_bs, :se_bootstrap)
90+
update_partable!(partable, :se_hessian, params(model_fit), se_he)
91+
update_partable!(partable, :se_bootstrap, params(model_fit), se_bs)
9292
9393
sem_summary(partable)
9494
```
@@ -130,7 +130,7 @@ df
130130
minus2ll
131131
n_man
132132
n_obs
133-
n_par
133+
nparams
134134
p_value
135135
RMSEA
136136
```

0 commit comments

Comments
 (0)