Skip to content

Commit 89e588f

Browse files
Add Documenter docs and docstrings (#300)
1 parent 4e59ab2 commit 89e588f

37 files changed

Lines changed: 1270 additions & 328 deletions

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name = "ITensorNetworks"
22
uuid = "2919e153-833c-4bdc-8836-1ea460a35fc7"
3-
version = "0.15.23"
3+
version = "0.15.24"
44
authors = ["Matthew Fishman <mfishman@flatironinstitute.org>, Joseph Tindall <jtindall@flatironinstitute.org> and contributors"]
55

66
[workspace]

docs/Project.toml

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,13 @@
11
[deps]
22
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
3+
Graphs = "86223c79-3864-5bf0-83f7-82e725a168b6"
34
ITensorFormatter = "b6bf39f1-c9d3-4bad-aad8-593d802f65fd"
45
ITensorNetworks = "2919e153-833c-4bdc-8836-1ea460a35fc7"
6+
ITensors = "9136182c-28ba-11e9-034c-db9fb085ebd5"
57
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"
8+
NamedGraphs = "678767b0-92e7-4007-89e4-4527a8725b19"
9+
OMEinsumContractionOrders = "6f22d1fd-8eed-4bb7-9776-e7d684900715"
10+
TensorOperations = "6aa20fa7-93e2-5fca-9bc0-fbd0db3c71a2"
611

712
[sources.ITensorNetworks]
813
path = ".."
@@ -11,4 +16,8 @@ path = ".."
1116
Documenter = "1.10"
1217
ITensorFormatter = "0.2.27"
1318
ITensorNetworks = "0.15"
19+
ITensors = "0.9"
1420
Literate = "2.20.1"
21+
NamedGraphs = "0.8.2"
22+
OMEinsumContractionOrders = "1.2.2"
23+
TensorOperations = "5.5"

docs/make.jl

Lines changed: 28 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,26 @@
11
using Documenter: Documenter, DocMeta, deploydocs, makedocs
2+
using Graphs: Graphs
23
using ITensorFormatter: ITensorFormatter
34
using ITensorNetworks: ITensorNetworks
5+
using ITensors: ITensors
6+
using LinearAlgebra: LinearAlgebra
7+
using OMEinsumContractionOrders
8+
using TensorOperations
49

510
DocMeta.setdocmeta!(
6-
ITensorNetworks, :DocTestSetup, :(using ITensorNetworks); recursive = true
11+
ITensorNetworks,
12+
:DocTestSetup,
13+
quote
14+
using Graphs: dst, edges, src, vertices
15+
using ITensorNetworks
16+
using ITensorNetworks: TreeTensorNetwork, expect, loginner, mps, orthogonalize,
17+
random_mps, random_ttn, siteinds, truncate, ttn
18+
using ITensors: inner
19+
using LinearAlgebra: norm, normalize
20+
using OMEinsumContractionOrders
21+
using TensorOperations
22+
end;
23+
recursive = true
724
)
825

926
ITensorFormatter.make_index!(pkgdir(ITensorNetworks))
@@ -17,8 +34,16 @@ makedocs(;
1734
edit_link = "main",
1835
assets = ["assets/favicon.ico", "assets/extras.css"]
1936
),
20-
pages = ["Home" => "index.md", "Reference" => "reference.md"],
21-
warnonly = true
37+
pages = [
38+
"Home" => "index.md",
39+
"Manual" => [
40+
"ITensor Networks" => "itensor_networks.md",
41+
"Tree Tensor Networks" => "tree_tensor_networks.md",
42+
"Computing Properties" => "computing_properties.md",
43+
"Solvers" => "solvers.md",
44+
],
45+
"API Reference" => "reference.md",
46+
]
2247
)
2348

2449
deploydocs(;

docs/src/computing_properties.md

Lines changed: 108 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,108 @@
1+
# Computing Properties
2+
3+
```@setup main
4+
using Graphs: vertices
5+
using ITensorNetworks: expect, inner, loginner, normalize, random_ttn, siteinds
6+
using LinearAlgebra: norm
7+
using NamedGraphs.NamedGraphGenerators: named_grid
8+
9+
g = named_grid((4,))
10+
s = siteinds("S=1/2", g)
11+
phi = normalize(random_ttn(s; link_space = 2))
12+
psi = normalize(random_ttn(s; link_space = 2))
13+
x = normalize(random_ttn(s; link_space = 2))
14+
y = normalize(random_ttn(s; link_space = 2))
15+
v = first(vertices(psi))
16+
```
17+
18+
## Inner Products and Norms
19+
20+
For general `ITensorNetwork` states, inner products are computed by constructing and
21+
contracting the combined bra–ket network. The default algorithm is **belief propagation**
22+
(`alg="bp"`), which is efficient for large and loopy networks. Use `alg="exact"` for
23+
exact contraction (only practical for small networks or trees).
24+
25+
```@example main
26+
z = inner(phi, psi) # ⟨ϕ|ψ⟩
27+
n = norm(psi) # √⟨ψ|ψ⟩
28+
```
29+
30+
For numerically large tensor networks where the inner product would overflow, use the
31+
logarithmic variant:
32+
33+
```@example main
34+
logz = loginner(phi, psi) # log(⟨ϕ|ψ⟩) (numerically stable)
35+
```
36+
37+
For `TreeTensorNetwork`, specialised exact methods exploit the tree structure directly
38+
without belief propagation:
39+
40+
```@example main
41+
z = inner(x, y) # ⟨x|y⟩ via DFS contraction
42+
n = norm(psi) # uses ortho_region if available for efficiency
43+
```
44+
45+
```@docs; canonical=false
46+
ITensors.inner(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
47+
ITensors.inner(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
48+
ITensorNetworks.loginner
49+
ITensors.inner(::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork)
50+
ITensors.inner(::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork)
51+
```
52+
53+
## Normalization
54+
55+
`normalize` rescales all tensors in the network by the same factor so that `norm(ψ) ≈ 1`.
56+
For `TreeTensorNetwork`, the normalisation is applied directly at the orthogonality centre.
57+
58+
```@example main
59+
psi = normalize(psi) # exact (default)
60+
psi_bp = normalize(psi; alg = "bp") # belief-propagation (for large loopy networks)
61+
```
62+
63+
```@docs; canonical=false
64+
LinearAlgebra.normalize(::ITensorNetworks.AbstractITensorNetwork)
65+
```
66+
67+
## Expectation Values
68+
69+
### General `ITensorNetwork`
70+
71+
For arbitrary (possibly loopy) tensor networks, expectation values are computed via
72+
**belief propagation** by default. This is approximate for loopy networks but can be made
73+
exact with `alg="exact"` (at exponential cost).
74+
75+
```@example main
76+
# Expectation of "Sz" at every vertex
77+
sz = expect(psi, "Sz")
78+
79+
# Selected vertices only
80+
sz = expect(psi, "Sz", [(1,), (3,)])
81+
82+
# Exact contraction
83+
sz = expect(psi, "Sz"; alg = "exact")
84+
```
85+
86+
```@docs; canonical=false
87+
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::String)
88+
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::String, ::Any)
89+
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::ITensors.Ops.Op)
90+
```
91+
92+
### `TreeTensorNetwork`
93+
94+
For TTN/MPS states, a specialised exact method exploiting successive orthogonalisations is
95+
available. The operator name is passed as the **first** argument (note the different
96+
argument order from the general form above):
97+
98+
```@example main
99+
sz = expect("Sz", psi) # all sites
100+
sz = expect("Sz", psi; vertices = [(1,), (3,)]) # selected sites
101+
```
102+
103+
This is more efficient than the belief propagation approach for tree-structured networks
104+
because it reuses the orthogonal gauge.
105+
106+
```@docs; canonical=false
107+
ITensorNetworks.expect(::String, ::ITensorNetworks.AbstractTreeTensorNetwork)
108+
```

docs/src/itensor_networks.md

Lines changed: 111 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,111 @@
1+
# ITensor Networks
2+
3+
## The `ITensorNetwork` Type
4+
5+
An `ITensorNetwork` is the central data structure of this package. It represents a
6+
collection of [`ITensor`](https://itensor.github.io/ITensors.jl/stable/)s arranged on a
7+
graph, where each edge encodes a shared (contracted) index between the neighboring tensors.
8+
9+
Key facts:
10+
11+
- The underlying graph is a [`NamedGraph`](https://github.com/ITensor/NamedGraphs.jl), so
12+
vertices can be any hashable Julia value: integers, tuples, strings, etc.
13+
- Each vertex holds exactly one `ITensor`.
14+
- Edges and link indices are either inferred from shared `Index` objects (when constructing
15+
from a collection of `ITensor`s) or inserted automatically (when constructing from an
16+
`IndsNetwork`).
17+
18+
## Construction
19+
20+
The most common entry point is an `IndsNetwork` — a graph whose vertices and edges carry
21+
`Index` objects. Generate site indices with the `siteinds` function which takes a site
22+
type string (such as "S=1/2" or "Electron") and a NamedGraph. The NamedGraph can be
23+
generated from functions such as `named_grid`, `named_comb_tree`, etc. from the NamedGraphs.jl
24+
`NamedGraphGenerators` module:
25+
26+
```@example main
27+
using Graphs: edges, ne, neighbors, nv, vertices
28+
using ITensorNetworks: ITensorNetwork, add, linkinds, siteinds
29+
using ITensors: Index, ITensor
30+
using NamedGraphs.NamedGraphGenerators: named_grid
31+
32+
# 3×3 square-lattice tensor network
33+
g = named_grid((3, 3))
34+
s = siteinds("S=1/2", g) # one spin-½ Index per vertex
35+
36+
# Zero-initialized, bond dimension 2
37+
ψ = ITensorNetwork(s; link_space = 2)
38+
39+
# Product state — every site in the |↑⟩ state
40+
ψ = ITensorNetwork("Up", s)
41+
42+
# Staggered initialization with a vertex-dependent function
43+
ψ = ITensorNetwork(v -> isodd(sum(v)) ? "Up" : "Dn", s)
44+
```
45+
46+
When you already have `ITensor`s in hand, edges are inferred automatically from shared
47+
indices:
48+
49+
```@example main
50+
i, j, k = Index(2, "i"), Index(2, "j"), Index(2, "k")
51+
A, B, C = ITensor(i, j), ITensor(j, k), ITensor(k)
52+
53+
tn = ITensorNetwork([A, B, C]) # integer vertices 1, 2, 3
54+
tn = ITensorNetwork(["A", "B", "C"], [A, B, C]) # named vertices
55+
tn = ITensorNetwork(["A" => A, "B" => B, "C" => C]) # from pairs
56+
```
57+
58+
```@docs; canonical=false
59+
ITensorNetworks.ITensorNetwork
60+
```
61+
62+
## Accessing Data
63+
64+
```@example main
65+
v = (1, 2)
66+
T = ψ[v] # ITensor at vertex (1,2)
67+
ψ[v] = T # replace tensor at a vertex
68+
vertices(ψ) # all vertex labels
69+
edges(ψ) # all edges
70+
neighbors(ψ, v) # neighbouring vertices of v
71+
nv(ψ), ne(ψ) # vertex / edge counts
72+
siteinds(ψ) # IndsNetwork of site (physical) indices
73+
linkinds(ψ) # IndsNetwork of bond (virtual) indices
74+
```
75+
76+
## Adding Two `ITensorNetwork`s
77+
78+
Two networks with the same graph and site indices can be added. The result represents the
79+
tensor network `ψ₁ + ψ₂` and has bond dimension equal to the **sum** of the two input bond
80+
dimensions. Individual bonds of the result can be recompressed with `truncate(tn, edge)`.
81+
For `TreeTensorNetwork`, the no-argument form `truncate(ttn; kwargs...)` sweeps and
82+
recompresses all bonds at once.
83+
84+
```@example main
85+
ψ1, ψ2 = ψ, ψ
86+
ψ12 = add(ψ1, ψ2)
87+
ψ12 = ψ1 + ψ2
88+
```
89+
90+
```@docs; canonical=false
91+
ITensorNetworks.add(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
92+
```
93+
94+
## Bond Truncation
95+
96+
A single bond (edge) of any `ITensorNetwork` can be truncated by SVD:
97+
98+
```@example main
99+
edge = (1, 2) => (1, 3)
100+
ψ12 = truncate(ψ12, (1, 2) => (1, 3)) # truncate the bond between vertices (1,2) and (1,3)
101+
ψ12 = truncate(ψ12, edge) # or pass an AbstractEdge directly
102+
```
103+
104+
Truncation parameters (`cutoff`, `maxdim`, `mindim`, …) are forwarded to `ITensors.svd`.
105+
For a `TreeTensorNetwork`, the sweep-based `truncate(ttn; kwargs...)` is usually more
106+
convenient because it recompresses the entire network at once with controlled errors;
107+
see the [Tree Tensor Networks](@ref) page.
108+
109+
```@docs; canonical=false
110+
Base.truncate(::ITensorNetworks.AbstractITensorNetwork, ::Graphs.AbstractEdge)
111+
```

docs/src/reference.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,7 @@
1-
# Reference
1+
# API Reference
2+
3+
Complete listing of all documented public functions and types in ITensorNetworks.jl,
4+
ITensorNetworks.ModelNetworks, and ITensorNetworks.ModelHamiltonians.
25

36
```@autodocs
47
Modules = [ITensorNetworks, ITensorNetworks.ModelNetworks, ITensorNetworks.ModelHamiltonians]

docs/src/solvers.md

Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
# Solvers
2+
3+
ITensorNetworks.jl provides sweep-based solvers for variational problems on tree tensor
4+
networks. All solvers follow the same high-level pattern:
5+
6+
1. Start from an initial `ITensorNetwork` guess.
7+
2. Sweep over the network, solving a small local problem at each site or pair of sites.
8+
3. After each local solve, truncate the updated bond to control bond dimension growth.
9+
4. Repeat for `nsweeps` sweeps.
10+
11+
## Eigenvalue Problems — `eigsolve` / `dmrg`
12+
13+
[`eigsolve`](@ref ITensorNetworks.eigsolve) finds the lowest eigenvalue and corresponding
14+
eigenvector of an operator (e.g. a Hamiltonian) using a DMRG-like
15+
variational sweep algorithm.
16+
[`dmrg`](@ref ITensorNetworks.dmrg) is an alias for `eigsolve`.
17+
18+
```@example main
19+
using ITensorNetworks: dmrg, dst, edges, normalize, random_ttn, siteinds, src, ttn
20+
using ITensors: OpSum
21+
using NamedGraphs.NamedGraphGenerators: named_comb_tree
22+
23+
# Build a Heisenberg Hamiltonian on a comb tree
24+
g = named_comb_tree((3, 2))
25+
s = siteinds("S=1/2", g)
26+
H = let h = OpSum()
27+
for e in edges(g)
28+
h += 0.5, "S+", src(e), "S-", dst(e)
29+
h += 0.5, "S-", src(e), "S+", dst(e)
30+
h += "Sz", src(e), "Sz", dst(e)
31+
end
32+
ttn(h, s)
33+
end
34+
35+
# Random initial state (normalise first!)
36+
psi0 = normalize(random_ttn(s; link_space = 2))
37+
38+
# Run DMRG
39+
energy, psi = dmrg(H, psi0;
40+
nsweeps = 2,
41+
nsites = 2,
42+
factorize_kwargs = (; cutoff = 1e-10, maxdim = 10),
43+
outputlevel = 1,
44+
)
45+
```
46+
47+
```@docs
48+
ITensorNetworks.eigsolve
49+
ITensorNetworks.dmrg
50+
```
51+
52+
## Time Evolution — `time_evolve`
53+
54+
```@docs
55+
ITensorNetworks.time_evolve
56+
```

0 commit comments

Comments
 (0)