Skip to content
Merged
Show file tree
Hide file tree
Changes from 15 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "ITensorNetworks"
uuid = "2919e153-833c-4bdc-8836-1ea460a35fc7"
version = "0.15.23"
version = "0.15.24"
authors = ["Matthew Fishman <mfishman@flatironinstitute.org>, Joseph Tindall <jtindall@flatironinstitute.org> and contributors"]

[workspace]
Expand Down
9 changes: 9 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Graphs = "86223c79-3864-5bf0-83f7-82e725a168b6"
ITensorFormatter = "b6bf39f1-c9d3-4bad-aad8-593d802f65fd"
ITensorNetworks = "2919e153-833c-4bdc-8836-1ea460a35fc7"
ITensors = "9136182c-28ba-11e9-034c-db9fb085ebd5"
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"
NamedGraphs = "678767b0-92e7-4007-89e4-4527a8725b19"
OMEinsumContractionOrders = "6f22d1fd-8eed-4bb7-9776-e7d684900715"
TensorOperations = "6aa20fa7-93e2-5fca-9bc0-fbd0db3c71a2"

[sources.ITensorNetworks]
path = ".."
Expand All @@ -11,4 +16,8 @@ path = ".."
Documenter = "1.10"
ITensorFormatter = "0.2.27"
ITensorNetworks = "0.15"
ITensors = "0.9"
Literate = "2.20.1"
NamedGraphs = "0.8.2"
OMEinsumContractionOrders = "1.2.2"
TensorOperations = "5.5"
31 changes: 28 additions & 3 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,9 +1,26 @@
using Documenter: Documenter, DocMeta, deploydocs, makedocs
using Graphs: Graphs
using ITensorFormatter: ITensorFormatter
using ITensorNetworks: ITensorNetworks
using ITensors: ITensors
using LinearAlgebra: LinearAlgebra
using OMEinsumContractionOrders
using TensorOperations

DocMeta.setdocmeta!(
ITensorNetworks, :DocTestSetup, :(using ITensorNetworks); recursive = true
ITensorNetworks,
:DocTestSetup,
quote
using Graphs: dst, edges, src, vertices
using ITensorNetworks
using ITensorNetworks: TreeTensorNetwork, expect, loginner, mps, orthogonalize,
random_mps, random_ttn, siteinds, truncate, ttn
using ITensors: inner
using LinearAlgebra: norm, normalize
using OMEinsumContractionOrders
using TensorOperations
end;
recursive = true
)

ITensorFormatter.make_index!(pkgdir(ITensorNetworks))
Expand All @@ -17,8 +34,16 @@ makedocs(;
edit_link = "main",
assets = ["assets/favicon.ico", "assets/extras.css"]
),
pages = ["Home" => "index.md", "Reference" => "reference.md"],
warnonly = true
pages = [
"Home" => "index.md",
"Manual" => [
"ITensor Networks" => "itensor_networks.md",
"Tree Tensor Networks" => "tree_tensor_networks.md",
"Computing Properties" => "computing_properties.md",
"Solvers" => "solvers.md",
],
"API Reference" => "reference.md",
]
)

deploydocs(;
Expand Down
109 changes: 109 additions & 0 deletions docs/src/computing_properties.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
# Computing Properties

```@setup main
using NamedGraphs.NamedGraphGenerators: named_grid
using ITensorNetworks: siteinds, random_ttn, normalize, loginner, expect, inner
using ITensors: Op
using Graphs: vertices
using LinearAlgebra: norm

g = named_grid((4,))
s = siteinds("S=1/2", g)
phi = normalize(random_ttn(s; link_space = 2))
psi = normalize(random_ttn(s; link_space = 2))
x = normalize(random_ttn(s; link_space = 2))
y = normalize(random_ttn(s; link_space = 2))
v = first(vertices(psi))
```

## Inner Products and Norms

For general `ITensorNetwork` states, inner products are computed by constructing and
contracting the combined bra–ket network. The default algorithm is **belief propagation**
(`alg="bp"`), which is efficient for large and loopy networks. Use `alg="exact"` for
exact contraction (only practical for small networks or trees).

```@example main
z = inner(phi, psi) # ⟨ϕ|ψ⟩
n = norm(psi) # √⟨ψ|ψ⟩
```

For numerically large tensor networks where the inner product would overflow, use the
logarithmic variant:

```@example main
logz = loginner(phi, psi) # log(⟨ϕ|ψ⟩) (numerically stable)
```

For `TreeTensorNetwork`, specialised exact methods exploit the tree structure directly
without belief propagation:

```@example main
z = inner(x, y) # ⟨x|y⟩ via DFS contraction
n = norm(psi) # uses ortho_region if available for efficiency
```

```@docs; canonical=false
ITensors.inner(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
ITensors.inner(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
ITensorNetworks.loginner
ITensors.inner(::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork)
ITensors.inner(::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork)
```

## Normalization

`normalize` rescales all tensors in the network by the same factor so that `norm(ψ) ≈ 1`.
For `TreeTensorNetwork`, the normalisation is applied directly at the orthogonality centre.

```@example main
psi = normalize(psi) # exact (default)
psi_bp = normalize(psi; alg = "bp") # belief-propagation (for large loopy networks)
```

```@docs; canonical=false
LinearAlgebra.normalize(::ITensorNetworks.AbstractITensorNetwork)
```

## Expectation Values

### General `ITensorNetwork`

For arbitrary (possibly loopy) tensor networks, expectation values are computed via
**belief propagation** by default. This is approximate for loopy networks but can be made
exact with `alg="exact"` (at exponential cost).

```@example main
# Expectation of "Sz" at every vertex
sz = expect(psi, "Sz")

# Selected vertices only
sz = expect(psi, "Sz", [(1,), (3,)])

# Exact contraction
sz = expect(psi, "Sz"; alg = "exact")
```

```@docs; canonical=false
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::String)
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::String, ::Any)
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::ITensors.Ops.Op)
```

### `TreeTensorNetwork`

For TTN/MPS states, a specialised exact method exploiting successive orthogonalisations is
available. The operator name is passed as the **first** argument (note the different
argument order from the general form above):

```@example main
sz = expect("Sz", psi) # all sites
sz = expect("Sz", psi; vertices = [(1,), (3,)]) # selected sites
```

This is more efficient than the belief propagation approach for tree-structured networks
because it reuses the orthogonal gauge.

```@docs; canonical=false
ITensorNetworks.expect(::String, ::ITensorNetworks.AbstractTreeTensorNetwork)
```
111 changes: 111 additions & 0 deletions docs/src/itensor_networks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
# ITensor Networks

## The `ITensorNetwork` Type

An `ITensorNetwork` is the central data structure of this package. It represents a
collection of [`ITensor`](https://itensor.github.io/ITensors.jl/stable/)s arranged on a
graph, where each edge encodes a shared (contracted) index between the neighboring tensors.

Key facts:

- The underlying graph is a [`NamedGraph`](https://github.com/ITensor/NamedGraphs.jl), so
vertices can be any hashable Julia value: integers, tuples, strings, etc.
- Each vertex holds exactly one `ITensor`.
- Edges and link indices are either inferred from shared `Index` objects (when constructing
from a collection of `ITensor`s) or inserted automatically (when constructing from an
`IndsNetwork`).

## Construction

The most common entry point is an `IndsNetwork` — a graph whose vertices and edges carry
`Index` objects. Generate site indices with the `siteinds` function which takes a site
type string (such as "S=1/2" or "Electron") and a NamedGraph. The NamedGraph can be
generated from functions such as `named_grid`, `named_comb_tree`, etc. from the NamedGraphs.jl
`NamedGraphGenerators` module:

```@example main
using NamedGraphs.NamedGraphGenerators: named_grid
using ITensorNetworks: ITensorNetwork, siteinds, linkinds, add
using ITensors: ITensor, Index
using Graphs: vertices, edges, neighbors, nv, ne

# 3×3 square-lattice tensor network
g = named_grid((3, 3))
s = siteinds("S=1/2", g) # one spin-½ Index per vertex

# Zero-initialized, bond dimension 2
ψ = ITensorNetwork(s; link_space = 2)

# Product state — every site in the |↑⟩ state
ψ = ITensorNetwork("Up", s)

# Staggered initialization with a vertex-dependent function
ψ = ITensorNetwork(v -> isodd(sum(v)) ? "Up" : "Dn", s)
```

When you already have `ITensor`s in hand, edges are inferred automatically from shared
indices:

```@example main
i, j, k = Index(2,"i"), Index(2,"j"), Index(2,"k")
A, B, C = ITensor(i,j), ITensor(j,k), ITensor(k)

tn = ITensorNetwork([A, B, C]) # integer vertices 1, 2, 3
tn = ITensorNetwork(["A","B","C"], [A, B, C]) # named vertices
tn = ITensorNetwork(["A"=>A, "B"=>B, "C"=>C]) # from pairs
```

```@docs; canonical=false
ITensorNetworks.ITensorNetwork
```

## Accessing Data

```@example main
v = (1, 2)
T = ψ[v] # ITensor at vertex (1,2)
ψ[v] = T # replace tensor at a vertex
vertices(ψ) # all vertex labels
edges(ψ) # all edges
neighbors(ψ, v) # neighbouring vertices of v
nv(ψ), ne(ψ) # vertex / edge counts
siteinds(ψ) # IndsNetwork of site (physical) indices
linkinds(ψ) # IndsNetwork of bond (virtual) indices
```

## Adding Two `ITensorNetwork`s

Two networks with the same graph and site indices can be added. The result represents the
tensor network `ψ₁ + ψ₂` and has bond dimension equal to the **sum** of the two input bond
dimensions. Individual bonds of the result can be recompressed with `truncate(tn, edge)`.
For `TreeTensorNetwork`, the no-argument form `truncate(ttn; kwargs...)` sweeps and
recompresses all bonds at once.

```@example main
ψ1, ψ2 = ψ, ψ
ψ12 = add(ψ1, ψ2)
ψ12 = ψ1 + ψ2
```

```@docs; canonical=false
ITensorNetworks.add(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
```

## Bond Truncation

A single bond (edge) of any `ITensorNetwork` can be truncated by SVD:

```@example main
edge = (1,2) => (1,3)
ψ12 = truncate(ψ12, (1,2) => (1,3)) # truncate the bond between vertices (1,2) and (1,3)
ψ12 = truncate(ψ12, edge) # or pass an AbstractEdge directly
```

Truncation parameters (`cutoff`, `maxdim`, `mindim`, …) are forwarded to `ITensors.svd`.
For a `TreeTensorNetwork`, the sweep-based `truncate(ttn; kwargs...)` is usually more
convenient because it recompresses the entire network at once with controlled errors;
see the [Tree Tensor Networks](@ref) page.

```@docs; canonical=false
Base.truncate(::ITensorNetworks.AbstractITensorNetwork, ::Graphs.AbstractEdge)
```
5 changes: 4 additions & 1 deletion docs/src/reference.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
# Reference
# API Reference

Complete listing of all documented public functions and types in ITensorNetworks.jl,
ITensorNetworks.ModelNetworks, and ITensorNetworks.ModelHamiltonians.

```@autodocs
Modules = [ITensorNetworks, ITensorNetworks.ModelNetworks, ITensorNetworks.ModelHamiltonians]
Expand Down
56 changes: 56 additions & 0 deletions docs/src/solvers.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Solvers

ITensorNetworks.jl provides sweep-based solvers for variational problems on tree tensor
networks. All solvers follow the same high-level pattern:

1. Start from an initial `ITensorNetwork` guess.
2. Sweep over the network, solving a small local problem at each site or pair of sites.
3. After each local solve, truncate the updated bond to control bond dimension growth.
4. Repeat for `nsweeps` sweeps.

## Eigenvalue Problems — `eigsolve` / `dmrg`

[`eigsolve`](@ref ITensorNetworks.eigsolve) finds the lowest eigenvalue and corresponding
eigenvector of an operator (e.g. a Hamiltonian) using a DMRG-like
variational sweep algorithm.
[`dmrg`](@ref ITensorNetworks.dmrg) is an alias for `eigsolve`.

```@example main
using NamedGraphs.NamedGraphGenerators: named_comb_tree
using ITensors: OpSum
using ITensorNetworks: dmrg, dst, edges, normalize, random_ttn, siteinds, src, ttn

# Build a Heisenberg Hamiltonian on a comb tree
g = named_comb_tree((3, 2))
s = siteinds("S=1/2", g)
H = let h = OpSum()
for e in edges(g)
h += 0.5, "S+", src(e), "S-", dst(e)
h += 0.5, "S-", src(e), "S+", dst(e)
h += "Sz", src(e), "Sz", dst(e)
end
ttn(h, s)
end

# Random initial state (normalise first!)
psi0 = normalize(random_ttn(s; link_space = 2))

# Run DMRG
energy, psi = dmrg(H, psi0;
nsweeps = 2,
nsites = 2,
factorize_kwargs = (; cutoff = 1e-10, maxdim = 10),
outputlevel = 1,
)
```

```@docs
ITensorNetworks.eigsolve
ITensorNetworks.dmrg
```

## Time Evolution — `time_evolve`

```@docs
ITensorNetworks.time_evolve
```
Loading
Loading