Skip to content

Add reverse Index map to ITensorNetwork; drop preserve_graph and fix_edges!#365

Merged
mtfishman merged 10 commits into
mainfrom
mf/itensornetwork-redesign
May 12, 2026
Merged

Add reverse Index map to ITensorNetwork; drop preserve_graph and fix_edges!#365
mtfishman merged 10 commits into
mainfrom
mf/itensornetwork-redesign

Conversation

@mtfishman
Copy link
Copy Markdown
Member

@mtfishman mtfishman commented May 12, 2026

Summary

  • Replaces ITensorNetwork's DataGraph storage with explicit fields
    (graph, vertex_data, and a new Index → Set{vertex} reverse map)
    so plain tn[v] = ... reconciles edges in
    O(deg(v) + |inds(value)|) instead of the old O(n) fix_edges!
    sweep.
  • Drops the entire preserve_graph machinery (macro,
    setindex_preserve_graph!, fix_edges!, etc.) now that vertex writes
    are cheap.
  • Inlines TreeTensorNetwork storage to mirror ITensorNetwork instead
    of wrapping one, and narrows the constructor surface on both types to
    () / {V}() / (tensors).

🤖 Generated with Claude Code

…ges!

Adds an `Index → Set{vertex}` reverse map as a field on `ITensorNetwork`
and uses it to maintain the graph-edge ↔ shared-`Index` invariant on
every vertex write in O(deg(v) + |inds(value)|), eliminating the need
for the `@preserve_graph` bypass and the O(n) `fix_edges!` sweep.

Removed:
  - `@preserve_graph` macro, `setindex_preserve_graph!`, `fix_edges!`
  - `map_vertex_data_preserve_graph`, `map_vertices_preserve_graph!`
  - `map_vertex_data` (replaced by `Base.map` / `Base.map!`)
  - `scale_tensors` / `scale_tensors!` (inlined at the three callsites
    in `normalize.jl` and `caches/abstractbeliefpropagationcache.jl`)
  - `Base.setindex!(::AbstractBeliefPropagationCache, ::ITensor, vertex)
    = not_implemented()` stubs (BPC writes now go through the underlying
    `ITensorNetwork`'s `set_vertex_data!`)

Each wrapper type (`TreeTensorNetwork`, `BeliefPropagationCache`,
`BilinearFormNetwork`, `LinearFormNetwork`, `QuadraticFormNetwork`)
defines its own `set_vertex_data!` that forwards to its wrapped network;
there is no abstract `tensornetwork(::AbstractITensorNetwork)` interface
method.

Simplified `Graphs.add_edge!(::AbstractITensorNetwork, edge)`: under the
invariant the "fill missing half" branch collapses to a single `qr!`.
@codecov
Copy link
Copy Markdown

codecov Bot commented May 12, 2026

Codecov Report

❌ Patch coverage is 87.34940% with 21 lines in your changes missing coverage. Please review.
✅ Project coverage is 76.46%. Comparing base (43bf3fd) to head (27fa3b2).
⚠️ Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
src/itensornetwork.jl 89.28% 6 Missing ⚠️
src/treetensornetworks/treetensornetwork.jl 71.42% 6 Missing ⚠️
src/caches/abstractbeliefpropagationcache.jl 84.61% 2 Missing ⚠️
src/formnetworks/linearformnetwork.jl 50.00% 2 Missing ⚠️
...rc/treetensornetworks/abstracttreetensornetwork.jl 80.00% 2 Missing ⚠️
src/abstractitensornetwork.jl 96.77% 1 Missing ⚠️
src/apply.jl 66.66% 1 Missing ⚠️
src/solvers/insert.jl 75.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #365      +/-   ##
==========================================
+ Coverage   76.05%   76.46%   +0.41%     
==========================================
  Files          57       57              
  Lines        2710     2639      -71     
==========================================
- Hits         2061     2018      -43     
+ Misses        649      621      -28     
Flag Coverage Δ
docs 52.19% <72.90%> (+0.09%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

mtfishman and others added 9 commits May 12, 2026 11:49
Storage is now `graph::NamedGraph{V}`, `vertex_data::Dict{V, ITensor}`, and
`ind_to_vertices::Dict{Index{S}, Set{V}}`, with the constructor surface
narrowed to `ITensorNetwork(tensors)` / `{V}` / `{V, S}` plus a graph-only
empty-network form for `similar_graph`-style scaffolding. The 2-arg
`(tensors, graph)` ctor is gone.

Drop `data_graph` / `data_graph_type` from the `AbstractITensorNetwork`
interface; subtypes (TreeTensorNetwork, AbstractBeliefPropagationCache,
AbstractFormNetwork, QuadraticFormNetwork) now forward via
`underlying_graph` + `vertex_data` directly. Add `Base.values(tn)` so
callers don't depend on whether `vertex_data` is `Dict`- or
`Dictionary`-shaped. Inline the old reverse-map / edge-reconciliation
helpers into `set_vertex_data!` and `rem_vertex!`.

Refactor `opsum_to_ttn` and the TTN `directsum` paths that previously
relied on the 2-arg ctor to build an empty TTN-shaped scaffold: both now
buffer into a plain `Dict{V, ITensor}` and wrap as
`TreeTensorNetwork(ITensorNetwork(...))` once the tensor data is filled,
so the tree invariant only has to hold on a fully-populated network.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- TreeTensorNetwork now holds graph, vertex_data, ind_to_vertices, and
  ortho_region directly, matching the ITensorNetwork field layout
  instead of wrapping an inner ITensorNetwork.
- Extract _set_vertex_data! and _rem_vertex! helpers shared by both
  ITensorNetwork and TreeTensorNetwork.
- Drop similar_graph and the empty-then-fill construction paths from
  the form networks and belief-propagation cache; networks are now
  always constructed with their full vertex data.
- Rewrite induced_subgraph_from_vertices to feed tensors through the
  standard ITensorNetwork(tensors) constructor.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- TreeTensorNetwork now uses the auto-generated all-fields constructor
  plus a single outer ctor that performs the is_tree check, matching
  the ITensorNetwork constructor design.
- Add ITensorNetwork{V}() empty ctor used as the seed for the
  tensor-collection constructor, replacing the explicit three-field
  call.
- Extract _unregister_inds! so _set_vertex_data! and _rem_vertex!
  share their reverse-map cleanup, without altering the in-place
  vertex_data update path.
- Build dictionaries via map(Indices(...)) in test/utils.jl.
- Fix the TreeTensorNetwork jldoctest to use a properly connected
  two-vertex example.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- Drop ITensorNetwork{V}() empty-arg ctor; inline the three-field call
  at the single seed site inside ITensorNetwork{V}(tensors). External
  callers that want an empty network can pass an empty tensor
  collection (Dict{V, ITensor}() etc.).
- Drop the TreeTensorNetwork(::ITensorNetwork; ortho_region=...) and
  TreeTensorNetwork{V}(::ITensorNetwork) overloads. All construction
  now routes through TreeTensorNetwork(tensors; ortho_region=nothing),
  which builds an ITensorNetwork first and then performs the is_tree
  check.
- Define Base.keytype on AbstractITensorNetwork so an AbstractITN can
  be passed as a tensor collection to ITensorNetwork(tensors).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Mirror the `Vector()` / `Dictionary()` convention by providing
parameterless and `{V}`-only constructors that yield an empty network:

- `ITensorNetwork()` / `ITensorNetwork{V}()`
- `TreeTensorNetwork()` / `TreeTensorNetwork{V}()`

The default vertex type is `Any`. The body of `ITensorNetwork{V}(tensors)`
now seeds from `ITensorNetwork{V}()` and writes each tensor through
`setindex!`, removing the explicit field call and the prior
`set_vertex_data!` call at this layer.

Also drop the redundant `ITensorNetwork(::TreeTensorNetwork)` and the
`ITensorNetwork(::AbstractTTN)` "not implemented" placeholder — the
generic `ITensorNetwork(tensors)` now handles a `TreeTensorNetwork`
via `keytype` and `setindex!`. Update the docs to drop the dangling
references.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
`Dict{Any, ITensor}` has `keytype = Any`, so `TreeTensorNetwork(tensors)`
already produces a TTN with `V = Any` — the explicit
`ITensorNetwork{Any}(...)` wrap is redundant.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The intermediate `ITensorNetwork` wrap was redundant — `TreeTensorNetwork`
accepts a tensor collection directly. Use a literal vertex label (`[1]`)
in `ortho_region` instead of looking it up via `vertices(itn)`.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…data!

The `ortho_region` keyword on `TreeTensorNetwork(tensors)` was not used
by any caller and let users assert a gauge without performing the
associated transformation. Removing it nudges callers toward
`orthogonalize` (which does the QR sweep) and `set_ortho_region` (the
low-level metadata update used internally).

Also drop our `set_vertices_data!` override on `AbstractITensorNetwork`
— `DataGraphs` already provides exactly the same loop as a default.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The two-argument `ITensorNetwork(tensors, graph)` ctor was removed
earlier in this PR but lingered in three docs/src/ pages and one
prose reference. Migrate them to the single-arg form, which infers
edges from shared `Index`es.

Also fix `truncate(tn::AbstractTTN, edge)` to call the generic
`TreeTensorNetwork(...)` ctor instead of the now-removed
`TreeTensorNetwork{V}(::ITensorNetwork)` specialization, and simplify
the empty `TreeTensorNetwork{V}()` to build the all-fields struct
directly instead of indirecting through `ITensorNetwork{V}()`.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@mtfishman mtfishman enabled auto-merge (squash) May 12, 2026 19:12
@mtfishman mtfishman merged commit 22d5d62 into main May 12, 2026
18 checks passed
@mtfishman mtfishman deleted the mf/itensornetwork-redesign branch May 12, 2026 19:32
mtfishman added a commit that referenced this pull request May 13, 2026
…tindex!(bpc, ...)

`data_graph_type` was an accessor in the pre-#365 DataGraph-based
storage layout. After #365 inlined storage into NamedGraph +
Dictionary fields, no implementation of `data_graph_type` remains
in src/, but the docs still listed it for `AbstractFormNetwork`,
`QuadraticFormNetwork`, `AbstractBeliefPropagationCache`,
`AbstractTTN` (as a prose label), and `TreeTensorNetwork`. Drop
all of those entries.

`edge_data_eltype(::Type{<:AbstractIndsNetwork})` was a misnamed
reference — the actual function is `DataGraphs.edge_data_type`.
Fix the name.

`setindex!(bpc, factor, vertex)` is documented for both
`AbstractBeliefPropagationCache` and `BeliefPropagationCache`,
but neither type has a corresponding method in src/. Drop both
entries.
mtfishman added a commit that referenced this pull request May 13, 2026
## Summary

Doc-only cleanup: remove references in
`docs/src/experimental_methods.md` and `docs/src/developer_methods.md`
to methods that no longer exist in src/.

- **`visualize(is::IndsNetwork, args...; kwargs...)`** — removed from
`src/indsnetwork.jl` in
[#356](#356) along
with the `IndsNetwork`-based `ITensorNetwork` constructor family. The
doc entry was missed.
- **`data_graph_type`** — was an accessor for the
pre-[#365](#365)
`DataGraph`-based storage layout. After #365 inlined storage into
`NamedGraph + Dictionary` fields, no implementation remains in src/, but
the docs still listed it for `AbstractFormNetwork`,
`QuadraticFormNetwork`, `AbstractBeliefPropagationCache`, `AbstractTTN`
(as a prose label), and `TreeTensorNetwork`. Drop all of those entries.
- **`edge_data_eltype(::Type{<:AbstractIndsNetwork})`** — misnamed; the
actual function is `DataGraphs.edge_data_type`. Fix the name.
- **`setindex!(bpc, factor, vertex)`** — documented for both
`AbstractBeliefPropagationCache` and `BeliefPropagationCache`, but
neither type has a corresponding method in src/. Drop both entries.

No version bump — substantive change accumulates against the existing
`0.22.0-DEV` prerelease.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant