Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
92 commits
Select commit Hold shift + click to select a range
b4ae4d7
Working BP Commit
JoeyT1994 Oct 2, 2025
d77d063
BP Code
JoeyT1994 Oct 23, 2025
b80e36e
Express BP in terms of `SweepIterator` interface
jack-dunham Oct 28, 2025
fe44b80
Add method for `setmessages!` that allows messages from one cache to …
jack-dunham Oct 31, 2025
3ce0898
Network is now passed to `forest_cover_edge_sequence` directly.
jack-dunham Nov 10, 2025
f6e4fd0
test file formatting
jack-dunham Nov 25, 2025
63840a9
Add `DataGraphsPartitionedGraphsExt` glue for `TensorNetwork` type
jack-dunham Nov 25, 2025
ba22ab5
Make abstract tensor network interface more generic.
jack-dunham Nov 25, 2025
49b0870
BP Caching overhauls
jack-dunham Nov 25, 2025
db46c04
Remove dead deps
jack-dunham Nov 25, 2025
400e373
Fix merge
jack-dunham Nov 25, 2025
b9aafe8
Fix type inference in TensorNetwork construction
jack-dunham Nov 25, 2025
4090e61
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 25, 2025
be0750e
Remove `ITensorBase` dep
jack-dunham Nov 25, 2025
b971b89
`forest_cover_edge_sequence` now constructs a temporary `NamedGraph` …
jack-dunham Dec 1, 2025
9ebf031
[LazyNamedDimsArrays] Fix `parenttype` method
jack-dunham Jan 6, 2026
16fe303
BP Cache now uses new `DataGraphs`interface
jack-dunham Jan 6, 2026
24a4335
Adjust `default_message` to take a `message` type as its first argument
jack-dunham Jan 6, 2026
c43884e
Remove unnecessary code and fix ambiguities in `AbstractTensorNetwork`
jack-dunham Jan 6, 2026
dd6f645
`TensorNetwork` type now uses new DataGraphs interface
jack-dunham Jan 6, 2026
7bb579c
Sweeping algorithms based on AlgorithmsInterface.jl (#30)
mtfishman Dec 19, 2025
032447a
Upgrade to NamedDimsArrays.jl v0.11 (#38)
mtfishman Dec 23, 2025
b256d79
[LazyNamedDimsArrays] New `symnameddims` method that pulls out indice…
jack-dunham Jan 9, 2026
b2da9d8
The function `region_scalar` should now return a scalar, rather than …
jack-dunham Jan 9, 2026
8506e26
Fix double counting in `edge_scalars` function
jack-dunham Jan 9, 2026
938180a
Minor code formatting
jack-dunham Jan 9, 2026
4461967
Expressed belief propagation in terms of AlgorithmsInterface
jack-dunham Jan 9, 2026
d68860a
Fixes to TensorNetwork construction from tensor list
jack-dunham Jan 9, 2026
2f5c783
Minor simplifications to `contract_network` interface.
jack-dunham Jan 9, 2026
9a45a5b
Merge branch 'main' into bp
jack-dunham Feb 13, 2026
4eec9b6
Upgrade DataGraphs and NamedGraphs dependencies
jack-dunham Feb 10, 2026
202724c
[AlgorithmsInterfaceExtensions] Allowing mapping over a generic itera…
jack-dunham Feb 10, 2026
69542e3
Upgrade serial BP to use own `<:Algorithm` structs.
jack-dunham Feb 11, 2026
9925069
Simplify BP cache to only store factors
jack-dunham Feb 13, 2026
292f2fa
Upgrade to DataGraphs v0.3.1 and NamedGraphs v0.10
jack-dunham Feb 13, 2026
9d937aa
Fix compat
jack-dunham Feb 13, 2026
5432fe2
Fix broken merge
jack-dunham Feb 13, 2026
c916c84
Bug fix; upgrade tests
jack-dunham Feb 19, 2026
4a511a1
Add 2D TN test
jack-dunham Feb 24, 2026
5b97af3
Formatting
jack-dunham Feb 24, 2026
fef588d
Merge branch 'main' into bp
jack-dunham Feb 24, 2026
62dae14
Merge branch 'bp' of https://github.com/jack-dunham/ITensorNetworksNe…
jack-dunham Mar 24, 2026
951cee6
Simplify BP code
jack-dunham Mar 12, 2026
1f1920c
Add spin ice test
jack-dunham Mar 13, 2026
5f3be98
Version Bump
jack-dunham Mar 13, 2026
487683a
Use `abs2` in message diff function.
jack-dunham Mar 13, 2026
aa24243
Add method for setting intitial messages; improve spin ice tests.
jack-dunham Mar 16, 2026
9248686
Remove redundant `default_message_diff_function` function.
jack-dunham Mar 16, 2026
9d7abea
Upgrade to DataGraphs and NamedGraphs to 0.4 and 0.11
jack-dunham Mar 24, 2026
a7b0986
Merge branch 'main' into bp
jack-dunham Apr 14, 2026
0b65bfb
Formatting
jack-dunham Apr 14, 2026
f08f022
Upgrade to simplified `similar_graph`
jack-dunham Apr 17, 2026
3aec516
Remove edge arg in `similar_graph`.
jack-dunham Apr 20, 2026
4d4bc5a
Inline message computation into `solve!`; use type instead of alg str…
jack-dunham Apr 20, 2026
1f23ab8
Add in `PartitionedGraphs` interface methods for `TensorNetwork` and …
jack-dunham Apr 20, 2026
9ca7275
Use `map` instead of comprehension when returning messages.
jack-dunham Apr 22, 2026
005ccf0
Test BP with differing precisions; remove `atol` test criteria.
jack-dunham Apr 21, 2026
19d1256
Fix `nested_algorithm` methods on iterables.
jack-dunham Apr 22, 2026
5a0ee82
Merge branch 'bp' of https://github.com/jack-dunham/ITensorNetworksNe…
jack-dunham Apr 22, 2026
0674767
Cleanup `AbstractBeliefPropagationCache` interface.
jack-dunham Apr 22, 2026
3720391
Remove `Graphs.connected_components` method for `TensorNetwork`
jack-dunham Apr 27, 2026
ccdcb74
Remove unecessary `symnameddims` method.
jack-dunham Apr 27, 2026
d59c3e1
Remove confusing code comment.
jack-dunham Apr 27, 2026
9a2a88e
Remove `beliefpropagation_sweep` in favour of constructor call.
jack-dunham Apr 27, 2026
2ae7100
Fix message type initialization failing when only factors are provided.
jack-dunham Apr 28, 2026
bf4d0fe
Formatting.
jack-dunham Apr 28, 2026
d33a58b
Remove `edge_data_type` method for `AbstractTensorNetwork`
jack-dunham Apr 28, 2026
e5619be
Add some tests for `TensorNetwork` type.
jack-dunham Apr 28, 2026
19588ce
Bug fixes; more tests
jack-dunham Apr 28, 2026
b520afd
Using `Inf` instead of `NaN` for delta initialization in `StopWhenCon…
jack-dunham Apr 28, 2026
397733a
Add some basic tests for `PartitionedGraphs` interactions with `Tenso…
jack-dunham Apr 28, 2026
44f063a
Add tests via Claude.
jack-dunham Apr 29, 2026
7197bcf
Refine and redistribute generated tests
jack-dunham Apr 29, 2026
44805cc
Further BP test improvements
jack-dunham Apr 29, 2026
1950a80
Fix incomplete `sitenames` and `siteaxes` definitions.
jack-dunham Apr 29, 2026
20dca72
Remove `default_message` and other fixes.
jack-dunham Apr 29, 2026
8e18614
Fix test imports
jack-dunham Apr 29, 2026
ef7e659
Formatting.
jack-dunham Apr 29, 2026
fbd0331
Fix and test tensor network graph manipulation functions.
jack-dunham Apr 29, 2026
e7d69c6
Simplify `factors` and `messages` methods on `AbstractGraph`
jack-dunham May 1, 2026
38c391a
Refactor `BeliefPropagationCache` -> `MessageCache`, remove abstract …
jack-dunham Apr 30, 2026
2c8b450
Allow a custom stopping criteria input into `beliefpropagation` using…
jack-dunham May 1, 2026
bc22b67
Hard code edge type in `MessageCache`.
jack-dunham May 1, 2026
ed2621c
Merge branch 'bp' of https://github.com/jack-dunham/ITensorNetworksNe…
jack-dunham May 1, 2026
115dcff
Remove `MessageCache` undef initializer.
jack-dunham May 1, 2026
1deba51
Rename argument names to be more consistent.
jack-dunham May 1, 2026
80bbc99
Simplify `MessageCache` interface.
jack-dunham May 4, 2026
8c52788
Rename `ennvironment_messages` to `incoming_edge_data`.
jack-dunham May 6, 2026
cdfa57c
Rename `logscalar` to `bethe_free_energy`; remove `scalar`.
jack-dunham May 6, 2026
f40500a
Add `messagecache(f, edge)` method.
jack-dunham May 6, 2026
a6809bb
Inline belief propagation algorithm construction.
jack-dunham May 6, 2026
00a23f6
Upgrade to registered `AlgorithmsInterface` version; remove `solve!` …
jack-dunham May 6, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "ITensorNetworksNext"
uuid = "302f2e75-49f0-4526-aef7-d8ba550cb06c"
version = "0.3.24"
version = "0.4.0"
authors = ["ITensor developers <support@itensor.org> and contributors"]

[workspace]
Expand Down Expand Up @@ -39,15 +39,15 @@ Adapt = "4.3"
AlgorithmsInterface = "0.1"
BackendSelection = "0.1.6"
Combinatorics = "1"
DataGraphs = "0.2.7"
DataGraphs = "0.4"
DiagonalArrays = "0.3.31"
Dictionaries = "0.4.5"
FunctionImplementations = "0.4"
FunctionImplementations = "0.4.1"
Graphs = "1.13.1"
LinearAlgebra = "1.10"
MacroTools = "0.5.16"
NamedDimsArrays = "0.14.2"
NamedGraphs = "0.6.9, 0.7, 0.8"
NamedDimsArrays = "0.14.3"
NamedGraphs = "0.11"
SimpleTraits = "0.9.5"
SplitApplyCombine = "1.2.3"
TensorOperations = "5.3.1"
Expand Down
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,5 +10,5 @@ path = ".."
[compat]
Documenter = "1"
ITensorFormatter = "0.2.27"
ITensorNetworksNext = "0.3"
ITensorNetworksNext = "0.4"
Literate = "2"
2 changes: 1 addition & 1 deletion examples/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ ITensorNetworksNext = "302f2e75-49f0-4526-aef7-d8ba550cb06c"
path = ".."

[compat]
ITensorNetworksNext = "0.3"
ITensorNetworksNext = "0.4"
98 changes: 22 additions & 76 deletions src/AlgorithmsInterfaceExtensions/AlgorithmsInterfaceExtensions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ function AI.initialize_state!(
end

function AI.initialize_state(
problem::Problem, algorithm::Algorithm; kwargs...
problem::Problem, algorithm::Algorithm; iterate, kwargs...
)
stopping_criterion_state = AI.initialize_state(
problem, algorithm, algorithm.stopping_criterion
problem, algorithm, algorithm.stopping_criterion; iterate
)
return DefaultState(; stopping_criterion_state, kwargs...)
return DefaultState(; iterate, stopping_criterion_state, kwargs...)
end

# ============================ DefaultState ================================================
Expand All @@ -46,61 +46,6 @@ end
function AI.increment!(problem::Problem, algorithm::Algorithm, state::State)
return AI.increment!(state)
end
# ============================ solve! ======================================================

# Custom version of `solve!` that allows specifying the logger and also overloads
# `increment!` on the problem and algorithm.
function basetypenameof(x)
return Symbol(last(split(String(Symbol(Base.typename(typeof(x)).wrapper)), ".")))
end
default_logging_context_prefix(x) = Symbol(basetypenameof(x), :_)
function default_logging_context_prefix(problem::Problem, algorithm::Algorithm)
return Symbol(
default_logging_context_prefix(problem),
default_logging_context_prefix(algorithm)
)
end
function AI.solve!(
problem::Problem, algorithm::Algorithm, state::State;
logging_context_prefix = default_logging_context_prefix(problem, algorithm),
kwargs...
)
logger = AI.algorithm_logger()

context_suffixes = [:Start, :PreStep, :PostStep, :Stop]
contexts = Dict(context_suffixes .=> Symbol.(logging_context_prefix, context_suffixes))

# initialize the state and emit message
AI.initialize_state!(problem, algorithm, state; kwargs...)
AI.emit_message(logger, problem, algorithm, state, contexts[:Start])

# main body of the algorithm
while !AI.is_finished!(problem, algorithm, state)
AI.increment!(problem, algorithm, state)

# logging event between convergence check and algorithm step
AI.emit_message(logger, problem, algorithm, state, contexts[:PreStep])

# algorithm step
AI.step!(problem, algorithm, state; logging_context_prefix)

# logging event between algorithm step and convergence check
AI.emit_message(logger, problem, algorithm, state, contexts[:PostStep])
end

# emit message about finished state
AI.emit_message(logger, problem, algorithm, state, contexts[:Stop])
return state
end

function AI.solve(
problem::Problem, algorithm::Algorithm;
logging_context_prefix = default_logging_context_prefix(problem, algorithm),
kwargs...
)
state = AI.initialize_state(problem, algorithm; kwargs...)
return AI.solve!(problem, algorithm, state; logging_context_prefix, kwargs...)
end

# ============================ AlgorithmIterator ===========================================

Expand Down Expand Up @@ -151,8 +96,9 @@ end

abstract type NestedAlgorithm <: Algorithm end

function nested_algorithm(f::Function, nalgorithms::Int; kwargs...)
return DefaultNestedAlgorithm(f, nalgorithms; kwargs...)
nested_algorithm(f::Function, int::Int; kwargs...) = nested_algorithm(f, 1:int; kwargs...)
function nested_algorithm(f::Function, iterable; kwargs...)
return DefaultNestedAlgorithm(f, iterable; kwargs...)
end

max_iterations(algorithm::NestedAlgorithm) = length(algorithm.algorithms)
Expand All @@ -173,18 +119,12 @@ function set_substate!(
return state
end

function AI.step!(
problem::AI.Problem, algorithm::NestedAlgorithm, state::AI.State;
logging_context_prefix = Symbol()
)
function AI.step!(problem::AI.Problem, algorithm::NestedAlgorithm, state::AI.State)
# Get the subproblem, subalgorithm, and substate.
subproblem, subalgorithm, substate = get_subproblem(problem, algorithm, state)

# Solve the subproblem with the subalgorithm.
logging_context_prefix = Symbol(
logging_context_prefix, default_logging_context_prefix(subalgorithm)
)
AI.solve!(subproblem, subalgorithm, substate; logging_context_prefix)
AI.solve!(subproblem, subalgorithm, substate)

# Update the state with the substate.
set_substate!(problem, algorithm, state, substate)
Expand All @@ -206,8 +146,8 @@ from a list of stored algorithms.
algorithms::Algorithms
stopping_criterion::StoppingCriterion = AI.StopAfterIteration(length(algorithms))
end
function DefaultNestedAlgorithm(f::Function, nalgorithms::Int; kwargs...)
return DefaultNestedAlgorithm(; algorithms = f.(1:nalgorithms), kwargs...)
function DefaultNestedAlgorithm(f::Function, iterable; kwargs...)
return DefaultNestedAlgorithm(; algorithms = f.(iterable), kwargs...)
end

# ============================ FlattenedAlgorithm ==========================================
Expand Down Expand Up @@ -246,15 +186,14 @@ function AI.increment!(
return state
end
function AI.step!(
problem::AI.Problem, algorithm::FlattenedAlgorithm, state::FlattenedAlgorithmState;
logging_context_prefix = Symbol()
problem::AI.Problem, algorithm::FlattenedAlgorithm, state::FlattenedAlgorithmState
)
algorithm_sweep = algorithm.algorithms[state.parent_iteration]
state_sweep = AI.initialize_state(
problem, algorithm_sweep;
state.iterate, iteration = state.child_iteration
)
AI.step!(problem, algorithm_sweep, state_sweep; logging_context_prefix)
AI.step!(problem, algorithm_sweep, state_sweep)
state.iterate = state_sweep.iterate
return state
end
Expand Down Expand Up @@ -291,10 +230,17 @@ abstract type NonIterativeAlgorithmState <: State end
function AI.initialize_state(problem::Problem, algorithm::NonIterativeAlgorithm; kwargs...)
return DefaultNonIterativeAlgorithmState(; kwargs...)
end
function AI.solve!(
problem::Problem, algorithm::NonIterativeAlgorithm, state::State; kwargs...

function AI.initialize_state!(
problem::Problem,
algorithm::NonIterativeAlgorithm,
state::NonIterativeAlgorithmState
)
return throw(MethodError(AI.solve!, (problem, algorithm, state)))
return state
end

function AI.solve_loop!(problem::Problem, algorithm::NonIterativeAlgorithm, state::State)
return throw(MethodError(AI.solve_loop!, (problem, algorithm, state)))
end

@kwdef mutable struct DefaultNonIterativeAlgorithmState{Iterate} <:
Expand Down
3 changes: 3 additions & 0 deletions src/ITensorNetworksNext.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,7 @@ include("contract_network.jl")
include("sweeping/utils.jl")
include("sweeping/eigenproblem.jl")

include("beliefpropagation/messagecache.jl")
include("beliefpropagation/beliefpropagationproblem.jl")

end
2 changes: 1 addition & 1 deletion src/LazyNamedDimsArrays/lazynameddimsarray.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ using WrappedUnions: @wrapped
union::Union{A, Mul{LazyNamedDimsArray{T, A}}}
end

parenttype(::Type{LazyNamedDimsArray{<:Any, A}}) where {A} = A
parenttype(::Type{LazyNamedDimsArray{T, A}}) where {T, A} = A
parenttype(::Type{LazyNamedDimsArray{T}}) where {T} = AbstractNamedDimsArray{T}
parenttype(::Type{LazyNamedDimsArray}) = AbstractNamedDimsArray

Expand Down
Loading
Loading