Skip to content

Upgrade-OMEinsum, File IO #101

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jul 28, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 2 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,10 @@ JL = julia --project
default: init test

init:
$(JL) -e 'using Pkg; Pkg.precompile()'
init-docs:
$(JL) -e 'using Pkg; Pkg.activate("docs"); Pkg.develop(path="."), Pkg.precompile()'
$(JL) -e 'using Pkg; Pkg.precompile(); Pkg.activate("docs"); Pkg.develop(path=".")'

update:
$(JL) -e 'using Pkg; Pkg.update(); Pkg.precompile()'
update-docs:
$(JL) -e 'using Pkg; Pkg.activate("docs"); Pkg.update(); Pkg.precompile()'
$(JL) -e 'using Pkg; Pkg.update(); Pkg.activate("docs"); Pkg.update()'

test:
$(JL) -e 'using Pkg; Pkg.test("GenericTensorNetworks")'
Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "GenericTensorNetworks"
uuid = "3521c873-ad32-4bb4-b63d-f4f178f42b49"
authors = ["GiggleLiu <cacate0129@gmail.com> and contributors"]
version = "4.0.1"
version = "4.1.0"

[deps]
AbstractTrees = "1520ce14-60c1-5f80-bbc7-55ef81b5835c"
Expand Down Expand Up @@ -38,7 +38,7 @@ FFTW = "1.4"
Graphs = "1.7"
LinearAlgebra = "1"
LuxorGraphPlot = "0.5"
OMEinsum = "0.8"
OMEinsum = "0.9.1"
Polynomials = "4"
Primes = "0.5"
ProblemReductions = "0.3"
Expand Down
9 changes: 4 additions & 5 deletions docs/src/performancetips.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ using GenericTensorNetworks, Graphs, Random
graph = random_regular_graph(120, 3)
iset = IndependentSet(graph)
problem = GenericTensorNetwork(iset; optimizer=TreeSA(
sc_target=20, sc_weight=1.0, rw_weight=3.0, ntrials=10, βs=0.01:0.1:15.0, niters=20))
score=ScoreFunction(sc_target=20, sc_weight=1.0, rw_weight=3.0), ntrials=10, βs=0.01:0.1:15.0, niters=20))
```

The `GenericTensorNetwork` constructor maps a problem to a tensor network with an optimized contraction order. The `optimizer` parameter specifies the algorithm to use:
Expand Down Expand Up @@ -75,20 +75,19 @@ The finite field approach requires only 298 KB, while using the `Polynomial` typ
## 2. Slicing Technique for Large Problems

For large-scale applications, you can slice over certain degrees of freedom to reduce space complexity. This approach loops and accumulates over selected degrees of freedom, resulting in smaller tensor networks inside the loop.

In the `TreeSA` optimizer, set `nslices` to a value greater than zero:
This can be achieved by setting the `slicer` parameter of the `GenericTensorNetwork` constructor.

```julia
# Without slicing
problem = GenericTensorNetwork(iset; optimizer=TreeSA(βs=0.01:0.1:25.0, ntrials=10, niters=10))
contraction_complexity(problem)

# With slicing over 5 degrees of freedom
problem = GenericTensorNetwork(iset; optimizer=TreeSA(βs=0.01:0.1:25.0, ntrials=10, niters=10, nslices=5))
problem = GenericTensorNetwork(iset; optimizer=TreeSA(βs=0.01:0.1:25.0, ntrials=10, niters=10), slicer=TreeSASlicer(score=ScoreFunction(sc_target=10)))
contraction_complexity(problem)
```

In this example, slicing over 5 degrees of freedom reduces space complexity by a factor of 32 (2^5), while increasing computation time by less than a factor of 2.
In this example, slicing with the `TreeSASlicer` to reach space complexity of 2^10, at the cost of increased time complexity.

## 3. Accelerating Tropical Number Operations

Expand Down
8 changes: 8 additions & 0 deletions docs/src/ref.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,6 +144,14 @@ SABipartite
KaHyParBipartite
MergeVectors
MergeGreedy
TreeSASlicer
ScoreFunction
```

## FileIO
```@docs
save_tensor_network
load_tensor_network
```

## Others
Expand Down
6 changes: 5 additions & 1 deletion src/GenericTensorNetworks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ using Core: Argument
using TropicalNumbers
using OMEinsum
using OMEinsum: contraction_complexity, timespace_complexity, timespacereadwrite_complexity, getixsv, NestedEinsum, getixs, getiy, DynamicEinCode
using OMEinsum.OMEinsumContractionOrders.JSON
using Graphs, Random
using DelimitedFiles, Serialization
using LuxorGraphPlot
Expand All @@ -26,7 +27,7 @@ import StatsBase

# OMEinsum
export timespace_complexity, timespacereadwrite_complexity, contraction_complexity, @ein_str, getixsv, getiyv
export GreedyMethod, TreeSA, SABipartite, KaHyParBipartite, MergeVectors, MergeGreedy
export GreedyMethod, TreeSA, SABipartite, KaHyParBipartite, MergeVectors, MergeGreedy, TreeSASlicer, ScoreFunction

# estimate memory
export estimate_memory
Expand Down Expand Up @@ -80,6 +81,9 @@ export read_size, read_count, read_config, read_size_count, read_size_config
export show_graph, show_configs, show_einsum, GraphDisplayConfig, render_locs, show_landscape
export AbstractLayout, SpringLayout, StressLayout, SpectralLayout, Layered, LayeredSpringLayout, LayeredStressLayout

# FileIO
export save_tensor_network, load_tensor_network

project_relative_path(xs...) = normpath(joinpath(dirname(dirname(pathof(@__MODULE__))), xs...))

# Mods.jl fixed to v1.3.4
Expand Down
66 changes: 66 additions & 0 deletions src/fileio.jl
Original file line number Diff line number Diff line change
Expand Up @@ -123,3 +123,69 @@ function dict_deserialize_tree(id::UInt, d::Dict)
end
end

"""
save_tensor_network(tn::GenericTensorNetwork; folder::String)

Serialize a tensor network to disk for storage/reloading. Creates three structured files:
- `code.json`: OMEinsum contraction code (tree structure and contraction order)
- `fixedvertices.json`: JSON-serialized Dict of pinned vertex configurations
- `problem.json`: Problem specification using ProblemReductions serialization

The target folder will be created recursively if it doesn't exist. Files are overwritten
if they already exist. Uses JSON for human-readable serialization with type preservation.

The saved files can be loaded using [`load_tensor_network`](@ref).

# Arguments
- `tn::GenericTensorNetwork`: a [`GenericTensorNetwork`](@ref) instance to serialize. Must contain valid code, problem, and fixedvertices fields.
- `folder::String`: Destination directory path. Parent directories will be created as needed.
"""
function save_tensor_network(tn::GenericTensorNetwork; folder::String)
!isdir(folder) && mkpath(folder)

OMEinsum.writejson(joinpath(folder, "code.json"), tn.code)

open(joinpath(folder, "fixedvertices.json"), "w") do io
JSON.print(io, tn.fixedvertices, 2)
end

ProblemReductions.writejson(joinpath(folder, "problem.json"), tn.problem)
return nothing
end

"""
load_tensor_network(folder::String) -> GenericTensorNetwork

Load a tensor network from disk that was previously saved using [`save_tensor_network`](@ref).
Reconstructs the network from three required files: contraction code, fixed vertices mapping, and problem specification.

# Arguments
- `folder::String`: Path to directory containing saved network files. Must contain:
- `code.json`: Contraction order/structure from OMEinsum
- `fixedvertices.json`: Dictionary of pinned vertex states
- `problem.json`: Problem specification and parameters

# Returns
- `GenericTensorNetwork`: Reconstructed tensor network.
"""
function load_tensor_network(folder::String)
!isdir(folder) && throw(SystemError("Folder not found: $folder"))

code_path = joinpath(folder, "code.json")
fixed_path = joinpath(folder, "fixedvertices.json")
problem_path = joinpath(folder, "problem.json")

!isfile(code_path) && throw(SystemError("Code file not found: $code_path"))
!isfile(fixed_path) && throw(SystemError("Fixedvertices file not found: $fixed_path"))
!isfile(problem_path) && throw(SystemError("Problem file not found: $problem_path"))

code = OMEinsum.readjson(code_path)

fixed_dict = JSON.parsefile(fixed_path)
fixedvertices = Dict{labeltype(code),Int}(parse(Int, k) => v for (k, v) in fixed_dict)

problem = ProblemReductions.readjson(problem_path)

return GenericTensorNetwork(problem, code, fixedvertices)
end

23 changes: 15 additions & 8 deletions src/networks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,29 +16,36 @@ end

"""
$TYPEDEF
GenericTensorNetwork(problem::ConstraintSatisfactionProblem; openvertices=(), fixedvertices=Dict(), optimizer=GreedyMethod())
GenericTensorNetwork(problem::ConstraintSatisfactionProblem; openvertices=(), fixedvertices=Dict(), optimizer=GreedyMethod(), slicer=nothing)

The generic tensor network that generated from a [`ConstraintSatisfactionProblem`](@ref).

Positional arguments
-------------------------------
* `problem` is the graph problem.
* `code` is the tensor network contraction code.
* `fixedvertices` is a dictionary specifying the fixed dimensions.
- `problem` is the constraint satisfaction problem.

Keyword arguments
-------------------------------
- `openvertices` is a vector of open indices, which are the degrees of freedoms that appears in the output tensor.
- `fixedvertices` is a dictionary specifying the fixed degrees of freedom. For example, If I want to fix the variable `5` to be 0, I can set `fixedvertices = Dict(5 => 0)`.
- `optimizer` is the contraction order optimizer for the generated tensor network.
- `slicer` is the slicer for the tensor network, it can reduce the memory usage at the cost of computing time by slicing the tensor network.

For more information about contraction order optimization and slicing, please refer to the [OMEinsumContractionOrders documentation](https://tensorbfs.github.io/OMEinsumContractionOrders.jl/dev/).
"""
struct GenericTensorNetwork{CFG, CT, LT}
problem::CFG
code::CT
fixedvertices::Dict{LT,Int}
end
function GenericTensorNetwork(problem::ConstraintSatisfactionProblem; openvertices=(), fixedvertices=Dict(), optimizer=GreedyMethod())
function GenericTensorNetwork(problem::ConstraintSatisfactionProblem; openvertices=(), fixedvertices=Dict(), optimizer=GreedyMethod(), slicer=nothing)
rcode = rawcode(problem; openvertices)
code = _optimize_code(rcode, uniformsize_fix(rcode, num_flavors(problem), fixedvertices), optimizer, MergeVectors())
code = _optimize_code(rcode, uniformsize_fix(rcode, num_flavors(problem), fixedvertices), optimizer, MergeVectors(), slicer)
return GenericTensorNetwork(problem, code, Dict{labeltype(code),Int}(fixedvertices))
end
# a unified interface to optimize the contraction code
_optimize_code(code, size_dict, optimizer::Nothing, simplifier) = code
_optimize_code(code, size_dict, optimizer, simplifier) = optimize_code(code, size_dict, optimizer, simplifier)
_optimize_code(code, size_dict, optimizer::Nothing, simplifier, slicer) = code
_optimize_code(code, size_dict, optimizer, simplifier, slicer) = optimize_code(code, size_dict, optimizer; simplifier, slicer)

function Base.show(io::IO, tn::GenericTensorNetwork)
println(io, "$(typeof(tn))")
Expand Down
27 changes: 27 additions & 0 deletions test/fileio.jl
Original file line number Diff line number Diff line change
Expand Up @@ -51,3 +51,30 @@ end
@test ma == tree
end

@testset "save load GenericTensorNetwork" begin
g = smallgraph(:petersen)
problem = IndependentSet(g, UnitWeight(10))
tn = GenericTensorNetwork(problem; fixedvertices=Dict(1=>0, 2=>1))
folder = tempname()
save_tensor_network(tn; folder=folder)
tn2 = load_tensor_network(folder)
@test tn.problem == tn2.problem
@test tn.code == tn2.code
@test tn.fixedvertices == tn2.fixedvertices
@test solve(tn, SizeMax()) == solve(tn2, SizeMax())

# test with empty fixedvertices
tn3 = GenericTensorNetwork(problem)
folder2 = tempname()
save_tensor_network(tn3; folder=folder2)
tn4 = load_tensor_network(folder2)
@test tn3.problem == tn4.problem
@test tn3.code == tn4.code
@test tn3.fixedvertices == tn4.fixedvertices

# test error cases
empty_folder = tempname()
mkpath(empty_folder)
@test_throws SystemError load_tensor_network(empty_folder)
end

4 changes: 2 additions & 2 deletions test/interfaces.jl
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ end

@testset "slicing" begin
g = Graphs.smallgraph("petersen")
gp = GenericTensorNetwork(IndependentSet(g), optimizer=TreeSA(nslices=5, ntrials=1))
gp = GenericTensorNetwork(IndependentSet(g), optimizer=TreeSA(ntrials=1), slicer=TreeSASlicer(score=ScoreFunction(sc_target=2)))
res1 = solve(gp, SizeMax())[]
res2 = solve(gp, CountingAll())[]
res3 = solve(gp, CountingMax(Single))[]
Expand Down Expand Up @@ -278,4 +278,4 @@ end
graph = UnitDiskGraph(fullerene(), sqrt(5))
spin_glass = SpinGlass(graph, UnitWeight(ne(graph)), zeros(Int, nv(graph)))
@test log(solve(spin_glass, PartitionFunction(1.0))[])/nv(graph) ≈ 1.3073684577607942
end
end
Loading