Skip to content

Conversation

DhairyaLGandhi
Copy link
Member

@DhairyaLGandhi DhairyaLGandhi commented May 25, 2025

In several downstream packages in SciML, ref SciML/ModelingToolkit.jl#3585, SciML/SciMLSensitivity.jl#1212 and others, accumulation into a mutable Tangent throws errors of the form.

julia> gs = gradient(new_p) do new_p
           new_params = SciMLStructures.replace(SciMLStructures.Tunable(), mtkparams, new_p)
           new_prob = remake(prob, p = new_params)
           new_sol = solve(new_prob, Tsit5())
           sum(new_sol)
       end
ERROR: MethodError: no method matching +(::Base.RefValue{…}, ::@NamedTuple{})
The function `+` exists, but no method is defined for this combination of argument types.

Closest candidates are:
  +(::Any, ::Any, ::Any, ::Any...)
   @ Base operators.jl:596
  +(::MutableArithmetics.Zero, ::Any)
   @ MutableArithmetics ~/.julia/packages/MutableArithmetics/tNSBd/src/rewrite.jl:64
  +(::Any, ::MutableArithmetics.Zero)
   @ MutableArithmetics ~/.julia/packages/MutableArithmetics/tNSBd/src/rewrite.jl:65
  ...

Stacktrace:
  [1] accum(x::Base.RefValue{…}, y::@NamedTuple{})
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/lib/lib.jl:9
  [2] accum(::Base.RefValue{…}, ::@NamedTuple{}, ::Nothing, ::Vararg{…})
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/lib/lib.jl:14
  [3] maybe_eager_initialize_problem
    @ ~/.julia/packages/SciMLBase/z4OYD/src/remake.jl:1211 [inlined]
  [4] (::Zygote.Pullback{Tuple{…}, Any})(Δ::Tuple{Thunk{…}, @NamedTuple{…}})
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/compiler/interface2.jl:0
  [5] #remake#753
    @ ~/.julia/packages/SciMLBase/z4OYD/src/remake.jl:266 [inlined]
  [6] (::Zygote.Pullback{Tuple{…}, Any})(Δ::Base.RefValue{Any})
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/compiler/interface2.jl:0
  [7] remake
    @ ~/.julia/packages/SciMLBase/z4OYD/src/remake.jl:214 [inlined]
  [8] (::Zygote.Pullback{Tuple{…}, Any})(Δ::Base.RefValue{Any})
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/compiler/interface2.jl:0
  [9] #12
    @ ./REPL[28]:3 [inlined]
 [10] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/compiler/interface2.jl:0
 [11] (::Zygote.var"#88#89"{Zygote.Pullback{Tuple{}, Tuple{}}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/compiler/interface.jl:97
 [12] gradient(f::Function, args::Vector{Float64})
    @ Zygote ~/.julia/packages/Zygote/wfLOG/src/compiler/interface.jl:154
 [13] top-level scope
    @ REPL[28]:1
Some type information was truncated. Use `show(err)` to see complete types.

The reason is that the literal_getproperty adjoint returns Ref instead of the Tangent itself for mutable structs. This creates an issue at the accumulation stage when only a partial graph has accumulated.

Motivating case:

mutable struct MWEM{F}
    f::F
end
ctx = Zygote.Context()
y,b = Zygote.pullback(ctx, MWEM(rand(3))) do iprob
    sum(iprob.f)
end
b(y) # ((f = [1.3008144940009554, 1.3008144940009554, 1.3008144940009554],),)
ctx.cache 
# IdDict{Any, Any} with 1 entry:
#   MWEM{Vector{Float64}}([0.178069, 0.96279, 0.159955]) => RefValue{Any}((f = [1.30081, 1.30081, 1.30081],))

Notice how the Ref is removed in the output.

Since the accumulation happens for the mutables in the pullback (and has several other issues with creating statefulness), this way unifies the return types from both branches, and thus fixes the errors seen downstream.

PR Checklist

  • Tests are added
  • Documentation, if applicable

@DhairyaLGandhi DhairyaLGandhi changed the title Accumulate mutable structs properly Accumulate mutable structs with Ref May 26, 2025
@DhairyaLGandhi
Copy link
Member Author

DhairyaLGandhi commented May 26, 2025

Not sure why the CUDA test fails, with Metal, I see:

julia> a_gpu = a |> mtl
9-element MtlVector{Float32, Metal.PrivateStorage}:
 1.0
 2.0
 3.0
 4.0
 5.0
 6.0
 7.0
 8.0
 9.0

julia> gradient(x -> sum(x .^ 3) / count(x .> 3), a_gpu)[1]
9-element MtlVector{Float32, Metal.PrivateStorage}:
  0.5
  2.0
  4.5
  8.0
 12.5
 18.0
 24.5
 32.0
 40.5

@DhairyaLGandhi
Copy link
Member Author

DhairyaLGandhi commented May 26, 2025

Interestingly, I see the same issue as buildkite with current tags

(jl_h6hRNW) pkg> st
Status `/tmp/jl_h6hRNW/Project.toml`
  [052768ef] CUDA v5.8.1
  [e88e6eb3] Zygote v0.7.7
julia> a = Float32.(1:9)
9-element Vector{Float32}:
 1.0
 2.0
 3.0
 4.0
 5.0
 6.0
 7.0
 8.0
 9.0

julia> a_gpu = a |> cu
9-element CuArray{Float32, 1, CUDA.DeviceMemory}:
 1.0
 2.0
 3.0
 4.0
 5.0
 6.0
 7.0
 8.0
 9.0

julia> g3 = gradient(x -> sum(x .^ 3) / count(x .> 3), a)[1]
9-element Vector{Float32}:
  0.5
  2.0
  4.5
  8.0
 12.5
 18.0
 24.5
 32.0
 40.5

julia> gradient(x -> sum(x .^ 3) / count(x .> 3), a_gpu)[1]
9-element CuArray{Float32, 1, CUDA.DeviceMemory}:
  0.42857146
  1.7142859
  3.8571432
  6.8571434
 10.714287
 15.428573
 21.000002
 27.428574
 34.714287

@DhairyaLGandhi
Copy link
Member Author

DhairyaLGandhi commented May 27, 2025

mutable struct MWEGetter{G, U, P, T}
    idxs::G
    u::U
    p::P
    t::T
end

u = ones(3)
p = ones(3)
t_ = 1.5
idxs = [1, 2]
mwe = MWEGetter(idxs, u, p, t_)

function fn(mwe)
    map(i -> mwe.u[i], mwe.idxs)
end

gs = Zygote.gradient(mwe) do mwe
    sum(fn(mwe))
end

produces

ERROR: MethodError: no method matching +(::Base.RefValue{Any}, ::@NamedTuple{getter::Nothing, u::Vector{Float64}, p::Nothing, t::Nothing})

Closest candidates are:
  +(::Any, ::Any, ::Any, ::Any...)
   @ Base operators.jl:587
  +(::ZeroTangent, ::Any)
   @ ChainRulesCore ~/.julia/packages/ChainRulesCore/U6wNx/src/tangent_arithmetic.jl:99
  +(::Any, ::Symbolics.SemiMonomial)
   @ Symbolics ~/.julia/packages/Symbolics/kQzvO/src/semipoly.jl:31
  ...

Stacktrace:
 [1] accum(x::Base.RefValue{Any}, y::@NamedTuple{getter::Nothing, u::Vector{Float64}, p::Nothing, t::Nothing})
   @ Zygote ~/Downloads/arpa/jsmo/t2/Zygote.jl/src/lib/lib.jl:9
 [2] fn2
   @ ~/Downloads/arpa/jsmo/t2/JuliaSimExampleComponents/base_err.jl:455 [inlined]
 [3] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Tuple{Float64, Float64})
   @ Zygote ~/Downloads/arpa/jsmo/t2/Zygote.jl/src/compiler/interface2.jl:0
 [4] #739
   @ ~/Downloads/arpa/jsmo/t2/JuliaSimExampleComponents/base_err.jl:459 [inlined]
 [5] (::Zygote.var"#88#89"{Zygote.Pullback{Tuple{}, Tuple{}}})(Δ::Float64)
   @ Zygote ~/Downloads/arpa/jsmo/t2/Zygote.jl/src/compiler/interface.jl:97
 [6] gradient(f::Function, args::MWEGetter{Tuple{Int64, Int64}, Vector{Float64}, Vector{Float64}, Float64})
   @ Zygote ~/Downloads/arpa/jsmo/t2/Zygote.jl/src/compiler/interface.jl:154
 [7] top-level scope
   @ ~/Downloads/arpa/jsmo/t2/JuliaSimExampleComponents/base_err.jl:458
Some type information was truncated. Use `show(err)` to see complete types.

cc @AayushSabharwal

@DhairyaLGandhi DhairyaLGandhi marked this pull request as ready for review May 27, 2025 13:26
@DhairyaLGandhi
Copy link
Member Author

bump cc @ChrisRackauckas

@ChrisRackauckas
Copy link
Member

I don't have review or merge here. @mcabbott could you please take a look? This is pretty urgent from SciML since we have a lot of regressions on v0.7 without this.

@ChrisRackauckas
Copy link
Member

Bump

@DhairyaLGandhi
Copy link
Member Author

bump @mcabbott @ToucheSir

@ChrisRackauckas
Copy link
Member

@oxinabox can you take a look? You seem to have committer status and it seems the others are gone?

@oxinabox
Copy link
Member

oxinabox commented Jun 5, 2025

I am incredibly busy for the next few weeks

Copy link
Member

@oxinabox oxinabox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok but this is quiet small and fine

CarloLucibello and others added 2 commits June 5, 2025 09:30
Co-authored-by: Frames White <[email protected]>
Co-authored-by: Frames White <[email protected]>
@CarloLucibello CarloLucibello merged commit 48647d9 into FluxML:master Jun 5, 2025
9 of 11 checks passed
@ChrisRackauckas
Copy link
Member

Thanks! @AayushSabharwal

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants