You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The attached script is a reduced and modified variant of simple_hybrid_ME.ipynb.
Removed has been anything unnecessary to reproduce the error.
Modifications are:
inclusion of recordEigenvaluesSensitivity=:ForwardDiff, recordEigenvalues = true in the loss function (lossSum)
renaming of train! to _train! *, only one iteration, removal of some arguments, adding of gradient =:ForwardDiff
# imports
using FMI
using FMIFlux
using FMIFlux.Flux
using FMIZoo
using DifferentialEquations: Tsit5
import Plots
# set seed
import Random
Random.seed!(42);
tStart = 0.0
tStep = 0.01
tStop = 5.0
tSave = collect(tStart:tStep:tStop)
realFMU = fmiLoad("SpringFrictionPendulum1D", "Dymola", "2022x")
fmiInfo(realFMU)
initStates = ["s0", "v0"]
x₀ = [0.5, 0.0]
params = Dict(zip(initStates, x₀))
vrs = ["mass.s", "mass.v", "mass.a", "mass.f"]
realSimData = fmiSimulate(realFMU, (tStart, tStop); parameters=params, recordValues=vrs, saveat=tSave)
posReal = fmi2GetSolutionValue(realSimData, "mass.s")
fmiUnload(realFMU)
simpleFMU = fmiLoad("SpringPendulum1D", "Dymola", "2022x")
# loss function for training
function lossSum(p)
global posReal
solution = neuralFMU(x₀; p=p,recordEigenvaluesSensitivity=:ForwardDiff, recordEigenvalues = true)
posNet = fmi2GetSolutionState(solution, 1; isIndex=true)
FMIFlux.Losses.mse(posReal, posNet)
end
# NeuralFMU setup
numStates = fmiGetNumberOfStates(simpleFMU)
net = Chain(x -> simpleFMU(x=x, dx_refs=:all),
Dense(numStates, 16, tanh),
Dense(16, 16, tanh),
Dense(16, numStates))
neuralFMU = ME_NeuralFMU(simpleFMU, net, (tStart, tStop), Tsit5(); saveat=tSave);
# train
paramsNet = FMIFlux.params(neuralFMU)
optim = Adam()
FMIFlux._train!(lossSum, paramsNet, Iterators.repeated((), 1), optim; gradient =:ForwardDiff)
Reported error
MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lossSum), Float64}, Float64, 32})
Closest candidates are:
(::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat
@ Base rounding.jl:207
(::Type{T})(::T) where T<:Number
@ Core boot.jl:792
Float64(::IrrationalConstants.Fourπ)
@ IrrationalConstants C:\Users\JUR.julia\packages\IrrationalConstants\vp5v4\src\macro.jl:112
...
The same happens if you use recordEigenvaluesSensitivity=:none in the lossSum.
You can replace the gradient with :ReverseDiff (in the lossSum and in _train!), and end up with another error. So that doesn't work either.
The combination :none (in lossSum) and :ReverseDiff (in _train!) works, however, if one wants to include the eigenvalues in the senstitivity calculation, this is not an option, is it?
I haven't tried with Zygote, I don't care about Zygote ;-)
*this is actually a bug that this is not updated, but _train! is probably not the preferred resolution, rather train! with the neuralFMU instead of params as the second argument
The text was updated successfully, but these errors were encountered:
Problem description and MWE
The attached script is a reduced and modified variant of simple_hybrid_ME.ipynb.
Removed has been anything unnecessary to reproduce the error.
Modifications are:
recordEigenvaluesSensitivity=:ForwardDiff, recordEigenvalues = true
in the loss function (lossSum
)gradient =:ForwardDiff
Reported error
Remarks
recordEigenvaluesSensitivity=:none
in the lossSum.:ReverseDiff
(in the lossSum and in _train!), and end up with another error. So that doesn't work either.:none
(in lossSum) and:ReverseDiff
(in _train!) works, however, if one wants to include the eigenvalues in the senstitivity calculation, this is not an option, is it?*this is actually a bug that this is not updated, but _train! is probably not the preferred resolution, rather train! with the neuralFMU instead of params as the second argument
The text was updated successfully, but these errors were encountered: