You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: NEWS.md
+7-5Lines changed: 7 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,11 +4,13 @@ See also [github's page](https://github.com/FluxML/Flux.jl/releases) for a compl
4
4
5
5
## v0.15.0
6
6
* Recurrent layers have undergone a complete redesign in [PR 2500](https://github.com/FluxML/Flux.jl/pull/2500).
7
-
*`RNN`, `LSTM`, and `GRU` no longer store the hidden state internally. Instead, they now take the previous state as input and return the updated state as output.
8
-
* These layers (`RNN`, `LSTM`, `GRU`) now process entire sequences at once, rather than one element at a time.
9
-
* The `Recur` wrapper has been deprecated and removed.
10
-
* The `reset!` function has also been removed; state management is now entirely up to the user.
11
-
*`RNNCell`, `LSTMCell`, and `GRUCell` are now exported and provide functionality for single time-step processing.
7
+
*`RNNCell`, `LSTMCell`, and `GRUCell` are now exported and provide functionality for single time-step processing: `rnncell(x_t, h_t) -> h_{t+1}`.
8
+
*`RNN`, `LSTM`, and `GRU` no longer store the hidden state internally, it has to be explicitely passed to the layer. Moreover, they now process entire sequences at once, rather than one element at a time: `rnn(x, h) -> h′`.
9
+
* The `Recur` wrapper has been deprecated and removed.
10
+
* The `reset!` function has also been removed; state management is now entirely up to the user.
11
+
* The `Flux.Optimise` module has been deprecated in favor of the Optimisers.jl package.
12
+
Now Flux re-exports the optimisers from Optimisers.jl. Most users will be uneffected by this change.
13
+
The module is still available for now, but will be removed in a future release.
12
14
13
15
## v0.14.22
14
16
* Data movement between devices is now provided by [MLDataDevices.jl](https://github.com/LuxDL/MLDataDevices.jl).
Copy file name to clipboardExpand all lines: src/outputsize.jl
-2Lines changed: 0 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -302,8 +302,6 @@ function ChainRulesCore.rrule(::typeof(striplazy), m)
302
302
striplazy(m), _ ->error("striplazy should never be used within a gradient")
303
303
end
304
304
305
-
params!(p::Params, x::LazyLayer, seen =IdSet()) =error("LazyLayer should never be used within params(m). Call striplazy(m) first.")
306
-
307
305
Functors.functor(::Type{<:LazyLayer}, x) =error("LazyLayer should not be walked with Functors.jl, as the arrays which Flux.gpu wants to move may not exist yet.")
0 commit comments