Skip to content

Releases: LuxDL/Lux.jl

v0.4.17

22 Aug 02:45
4e394f7
Compare
Choose a tag to compare

Lux v0.4.17

Diff since v0.4.16

Closed issues:

  • Inconsistent descripition of PairwiseFusion (#130)
  • No method matching with argument IRTools.Inner.Undefined in gradient computation. (#134)

Merged pull requests:

  • LSTM docs: don't go over first element in sequence twice (#132) (@visr)
  • fix PairwiseFusion docs (#133) (@MilkshakeForReal)
  • Generic recurrent cells (#136) (@jumerckx)
  • relu tests with finite diff is too unreliable (#137) (@avik-pal)

v0.4.16

09 Aug 04:29
605aa14
Compare
Choose a tag to compare

Lux v0.4.16

Diff since v0.4.15

Merged pull requests:

v0.4.15

07 Aug 17:05
e7e64c1
Compare
Choose a tag to compare

Lux v0.4.15

Diff since v0.4.14

Merged pull requests:

v0.4.14

07 Aug 15:45
6041cdb
Compare
Choose a tag to compare

Lux v0.4.14

Diff since v0.4.13

Merged pull requests:

v0.4.13

06 Aug 21:17
3559802
Compare
Choose a tag to compare

Lux v0.4.13

Diff since v0.4.12

Closed issues:

  • Make it easier to pass empty state st = (;) (#118)
  • is there transposed convolution (#122)
  • Support for multidimensional data? (#123)

Merged pull requests:

v0.4.12

30 Jul 21:57
33d1eb1
Compare
Choose a tag to compare

Lux v0.4.12

Diff since v0.4.11

Closed issues:

  • optimising parameters with Optimization.jl (#108)
  • add OrdinaryDiffEq downstream test (#110)

Merged pull requests:

v0.4.11

26 Jul 06:06
6babea5
Compare
Choose a tag to compare

Lux v0.4.11

Diff since v0.4.10

Closed issues:

  • WeightNorm causes NaN for Conv layer gradients (#95)

Merged pull requests:

  • [LuxTraining] Wrappers for less clunky training loops (#104) (@avik-pal)
  • Fixes WeightNorm with zero Parameter bug (#106) (@avik-pal)

v0.4.10

26 Jul 04:27
c97a83a
Compare
Choose a tag to compare

Lux v0.4.10

Diff since v0.4.9

Closed issues:

  • Lighter syntax for stateless networks? (#83)
  • Scalar indexing problem for the NeuralODE example (#92)
  • Basic example from Migrating from Flux to Lux is broken || normalization issue (#94)
  • [Feature request] Another type of Chain that sequentially passing x and st (#96)
  • RNN and LSTM break when using GPU (#100)
  • Can one compose lux layers with graph neural network (#102)

Merged pull requests:

  • CompatHelper: bump compat for FluxMPI to 0.6 for package examples, (keep existing compat) (#86) (@github-actions[bot])
  • Update comparison section in overview.md (#88) (@ToucheSir)
  • Fix typos (#89) (@claforte)
  • Fix minor typos in the docs (#93) (@gabrevaya)
  • making x Float32 in migrate from Flux example (#97) (@gabrevaya)
  • add init_hidden_state function (#101) (@gabrevaya)
  • JLArray is now registered (#103) (@MilkshakeForReal)
  • Use OneHotArrays (#105) (@MilkshakeForReal)

v0.4.9

10 Jul 05:09
Compare
Choose a tag to compare

Lux v0.4.9

Diff since v0.4.8

Merged pull requests:

  • Update rrules so that we can support Yota (#85) (@avik-pal)

v0.4.8

08 Jul 03:12
f7ba291
Compare
Choose a tag to compare

Lux v0.4.8

Diff since v0.4.7

Merged pull requests:

  • More Testing + Deprecate Nonsensical Functions + Better Naming for Kwargs (#80) (@avik-pal)
  • CompatHelper: add new compat entry for Optimisers at version 0.2, (keep existing compat) (#82) (@github-actions[bot])