Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> If flux chose a buggy package as a dependency, that's on them, and users are well justified in steering clear of Flux if it's authors are not in habit of auditing the dependencies they pull in. As of today, the Project.toml for both Flux and DiffEqFlux still lists Zygote as a dependency. Neither list Enzyme.

For DiffEqFlux, it's just for backwards compatibility to Flux. DiffEqFlux is a weird library because it's been "eradicated" over time. There was a point in time where it defined some necessary things to do the things in its documentation. At this point, for most tutorials it's not even required that you have the DiffEqFlux library to do it. That makes it a rather odd library haha. The transition is:

- GalacticOptim.jl has become a fully-fledged optimization package formalizing some of the heuristics and multi-package support it was using internally. https://galacticoptim.sciml.ai/dev/

- The adjoint overloads are handled all by DiffEqSensitivity.jl at this point. If you try to use them without having that library then you get an error asking you to install it. DiffEqFlux.jl just reexports it. This is why you don't see the Enzyme.jl dependency.

- Enzyme.jl has gotten better and better over time, and has become the go-to for DiffEq. In fact, we use a polyalgorithm under the hood that tends to prefer Enzyme.jl and ReverseDiff.jl for the autodiff over Zygote.jl, so it's weird that in 2022 any comparisons still feature Zygote.jl given that, internally, it's almost certainly not using Zygote. Zygote is what the user sees and interacts with, but it's not the core. (Even then, that will be changing soon with the coming Enzyme.jl overloads)

- FastChain was a setup to get around issues with Flux, but that has become a full-fledged library of its own, Lux.jl, which isn't completely done yet but will be how those pieces get deleted.

So yes, the state of DiffEqFlux.jl is that it has been a fast moving library to the point of its own destruction, basically just holding what would become "hacks" to make the high end work that were then slowly absorbed to become things that "just work" when using Julia. What the library is pivoting towards now is just being a high-level interface for common machine learning use cases of differential equations, like for defining FFJORD or continuous normalizing flows architectures, which you could do from scratch but it's nice to have somewhere that these are all defined. What to do with those tutorials, who knows, move those to DiffEqSensitivity.jl docs, and then we need some kind of inter-module documentation so that way people can easily see the 25+ docs of SciML in one website (or whatever the number is).

But honestly, while it becomes a documentation mess to eradicate a higher-level library like this, this has been our dream with Julia over the last few years. Having no place to describe the differentiability of solvers means differentiable programming is truly working. With Enzyme's improvements and such, we're supporting even things like mutation, which is progressively making almost any Julia code you write "just work" with automatic differentiation. No hacks are required to make it work with some underdocumented sublanguage (cough Jax). As everything becomes automated, it becomes harder to document it as a feature because it's instead just a property of being a Julia library.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: