Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I said something similar in another thread, but for me it doesn't have to be better than Python, as that is largely going to be subjective, the package ecosystem just has to grow and have some offering, at all, for the things that I do.

https://fluxml.ai/Flux.jl/stable/

Is still very barebones compared to Torch/TF/Flax and I would be hamstringing myself by switching to Julia even if I find the language otherwise attractive.



Maybe keep an eye on this issue:

https://github.com/FluxML/Flux.jl/issues/1431

They are going for feature parity with pytorch hopefully in the near term.


Thanks for this, I will definitely follow along there. Yeah if they can just check a few of those boxes I'm much more likely to at least try to more regularly work with Julia.


But can Torch/TF/Flax do autodifferentiation on constants ordinary functions? No they cannot!


Flax/jax can :)


If it's a pure function. Oh and if you have state-based control flow you have to turn off the JIT. Etc. If you take a standard library like some thermodynamics simulator and throw Jax on it do you expect it to work without modification? Most of the time it'll fail right at the start by using the wrong implementation of Numpy. So no, that's not "ordinary functions": those are functions where people consciously put in the effort to rewrite years of work onto Jax which is very different.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: