Numba is great (as well as Jax, which is even better), but its limitations are drastic compared to Julia. So much of what is great about python (e.g., compossibility, introspection, simplicity) is lost with these libraries, while Julia still retains them.
The main thing missing from Numba is user defined structures, the jit classes are a performance penalty usually.
That said, in terms of composability you can jit over the closure to achieve a lot of what you might want, e.g.
def make_loop(f):
@jit
def fn(x):
for i in range(x.shape[0]):
x[i] = f(x[i])
return fn
for any jit'd function f will be just as fast as if you had inlined the body of f.
For introspection and simplicity, I think, in high performance, you simply have to choose two of fast, simple and generic. Julia clearly chooses fast and generic.
Could you elaborate on why you are saying julia is not as simple? Certainly, there are performance tips one needs to be aware of, but that is the case with python too. I have used python for relatively high-performance numerics for research work for the last decade, and now that I am exploring Julia, it certainly addresses all of the concerns I mentioned above with code which is at least as legible/simple and much more introspective. It has its warts, but those warts are on the roadmap and I have seen significant improvement between v1.3 and v1.6 (e.g. compilation latency and debuggability).
Compiler errors can be on par with heavily templates C++, with similarly difficult to read code bases. It’s of course easier to work with than C++ but not the panacea that seems to be implied in a lot of the comments.
In terms of deployment, you essentially have to have Julia installed wherever you want to run, which frequently ok, and if not there’s always PackageCompiler except it’s not that easy to get a working shared lib.
There are other things I find complex but probably because I’ve used Python too long so Julia is different etc. I think Julia is a great choice for HPC generally except for challenging SIMD codes where it’s hard to vectorize without an explicit ILP model.
E.g. jax does not autodifferentiate anything that is not jax (scipy ode solvers, special functions, image processing libraries, special number types (mpmath), domain-specific libraries). Compare that to Zygote.jl
Are you talking about "Differentiating with respect to nested lists, tuples, and dicts" from that page? The comment to which you are responding covers quite a bit more. The jax documentation specifically says "standard Python containers". Zygote.jl and other less stable Julia auto-diff libraries go far beyond the built-ins and can work with structures defined by packages never designed to be used with automatic differentiation. Of course, there are limitations, but quite a bit less severe than the one in jax (and again, I am saying this while being a big jax fan).
I understand you are frustrated, however, please remember
> Please don't comment on whether someone read an article. "Did you even read the article? It mentions that" can be shortened to "The article mentions that."
> Please don't comment about the voting on comments. It never does any good, and it makes boring reading.
> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.
I am confused why you assume I am a "Julia booster" or use such combative language. I love Python and Jax and use it for much of my research work, I just also like learning of other approaches. Please try to honestly address the sibling comments. We have repeatedly claimed that tools like Zygote.jl can autodifferentiate efficiently things that Jax can not (without a lot of extra special code and hand-defined backprop methods), e.g., an array of structs with scalar and vector properties over which a scalar cost is defined. Just give examples, so that we can both learn something new about these wonderful tools instead of using such offensive language. It is hard to not take your own comments as the ones being dismissive.
Also, look from where this conversation started. My claim was that jax does not work with "(scipy ode solvers, special functions, image processing libraries, special number types (mpmath), domain-specific libraries)". A julia library does not need to know of Zygote.jl to be autodifferentiable. A python library needs to be pure-python numpy-based library to work with jax.
In order to try to contribute to the discussion: I think this paper describes relatively well what is so special about the Julia autodiff tools: https://arxiv.org/abs/1810.07951
Indeed, but "python code written to accept numpy" is a pretty restrictive subset (comparatively; I do still enjoy using python). It does not even cover most of scipy, let alone the domain specific libraries, which frequently end up using cython or C for their tightest loops.