met0xff

met0xff t1_j87vd58 wrote

Think the problem is that we don't want a language that can only do X. Imho that's one of the big issues with the adoption of Julia. Because it just doesn't offer such a big ecosystem like Python otherwise. That's why Torch went from Lua to Python. And why Swift for Tensorflow didn't become more popular.

Because the deep learning code generally doesn't stand on its own. I once ported our inference engine to Rust to some degree by using tch-rs to call torchscript exported models. But that's only half of the game, before, between and after the networks there are lots of processing activities that were a pain without a lot of python libs. Just finding something solid like SpaCy is pretty tough in almost any other language.

I think Swift 4 TF looked awesome. But if nobody builds good platform support, tooling, plotting libraries, integration of SDKs (like AWS), experiment tracking, configuration management blah blah around it, it doesn't help much. If I look at my current work project there's pytorch as dependency that's directly related to DL and then some 100 others that are not ;).

Ok so the other option is to use an embedded language like we had it with Lua. Suddenly you have to deal with the main language, this embedded language and probably with the lower C++ and CUDA layers. Also what exactly does the embedded language cover? Just differentiable programming code? And then you got to inferface with data loader code that might have to load specific point cloud data formats, extract Mel spectrograms or pitch contours or run complex text analysis pipelines or get stuff from hdf5 or read from some exotic DB or azure or whatever.

As Jeremey Howards has been mentioned - yeah he had high hopes for Swift and then Julia but now it's back to "well, seems we're stuck with Python after all" (check Julia vs Python here https://wandb.ai/wandb_fc/gradient-dissent/reports/Jeremy-Howard-The-Simple-but-Profound-Insight-Behind-Diffusion--VmlldzozMjMxODEw)

26