AsIAm

AsIAm t1_j88u9rb wrote

Me too, but it was Google project, and what Google does to its projects..?

I think relying on TF would have been a mistake. This deep integration approach will be more fruitful in the long run. Also, if anybody wants to do ML in Swift on Apple platforms, there is awesome MPS Graph.

3

AsIAm t1_izx39lx wrote

His take on hardware for neural nets is pretty forward(-forward) thinking. Neural nets started by being analog (Rosenblatt's' Perceptron) and only later we started simulating them in software on digital computers. Some recent research (1,2) suggest that physical implementation of learnable neural nets is possible and way more efficient in analog circuits. This means that we could run extremely large nets on a tiny chip. Which could live in your toaster, or your skull.

15