Submitted by Balance- t3_11ksa12 in MachineLearning
etesian_dusk t1_jb8yzec wrote
Why would I start using this today?
nucLeaRStarcraft t1_jb9289f wrote
they claim it's fast on apple m1 and some embedded arm devices, but i have no idea how easy it is to use ootb.
etesian_dusk t1_jb94rak wrote
Ok, that doesn't sound like much. I don't understand why I should abandon standard and verified tools for this.
On top of that the whole "George Hotz Twitter internship" thing was just embarassing. I trust him to jailbreak playstations, but that's the end of it.
chris_myzel t1_jb9bbqz wrote
pytorch installations go typically into the gigabytes, while tinygrad keeps it core at <1000 lines.
SuddenMinimum t1_jbdwwaa wrote
Apparently it's An upsetting 2223 lines of code now.
etesian_dusk t1_jbioscf wrote
Comparing package size to "sore sourcecode" size is kind of misleading. The pytorch codebase by itself isn't 1 GB.
Also, in most usecases, I'd rather have pytorch's versatility, than be able to brag about <1000 lines.
Viewing a single comment thread. View all comments