Submitted by GraciousReformer t3_118pof6 in MachineLearning
DigThatData t1_j9k17rr wrote
it's not. tree ensembles scale gloriously, as do approximations of nearest neighbors. there are certain (and growing) classes of problems for which deep learning produces seemingly magical results, but that doesn't mean it's the only path to a functional solution. It'll probably give you the best solution, but that doesn't mean it's the only way to do things.
in any event, if you want to better understand scaling properties of DL algorithms, a good place to start is the "double descent" literature.
JackBlemming t1_j9n4bp2 wrote
This is true. Netflix famously didnt use some complex neural net for choosing shows you'd like exactly because it didnt scale. Neural nets are expensive and if you can sacrifice a few percentages to save hundreds of millions in server fees, it's probably good.
DigThatData t1_j9nlrii wrote
just to be clear: i'm not saying neural networks don't scale, i'm saying they're not the only class of learning algorithm that scales.
Viewing a single comment thread. View all comments