Submitted by GraciousReformer t3_118pof6 in MachineLearning
randomoneusername t1_j9jkzs7 wrote
Reply to comment by [deleted] in [D] "Deep learning is the only thing that currently works at scale" by GraciousReformer
The statement stand-alone that you have there is very vague. Can I assume is taking about NLP or CV projects ?
On tabular data even with non linear relationship normal boosting, ensemble algorithms can scale and be top of the game.
bloodmummy t1_j9jzvnr wrote
It strikes me that people who tout DL as a hammer-for-all-nails never touched tabular data in their lives. Go try to do a couple of Kaggle Tabular competitions and you'll soon realise that DL can be very dumb, cumbersome, and data-hungry. Ensemble models,Decision Tree models, and even feature-engineered Linear Regression models still rule there and curb-stomp DL all day long ( For most cases ).
Tabular data is also still the type of data most-used with ML. I'm not a "DL-hater" if there is such a thing, in fact my own research is using DL only. But it isn't a magical wrench, and it won't be.
Viewing a single comment thread. View all comments