lightofaman
lightofaman t1_j3dccb2 wrote
PhD candidate on AI here. Gradient boosting is the real deal when tabular data is concerned (for both regression and classification on ML). However, thx to UAT neural nets are awesome approximators to really complex functions and therefore are the way to go for complex tasks, like the ones presented by scientific machine learning, for example. LeCun (not so) recently said that deep learning is dead and differentiable programing (another way to describe SciML) is the new kid in the block.
lightofaman t1_j4y2y0o wrote
Reply to [R] Researchers out there: which are current research directions for tree-based models? by BenXavier
The direction of the gradient