should_go_work
should_go_work t1_j6kooe8 wrote
Reply to [D] Sparse Ridge Regression by antodima
If your goal is to do linear regression and enforce hard sparsity constraints on W, then there are several algorithms to do this directly (not guaranteed to recover the true sparse W unless certain conditions are met though). A simple starting point might be orthogonal matching pursuit: https://scikit-learn.org/stable/auto_examples/linear_model/plot_omp.html.
should_go_work t1_ix06vav wrote
Reply to comment by LosTheRed in [D] Simple Questions Thread by AutoModerator
Depending on the model, "tracing" the output is certainly possible - for example, in decision trees. As far as confidence is concerned, you might find the recent work in conformal prediction interesting (basically predicting ranges of outputs at a specified confidence level). A really nice tutorial can be found here: https://people.eecs.berkeley.edu/~angelopoulos/publications/downloads/gentle_intro_conformal_dfuq.pdf.
should_go_work t1_j9zclbn wrote
Reply to comment by TinkerAndThinker in [D] Simple Questions Thread by AutoModerator
Pattern Recognition and Machine Learning (PRML) and Elements of Statistical Learning (ESL) are two of the standard references that will give you what you're looking for with regards to the more classical topics you allude to (linear models, kernels, boosting, etc.).