Submitted by olmec-akeru t3_z6p4yv in MachineLearning
imyourzer0 t1_iy35ydo wrote
I don’t know why people worry so much about the state of the art. Sometimes, the right tool just happens to have already existed for a while. In a lot of cases, PCA works just fine, or to the point where something much more current won’t give you a much better answer. Like another commenter has already said, depending on the assumptions you can or are willing to make, the best choice needn’t be at the bleeding edge.
olmec-akeru OP t1_iy362iv wrote
Cool, I get this—but I think its important not to keep ones head in the sand. There are new techniques and its important to grok them.
Alert_Albatross9145 t1_iy3pw2i wrote
Two valid points, appreciate the discussion!
imyourzer0 t1_iy5unj3 wrote
I certainly wouldn’t advise anyone to ignore new methods. That’s a point well taken. I’m only saying that when you have a working method, and its assumptions about your data can be validated (either logically or with tests), you don’t need to start looking for SOTA methods.
Really, the goal is solving a problem, and to do that, you want to find the most appropriate method—not just the newest method. It’s one thing to “keep up with the Joneses” so that when problems arise you know as many of the available tools as possible, but picking an algorithm usually doesn’t depend on whether that algorithm is new.
olmec-akeru OP t1_iy7apiy wrote
Beautifully said.
ktpr t1_iy3v2i1 wrote
Unfortunately, deeply understanding your problem and its relationship to prior algorithms is a lot more work than just telling someone that you applied a SOTA algorithm and got decent results.
edit - an apostrophe
olmec-akeru OP t1_iy7aszg wrote
Completely correct.
The corollary remains true though: applying the correct algorithm is a function of knowing the set of available algorithms. The newness of the algorithm isn't a ranking feature.
Viewing a single comment thread. View all comments