ML4Bratwurst t1_j5z42ky wrote
Call me picky, but I would not use a ML library that is not GPU accelerated. This should be default
ReginaldIII t1_j5zvqal wrote
Okay, you're picky :p
Try deploying a model for realtime online learning of streaming sensor data that needs to runs on battery power and then insist it needs to run on GPUs.
Plenty of legitimate use cases for non GPU ML.
ML4Bratwurst t1_j5zxikl wrote
Can you give me one example of this? And even if. My point is still valid because I did not say that you should delete the CPU support lol
ReginaldIII t1_j5zzhj1 wrote
Pick the tools that work for the problems you have. If you are online training a model on an embedded device you need something optimized for that hardware.
I gave you a generic example of a problem domain where this applies. You can search for online training on embedded devices if you are interested but I can't talk about specific applications because they are not public.
All I'm saying is drawing a line in the sand and saying you'd never use X if it doesn't have Y is silly because what if you end up working on something in the future where the constraints are different?
fernandocamargoti t1_j5z4qpc wrote
Evolutionary algorithms are not ML.
new_name_who_dis_ t1_j5zoc0t wrote
They are not gradient-descent based (so they don't need GPU acceleration as much, but sometimes times still do depending on the problem) but they are definitely ML.
fernandocamargoti t1_j5zs45e wrote
They not about learning from data, they are about optimization. They are from the broader AI field of study, but I wouldn't say they are ML. They serve a different purpose. Even though there are some research about using them to optimize models (instead of using gradient descent), but it's not their main use case.
ReginaldIII t1_j5zv9gz wrote
Thats such a tenuous distinction and you're wrong anyway because you can pose any learning from data problem as a generic optimization problem.
They're very useful when your loss function is not differentiable but you still want to fit a model to input+output data pairs.
They're also useful when your model parameters have domain specific meaning and you can derive rules for how two parameter sets can be meaningfully combined with one another
Decision trees and random forests are ML too. What you probably mean is Deep Learning. But even that has a fuzzy boundary to surrounding methods.
Being a prescriptionist with these definitions is a waste of time because the research community as a whole cannot draw clear lines in the sand.
fernandocamargoti t1_j60xagg wrote
Well, what you talking about is some ways to use evolutionary algorithms to optimize the parameters of a ML model. But in my eyes, it doesn't mean it is ML. They both share a lot, but they aren't the same. For me, evolutionary algorithms is part of Meta Heuristics, which is part of AI (which ML is also part of). Different areas and sub areas of research do interact with each other. I just mean that the is part is a bit too much in this case.
ReginaldIII t1_j61nlno wrote
Trying to force these things into a pure hierarchy sounds nothing short of an exercise in pedantry.
And to what end? You make up your own distinctions that no one else agrees with and you lose your ability to communicate ideas to people because you're talking a different language to them.
If you are so caught up on the "is a" part. Have you studied any programming languages that support "multiple inheritance" ?
[deleted] t1_j61n377 wrote
[deleted]
new_name_who_dis_ t1_j601m4q wrote
Gradient descent is also about optimization... You can optimize even neural networks with a bunch of different methods other than gradient descent (including evolutionary methods). They don't work as well but you can still do it.
Viewing a single comment thread. View all comments