Submitted by PleasantBase6967 t3_10nbrss in MachineLearning
[removed]
Submitted by PleasantBase6967 t3_10nbrss in MachineLearning
[removed]
Thanks for the exhaustive answer.
Google collab
Here is what I do. For laptops I always want to have a good battery life. I don't need my laptop to be particularly powerful. I have a normal business laptop running Linux. I personally like Lenovo laptops because they play nice with Linux and I just love the keyboard.
I work remotely. We have our own compute server. But you could just as easily work on an AWS instance.
To me it makes more sense to use a laptop for what it is good at (Mobility) and a stationary server for what it is good at (Power). In the past I've tried to make compromises and it always is a lousy compromise.
It depends on the scope of your projects. If you're only training small models (like GANs, CNNs, etc.), then a decent modern laptop with 8+ GB RAM and an intel i7 or Ryzen 7 processor should suffice. GPUs are nice to have, but with an Intel i7 or Ryzen 7 processor you can do most of the work without them. As for the OS, Windows and Linux are both fine, but I'd recommend Linux for ML projects for maximum compatibility. Hope this helps!
Thanks
What is your opinion on macOS and the M1 chip using PyTorch
It looks like you don't care about throughput so it doesn't matter.
Throughput matters for me. How is it affected?
Then you shouldn't be training on a laptop, and should train on a remote server, in which case your laptop doesn't matter.
Don’t bother. mps support is terrible. Tensorflow GPU support is better in comparison.
However, the MBA is good for fast and efficient cpu prototyping which you should ship off to a Linux running workstation or cloud with discrete Nvidia GPUs.
Can we do a bot to autodelete these kind of posts?
Perhaps answer with a better subreddit, but autodelete without any kind of a message is rude and not helpful on the long run.
chatterbox272 t1_j67x55u wrote
Don't train on a laptop. Being portable and using lots of computational power are essentially opposite goals. You're not going to be able to train anything more than toys on battery, at which case if you're going to be tethered to a power cord you might as well be tethered to a desktop. You're also going to be limited in performance, due to a combination of efficiency-focussed laptop hardware as well as the thermal constraints imposed by a laptop form factor. You're far better off getting a highly portable, long battery life, but low power machine, and using cloud resources (even free ones like Colab or Paperspace) to do the heavier lifting.
If you absolutely must use a laptop because you're living out of your car or something and have nowhere to set up a desktop, then the rest depends on what you're doing:
If you're doing "deep learning" (anything involving neural networks more than a layer or two deep) you'll need a discrete GPU from NVIDIA specifically. AMD and AS support exist but are far from mature or feature complete in most frameworks. CPU need only be powerful enough to keep the GPU fed, a modern i5 or equivalent AMD will do the job, although you may find that specs with a suitable GPU aren't offered with less than an i7 / R7.
If you're not doing deep learning, you probably don't need a GPU. In that case, stick with integrated graphics and look for a higher end i7 or i9 (or equivalent AMD).
As a rule, you'll get better support on Linux than Windows or MacOS. You can skirt this in Windows via the WSL.
Finally, this post reads like you haven't even started doing whatever it is you're trying to do. I'm guessing you're a beginner just starting out, and I'd strongly advise anyone at that stage to delay purchasing hardware as long as humanly possible. Everything I noted is a generalisation, none of it specific to what you're doing because you haven't been (and likely can't be) specific to what you're doing. If you get started first using free or cheap cloud resources, you'll get a much better idea of which things you need, and which things you don't.