Viewing a single comment thread. View all comments

colugo t1_j1y5k71 wrote

I kind of feel like the answer is, if you are doing the kind of work that needs more RAM, you'd know.

In deep learning in particular, RAM would affect your maximum batch size which could limit how you train models. I'm not sure which particular hard limits you'd come up against in other machine learning. More RAM is helpful, sure, but you can usually use less with more efficient code.

9

peno8 OP t1_j1y5vrf wrote

Hey, thanks for the reply.

I know using macbook for DL is kind of unusual, so for DL I will use Google Colab or buy desktop. So for DL I will use my laptop for feature calculation, and the batch size will not be a problem to me.

3

barvazduck t1_j1yl9s5 wrote

I work on LLMs and it's my setup (I have M1 32gb, though it has little influence as everything is done on colab/server jobs).

Local decice can have more influence if you plan to use the laptop for testing/inference colab made model.

1