Comments

You must log in or register to comment.

t1_jbs27ei wrote

> I am new to Deep Learning

I feel it's probably better to understand the what or why of DL instead of the how at your level. That is Colab should be plently enough

17

t1_jbs61yn wrote

I second this. Cloud is pretty much great for ALL your learning purposes, once you get decent, you won't be doing it alone! Don't rush.

4

t1_jbrehr8 wrote

Get a 3090 or a 4090 for the vram. For short term cloud is always better cuz it’s cheaper. For long term , having a gpu is incredibly handy and the cloud expenses build up so a gpu in that case becomes more economical.

Imo 3080 isn’t worth it because 10gb is too low.

5

OP t1_jbrezlx wrote

I value your input and will apply this. Re: costs building up, that’s definitely my concern too. Is it best to buy from like a BestBuy or are there better options ?

1

OP t1_jbrfnji wrote

Oh and does it matter if I get an Intel or AMD (assuming it’s RTX3090)?

1

t1_jbrlatc wrote

I use a 5800x but for deep learning I would recommend Intel for max stability

1

OP t1_jbrlthy wrote

Got it. Could you clarify what you meant by stability ?

1

t1_jbrn733 wrote

Personally haven’t faced an issue but a more experienced person told me he has ran into some issues with AMD chips with some frameworks. That’s all. Nothing major.

2

OP t1_jbrv1te wrote

Appreciate the clarification. I am getting a good deal on RTX 4080 with an AMD Ryzen 7

1

t1_jbsrgdc wrote

I am agree with most of the people here, for learning colab is more than enough in the start (but you have to install libraries every time- keep in mind).

If you have basic knowledge of DL then it is better to go for desktop with gpu that you can afford (once install use always).

2

t1_jbtbhcy wrote

I’ve been getting into deep learning for about 6mo and have access to some pretty intense graphics cards (RTX 8000) but have not even used them for it because it’s widely accepted that it’s not worth the trouble for beginners and you can focus on the more important lessons by using platforms like kaggle and colab which are free and plenty powerful

2

t1_jbubl55 wrote

You can learn with lower cards. I still use a GTX1050Ti for training on some models. Works great. The important thing is learning how to use resources efficiently. If you can use Collab or a lower card well, then you’ll know the moment you really, actually need a better card (or cards).

2

t1_jbsau01 wrote

I have an rtx 2080 and it’s barely faster than google colab free GPU for training a pretty large cnn. Use that as you will.

1

t1_jbt2v1y wrote

Unless you plan to train models consistently every day for at least several hours for more than a year, cloud will be more efficient.

1