Viewing a single comment thread. View all comments

agentfuzzy999 t1_j5qy82t wrote

“Should I just buy a 4090”

Ok Jeff Bezos

4090 clock speed is going to be faster than similar instances that use T4s, plus wayyyyyyy more CUDA cores. Training will be significantly faster, if you can fit the model on the 4090. If you can “business expense” a 4090 for your own machine, good lord do that.

14

incrediblediy t1_j5rokou wrote

or even try to get an used 3090. if OP can afford 4090, just go with that.

2

Zealousideal-Copy463 OP t1_j5tj30h wrote

My first idea was a 3090, but I'm not based in the US, and getting a used GPU here is risky, it's easy to be scammed. A 4080 is around 2000$ here, 3090 new is 1800$, and a 4900 is 2500$. So I thought that if I decide to get a desktop, I should "just" go for the 4090 cause is 500-700$ more but I'd get double the speed than a 3090 and 8+ vram.

1

incrediblediy t1_j61evor wrote

Ah! I am also not from USA. I got my used 3090 for ~US$900. Could be cheaper now. 3090 & 4090 has same VRAM (24 GB)

1

Zealousideal-Copy463 OP t1_j61j6n2 wrote

I was checking Marketplace, couldn't find any used below 1500$. Also, I just discovered that 3090 is 2.2k$ here now lol (that would be the cheapest option)... meanwhile in BestBuy it costs 1k$, was just thinking about traveling to the US with the other k lol

1

Zealousideal-Copy463 OP t1_j5tih5j wrote

Sorry, I wrote it in a hurry and now I realize it came out wrong.

What I meant is that in my experience dealing with: moving data between buckets/vms, uploading data, logging into a terminal via ssh or using notebooks that crash from time to time (Sagemaker is a bit buggy), or just training cloud models has some annoyances that are hard to avoid and make the whole experience horrible. So, maybe I should "just buy a good GPU" (4090 is a "good" deal where I live) and stop trying stuff around in the cloud.

1