Submitted by ronaldxd2 t3_xwfaft in deeplearning

I have 3 gpu laying around here. 1 RTX3080TI and 2 RTX3070Ti, for image learning tasks. Worth putting all the 3 gpu to work together in single task or its better to keep only with the 3080 TI alone?

1

Comments

You must log in or register to comment.

Aromatic-Ad-2497 t1_ir6mvvw wrote

Highly depends on the task and especially how fast or accurate it needs to be.

Former miner trying to make use of spare GPU hehe?

1

Knurpel t1_ir90nkx wrote

Assuming that your deep learning stack uses CUDA: Multi-GPU CUDA is not for the faint of heart, and most likely will requre intense code wrangling on your part. It's not as easy as sticking in another GPU

Your GPUs support the outgoing NVLINK, and using it would make things easier on you.

https://medium.com/gpgpu/multi-gpu-programming-6768eeb42e2c

1

GPUaccelerated t1_ireeqet wrote

It’s definitely worth the test! Have some fun and play around with tensorflow. Once you have the 3 cards set up to work on the same job, test it. Compare your results to running individual jobs. I personally think they’ll do better alone but you should check it out for yourself. :) have fun!

1