Viewing a single comment thread. View all comments

chatterbox272 t1_iudez8h wrote

Without a doubt. You get more than double the VRAM (11GB -> 24GB), and you get tensor cores which are significantly faster, and also half-precision tricks give you effectively double VRAM again compared to FP32. A 3090 (possibly Ti) can train the kinds of models you used to train on 4x1080Ti.

3

jackhhchan OP t1_iugnxpa wrote

yes the VRAM jump is one of the biggest reasons why I am considering the upgrade!

4 x1080ti - that is certainly too tempting

​

Thanks!

1