Submitted by moekou t3_z4lpry in deeplearning

I'm building a brand new RTX 4090 PC and came across a number of posts from several years ago saying Linux was vastly preferred over Windows for the field. Is that still the case or has Windows caught up/is good enough for casual non-professional projects? Is dual boot setup the optimal choice if I want to do both deep learning and be able to run standard Windows applications/games on the machine?

37

Comments

You must log in or register to comment.

notgettingfined t1_ixrnwuu wrote

I dual boot. I’m sure you could get things to work on Windows but it would be a horrible experience.

22

Cholojuanito t1_ixrsm65 wrote

Most DL hardware accelerator libraries are built and tested to run on servers which are more than likely Linux machines so yeah Linux would be the way to go.

14

stillworkin t1_ixs09pm wrote

I'll extend this: for the majority of real computer science/ engineering work (especially back-end stuff), *nix is the way to go. It's a huge reason Macs are the standard for CS folks (ever since the terminal became part of the MacOS). At every university I attended and taught, and at every company I've worked for, linux/macs have been used by essentially 100% of the people. Roughly once a year I'll hear/see someone using Windows.

8

RichardBJ1 t1_ixs8s3c wrote

In my dept a mix of window/Linux. CUDA-GPU fitted machines, mostly TensorFlow /Python. No significant issues with either. But that is, in CS terms quite a limited remit.

2

onlymagik t1_ixsi37l wrote

I got TensorFlow 2.0 to work first time on my personal machine on Windows 11 a year ago. I think it has come a long way from before. I just downloaded the latest version of Tensorflow, which was 2.8 at the time, and the CUDA drivers that went with that version, and it all worked nicely.

4

z3n777 t1_ixsidqn wrote

For research and small projects windows might be bearable, for anything more serious big no

Running VMS / wsl on windows also big no, it's a terrible workaround and completely inefficient, not counting the amount of time to set it up

0

obsoletelearner t1_ixsqjua wrote

Does anyone actually use windows on your servers? hows the experience so far?

1

chatterbox272 t1_ixsu1u4 wrote

Yes but WSL is now good enough most of the time, so long as you don't need every last megabyte of VRAM. I'd start with Windows and WSL, and then move to dual boot if you find you need it.

4

someone383726 t1_ixswhp3 wrote

I have not had any problems getting tensorflow to run on windows and utilize my Nvidia gpu. I watched some YT video to get everything setup and the big thing was making sure the versions of all the components were compatible. I’ve also converted my models to onnx and completed inference on windows and Linux (VPS)

3

Syzygianinfern0 t1_ixsyeat wrote

You would want to try out Pop OS over Ubuntu for the Linux distro to run on your PC. It makes installation of GPU drivers super easy. Infact you don't even need to do anything. Just select the Nvidia ISO and you should be good to go right after installation!

9

Syzygianinfern0 t1_ixt12wk wrote

It has a super intuitive and short installation, you just need to select the desired drive and keep pressing next. You can check installation instructions on YouTube too.

1

Zeratas t1_ixt1lr7 wrote

You don't, it's essentially access to a Linux 'server' OS that lives in your Windows installation.

You can ssh into it and do almost everything you normally would, except GUIs (QT, KDE, Gnome ...etc)

5

dwl2234 t1_ixtau51 wrote

DL methods are usually using anaconda environments. It should not matter much which os are using as long as you have cuda installed

4

ChunkyHabeneroSalsa t1_ixtdpgh wrote

I've used Windows for a few years. I've had bugs and issues that would never get resolved but all in all it's not a big deal. Tensorflow + C++ + Windows was a nightmare but now I use pytorch->onnx and run onnxruntime in c++ and have no problems.

Haven't tried wsl. I have it setup and I use it to test builds because we are switching to linux at least on the production side so our code compiles for both windows and Linux.

1

lamhintai t1_ixtiq1i wrote

Yeah.. Instead of a no for running on Windows, it would be more likely a no for AMD / Intel GPU. I wonder if there’s any actual working data science shop running on AMD?

3

Appropriate_Ant_4629 t1_ixtlk37 wrote

> Windows + WSL with Ubuntu is great

I wouldn't call it "great".

It's very fragile environment compared to running Linux as the host OS. At work our Linux machines have uptimes measure in years, running many jobs 24x7.

It's lucky if my co-worker's GPU-accelerated WSL environments survive a few days.

I much prefer running Linux as the host and Windows in a VM for the rare occasions I need it.

4

RichardBJ1 t1_ixtsdc5 wrote

Thanks, I hadn’t seen that implementation, I ran some deep learning models on TensorFlow adapted for iMac GPU, but it was less stable and significantly slower than the Nvidia Linux/Windows equivalents. Things have moved on I guess.

1

x11ry0 t1_ixtxx15 wrote

Yes. You can go with Windows but if you don't want to be bothered by your environment and things fo go smoothly take a mainstream Linux distro or something based upon it.

1

Pd_jungle t1_ixu2scb wrote

Is it ok to use windows now? I remember 5 years ago I have to install numpy in windows from unofficial repo

1

Mooks79 t1_ixud2qj wrote

Even more recently MS have provided their own solution for Windows 10 (provided you’re running a sufficiently updated version) and 11, which supports both X11 and Wayland.

See here albeit this document is slightly out of date because, as of just a few days ago, they now support Windows 10 from 19.044 (21H2) onwards iirc. Edit: as per here.

5

Diasimos t1_ixuj5p4 wrote

Yes. Id say go for an arch based distribution like garuda Linux (perfect for gaming and regular Linux use). Remember never get noveau drivers for your gpu. You can qemu-kvm a windows 10/11 slim etc ISO or just dual boot but it's a Linux machine better just kvm windows.

1

Diasimos t1_ixujhsk wrote

Sorry but I also mention arch packages are for bleeding edge tech and garuda isn't for expert Linux users. Better this than Debian distributions even in stability

1

corneliusJackson1 t1_ixvxvpg wrote

I think it’s great from a local workstation perspective and see it no more fragile than a standard windows, Mac, or Linux work station environment.

I do agree it doesn’t compare to a professional server cluster. I use wsl at work for a local workstation (small builds/jobs and debugging) but farm out big jobs to a server cluster.

Based on the described use case I think wsl would be a great option. I use it on my personal machine and transitioning from a code/ml project to playing a game of league is so easy compared to switching to my dual boot partition. Running wsl does share system resources with a second os, but the over head of windows is not that large. I personally never run into the case where native Linux provides a Benifit over wsl for jobs small enough that I wouldn’t just farm out to a cloud based server (very rare with my personal projects).

2

corneliusJackson1 t1_ixvzffk wrote

Not exactly answering your question, but a 4090 is probably quite a bit of overkill for personal ml projects. On the rare occasion you will be training models that require those resources, you can probably leverage cloud based resources and it will be cheaper in the long run.

With that said you mentioned gaming being a primary use case of your system. Depending on what you play I assume you will want a windows os, and if that is the case I will echo what others have said and say wsl2 is great.

I personally used a 2070 with wsl 2 for graduate work In computer vision and deep learning.

2