ParanoidTire

ParanoidTire t1_j9hdztb wrote

No idea what nmf is, but normalization is usually a critical step for any ML algo. Min max normalization is common, as well as z normalization. If your data needs to be positive, adding the minimum is indeed a way to guarantee this.

1

ParanoidTire t1_j9hd4r5 wrote

Welcome to the world of research. You can find all that stuff in so called "papers", i.e. publications. To get started I would suggest to have a look at one of the most influential architectures: resnet. Just Google "resnet paper" and your good to go (too lazy to fetch the citation, but it's by he et al.)

1

ParanoidTire t1_j9hc4uq wrote

My journey started years ago by wanting to understand the DQN paper. Hintons coersera course was a nice start and after that it was just going down the rabbit hole which are citations. It takes a lot of effort in the beginning because every single sentence you read will introduce new topics to you that you never heard before. But after a while these become second nature and you won't spend any second thoughts on them anymore. It just takes preserverance and will imo.

1

ParanoidTire t1_j9hbd77 wrote

I have years of grievances with io. It's really difficult to have something that is both flexible, performant, and can scale to terabytes of data with complex strucuture. As soon as you leave the nice cv or nlp domain you are on your own. Raw c type arrays loaded manually from disk in a separate Cuda stream can sometimes be really be your best shot.

1