tripple13

tripple13 t1_ixgqyyi wrote

To be fair, I understand your motivation, I've had similar reservations.

However, the amount of boilerplate code I've been writing (DDP, train/evals, tracking metrics etc.) has been shrunk by a huge amount after switching to Pytorch Lightning.

When you are measured by your efficiency in terms of hours spent, I'd definitely argue for simplifying things, rather than not.

3

tripple13 t1_it1vs7l wrote

Being pedantic is at least not a pre-requisite.

Normalization is just centering and standardizing the data. Which these researchers are fully aware of.

Does that mean you suddenly transform Poisson distributed data into Gaussian? No.

Is it a big mistake to name it as such, ahhh, I don't know. Is it a measure of their mathematical ability? No definitely not.

Does it tell something about the level of pedacticity (i don't even know if that's a word) of the person? Maybe.

I'd argue becoming successful in this field you can go many ways, one of them may be very specific and T-person oriented (like measure-theory for instance), other ways may be more rounded and broad based. Whatever works for you.

5