chengstark
chengstark t1_j9gd72x wrote
How about we don’t do the useless Olympics? Absolutely pointless show.
chengstark t1_j79j1rw wrote
Off the topic, that’s a nice suit!
chengstark t1_j5k99hm wrote
Reply to Tensorflow or Pytorch by ContributionWild5778
Pytorch
chengstark t1_j3dr7nw wrote
Reply to Review Request: MS in AI Grad Student with 3+ years of relevant experience trying to apply for Summer Internships '23 (posting here because I need domain-specific feedback) by animikhaich
You got two “performed” in the H2X lab item. Other than that it’s great
chengstark t1_j26evbj wrote
Reply to Laptop for Machine Learning by sifarsafar
Don’t bother, ask department for cluster resources, none of these will be enough for actual fast DL works.
chengstark t1_j1gqy3z wrote
Reply to Student & Need Help by alla_n_barakat
Look up some model compression techniques, use smaller batch sizes etc. sorry for your situation, it is very hard to do proper work without the proper tools.
chengstark t1_izygwqn wrote
Reply to comment by Sixo60 in Priority of data in deep learning? by Sixo60
In academia we usually have the data already labeled, but I did one unfortunate project where the annotation is absolutely garbage (too many mistakes). Ensuring the correctness of labeling should be one of the priorities. From my limited experience you would want collaborators with domain knowledge of the data to make sure the processing is absolutely correct.
Recent developments in self supervised learning and generalized pretrained big models may lower the amount of labeled samples needed, not sure what that would affect your product, but it seems related.
chengstark t1_izy6m4n wrote
Reply to Priority of data in deep learning? by Sixo60
Very.
chengstark t1_iztu0do wrote
Reply to comment by digital-bolkonsky in What’s different between developing deep learning product and typical ML product? by digital-bolkonsky
Sorry for being blunt, wtf is productization in this context, what does this word include? This is way too broad of a question, there are many nuances in ml/dl development, too many varibles could change based on a specific use case.
Simple models can be used just with the trained model and some API calls, this is the same between DL and ML. Non computational intensive tasks don’t even need GPUs/TPUs, most can even run on embedded hardwares. However they differ in amount of data required for training; data formats/ types also matter, typical ml algorithms work better with tabular data, but you wouldn’t use them for images. I mean what kind of garbage question is this lol. You can write a whole book on this.
If I get asked this question I’d ask back for a more concrete example, throwing out a generalized question only indicate the interviewer does not have the know how in ml/dl operations.
chengstark t1_iz18t86 wrote
Reply to Since AI is poised to disrupt/aid in/replace many technical and creative jobs. Is it logical to assume that studying the field of AI/machine learning/DL is a way to future proof your employment for a while? by pawnh4
No, you’d be surprised how capable models will be and how stubborn humans ate.
chengstark t1_iw0xw0w wrote
Some trail and error and some common techniques. Warm up, lr scheduling is not hard to think of.
chengstark t1_ivzd2on wrote
Reply to The Ultimate Subway-Safety Plan by King-of-New-York
Hmmm, might as well just invent the teleportation machine /s
chengstark t1_ivruyad wrote
MATLAB is absolutely garbage for any production use. Maybe good for prototyping, but I have zero idea why you would use that over PyTorch.
chengstark t1_jcrfptp wrote
Reply to Seeking Career Advice to go from general CS background to a career in AI/Machine Learning by brown_ja
Dont, it’s a waste of your time.