Mogady
Mogady OP t1_isvkmba wrote
Reply to comment by Mogady in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
and I was able to do many things in the interview (dealing with categorical, strings , numerical, organizing the features as array, applying the models, testing it and get a score) I could do more but simply you can't recall everything, I use HuggingFace literally every day and I have hacked it multiple times to suit my needs, but still, I can't remember how to import the LM head without searching or how to access the attention layer.
Mogady OP t1_isvk1jz wrote
Reply to comment by RockyMcNuts in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
thanks for the resource, but maybe I didn't show that in the post properly, I can do all of that and I know about it :D it is not that I'm rusty, it is just when I do all of these EDA, plots, and experiments, I don't pay attention to every line I write so that I can recall it without searching again. even If I was working with kind of problems recently I would search how to remove Nan rows from a Numpy array millions of times and copy the same one-line code. This is simply how I work, I understand Numpy and I know which functions I need to use it is just I don't spend time focusing all the details.
Mogady OP t1_isti0cl wrote
Reply to comment by Azmisov in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
who said I only specialize in NLP, yes this is mainly my experience, but I didn't fail to show how to apply a classifier on a traditional dataset, there is a difference between failing to show the ability to do something, and failing to do it 100% correctly within the given "time-frame". Also, they could have simply ignored my resume.
Mogady OP t1_isthauc wrote
Reply to comment by Azmisov in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
No man it doesn't work like that, yes you might be worried if this is the only thing you asked about, but The Nans part came late when I already used almost all the features but the last two had them and I had 5 min left, I can do that easily with Pandas but NumPy is a little complex a[~np.isnan(a).any(axis=1), :], also when you say 97% like you, what is us? let's say this month you worked with tabular data, and the next month you worked with a CV project, are you expected to remember all the syntax of openCV, Pandas, Numpy,Sklearn at that point?
Mogady OP t1_istb0r4 wrote
Reply to comment by Brudaks in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
I understand this, but at this point they simply reply "we found more suitable candidates" not you failed to one-line pandas
Mogady OP t1_istab29 wrote
Reply to comment by CommunismDoesntWork in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
they are a recruitment platform, that's the point actually they never asked me a specific question related to my experience, just random questions everywhere
Mogady OP t1_ist9vhe wrote
Reply to comment by cyancynic in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
I wish they asked me PS, I was ready for this, but anyway still I can't see what I did wrong, and this is the second time I get this irritating email from them telling me I'm not good enough as top 3%
Mogady OP t1_issupod wrote
Reply to comment by [deleted] in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
why ? what is the problem with focusing more on the problem than the tools? there are tons of tools out there for ML but some people still insist that it is all about Pandas and Sklearn so you should excel at them.
Submitted by Mogady t3_y7708w in MachineLearning
Mogady OP t1_isxaoym wrote
Reply to comment by give_me_the_truth in [D] How frustrating are the ML interviews these days!!! TOP 3% interview joke by Mogady
The data itself was linear and the relation was obvious(at least to one of the binary classes either the 0 class or the 1), so I simply kept adding more features to the classifier and it kept getting higher values, of course, this doesn't give any info about the precision and recall for each class (0, 1)