cruddybanana1102
cruddybanana1102 t1_j6wkzwb wrote
Reply to comment by AbCi16 in Using Jupyter via GPU by AbCi16
If you have Nvidia with CUDA toolkit installed run nvidia-smi and you'll have your answer
cruddybanana1102 t1_j6n46op wrote
Reply to comment by pfm11231 in [D] deepmind's ai vision by [deleted]
I don't really unserstand the question What do you mean "looking at a screen"? Or "looking at numbers and finding a pattern"?
The model takes in multidimensional array as input. That array is all the rgb values at a given instant. Take that to mean whatever suits you.
cruddybanana1102 t1_j601vly wrote
Someone has already mentioned Neural Ordinary Differential Equations, which is also the first thing that came to mind. There are also extensions to it, where one can use PDEs(Neural Hamiltonian Flows) or even stochastic DEs(Score-Based Generative Models) in the model. All of them covering different but overlapping use cases.
There are also techniques which use numerical solvers as blackboxes to perform model-order reduction of a complicated system of equations, or identifying slow modes, timescale decomposition, etc.
cruddybanana1102 t1_j4asit0 wrote
Reply to comment by Skirlaxx in [D] Has ML become synonymous with AI? by Valachio
I mean Generative Adversarial Netwroks do engage in minimax optimization, and produxe deepfakes. I don't think anybody agrees that GANs have nothing to do machine learning.
cruddybanana1102 OP t1_j262t6l wrote
Reply to comment by TheNovicePhilomath in [D] Nesterov as a special case of PID control? by cruddybanana1102
Ikr! It blew my mind to see optimal control inspired designing of new optimizers! It shouldn't be surpising really but I can't not appreciate it. Also loveeeee the Kalman filter paper!!!!! And thanks for digging out that paper for me. Haven't gone through it fully yet, but it looks promising.
Submitted by cruddybanana1102 t3_zyclre in MachineLearning
cruddybanana1102 t1_j1yj895 wrote
Reply to comment by T4KKKK in [D] ANN for sine wave prediction by T4KKKK
Neural networks with any non-linear activation should do the job, periodic activations are not necessary.
Also if you have to predict the sine wave, don't do neural networks. Try simpler learning algorithms, neural networks are mostly overkill. Imho kernel regression or something should be an easier way to go, but as always, can't guarantee without trying
cruddybanana1102 t1_iznado4 wrote
You should check out what's known in the field as "uncertainty-aware learning". Definitely not the same as "getting NNs to estimate their own uncertainty" but certainly helpful for what you're wanting to do
cruddybanana1102 t1_izn9uza wrote
Reply to comment by shawarma_bees in [D] Making a regression NN estimate its own regression error by Alex-S-S
Wait you're one of the authors here?
cruddybanana1102 t1_iw6hi7a wrote
Reply to comment by DigThatData in [D] When was the last time you wrote a custom neural net? by cautioushedonist
From what I have learnt, if the paper doesn't have a github repo where they have shown an implementation, you'll probably never be able to reproduce the results.
cruddybanana1102 t1_j76eb6v wrote
Reply to Information Retrieval book recommendations? [D] by Ggronne
Schutze and Manning's book on Information Retrieval is your best guide.