Pikalima
Pikalima t1_janc14v wrote
Reply to comment by [deleted] in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
I’d say we need an /r/VXJunkies equivalent for statistical learning theory, but the real deal is close enough.
Pikalima t1_j0az5t9 wrote
Reply to comment by economy_programmer_ in [R] Talking About Large Language Models - Murray Shanahan 2022 by Singularian2501
I don’t know who was the first to use the analogy to bird flight, but it’s a somewhat common refutation used in philosophy of AI. That’s just to say, it’s been used before.
Pikalima t1_iylah8s wrote
Reply to comment by csreid in [R] Statistical vs Deep Learning forecasting methods by fedegarzar
Sometimes I consider retracting my very first paper because of this.
Pikalima t1_itv5q8m wrote
Reply to comment by B10H4Z4RD7777 in [D] Simple Questions Thread by AutoModerator
I would start with the illustrated stable diffusion for a high level overview. Then I would suggest reading the annotated diffusion model which goes over implementing the original diffusion paper by Ho et al.
Pikalima t1_isg6vb3 wrote
Reply to comment by ChebyshevsBeard in [P] neograd - A deep learning framework created from scratch using Python and NumPy by pranftw
For those curious, like I was, see section 4.8 "Convolution as a matrix operation" in A guide to convolution arithmetic for deep learning.
Pikalima t1_jau7ujc wrote
Reply to Meta’s LLaMa weights leaked on torrent... and the best thing about it is someone put up a PR to replace the google form in the repo with it 😂 by RandomForests92
King.