SlowThePath
SlowThePath t1_je7xmaz wrote
Reply to comment by antonivs in [D] FOMO on the rapid pace of LLMs by 00001746
Definitely not denying that it was trained on a massive amount of data because it was, but calling it internet sized is not accurate. I guess you were speaking in hyperbole and I juts didn't read it that way. I know what you mean.
SlowThePath t1_je2d9oi wrote
Reply to comment by visarga in [D] FOMO on the rapid pace of LLMs by 00001746
Yeah I don't see any startup being able to acquire the resources and time to catch up let alone compete or surpass. Unless they come up with some very novel new magic secret sauce which seems extremely unlikely.
SlowThePath t1_je2cknq wrote
Reply to comment by deepneuralnetwork in [D] FOMO on the rapid pace of LLMs by 00001746
The thing about magic is that it is only magic in the beginning. Eventually it becomes commonplace and it is no longer "magic" anymore. Right now it feels like magic to me though too.
SlowThePath t1_je2buak wrote
Reply to comment by antonivs in [D] FOMO on the rapid pace of LLMs by 00001746
No models are trained on internet sized corpuses.That would take an infinite amount of time. I would think.
SlowThePath t1_je28tce wrote
This is both very cool and very unsettling.
SlowThePath t1_j9d52fz wrote
Reply to comment by JeffMorse2016 in Samsung's next-gen display to add blood pressure and sugar level monitoring by xcalibre
Techs simply not there yet. We have other cool solutions though. Between my pump and my dexcom that talk to eachother and give me insulin or stop it when needed I'm cyborged out over here. I think it's cool.
SlowThePath t1_jecqrfz wrote
Reply to Christina's World, Andrew Wyeth, Tempera on panel,1948 by Breezilyfloat673
Whenever I see this painting, I can't help but think of Terrence Malick's Days of Heaven. He must have pulled some influence from this.