Submitted by EducationalCicada t3_10vgrff in MachineLearning
emerging-tech-reader t1_j7kptn9 wrote
Reply to comment by WokeAssBaller in [N] Google: An Important Next Step On Our AI Journey by EducationalCicada
I got a demo of some of the stuff happening.
The one that is most impressive is they have GPT watching a meeting taking minutes and even crafts action items, emails, etc all ready for you when you leave the meeting.
It will also offer suggestions to follow up on in the meetings as they are on going.
Google have become the altavista.
WokeAssBaller t1_j7kqhgl wrote
Yeah right, OpenAI is built on google research, and cool you worked a half functioning chat or into the worst messaging and search app, congrats
emerging-tech-reader t1_j7ksup6 wrote
> OpenAI is built on google research
To my knowledge that is not remotely true. Can you cite where you got that claim?
OpenAI does take funding and share research with a number of AI related companies. Don't know if Google is in that list.
WokeAssBaller t1_j7ktznh wrote
https://arxiv.org/pdf/1706.03762.pdf the paper that made all this possible.
Google has also been leading in research around transformers and NLP for some time. Not that they don’t in ways share from each other
emerging-tech-reader t1_j7kzd3i wrote
> https://arxiv.org/pdf/1706.03762.pdf the paper that made all this possible.
That's reaching IMHO. The original transformer was only around a few million parameters in size. It's not even in the realm of the level of ChatGPT.
You may as well say that MIT invented it as Googles paper is based on methods created by them.
WokeAssBaller t1_j7ladne wrote
Please without the transformer we would never be able to scale, not to mention all of this being built on BERT as well. Then a bunch of companies scaled it further including Google
emerging-tech-reader t1_j7p3gn4 wrote
> Please without the transformer we would never be able to scale,
Without back propagation we wouldn't have transformers. 🤷♂️
Viewing a single comment thread. View all comments