IDefendWaffles
IDefendWaffles t1_j96kvsk wrote
Reply to [D] Is Google a language transformer like ChatGPT except without the G (Generative) part? by Lets_Gooo_123
So I am trying to learn more about electricity and the more I read the less impressed I am about this lightbulb. To put it as shortly as possible it is a glass thing that shines.
So I'm trying to sort of prove this to myself.
Furthermore, The only thing that is going to make lightbulb the thing that everybody says it will be (replace everybodies jobs) is MORE wires, BUT to have more wires you need more power, and lightbulb already uses quantum computing as far as I know and QC progress is pretty much stalling.
IDefendWaffles t1_ivhc3ck wrote
Reply to comment by new_name_who_dis_ in [D] At what tasks are models better than humans given the same amount of data? by billjames1685
Fair enough.
IDefendWaffles t1_ivg12m2 wrote
Reply to [D] At what tasks are models better than humans given the same amount of data? by billjames1685
AlphaZero trained by self play from no data.
IDefendWaffles t1_jea2403 wrote
Reply to [D] Can large language models be applied to language translation? by matthkamis
You can talk to chatgpt in English and then just ask it a question in any other language and it will answer back to you in that language. Or you can just tell it: let's talk in Finnish from now on and it will.