Submitted by matthkamis t3_126kzb6 in MachineLearning
I’m just curious how well these models could be applied to translation? From my standpoint a pretty good benchmark at how “intelligent” these things are would be how well they can translate between two languages. Anyone who is bilingual or has a partner who speaks another languages knows how the current state of the art in translation is severely lacking.
ZestyData t1_je9ly2p wrote
.. Uh. I'm going to assume you're relatively new to the world of ML. Translation is one of the most common uses for SOTA LLMs.
Its how Google Translate works, as just the most famous example.
What the SOTA translation tools don't yet use is instruct-tuning, to give them conversational interfaces (i.e the difference between GPT and ChatGPT). So they look different than using ChatGPT. But its very largely the same Generative (technically pretrained) Transformers under the hood.