dat_cosmo_cat

dat_cosmo_cat t1_izyz5hj wrote

Several of our internal teams have arrived at similar conclusions when comparing AWS models to pre-trained open source models. Specifically; zero shot CLIP, and a fine-tuned ResNet (ImageNet) out performed Rekognition on various classification tasks (both on internal data sourced from 9 e-commerce catalogs, as well as on Google Open Image v6). Zero shot DETIC out performs it on image tagging. We even collaborated with a technical team at AWS to ensure these comparisons were as favorable as possible (truncating some classes from our data, combining others, etc...).

38

dat_cosmo_cat t1_iwteguv wrote

You and I are literally saying the same things. These models have been in prod on every major software platform since BERT.

We don't even need to look at offline eval metrics anymore. If you're an actual MLE / data scientist you likely have the pipelines set up which directly measure the engagement / attributable sales differences and report the real business impact across millions of users each time a new model is released.

I work on a team that has made millions of dollars building applications on top of LLMs since 2018, so when I see the claim "LLMs finally got good this year" it's hard not to laugh. --this is what I am getting at.

Edit*: did you read the article?

5

dat_cosmo_cat t1_iwqnbt1 wrote

The ubiquity of pretrained BERT + ResNet models in commercial software applications (and the measurable lift they deliver) is proof that they've been "good enough" for years. Sometimes these articles can come off a bit naive to the impact that the technology has already had or how widely it is used beyond the specific application that is most observable / accessible to the author.

10

dat_cosmo_cat t1_iuz49g9 wrote

It is easy to read it like an ad for NFTs, we've seen so much bullshit out if that community I don't blame anyone for getting triggered. The implication behind this seems different though; it is advertising an opportunity to profit off of free use, rather than scarcity.

1