Viewing a single comment thread. View all comments

Desperate-Whereas50 t1_iye5kfo wrote

>I doubt the typical human hears more than a million words of english in their childhood, but they know the language much better than GPT-3 does after reading billions of pages of it.

But is this a fair comparison? I am far a way from being an expert in Evolution but I assume we have some evolutinoary in coded bias to learn language easier. Whereas ML systems have to begin from 0.

1

currentscurrents t1_iye68b8 wrote

Well, fair or not, it's a real challenge for ML since large datasets are hard to collect and expensive to train on.

It would be really nice to be able to learn generalizable ideas from small datasets.

1

Desperate-Whereas50 t1_iye7hf3 wrote

Thats correct. But to define what is the bare minimum, you need a baseline. I just wanted to say that humans are a bad baseline because we have "training data" encoded in our DNA. Further for tabular data ML systems often outperform humans with not as much training data.

But of course less data needed with good training results is always better. I would not argue about that.

Edit: Typos

1