Viewing a single comment thread. View all comments

Patarokun t1_ix5ydv2 wrote

This certainly makes sense for why sleep is such a universal and critical trait. Without it you essentially run out of "RAM" and forget stuff that you need to survive. Sleep helps put that RAM into long term memory without overwriting the important stuff.

*I know the computer analogy has major issues I'm just trying to parse what the study said in the best framework I understand.

131

[deleted] t1_ix6x5gl wrote

Naw it's fine haha. Modern neuro and cognitive research often develop ML models for hypothesis generation and testing... because it works. ML draws extremely heavily from psych/neuro. I lean a lot on my psych background, and more recent training, to understand machine learning concepts as I am pushing to transition to data science.

I think the more accurate way to think about it though is as compression and refinement. The brain attempts to train itself to find a model which best fits what you experienced. It wants to extract the essence. Compression looks for equations which can reconstruct something without representing every detail. Same idea.

When you sleep in fact random noise is injected, just as it is to train machine learning models, so that you learn a more general idea rather than latching too much onto specifics which may be non-significant. Before it has the abstraction/compression algorithm figured out it doesn't know where the big idea ends, and the details begin.

Ram is more like not being able to recite a poem after the first time you hear it, or to do complex math without pen and paper.

24

Patarokun t1_ix6z0dd wrote

Interesting, how does random noise help ML not become too narrow-focused? I'm trying to think of examples, like how they'll kick the Boston Dynamics robots to test their reactions in unstable environments, but that's not quite right.

4

[deleted] t1_ix72jka wrote

Great question! It's a philosophically beautiful answer (I think).

Essentially everything in the universe exists in some state which resulted from some finite set of possibilities.

Your height for example is a combination of genetic, nutritional, and possibly behavioral influences. Each of those influences has a different magnitude of effect. Perhaps you personally might have been a few inches taller or shorter if some of those variables had different values. However on the whole people are more or less the same height, with some degree of variation.

That variation is driven by said differences in genes, etc, but there's only so much those variables can differ. They have a finite set of possibilities. So it turns out that most people will be close to some average height, with a predictable degree of variation to either side. A bell curve.

However you need to be sure you have a representative sample of people to determine what that average is and how much variation to expect. If you were to say look at NBA players you might get a very different idea what the average height was, and how much it might vary.

If you were then to try to use that expectation of how tall people are to inform other ideas like the size of cars, and the size of garages, and how much gas people might use, and so on, you could end up with a less than ideal model of the world.

So the best thing to do would be to get a really big sample of people in different situations, so that you can figure out the true average. You want to account for problems with how you selected people to measure (the basketball team) by averaging it out.

That isn't always possible or practical so what we and our brains do is to expect that some variation exists and to simulate this by taking an average, and then adding randomness distributed around that average, because that's how pretty much everything works anyways. We generate a bigger dataset from a smaller one, assuming some degree of randomness which tends towards a central limit.

Randomness is useful in other ways as well.. basically if you want to better estimate the influence basketball has on height, delete that variable at random. Pretend it doesn't exist and run a simulation and get a bettr idea of how the other variables play out.

Statistics is beautiful

9

Patarokun t1_ix863nv wrote

Ok so the randomness lets the brain/AI have a better sense of normal distribution, that makes sense.

And in this case sleep almost helps lump things into the different sigmas so it's easy to make decisions without getting lost in datapoints.

2

elanalion t1_ix63vz2 wrote

Thank you! That really helps me understand what they meant.

6

ToShrt t1_ix8lzlg wrote

I thought your analogy worked great. Ive always tried to find a great way to compare computers to the human body and your analogy has helped me make a great connection

2