Viewing a single comment thread. View all comments

a4mula t1_j64k1pr wrote

1 bit is a single register of representation. You can have a million 0s, or a million 1s, but you cannot combine them in any way.

That's just a point, incapable of possessing information.

1-bit can never be information. Information is defined as the change of states. Not a state itself, and a 1-bit cannot change.

0000000000000000000000000

has no meaning.

111111111111111111111111111111111111

has no meaning.

00 = 0

01 = 1

That's information.

edit: Hey dumdums.... I get this is space and not philosophy of computation. But it's not a hard concept to grasp.

If you've got a single light switch. I can represent it as on or off. But by itself, it cannot represent information. On and Off is not information, it's data.

It's only a combination of ons and offs that qualify. And the moment you introduce something like iteration. Flipping the light switch on and off over time?

You introduce a new register. A new bit. It's no longer 1d data. Now it's 1d data over time. This is two dimensional. 1 bit of data, 1 bit of iteration. 2d. 2 bit, minimum for information passing.

−5

rdwulfe t1_j64l9yl wrote

A bit, by definition, is the possibility of a 0 or a 1.

Now, the problemHehe is you're sending a 0 or 1 with no context.

6

a4mula t1_j64lxmx wrote

A bit, is a storage space for a representation. By the nature of dimensions, a single dimension, be it of physical space or data can never represent change. It's an isolated spot. In order to represent change (information) you have to have a second dimension. An x, and a y.

2-bits is the minimum state for information.

−1

KamikazeArchon t1_j64ltj9 wrote

Information is not "defined as the change of states." That's simply not true.

Maybe that's what you understand it as, but it's not a standard scientific definition in any field.

2

a4mula t1_j64mklw wrote

Information theory? I don't know about you, but that's the one I'd go to if I were looking first.

−1

KamikazeArchon t1_j64ms9j wrote

That is not how it's defined in information theory.

2

a4mula t1_j64n039 wrote

>In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable.

That's the definition. Want me to show you where the change is a required part of it?

−1

KamikazeArchon t1_j64n8jo wrote

Change is not a requirement of that. Are you under the impression that "event" means "change"?

1

a4mula t1_j64noee wrote

>Derive - base a concept on a logical extension or modification of (another concept).

In order to derive you must alter the original data. It must change. That is the fundamental aspect of information.

It can change. If it cannot change, it's just data. Not information.

1