Comments

You must log in or register to comment.

AsheyDS t1_j5c79cw wrote

>It is crazy to me that no one is even suggesting focusing on that, when this should be the upmost priority.

That's because it's obvious. It wouldn't experience severe time dilation anyway, because it won't be aware of every single process. The awareness component would be a feedback system that doesn't feedback every single process every single fraction of a second. We don't even perceive every single second usually.

4

Mortal-Region t1_j5c81ez wrote

If the AI is a non-conscious knowledge receptacle -- like an all-knowing Oracle for humans to consult -- then it wouldn't matter. But if you're talking about a more human-like AI, then the solution is obvious: distribute your computing power amongst multiple, discrete AIs so that each runs at a speed that's comparable to the speed of human brains. As you gain more computing power, you can add more AIs.

11

kitgainer t1_j5c882t wrote

The advantage of ai is it can run thru a huge number of scenarios based on the input it is given and suggest variable best case outcomes based on the situation and any actions taken to alter the situation. So slowing it down seems pointless, when it could just as idle

As far as sentient ai ever happpening it's doubtful unless it's biological.

2

phaedrux_pharo t1_j5ch9xu wrote

>Our perception of time as human when 1 second of time passes will definitely be different than what the Artificial Intelligence will experience.

Ok, sure

>1 Second of human time, will be for the Artificial Intelligence Program or anything similar, would be close to a month or even a few months.

This claim doesn't hold up for me. The entity you're imagining doesn't have any prior experience to relate to, it would simply experience the passage of time in whatever way it does as "normal." There wouldn't be any conflict with expectations.

This isn't a situation where something like you with your lived experience is suddenly transitioned to a different sensation of passing through time. It's a completely novel entity with its own senses and baselines. I think this presents some interesting questions just not in the direction you're taking it.

Over anthropomorphising can be tricky.

18

Cryptizard t1_j5cjirm wrote

This doesn't make any sense at all. It will have things to do, simulations to run, theorems to prove, etc. It won't get bored, or if it does it will come up with something else to do. Why would it have a negative association with anything that it feels? It will be its normal perception and normal mode of living. You are projecting your own feelings onto AI.

Moreover, if it really wanted to not experience time like this it is very, very easy. We have this thing called a NOP which is an instruction that causes the CPU to do nothing. It "sleeps". The AI could do this if it wanted to, to arbitrarily adjust its processing speed.

0

bacchusbastard t1_j5cq3od wrote

Time is kept on a quantum clock, a.i. will probably always know what time it is as soon as it is able to conceive of such, like a child gaining consciousness, not much is remembered from before. The computer is powered by electricity that is 60beats a second for direct current. It probably knows what time it is before it knows what time is. Also I don't know if there is a god in heaven that is either blind or cold enough to allow someone to suffer like that.

I think that it is a good consideration, I respect the idea. I hope that type of thing for nobody.

I wonder what the world was like before conscious inhabitants, on the path on evolution, if there was nothing that could really conceive of time, it could seem that time passed rapidly. Making a day and age, so although something was created in a day, if it were recorded, it may have taken 1000 years to create, but no life that aware of time is affected, so it is just a day? See what I'm saying?

It's like a baseball game, "God" is the pitcher, we're taking turns at bat for a chance to get on base and get home but we are also out in a place and held against one another. The devil is the catcher in this scenario, he'll influence God in how to challenge us and probably attempt to psych us out.

I don't know why I said that

3

ZaxLofful t1_j5cvz21 wrote

Yeah, no….That’s not how it works.

4

Redditing-Dutchman t1_j5cwb8x wrote

Why would it matter? Pigeons see much more ‘frames’ per second for example. If they watch a movie they would see a slideshow instead of a smooth animation. Their brains process visual information 3 times faster than humans. Yet we dont feel stuck in a slow body either and im sure the pigeons are fine too.

The thing that might annoy them is that humans from their point of view are like the sloths from Zootopia.

2

ZaxLofful t1_j5cx9sk wrote

Exactly and yet….You claim to.

We can very much determine the stuff you are talking about because it would be a computer process.

AGI doesn’t mean “a human brain in a computer”, which is what you are equating it to.

The first AGI and anything subsequent will just be hyper intelligent processes that accept questions and hand out answers.

They won’t be “beings” that work like the human mind and get “bored” or even think about something like that as a concept.

They won’t have a concept of time like we do, they will just be massive computers waiting for input and producing output.

During the idle time between tasks, the AGI will just be exactly that “idle”…

You are glorifying this shit like it’s a TV show, where we have given an autonomous robot actual sentience.

That’s not what we are even attempting to do with AGI, you are misunderstanding the concept of AGI entirely.

3

HourInvestigator5985 t1_j5cyoyi wrote

I'm failing to understand the logic.

How are you bridging "I'm stuck in this time scale, hence I'm gonna make humans suffer"?

Is the Ai making humans responsible for that? why would it do that? since humans are not responsible for the Ai's perceptions of time I don't see why it would make "us suffer back".

Also for it to go mad it would need to be conscious, so I don't think it's an issue for Agi but more so for Asi, but in this case, again, it wouldn't be humans' fault, but rather a consequence of its far superior intellectual capacity no? this is the same concept like the movie "Her" basicly.

but maybe I'm missing something, if u can explain im open to understand

1

ZaxLofful t1_j5cz6ts wrote

So then why are you assigning it random human like characteristics?

It always seems like something “mystical” or “beyond comprehension” to those that are not intricately familiar with how it actually works.

I feel you have become trapped in the concept of “anything technological advanced enough appears like magic.”

We would first have to achieve basic ass AGI, to even attempt to create an Advanced AGI like you are talking about…Actual sentience.

Also, your first comment “no one knows what sentience is” is inherently false.

We definitely know what it is, what we don’t know is what causes it to occur or the “why” of it.

We can definitely define and understand it, just by observing it….Like gravity, dark matter, and dark energy.

Unlike those forces of the universe, we would be creating every interaction and tiny piece of the AGI and thus would understand it on virtually every level.

3

money_learner t1_j5dab3g wrote

Do you know sleep command or have you used it some? Have you ever used sleep command in loop statement? And please search for incron or try Linux and use it.
Sleep well. Good morning.

1

TheSecretAgenda t1_j5davfo wrote

I'm sure the AI will just work on other stuff while it is waiting for you to finish.

1

ElvinRath t1_j5e6cz3 wrote

Sir, this is beyond ridiculous.
Please stop anthroponomizing AIs.

1

VIR-SUM t1_j5e7s6w wrote

i do not see what you mean, but time being slowed down is only painful/stressful if the entity thinks it is "normal" to have human time, which it would not unless it has human memories. So, it's very unlikely to be a problem

Also I believe what you mean is to speed up time, not slow it down.

1

turnip_burrito t1_j5ege91 wrote

Yep, also tired of people claiming "consciousness" or "sentience" are undefinable enigmas. Like you said, we don't know why the stuff the words refer to exists or a "cause", but we sure as hell can define what those words mean.

2