Submitted by Ivan_The_8th t3_yvqqft in singularity
[removed]
Submitted by Ivan_The_8th t3_yvqqft in singularity
[removed]
As long as it's lossless compression, why not? Functionally it can be. There's no need to process two events that do not influence each other in any way at the same time.
You are possibly describing the idea of the cosmological horizon? It is possible (some believe very likely) that our universe is infinitely large, but since it is expanding faster than light we will never be able to see or interact with anything past a certain distance. This leaves our visible universe finite.
If some alien was living in a non-expanding infinite universe, it could be possible to simulate our “finite” portion of the universe.
>A cosmological horizon is a measure of the distance from which one could possibly retrieve information. This observable constraint is due to various properties of general relativity, the expanding universe, and the physics of Big Bang cosmology. Cosmological horizons set the size and scale of the observable universe. This article explains a number of these horizons.
^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)
>There's no need to process two events that do not influence each other in any way at the same time.
No two such events exist. The factors needed to perfectly simulate anything are everything.
All the results of an event can be stored in compression form until they are needed to determine how another event plays out.
[deleted]
I said that in the post, however there is a possibility that the compressed data would still be small enough and multiple compression algorithms can be used to increase this possibility.
1:1 is possible but only if you’re willing to look the other way about all the decimals that are going toneed to be cut off and hidden away to make it possible. You might be able to do a 1:1 model of specific scales of the universe, but being able to zoom from a Gluon on Earth to an overhead view of the entire Bang back down to a higgs boson at the core of Andromeda and everything in between - without absolutely massive massive error margins - is likely not gonna happen
Yes, the worst part about this is that we might not know that it's not 1:1.
What’s worse: every single modellable system in the world as we know it is modeled under the presumption that our presumptions about acceptable error in the model are correct
Not an expert (just a humble physics bachelors) but the problem isn't of whether you can "fit" one into the other: the real line (0,1) is in one-to-one correspondence with the whole real line, for example. The problem is that no infinite system can be simulated by us mortal finite beings.
For a slightly more boring situation: The Church-Turing-Deustche principle states if you assume every physical process can be completely described by quantum mechanics, then every finitely realizable physical system can be simulated by a universal quantum computer. So if you don't mind a reasonable finite approximation to whatever physical system you're interested in simulating, and you have the means to build a machine that can simulate them, there you have it.
Something else I heard on the passing: Apparently a string theorist had found what looks like error-correcting codes in his work (https://www.space.com/32543-universe-a-simulation-asimov-debate.html). Now I don't know what that means, cuz I know 0 string theory. But maybe it shows our universe itself is simulated (wherever the damn thing is running), who knows lol
[deleted]
Instead of 1:1 simulation of the universe, how about a 1:1 simulation of our galaxy. If not a 1:1 simulation in about 1-100 light years?
Problem is that our galaxy isn't completely cut off from other galaxies. All outside influences need to somehow be accounted for.
Not without knowing what all it consists of which we still don't. You'd need all the correct initial seed values that were present at the beginning of its timeline
You would need to know every random event in our universe and replicate that in your simulation. Hmm.
Your level of detail algorithms would produce observable artifacts unless you have some super AI managing it aware of the meaning of things like electronics and sensors so it can simulate the few cubic centimeters of rock that's the processors in a space probe but won't bother simulating the millions of cubic meters of rock in the asteroid it's flying past. Also, it needs to know that this chunk of rock is part of a gravity wave detector so you need to sync it up with the simulated supernova that happened 30 years ago 30 light years away... but this is just a chunk of the Earth and you can treat it as a uniform mass of undifferentiated basalt.
I don't think it's possible to hide the error artifacts from even our level of technology.
digitalthiccness t1_iwfqsxb wrote
>But what if we compress everything but the stuff needed for processing a certain event happening, then do the same for the next event, etc.?
Then it's not a 1:1 simulation of our universe.