Submitted by Gortanian2 t3_123zgc1 in singularity
CertainMiddle2382 t1_jdz4ra4 wrote
Recursive self improving is trivially exponential (no « singularity » aka limit though).
Singularity comes from our need to extrapolate the past to anticipate the future.
And if the future comes too fast, thr hypothesis is that we won’t be able to do that anymore.
Those exponentials often appear in recursive things, like population biology.
« Singularities » don’t really happen because, at one point, things outside the exponential mechanism start to « push » against it. It could be food, it could be space, it could be speed of light…
The fight between an exponential process and its environment leads to the omnious « logistics curve », better known as S curve.
Maybe something, somewhere will push against AI, limiting it as all other exponential stuff is limited.
For various resons, I don’t think it will happen.
IMO, Singularity is inevitable and will expand in the whole universe.
Viewing a single comment thread. View all comments