Submitted by ryusan8989 t3_yzgwz5 in singularity
was_der_Fall_ist t1_ix1ebvz wrote
Reply to comment by SoylentRox in 2023 predictions by ryusan8989
I agree that the past year has been “suspicious” and suggests that we may see even faster rates of progress in the coming years. If the singularity hypothesis, as you put it, is correct, 2023 should include even more profound advancements than we’ve seen so far. If it doesn’t, then we’ve got something wrong in our thinking.
SoylentRox t1_ix1f1g2 wrote
Agree mostly. One confounding variable is the coming recession may cut funding. I don't know how much gain we are getting from "singularity feedback". What this is as AGI gets closer, AGI subcomponents become advanced enough to speed up the progress towards AGI itself. As concrete examples, autoML is one and the transformer is another and mass ai accelerator compute boards is a third. Each of those is a component that a true AGI will have a more advanced version inside itself, and each speeds up progress.
The other form of singularity feedback as it becomes increasing obvious the AI is very near in calendar time, more money will be spent on it because of a higher probability of making ROI. You might have heard huggingface, a startup that duplicated openais work with stable diffusion but better, has a paper value of a billion dollars basically overnight.
This is similar to how as humanity got closer to a nuke multiple teams were trying in multiple countries.
Anyways if Singularity gain is say 2x, and funding gets cut to 1/4, then in 2023 we will see half progress.
Just as an example. If the gain is 10x the funding cut will be meaningless.
And obviously gain scales to well technically infinity though since the singularity is a physical process it will not be quite that high as the actual singularity happens, and presumably AGIs advance themselves and technology in lockstep until we hit the laws of physics.
That last phase would I guess be limited by energy input and the speed of computers.
was_der_Fall_ist t1_ixgc64a wrote
> The other form of singularity feedback [is that] as it becomes increasing obvious the AI is very near in calendar time, more money will be spent on it because of a higher probability of making ROI.
In my thinking, if we are as close to transformative AI as we seem to be from recent trends, the inevitable increase in funding should nullify any effect of an economic recession, so the stagnation of critical research would likely require more catastrophic intervention.
The people in charge of funding AI research (that is, the CEOs and other leadership of all relevant companies) are, almost universally, extremely interested in spending a lot of money on AI research, and they have the funds to do it even in a recession.
SoylentRox t1_ixgnzr5 wrote
In theory. In practice, Intel held a layoff for their AI accelerator teams. Amazon let go a lot of Amazon Robotics and Alexa workers. Argo AI closed.
While yeah more pure AI plays like Hugging Face raised on a unicorn valuation.
It seems to be mixed outcomes.
Viewing a single comment thread. View all comments