FiFoFree
FiFoFree t1_j753sx3 wrote
>The owning class has never let us in on the profits of increased productivity, and for the first time in human history we will be all but completely excluded from the means of production. What happens next?
-
Somebody decides to let us in, and we live.
-
Nobody decides to let us in, and we die.
End of options.
​
There's no classical dystopian middle ground of people being left to live poor, miserable lives in a world where a ruling class has zero need for them and no scruples about being rid of them. Either you believe that somebody in that ruling class has a heart and decides to let everyone in on what promises to be resource richness beyond human comprehension, or you believe they're all stone-cold and inhuman, capable of ridding the world of anyone not like them without so much as a second thought.
​
Because with the tools they'll have at their disposal, it's one or the other.
FiFoFree t1_j49ux9u wrote
Reply to comment by Down_The_Rabbithole in Don't add "moral bloatware" to GPT-4. by SpinRed
Hell, the routers and switches in-between might be running Linux as well.
FiFoFree t1_iy3w98s wrote
Reply to comment by Neurogence in Why is VR and AR developing so slowly? by Neurogence
Rapid iteration/advancement of hardware isn't quite here yet, but we do have reason to believe it's possible (cf. additive manufacturing/3D printing). We can pump out software quickly because we've removed a lot of the bottlenecks for doing so (e.g. using IDEs with keyboards and mice rather than punch cards or manually toggling switches) and made the ability to code widely available (allowing for massively collaborative software projects).
Hardware has a pipeline, and that pipeline is pretty constricted at the moment in comparison to software, but that doesn't mean it will be that way forever.
FiFoFree t1_itjq8le wrote
Reply to comment by Smoke-away in Large Language Models Can Self-Improve by xutw21
Might even be two more papers down the line.
FiFoFree t1_it2w094 wrote
Reply to comment by visarga in Talked to people minimizing/negating potential AI impact in their field? eg: artists, coders... by kmtrp
In theory all you need for an abundance of any element is other elements as stock, a particle accelerator, energy, and time.
In practice, we'd need the price and size of particle accelerators to go way down, the price of energy to go way down, and the time required to go way down before it would make a difference.
Then again... "Anything that is theoretically possible will be achieved in practice, no matter what the technical difficulties are, if it is desired greatly enough." -- Arthur C. Clarke.
FiFoFree t1_irjdq4j wrote
Reply to comment by FourthmasWish in Singularity, Protests and Authoritarianism by Lawjarp2
I agree on most of these points. It's like we're headed towards a fork in the road:
On the one hand, if AGI is expensive, then that empowers centralized bodies like governments and corporations. On the other, if AGI is inexpensive, then that empowers decentralized bodies, such as individuals and communities.
Plus, there's the question of agency and the diminishing returns of intelligence. If you have all the intelligence in the world but have limited ability to interact with the world, you only have so much agency. Nanotech enters the discussion here, but it's in such an early stage of development that we really have no idea what will be possible over the next decade or two, just like people in 2000-2010 had no idea what was coming in the 2020s for AI.
FiFoFree t1_j87q1pj wrote
Reply to comment by socialkaosx in Are you prepping just in case? by AvgAIbot
Honest question:
What keeps you going?