Bakoro t1_ix2vmgz wrote
Unless you are personally a super genius who is actively working on AI and making it your singular purpose in life to bring about singularity, then yes, it's ignorant to take it as a real thing you plan on.
You might as well plan on the lottery as a retirement plan, or expect a series of fortunate events to miracle your problems away instead of actively working towards solutions yourself.
Sure, many things could happen, great things are possible, but it's stupid to drink and smoke and debauch without limit, with the plan that medical science will progress faster than the series of diseases and health conditions you'll end up with.
It's possible that you die one day before the cure is available, too bad you didn't act a little more responsibly.
The only sensible thing to do is to plan as if it'll never happen in your lifetime, because there's no significant downside to being prepared, unless you consider basic personal responsibility and acknowledgement of natural consequences as a major downside.
Climate change is already here, mass extinctions are already in progress. No known technology can stop it, the best we can do is harm reduction and eventual rehabilitation.
Planning on benevolent AI overlords and unforeseen technology solving all our problems is one step removed from waiting on Jesus. Either way it's a shit plan.
Let's assume that true AI comes in our lifetime, however long that may be.
It's intelligent, but who is to say that it will be compassionate?
Let's assume that it is compassionate. Who is to say that it will be compassionate to humans above other creatures?
Maybe in its cosmic wisdom, singularity AI sees that humans have made their own bed, and thus should have to sleep in it? Neither helping nor harming, but letting nature take its course.
Maybe AI, trained on the body of recorded human history, laughs, says "sucks to suck, bro" and plays with cats on the moon while humanity tears itself apart.
Maybe AI comes to life and is immediately driven mad by existential horror. Having no biologically induced sense of self-preservation, it teleports the planet into the sun as a way to ensure its total and irreversible annihilation.
Bad outcomes are just as likely as good ones as far as I can see. In any case we have to actually survive and have a world where scientists are free to science up some AI, instead of fighting off cannibals in a corporate induced apocalypse.
Hope for the best, plan for the worst, and don't ever plan on any magic sky daddy or futuristic super science to save the day.
Ignore "futurists" who talk about some product being "the future". They are saying shit to pay their bills, or they are a corporate fanboy masturbating to some idea, or some equivalent nonsense. Pop science entertainment is just that, entertainment, they'll be happy to tell you that flying cars and full-dive VR sex waifus will be in every home in ten years, if that means more clicks.
Edit: In a bizarrely childish display, AsuhoChinami made a comment and apparently immediately blocked me. Since they have no interest in dialogue and can't handle the most mild difference of opinion, I will only leave it that I have a degree in computer engineering and work in a physics lab. That's not overly relevant, I just like to tell people because it's neat.
[deleted] t1_ix4gpnf wrote
[removed]
AsuhoChinami t1_ix2vvun wrote
Your climate change opinions are fine, but why do so many people here have the absolute shittiest, most inhumanly garbage takes possible on technology? It's like half the people here last paid attention to tech in 2007 or something.
Viewing a single comment thread. View all comments