TheHamsterSandwich
TheHamsterSandwich t1_ixbpzuk wrote
Reply to Would like to say that this subreddit's attitude towards progress is admirable and makes this sub better than most other future related discussion hubs by Foundation12a
Compared to other subreddits, I like this one better. Honestly, I'd rather go here and see people talk about what cool stuff we'll have in the future, even if they're wrong.
Not trying to be hateful or anything but seeing the posts on the collapse subreddit (or any negative subreddit in general) just throws my mood into the trashcan.
Submitted by TheHamsterSandwich t3_z0yk54 in singularity
TheHamsterSandwich t1_ix3e8z6 wrote
Reply to comment by Cult_of_Chad in is it ignorant for me to constantly have the singularity in my mind when discussing the future/issues of the future? by blxoom
Collapse could happen. - The part that makes this a religion is assuming that we wouldn't recover, and everyone would die instantly.
Singularity could happen. - The part that makes this a religion is assuming that it would be beneficial to all, and everyone would live forever instantly.
​
sp-usrsurus-rsurus, surus.
TheHamsterSandwich t1_ix3a4ka wrote
Reply to is it ignorant for me to constantly have the singularity in my mind when discussing the future/issues of the future? by blxoom
I would make a hard bet that I am more optimistic than anyone on this sub, but this is just plain stupidity. And that's okay, because sometimes we make mistakes.
But the way I'm looking at this post, I can see that you're treating the singularity like a religion of sorts. A benevolent AI will certainly exist at some point in the near future so it will help us out, even if we ruin everything the moment before it shows up.
But that's wrong. The singularity may be near, but putting a date on it is moronic.
Ray Kurzweil (if he makes it to longevity escape velocity) could just say he was "essentially right" if the singularity happens by 2075. It's happened before. So you can't sit around waiting for something to fix your problems.
What if police officers gave up their jobs because an artificial intelligence will replace them soon? What. What the fuck.
People have been ignoring technology since humanity has existed. It's probably best to keep it that way, so people don't lay back and relax while we watch our world die, waiting for true artificial intelligence to emerge.
"Techno messiah, I am ready for the rapture!"
(Please think about this more deeply.)
TheHamsterSandwich t1_ix17gzr wrote
Reply to comment by overlordpotatoe in 2023 predictions by ryusan8989
I bet my left nut that AI generated hands will be perfected by no later than the first half of 2025.
TheHamsterSandwich t1_ix16mrt wrote
Reply to comment by Yuli-Ban in 2023 predictions by ryusan8989
yuli ban does everyone here love you
TheHamsterSandwich t1_ix0km7j wrote
Reply to 2023 predictions by ryusan8989
a post on the longevity reddit that grabs everybody's attention
TheHamsterSandwich t1_iwu3zdw wrote
Reply to comment by ronnyhugo in When does an individual's death occur if the biological brain is gradually replaced by synthetic neurons? by NefariousNaz
If we expand our consciousness to a different medium, when do we die?
That is to say, if you expand your consciousness so that you are your biological part and machine part at the same time, then the biological part is removed, what happens?
Like if we see our mind as a house, what happens when you expand it to the size of a mansion?
And let's say that the brain is just an organ (at that point) and isn't responsible for 90% of what we call ourselves.
What happens then? If you remove it, would it be killing 10% of yourself or would it mean that you just die?
​
This shit gives me a headache. I'd rather wait for a superintelligence to figure this out...
TheHamsterSandwich t1_iwsavp9 wrote
Reply to comment by Suolucidir in When does an individual's death occur if the biological brain is gradually replaced by synthetic neurons? by NefariousNaz
Pretty sure ASI would be able to figure out any problems we have with making a system last forever.
TheHamsterSandwich t1_iwsaiaw wrote
Reply to comment by MrDreamster in When does an individual's death occur if the biological brain is gradually replaced by synthetic neurons? by NefariousNaz
They probably don't know what a neuron is.
TheHamsterSandwich t1_iwrhom5 wrote
Reply to When does an individual's death occur if the biological brain is gradually replaced by synthetic neurons? by NefariousNaz
almost everyone agrees on this 😂
TheHamsterSandwich t1_iwl79of wrote
Reply to comment by Nieshtze in A typical thought process by Kaarssteun
Assuming that AGI won't be here any time soon is a big mistake.
That's like saying "Hey guys, let's forget about the alignment problem since AGI won't be created for centuries."
TheHamsterSandwich t1_iwhxw32 wrote
Reply to comment by GodOfThunder101 in A typical thought process by Kaarssteun
TBF there are a lot more people worrying about the risks of artificial intelligence and how it could literally kill us all if it goes wrong.
TheHamsterSandwich t1_iw3rdos wrote
Reply to comment by OneRedditAccount2000 in What if the future doesn’t turn out the way you think it will? by Akashictruth
Our understanding of physics is incomplete. You can't say for certain what an artificial super intelligence can or can't do. Neither can I.
TheHamsterSandwich t1_iw3i8i8 wrote
Reply to comment by OneRedditAccount2000 in What if the future doesn’t turn out the way you think it will? by Akashictruth
Yes. The superintelligence will be perfectly predictable and we will know exactly how it does what it does. Just like how dogs perfectly understand and comprehend the concept of supermarkets.
TheHamsterSandwich t1_iw3b29p wrote
Reply to comment by HeinrichTheWolf_17 in What if the future doesn’t turn out the way you think it will? by Akashictruth
I have no idea why you're being downvoted. Literally speaking the truth.
TheHamsterSandwich t1_iw3answ wrote
I'm not sure that the sun will rise tomorrow, but it seems likely as it's been happening for as long as I've been alive.
Same goes for exponential growth.
Does the Singularity seem likely? In my highly uneducated view, I'd say yes. But of course I'd say something like that wouldn't I?
When it comes to Ray Kurzweil, I can see how his optimistic vision of the future could come to fruition. But I also see how he could be completely wrong and he's predicting something that will happen in the far future. Am I going to stay healthy for as long as I can, to potentially see what that the future has in store for humanity? Hell ye.
Either way, it seems that some type of change is coming and is inevitable. Whether it's within our lifetimes or not is yet to be seen. So, do I stay immobile in the sand waiting for the water to come? No. I live my life, enjoy every second and maybe, even if there's just a small chance, I'll get to witness the birth of a Utopia.
+ that whole thing with rich people seeing us as being baggage is kind of funny. I'm sure there are some rich people that value sentient life :)
TheHamsterSandwich t1_iw1c6hx wrote
Reply to comment by Russila in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
yeah dude they just wand funding obviously
TheHamsterSandwich t1_ivzxc6e wrote
Reply to comment by Russila in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
I literally responded to a post and I remember someone saying
"You know, the experts, people that got PhDs in the stuff, people that spent 30 years studying the field. Not, Reddit experts that watched a sketchy YouTube video and then formed an opinion based on wishful thinking."
​
yes yes. the experts, of course. how could I be so blind?
the experts. Yet nobody knows who they are.
​
^(fucking bullshit)
TheHamsterSandwich t1_ivywkwy wrote
Reply to comment by Phoenix5869 in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
Which expert
TheHamsterSandwich t1_ivxw6gr wrote
Reply to comment by Phoenix5869 in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
Ah yes, and you based this on your feelings right?
TheHamsterSandwich t1_ivxrys3 wrote
Reply to comment by marvinthedog in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
​
hope so
TheHamsterSandwich t1_ivxqesj wrote
Gonna come back in a few years and see how wrong this was.
TheHamsterSandwich OP t1_ivoyvao wrote
Reply to comment by OneRedditAccount2000 in Is Artificial General Intelligence Imminent? by TheHamsterSandwich
If you were being pessimistic, you would say never.
TheHamsterSandwich t1_ixdsbcz wrote
Reply to comment by roidbro1 in Are there others who lurk on both r/solarpunk and r/collapse? How do you handle the contrast? by gangstasadvocate
r/collapse is an echo chamber for depression. That's not even the worst part, some of the people on there enjoy what they perceive as society collapsing. They want it to happen.