Comments

You must log in or register to comment.

GayHitIer t1_j7k2hu0 wrote

And it's going faster and faster.

Just hope this sub doesn't turn into futurology.

Let's remember what this sub originated from.

92

JackFisherBooks t1_j7k3q9y wrote

I think ChatGPT has definitely raised the profile of this sub. And as more and more AI products become available, there will be greater interest in the potential/dangers. I won't be too surprised if this sub has around 400k by the end of the year.

32

H4X00R- t1_j7k7ybv wrote

Just a question... do u have 0.5 KB/s

3

dasnihil t1_j7kfjqo wrote

yay more depressed people looking for a coping mechanism that doesn't sound too absurd.

8

Evil_Patriarch t1_j7kfy3b wrote

Oh boy soon it will be another front page sub where 90% of the posts are political bullshit that isn't even related to the singularity!

46

Redditing-Dutchman t1_j7kjcsm wrote

Why are people so focussed on subscriptions anyway? I read here for a long time but I never use the subscription function of reddit...

5

GayHitIer t1_j7kjs5b wrote

You may be right, but let us try our best to keep this sub from becoming doomer posting 24/7 :)

Life is already as depressing as it is, let us keep some hope for our future, even as bleak it sometimes seems.

26

wildgurularry t1_j7kkgzt wrote

I don't really know how people survive without subscriptions on Reddit. It allows you to curate your feed, so you aren't inundated with the stupidity of the default subreddit list. Once you have set up your subscriptions (and most importantly your unsubscriptions), Reddit is a completely different experience.

11

crua9 t1_j7kl5dj wrote

How many of them are bots, dead accounts, or people who they forgot they sub here?

2

Snipgan t1_j7kn3wr wrote

Hopefully there isn't a mod takeover, or this sub becomes a diluted and more politically driven like r/Futurology as it gains more popularity.

Can we get some reassurance from the mods that they plan not to let this happen?

I actually like the more nuanced, if not always perfect, discussions I read and have on here.

13

LoasNo111 t1_j7kn8kf wrote

This isn't good. It's gonna ruin the sub.

Now the annoying people will bring their doom and gloom like with futurology.

Or they'll shove posts in that aren't even related to the future. Mostly to do with politics.

18

BlessedBobo t1_j7knl6b wrote

I for one hope this sub turns into r/science and we get fewer schizos posting, fewer low-effort screenshots of some random guy on Twitter, fewer people posting their chatGPT results, and shitty self-promotion of their crappy websites
for the first time in history, the singularity is no longer some science fiction fantasy for outcasts and weirdos to cope with, and I hope that is reflected in the standards of the sub
this sub would be brilliant if we only allowed high-effort posts, discussions, and papers without the hype and sensationalized headlines and we banned all the schizos posting their hallucinations

17

GayHitIer t1_j7ko9oy wrote

I agree to exclude the schizophrenia posting and false hype that turns into disappointment.

I am really realistic about the singularity as well.

We need to take this sub to a new standard.

Maybe some rule about low effort posts without any scientific background?

5

Yanutag t1_j7kq5zy wrote

Clearly we'll have 84 quadrillions subscribers by 2030!!!

3

MootFile t1_j7kqlbc wrote

With growing momentum, it should be put to good use.

Technological trends are going to be irreversible. So its important to have a conversation as to where we want to head as a society. In the most organized, rational, manner.

Not just in this subreddit either. All the other related ones should get together in a sort of techno-fixer unity. For the promotion of science & technical solutions in our ever evolving society.

6

Accursed-Seer t1_j7kvrpw wrote

and then, the singularity; or at least we can hope for it sooner with more eyes watching and more minds working

2

drekmonger t1_j7lf6f0 wrote

It was originally postulated as a doomsday scenario. It's certainly an event that would mark the end of civilization as we know it, aka, a doomsday.

https://edoras.sdsu.edu/~vinge/misc/singularity.html

The abstract reads:

> Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.

>Is such progress avoidable? If not to be avoided, can events be guided so that we may survive? These questions are investigated. Some possible answers (and some further dangers) are presented.

(Interestingly, the essay was written in 1983. Vernor Vinge was off by his prediction by at least 10 years, probably 20 years.)

−3

EddgeLord666 t1_j7lfihu wrote

I guess the “end of human civilization” doesn’t really matter to me as long as my consciousness still exists in some form. Since I already think of myself as a prospective posthuman, I don’t really perceive any more loss in that scenario than the “loss” involved in going from a child to an adult.

2

drekmonger t1_j7lgpnf wrote

I imagine the notion of self will be eliminated. In the bad outcome, the robot overlords have no use for us. In the better outcome, your circumstances will be so grossly changed that whatever there is of "you" that's left over will be unrecognizable as such. I don't imagine a true continuity as plausible.

In the more neutral outcome, we become pets in a zoo, not ascended transhumanistic beings.

1

EddgeLord666 t1_j7lh7nj wrote

Well unlike most people on this sub, I think transhumanism should be prioritized over the creation of AGI. I’m more interested in AI serving us as tools or augmenting our capabilities than ruling over us. Furthermore, you absolutely could have continuity of consciousness as long as augmentation happened in a ship of Theseus way, say by gradually boosting your IQ by 20 points every year instead of all at once.

1

drekmonger t1_j7lia04 wrote

The Singularity, as it was originally imagined, included potential scenarios for transhumanism over a technological singularity. The original essay is still well worth the read, even 30 years later.

But the doomsday scenario the essay was ultimately warning against was that the Singularity would occur rapidly as a shocking cascade of events.

Perhaps in the "pet human" scenario, a benevolent ASI might slowly augment people as individuals.

Regardless, the problem is one of alignment, and I don't think you or I have much say in that. Even if a relatively benevolent organization like OpenAI develops the first AGI, their competitors (like, say, China's AI research efforts) won't be so benevolent.

As in capitalism, the most unethical strategy will tend to dominate ethical strategies. The "bad" AIs will win any race.

3

EddgeLord666 t1_j7livt6 wrote

So far we are not at the stage where the Singularity is likely to be imminent, contrary to what some people here say. That means we probably have anywhere from 1 to 3 decades for the “good” people to coordinate and plan ways for it to happen in a more beneficial way or stop it from happening at all if that is deemed more desirable. That is really what people should be using this sub for, not just idle speculation.

2

Shamwowz21 t1_j7lmmpo wrote

Yes, and the rate at which it grows is accelerating. How appropriate!

2

ccnmncc t1_j7ltcj8 wrote

It was authored in 1993.

He noted that he’d “be surprised if this event occurs before 2005 or after 2030.” So unless you’re accusing Vinge of “relative-time ambiguity” maybe you can cut him some slack?

4

drekmonger t1_j7luzv7 wrote

>It was authored in 1993.

ChatGPT did me dirty. Prior to that comment I asked it to remind me who wrote the essay and when. It said 1983, and then I failed to look at the date on the essay itself.

Good catch.

3

_SputnicK_ t1_j7m28yh wrote

As someone who has been here since 2016, I consider myself a "legacy user." I think this sub is focused too much on commercial AI and not enough on the theory of artificial general intelligence, exponential growth, and intelligence explosion. December was nothing but ChatGPT screenshots. The userbase has shifted from readers of Kurzweil and Bostrom to those who are only here because of the endless AI news coverage. Many of the comments are to the effect "wow! AGI tomorrow" as this place functions like an endless hype train. I suppose to some extent this was inevitable, but it's still I think a regression in my view.

People in this sub assume that we can go from word prediction engines (ChatGPT) to artificial general intelligence while dismissing the numerous breakthroughs needed to reach that milestone. No one here understands the theory behind how AI actually works so it's more based on sentiment than fact. I need to find a small community of people who actually enjoy reading AI papers.

Edit: This sub has done well by remaining apolitical and largely focused on topic, but I really fear that this sub could devolve into a kind of hype machine echo chamber, and I fear that we're already there. Take someone like u/ideasware who understood the development of AI as tragic and very possibly apocalyptic. No one here seems to want to get into the finer details of how things could go very wrong.

3

LoasNo111 t1_j7mf4lt wrote

Same. Holy shit does that subreddit drain me. I'm just happy I stumbled onto this sub instead. I can still feel somewhat optimistic about the future instead of wondering whether I should kill myself right now.

They go out of their way to make the future a dystopia, even if the situation they propose can never happen they'll still insist it will.

7

AsuhoChinami t1_j7mnwdm wrote

Most definitely. The self-proclaimed realists fucked up this sub months ago.

Techno-optimists: Make informed predictions about the future based upon the present, the recent past, and the trajectory of changes over time
Self-proclaimed realists/skeptics/cynics/whatever: Endlessly call the other side stupid while making no actual credible arguments, combined with acting like a victimized minority despite entailing at least half the sub

Boy I sure do wish I was a super smart smart techno-skeptic. They're the smartest people in the world and can never be wrong on anything.

9

AsuhoChinami t1_j7mpgt6 wrote

That is nothing to celebrate. I genuinely, from the bottom of my heart, hate the kind of people that join in whenever futurism boards have sudden explosions in membership (see: futurology). I hate their argumentation style, I hate everything they stand for, I hate everything they believe in or don't believe in, I hate every last god damned thing about them and I love the fact that the world will change faster and more profoundly than those stupid motherfuckers believe.

3

imlaggingsobad t1_j7mu8dw wrote

the quality of this sub will rapidly deteriorate over the coming months/years. It was fun while it lasted! But when we have actual powerful AI systems, I probably won't spend time on here anyway lol

3

FpRhGf t1_j7o1ncw wrote

Not the same guy, but I only use subs to pin communities I frequent in for quick access. I've never used the feed nor the default subreddit list. I prefer to just check the subs directly and browse everything from there.

I've not subbed to Singularity nor r/Futurology since I've specially made shortcuts for these 2 on my homepage and didn't need to pin them.

2

SurroundSwimming3494 t1_j7ohzgh wrote

My girlfriend (though obviously not flawless - no one is, of course) is 100% perfect for me at all times, even when we're not seeing eye to eye.

I'm sure countless other people feel the same way about their partners.

5

Floater1157 t1_j7p7s3y wrote

I walked away for two seconds where did you get all of these people?!

1