Submitted by 4e_65_6f t3_zw1lgy in singularity
4e_65_6f
4e_65_6f t1_j1rw0lg wrote
Reply to comment by chimmercritter in I created an AI to replace Fox and CNN by redditguyjustinp
>his team trained an algorithm to find the news in common between sources from many different biases and report the commonalities
The problem with many news sources have nowadays, in the effort to be impartial then end up elevating opinions that aren't supposed to even be considered. Making it seem like everything is a 50/50 debate when only one of the sides has actual arguments.
Think for instance the problem of climate change, instead of debating measures for preventing climate change (because it's already consensus it's real) they keep bringing on people who deny climate change, even though it's 1/1000 scientists that will do that, in the news it's a 1vs1 debate so to the audience it looks like the issue is not yet settled.
Any algorithm that seeks to find commonalities between all news sources will end up considering points of view which are not valid as if they were. Because the news themselves are like that.
4e_65_6f t1_j1rudlt wrote
Reply to Genuine question, why wouldn’t AI, posthumanism, post-singularity benefits etc. become something reserved for the elites? by mocha_sweetheart
My (wildly speculative and somewhat pessimistic) thoughts on how this will go:
-Elites will very likely have exclusive access to the best models and smartest AI at first.
-A certain company will achieve complete monopoly of the labor market by creating some AGI model that can replace any worker.
-A massive push for economic change will start (being the sides UBI vs AI ban)
-The company (having now complete monopoly of the labor market) will realize there's no profit to be made from a market which no buyer has a source of income.
After that moment, there's no reason to reserve the benefits of AI to yourself. There's no cost in production and no profit in selling products. And a whole bunch of people are angry at you taking away their jobs. So what reason would anyone have to deny people access to your automatic production in that situation?
4e_65_6f t1_j1rqng7 wrote
Reply to comment by chimmercritter in I created an AI to replace Fox and CNN by redditguyjustinp
>There's already a Max Tegmark project that is trying to do something along these lines
IDK who Max Tegmark is but I bet my left nut his personal biases are included in whatever method he's using to sort through the news. Even if it's AI.
4e_65_6f t1_j1rfuvn wrote
I appreciate the effort but I think it's impossible to have news without bias.
To me that sounds like saying "I'm gonna build an algorithm to find the objectively best color".
4e_65_6f t1_j1j0nat wrote
Reply to This is how chatGPT sees itself. by Kindly-Customer-1312
give me a representation of an apple in ascii art:
> /\
>
>/ \
>
>||
Nice apple bro.
4e_65_6f t1_j1f5enn wrote
Reply to comment by raylolSW in Am I the only one on this sub that believes AI actually will bring more jobs (especially in tech)? by raylolSW
True but why do you think there will be more jobs before singularity level tech?
I think there will be new jobs but that's not the same as more jobs.
4e_65_6f t1_j1f4lkh wrote
Reply to Am I the only one on this sub that believes AI actually will bring more jobs (especially in tech)? by raylolSW
It's possible that it creates new jobs up until the point you're not required for literally anything else anymore.
But I couldn't say if that new jobs period is going to have more jobs than we have now, it might be less IMO.
4e_65_6f t1_j0rf0hq wrote
Reply to comment by oddonebetween in How far off is an AI like ChatGPT that is capable of being fed pdf textbooks and it being able to learn it all instantly. by budweiser431
I found that it works better if you keep it short, like tell it to write just a function or a small part of the code rather than the whole thing. Also explain in obnoxious detail what is supposed to be happening and it often gets it right.
It's really good at improving already written code also, I used it to make my code shorter and more efficient.
4e_65_6f t1_j0qjafm wrote
Reply to comment by coumineol in How far off is an AI like ChatGPT that is capable of being fed pdf textbooks and it being able to learn it all instantly. by budweiser431
I didn't know there was a limit. It worked for my code that's 120+ lines of code so I figured it would also work for books.
4e_65_6f t1_j0qcxxv wrote
Reply to How far off is an AI like ChatGPT that is capable of being fed pdf textbooks and it being able to learn it all instantly. by budweiser431
You can paste the text there and it will answer based on the prompt. And no it's not AGI (yet).
4e_65_6f t1_iye7ffu wrote
I suggested changing the community banner to "we told you so" or something like that.
4e_65_6f t1_ixasy0t wrote
Reply to comment by IndependenceRound453 in How much time until it happens? by CookiesDeathCookies
What fact would have to change in order for you to think the singularity is near? Like what else do you think is missing?
If you can't really point out what you'd think is "indication that the singularity is near" that hasn't happened then it's not really skepticism but cynicism.
4e_65_6f t1_ixae7yz wrote
Reply to comment by IndependenceRound453 in How much time until it happens? by CookiesDeathCookies
>It depends on what your definition of the singularity is.
It's just when AI will surpass human intelligence in general. That is what people mean by singularity. It's when there's no task that you would be able to perform better than the computer.
After that point AI starts developing/helping research and the timescales shift drastically. This is why people imply there will be a "burst" of technology.
4e_65_6f t1_ixa9fwh wrote
Reply to comment by IndependenceRound453 in How much time until it happens? by CookiesDeathCookies
Yeah if you were to ask an artist back in 2016, when do you think AI could make art?
They probably would've said never, there's been naysayers all along.
This sub is the only one that has been saying it's possible and look at that, now a bunch of artists post here worrying about their jobs.
If you don't think it will happen you're either very pessimistic or haven't been paying attention. Every other week now there's crazy stuff being created and improved further.
4e_65_6f t1_ix9ivyj wrote
Reply to comment by thehourglasses in How much time until it happens? by CookiesDeathCookies
So what? It's not just a grammar bot. It copies the data that you provide it.
I doesn't matter what language it is on, it matters what the text contains.
4e_65_6f t1_ix9gbqf wrote
Reply to comment by [deleted] in How much time until it happens? by CookiesDeathCookies
These text patterns are there for an intelligent reason, someone bothered to write those words in that particular order. It's not just random.
So when you copy someone's "word placing patterns" you are also indirectly copying their logic that wrote the text in the first place.
4e_65_6f t1_ix9ezwr wrote
Reply to Would like to say that this subreddit's attitude towards progress is admirable and makes this sub better than most other future related discussion hubs by Foundation12a
True, a while ago there were plenty more of "YOu GuYs aRe A cUlt" and doomer posting.
Now people post worrying about losing their jobs and whatnot.
This sub has been saying this stuff all along.
I feel like no one is prepared for things working out just fine, that happens sometimes too.
4e_65_6f t1_ix9arai wrote
Reply to How much time until it happens? by CookiesDeathCookies
I can't imagine it taking longer than 4 years from now.
4e_65_6f t1_iwzqa9m wrote
Reply to Are you a determinist? Why/why not? How does that impact your view of the singularity? by Kaarssteun
Is Dualism really opposite to determinism?
I know it is impossible to prove any metaphysics claim. But I believe that IF there are some meta universal rules they would also be deterministic. I think physicists call it "superdeterminism" or something like that.
4e_65_6f t1_iw4l7d3 wrote
Reply to comment by cjeam in What if the future doesn’t turn out the way you think it will? by Akashictruth
Can't you think of some solutions to those problems that don't involve killing humanity? I'm sure AGI will.
Unless the person who "owns" the AGI is literally Hitler, I don't see the worst case scenario happening.
4e_65_6f t1_iw4jirh wrote
Reply to comment by cjeam in What if the future doesn’t turn out the way you think it will? by Akashictruth
>Billionaires are very immoral people
While I don't disagree with that, I still don't think there's anything to be gained from letting everyone starve.
Like Jeff Bezos can buy a billion hamburgers but he can't eat them all. With automatic production what is the point in hoarding something that it's free for you to produce and nobody can buy? It would be like hoarding sand.
What resources do you think AGI couldn't produce automatically and therefore would be scarce still?
4e_65_6f t1_iw44ldr wrote
Reply to comment by TheDavidMichaels in What if the future doesn’t turn out the way you think it will? by Akashictruth
>to me that looks like more jobs not less.
Everything else you've said it's right but the whole point of automation is less jobs. AGI basically means the end of human labor (not even research).
People won't be able to repair or work with tech post singularity, it's like trying to read the data in one of those large language models with 1B+ parameters. It's not feasible.
At most I can see humans making decisions but even then it will be informed by AI collected data too.
4e_65_6f t1_iw2vumj wrote
Reply to comment by OneRedditAccount2000 in What if the future doesn’t turn out the way you think it will? by Akashictruth
>You will have to satisfy their needs and police them forever if they reproduce.
You won't have to do anything, it's all automated including the planning and execution.
​
>Why would you want to be tied to them forever?
You don't have to, you can just copy the AI for them and bail to outer space or whatever the hell else you'd do without humanity.
​
>It's like taking care of the needs of every wild animal in this world, you'd rather not
If I had to put in work, then no. If I could just say "do it", I would. And I don't even particularly like animals tbh.
​
>and humans occupy useful space on the planet,
There's plenty of space for machines on the moon, in mars, in the ocean or underground. Why could you possibly need the entire earth without humans for?
​
>they can rebel and be a nuissance to your establishment etc.
They can't though, the AI will be a thousand steps ahead.
4e_65_6f t1_j1s2b8m wrote
Reply to comment by AndromedaAnimated in I created an AI to replace Fox and CNN by redditguyjustinp
Yes, besides bias there's legitimate greedy people with bad intentions throwing money where it shouldn't go behind the scenes. They also fund scientific research.
It's naive to think AI can sort through that mess with any reliability.