4e_65_6f

4e_65_6f t1_j1rw0lg wrote

>his team trained an algorithm to find the news in common between sources from many different biases and report the commonalities

The problem with many news sources have nowadays, in the effort to be impartial then end up elevating opinions that aren't supposed to even be considered. Making it seem like everything is a 50/50 debate when only one of the sides has actual arguments.

Think for instance the problem of climate change, instead of debating measures for preventing climate change (because it's already consensus it's real) they keep bringing on people who deny climate change, even though it's 1/1000 scientists that will do that, in the news it's a 1vs1 debate so to the audience it looks like the issue is not yet settled.

Any algorithm that seeks to find commonalities between all news sources will end up considering points of view which are not valid as if they were. Because the news themselves are like that.

3

4e_65_6f t1_j1rudlt wrote

My (wildly speculative and somewhat pessimistic) thoughts on how this will go:

-Elites will very likely have exclusive access to the best models and smartest AI at first.
-A certain company will achieve complete monopoly of the labor market by creating some AGI model that can replace any worker.
-A massive push for economic change will start (being the sides UBI vs AI ban)
-The company (having now complete monopoly of the labor market) will realize there's no profit to be made from a market which no buyer has a source of income.

After that moment, there's no reason to reserve the benefits of AI to yourself. There's no cost in production and no profit in selling products. And a whole bunch of people are angry at you taking away their jobs. So what reason would anyone have to deny people access to your automatic production in that situation?

10

4e_65_6f t1_j0rf0hq wrote

I found that it works better if you keep it short, like tell it to write just a function or a small part of the code rather than the whole thing. Also explain in obnoxious detail what is supposed to be happening and it often gets it right.

It's really good at improving already written code also, I used it to make my code shorter and more efficient.

3

4e_65_6f t1_ixae7yz wrote

>It depends on what your definition of the singularity is.

It's just when AI will surpass human intelligence in general. That is what people mean by singularity. It's when there's no task that you would be able to perform better than the computer.

After that point AI starts developing/helping research and the timescales shift drastically. This is why people imply there will be a "burst" of technology.

0

4e_65_6f t1_ixa9fwh wrote

Yeah if you were to ask an artist back in 2016, when do you think AI could make art?

They probably would've said never, there's been naysayers all along.

This sub is the only one that has been saying it's possible and look at that, now a bunch of artists post here worrying about their jobs.

If you don't think it will happen you're either very pessimistic or haven't been paying attention. Every other week now there's crazy stuff being created and improved further.

6

4e_65_6f t1_ix9gbqf wrote

These text patterns are there for an intelligent reason, someone bothered to write those words in that particular order. It's not just random.

So when you copy someone's "word placing patterns" you are also indirectly copying their logic that wrote the text in the first place.

2

4e_65_6f t1_ix9ezwr wrote

True, a while ago there were plenty more of "YOu GuYs aRe A cUlt" and doomer posting.

Now people post worrying about losing their jobs and whatnot.

This sub has been saying this stuff all along.

I feel like no one is prepared for things working out just fine, that happens sometimes too.

6

4e_65_6f t1_iw4jirh wrote

>Billionaires are very immoral people

While I don't disagree with that, I still don't think there's anything to be gained from letting everyone starve.

Like Jeff Bezos can buy a billion hamburgers but he can't eat them all. With automatic production what is the point in hoarding something that it's free for you to produce and nobody can buy? It would be like hoarding sand.

What resources do you think AGI couldn't produce automatically and therefore would be scarce still?

1

4e_65_6f t1_iw44ldr wrote

>to me that looks like more jobs not less.

Everything else you've said it's right but the whole point of automation is less jobs. AGI basically means the end of human labor (not even research).

People won't be able to repair or work with tech post singularity, it's like trying to read the data in one of those large language models with 1B+ parameters. It's not feasible.

At most I can see humans making decisions but even then it will be informed by AI collected data too.

2

4e_65_6f t1_iw2vumj wrote

>You will have to satisfy their needs and police them forever if they reproduce.

You won't have to do anything, it's all automated including the planning and execution.

​

>Why would you want to be tied to them forever?

You don't have to, you can just copy the AI for them and bail to outer space or whatever the hell else you'd do without humanity.

​

>It's like taking care of the needs of every wild animal in this world, you'd rather not

If I had to put in work, then no. If I could just say "do it", I would. And I don't even particularly like animals tbh.

​

>and humans occupy useful space on the planet,

There's plenty of space for machines on the moon, in mars, in the ocean or underground. Why could you possibly need the entire earth without humans for?

​

>they can rebel and be a nuissance to your establishment etc.

They can't though, the AI will be a thousand steps ahead.

2