Viewing a single comment thread. View all comments

Superduperbals t1_iuuggl8 wrote

AI will find new ways to discriminate between us in ways beyond even our own comprehension.

29

s1syphean t1_iuv36vk wrote

Yes - discriminate between us based on who should get how much to reach a more just distribution of wealth. I think that’s intentional, no?

(Maybe you meant it this way, hard to tell haha)

12

AIxoticArt t1_iuv9r8c wrote

Why would you deserve someone else's money? That they worked to get, not you. That's mental illness to believe something like that. It's jealousy. And disgusting.

−38

s1syphean t1_iuva0c7 wrote

Just want to help people, man.

Doesn’t your comment strike you as odd? At all?

21

WeeaboosDogma t1_iuwj2ug wrote

Just click his profile link man, he frequents Tim Pool. He doesn't understand welfare economics, nor how things can be fundamentally better.

4

AIxoticArt t1_iuvaalh wrote

No my comment is what normal people believe. I want to help people as well, but not by taking other people's stuff. Why would someone want the right to steal from other people who have worked hard?

−26

Felix_Dzerjinsky t1_iuvaiok wrote

>who have worked hard?

Did they though?

30

AIxoticArt t1_iuvaqw5 wrote

Whether they did or not, the money belongs to them. Not you or anyone else. Anyone who believes in wealth redistribution lives in a fantasy land. I believe we could close the gap in what ceo and other make compared to their employees, but not wealth redistribution.

−21

s1syphean t1_iuvb2jk wrote

Maybe this is new info to you, but what do you think taxes are?

21

AIxoticArt t1_iuvb8gr wrote

Honestly idc enough about this topic. I just think you and anyone who thinks it will ever happen is an idiot. L

−10

s1syphean t1_iuvbh5p wrote

Well, it's already happening. The government redistributes our wealth when they tax us. You're saying the concept of taxes will never happen.

19

snowseth t1_iuvfj50 wrote

And when they put down tax cuts. Less money coming in from taxes means it comes in from someplace else. Either fees and fines and whatever. Or reduced services and support, which people pay to make up for.

I'd much much rather pay more taxes so the kid the next neighborhood over doesn't grow up in poverty then stab me in the face when they're 16, because poverty breeds violence.

7

BBASPN69 t1_iuvrme1 wrote

well as a taxpaying American, I say it's my right to continually bitch about solvable problems while repeatedly saying I don't want to allocate any resources to fix them.

4

Cr4zko t1_iuvzak2 wrote

Considering taxes are spent in the most stupid ways such as slush funds, financing coups in foreign countries, glowies, etc an AI managing it would be a massive improvement.

3

BitsyTipsy t1_iuw5jhr wrote

He asks you about taxes and suddenly you say “honestly Idc”. This is because you don’t know this topic in-depth. Perhaps you have a black and white view on the topic because your simplified version you had in your head isn’t the real world and shows your lack of awareness on the variables that surround it. Perhaps you should care, care about educating yourself on all the variables in a topic before you speak.

9

_ChestHair_ t1_iuz4kqs wrote

All his arguments throughout this thread can be summed as "honestly idc." He says rich people worked hard for their money. Someone says did they? He responds that it doesn't matter now look over here at this new goalpost (redistribution). Someone says taxes are already a thing. He says he doesn't care and the thing currently happening will never happen. I'm labeling him as "goalposts" with RES

3

s1syphean t1_iuvah3e wrote

Oh your comment is what normal people believe - got it lmao. Thanks commenting on their behalf and letting me know

18

theferalturtle t1_iuwlzka wrote

I just think people should be compensated according to their effort. You're telling me that billionaires work 10,000x harder than the average person? I guarantee Bezos and Musk don't work as hard as half the people I know and because they had an idea they get to rule the world and all the money in it?

4

turnip_burrito t1_iuyy4qb wrote

Yeah the absurdity of their "meritocratic power" mindset is astounding.

5

DedRuck t1_iuycjkh wrote

Because we as humans live in communities and it’s natural to want to help people in your communities?

3

leafhog t1_iuydhpf wrote

You give your work away for free because you don’t want to take other peoples’ money from them?

2

jsn12620 t1_iuvrtfh wrote

How about instead of shilling for millionaires and billionaires you maybe consider they should pay their fair share in taxes. That alone would redistribute wealth.

5

UnikittyGirlBella t1_iuyl11u wrote

That is literally exactly what billionaires under capitalism do. They intentionally underpay workers the value of their labor so they can make a profit.

5

TheDividendReport t1_iuw1esn wrote

Stable diffusion and the coming AI was not created by one man or company. All of our data is being used for the coming 4th industrial Revolution. Denying that we should be compensated for the data displacing us is ignorant.

2

UnikittyGirlBella t1_iuz8u1v wrote

This is made worse by the fact lots of users of these programs try to displace actual artists like me

1

EstablishmentLost252 t1_ivcfped wrote

> Why would you deserve someone else's money? That they worked to get, not you

Correct! This is exactly why we socialists believe the means of production should be owned by the working class, rather than by an elite few who appropriate the wealth that the workers create for themselves in the form of profit

1

boharat t1_iuvgo5n wrote

Spotted the American

0

TistedLogic t1_iuvqdl7 wrote

Spotted the conservative.

Lots of americans who think the whole system is bullshit.

9

boharat t1_iuxetyg wrote

Yeah you're right. I just had kind of a shoot from the hip response.

2

onyxengine t1_iuuz1s2 wrote

Mmmm this is debatable it can be done in an unbiased way. The programmers would have to be deliberately biased depending on whether the dataset is indirectly influenced or objectively raw.

1

mynd_xero t1_iuv1cv9 wrote

I disagree. A repeat in data causes a pattern, and when a pattern is recognized, that forms a bias. The terminology used kinda muddies the water a bit in that some people think biases are dishonest, or that a bias is simply a difference in opinion.

If a system is able to recognize and react to patterns, then it will form a bias. Might be safe to assume that an AI can't have unfounded bias. I do not believe it's possible to be completely unbias unless you are incapable of learning from the instant you exist.

8

Bakoro t1_iuv3r5r wrote

AI bias comes from the data being fed to it.
The data being fed to it doesn't have to be intentionally nefarious, the data can and usually does come from a world filled with human biases, and many of the human biases are nefarious.

For example, if you train AI on public figures, you very well may end up with AI that favors white people, because historically that's who are the rich and powerful public figures. The current status quo is because of imperialism, racism, slavery, and in recent history, forced sterilization of indigenous populations (Canada, not so nice to their first people).

Even if a tiny data set is in-house, based on the programers themselves, it's likely going to be disproportionately trained on White, Chinese, and Indian men.
That doesn't mean they're racist or sexist and excluded black people or women, it's just that they used whoever was around, which is disproportionately that group.
That's a real, actual issue that has popped up in products: a lack of diversity in testing, even to the point of no testing outside the production team.

You can just scale that up a million times. A lot of little biases which reflects history. History which is generally horrifying. That's not any programmer's fault, but it is something they should be aware of.

5

mynd_xero t1_iuv7okr wrote

Yeah and we all know how well FORCED diversity is going. Minorities are a minority because they are the smaller group. Nothing one way or the other people stating that fact. But that;s another rant for another subreddit.

I'm simply saying that a system that identifies and reacts to patterns is going to form a bias because that's why bias is, doesn't make it inherently evil, right or wrong, just IS.

−7

Bakoro t1_iuvh7qo wrote

You can't ignore institutional racism by using AI.
The AI just becomes part of institutional racism.

The AI can only reflect back on the data it's trained on and the data is often twisted. You can claim "it's just a tool" all you want, it's not magically immune to being functionally wrong in the way all systems and tools can become wrong.

4

mynd_xero t1_iuvjupw wrote

>institutional racism

Lost me here, this isn't a real thing.

No interest in going on this tangent further, nor was it my desire to laser focus on one thing that is moot to my general argument that anything capable of identifying repeating data ie. patterns, and has the capacity to react/adapt/interact/etc is going to formulate a bias, that nothing capable of learning, is capable of being unbias and finally that bias itself isn't good or bad it just IS.

−7

Bakoro t1_iuvqpgc wrote

Institutional racism is an indisputable historical fact. What you have demonstrated is not just willful ignorance, but outright denial of reality.

Your point is wrong, because you cannot ignore the context the tool is used in.
The data the AI is processing does not magically appear, the data itself is biased and created in an evrionment with biases.

The horse shit you are trying to push is like the assholes who look at areas where being black is a de facto crime, and then point to crime statistics as evidence against blacks. That is harmful.

You are simply wrong at a fundamental level.

5

justowen4 t1_iuw9c84 wrote

Perhaps your point could be further articulated by the idea that we are not maximizing economic capacity by using historical data directly, we need an AI that can factor bias into the equation. In other words institutional racism is bad for predictive power because it will assume certain groups are simply unproductive, so we need an AI smart enough to recognize the dynamics of historical opportunity levels and virtuous cycles. I’m pretty sure this would not be hard for a decent AI to grasp. Interestingly these AIs give tax breaks for the ultra wealthy which I am personally opposed to but even with all the dynamics factored into maximum productivity the truth might be that rich people are better at productivity.. (I’m poor btw)

2

freudianSLAP t1_iuvq9kt wrote

Just curious this thing called "institutional racism" that doesn't exist as you said (paraphrasing). How would you define this phenomenon that you don't believe matches reality?

4

TistedLogic t1_iuvr439 wrote

What makes you think institutional racism isn't a real thing?

2

onyxengine t1_iuv2bgi wrote

I both agree and disagree, but im too inebriated to flesh my position. I think raise really good point but stop short of the effect the people building the dataset have on the outcome of the results.

We can see our bias, we often don’t admit to it. We can also build highly objective datasets, nothing is perfect bias is a scale. My argument is effectively that the bias we code into system as living participants is much worse than bias coded into an ai that was built from altruistic intention. Every day a human making a decision can exercise a wildly differing gradients of bias, an ai will be consistent.

1

monsieurpooh t1_iuxoei8 wrote

Further muddying the waters is sometimes the bias is correct and sometimes it isn't, and the way the terms are used doesn't make it easy to distinguish between those cases and it easily becomes a sticking point for political arguments where people talk past each other.

A bias could be said to be objectively wrong if it leads to suboptimal performance in the real world.

A bias could be objectively correct and improve real-world performance but still be undesirable e.g. leveraging the fact that some demographics are more likely to commit crimes than others. This is a proven fact but if implemented makes the innocent ones amongst those demographics feel like 2nd class citizens and can also lead to domino effects.

1

Down_The_Rabbithole t1_iuvbb2o wrote

AI will discriminate based on its training data. Since its training data will come from human history it will discriminate exactly how history has discriminated.

AI is going to favor old white men for job positions. AI law is going to favor young white women. AI policing is going to favor young black men.

At least in the US.

We don't have enough modern data that doesn't have a bias to train a proper AI that won't repeat this pattern.

1

turnip_burrito t1_iuy6ry5 wrote

Not quite accurate to describe AI, what you described is just today's dominant "curve fitting" AI which doesn't generalize outside of the training distribution. This particular mathematical modeling style is as you've said, problematic.

However, it is possible to build a different type of AI which runs simulations step by step starting with sounder and minimally biased assumptions, in order to make predictions that exist outside of the existing data distribution.

3