Viewing a single comment thread. View all comments

AsheyDS t1_j1wo56q wrote

>the ones currently creating the AI make me very concerned about the future

Because of a vague fear of the future consequences of AI, or do you believe AI developers are somehow inherently nefarious?

>Even openAI is a for profit company.

I get the anti-capitalist bias, but there's nothing necessarily wrong with that. A for-profit company is easier to both start and maintain than a non-profit, and allows for more avenues for funding. If OpenAI didn't have Microsoft's deep pockets backing them, they'd probably have a bigger push to monetize what they've made. Even if they do have additional monetary goals, AI R&D costs money.

3

dracount OP t1_j1xyb94 wrote

>Because of a vague fear of the future consequences of AI, or do you believe AI developers are somehow inherently nefarious?

Because they have shareholders best interests at heart. With such power, society should come first, not shareholders. Not anyone can own nuclear weapons.

Soon it will be providing us with food, money, electricity, information, education... The services that cost them no additional cost will be divided and sold to maximize profit. Education? Sure get our gold package with a personal AI Tutor, silver you will get 10 tutorials about questions you have difficulty with, bronze you get 2 hours of assistance per week.

Is there a better way? I think so. It needs some thought and consideration though.

1

AsheyDS t1_j1zkqqd wrote

>Because they have shareholders best interests at heart. With such power, society should come first, not shareholders.

That's not always the case. It depends on the structure of the company. However, even if it isn't shareholders, say it was funded by crowdsourcing... AI devs are still beholden to those that donated, one way or another. Unfortunately, it can't be developed in a financial vacuum. That said, even if there are financial obligations, that doesn't mean AI devs are passively following orders either. Many are altruistic to varying degrees, and I doubt anyone is making an AGI just to make money or have power. Shareholders perhaps, but not the people actually making it.

I guess if it's a big concern for you, you should try looking for AI/AGI startups that don't have shareholders, determine their motives, and if you agree with their goals then donate to them directly.

2

SteppenAxolotl t1_j208evq wrote

It doesn't matter who controls it, they're afraid the future will look like the present and the past.

The structure of all political economies tend to produce certain results. A system that wants to survive wont permit situations that will allow people to not participate en mass. Most people on this sub wants their own pet AGI that will allow them the agency to materially survive without depending on anyone else. They want to free themselves of the one thing society exists to provide, society evaporates when that dependency is broken.

0