SgathTriallair

SgathTriallair t1_j9s9rns wrote

It's funny how many people made fun of that Google engineer but aren't laughing now.

I don't think we can definitively say that we've entered the age of sentient AIs but we can no longer definitively say that we haven't.

It's really exciting.

14

SgathTriallair t1_j9ebojk wrote

Let's imagine that you have created an AI that is capable of making realistic human behavior but doing so inevitably leads to full consciousness. So there is no way to have it seem human without being fully conscious.

In that case, you wouldn't have the individual NPCs reach have their own AI. Rather, you would have a single GM AI that controls all the characters. It wouldn't feel poison anymore then I feel pain when I write a story with realistic characters.

There is no circumstance I'm which it would be whether desirable or moral to create sapient entities for the style purpose of murdering them.

2

SgathTriallair t1_j94p7df wrote

No. They absolutely have special use AIs but they are not in the cutting edge of computer research. One big reason is that the large creative tech populations are not good at rigid hierarchy and rules. For example, the FBI and DOD are hard up for coders because they refuse to hire anyone that has ever smoked weed.

1

SgathTriallair t1_j94oqxt wrote

The military is ahead of civilian tech in some areas but not all areas. For instance, they are working with Microsoft to create AR heads up displays for soldiers. If they were ten years ahead they wouldn't need to contract with a private entity.

Additionally, no amount of money by the government can make up for the fact that there's are far more civilians working on certain fields, like AI. The civilians will likely come out with the tech sooner because there are more of them.

The basics of the atom bomb was discovered by civilians. It was only after they went to the government and described what was possible that the military began engineering the bomb.

3

SgathTriallair t1_j8kaduu wrote

Why would you want it to prompt you? Are you wanting to make friends with an AI or have it be your boss?

I can't think of a scenario where I want an AI to talk to me unprompted. This excludes things like having an appointment it reminds me of our giving me a morning update on the news as those would be on a schedule that I agreed to.

1

SgathTriallair t1_j87kdki wrote

Most of this is stupid. If we have a murderous AI take over then the very act of you prepping will highlight you for elimination.

For a lack of UBI "prepping" means building up savings and learning skills that will keep you relevant. This is something everyone should do even if there are no AIs.

The rest of these are run off the mill wars or terrorist attacks and have nothing to do with AI. They would or would not happen just the same regardless of whether the attackers are using AI tools.

Any prepper tendencies should not be affected by the existence of AI.

5

SgathTriallair t1_j75e2pg wrote

The impact of AI will depend on how powerful it becomes. At the low level, where we don't even get AGI, it'll probably be somewhere between the impact of the computer to the industrial revolution.

If we get AGI that is able to replace human work but is still basically under our control then expect it to be like agriculture or fire.

If we get the super intelligence then it will be equivalent to the invention of language or tool use.

1

SgathTriallair t1_j6jhpkz wrote

There are objective truths but it is impossible to know our present them all. When only some information is presented it creates a narrative. We try to make our narratives as course to the truth as possible but the limitations of our physiology makes this impossible to fully achieve.

Objective moral good is a slippery one. One needs to first define good before one can determine what actions and habits lead to that. I support human flourishing but others argue for obedience to god or maximizing non-interference. You can't have an objective conversation until you agree on what The Good consists of.

1

SgathTriallair t1_j6j77vo wrote

Everything is politics. When something seems "apolitical" it just means that it's biases conform to your own. Even the choice of what objective facts to present and which to not present is a political question. By political I don't mean "Republican or Democrat" but rather anything that deals with human society is political. Peanut butter and jelly sandwiches with lettuce and tomato in the middle isn't objectively wrong, it just feels weird because all of us have agreed that this isn't how we build those sandwiches.

The problem happens when you see news that presents facts that challenge your biases, such as pro or anti-cop stories. Both of those things happened but only one of them feels political and the other is just news we need to know.

A better option is to look at multiple sources, focus on places that have a reputation for presenting actual facts and not made up ones, and always think about why they are saying the things they are saying.

2

SgathTriallair t1_j6j5r4c wrote

These questions are way too specific and completely unanswerable. We don't know what the singularity will bring. The very definition of the singularity is the point at which our predictions no longer make sense.

We don't know what the capabilities of future AIs will be nor do we know the physical limits that the laws of reality will place on these ideas.

These are questions for sci fi writers, and there are thousands of such answers so go pick your favorite.

12

SgathTriallair t1_j6dhjuf wrote

We already have a good test case for this, hand created goods. Factory produced good have been around for decades and the individual crafter has been ALMOST entirely driven from the market. However, there is still a thriving group of crafters who are able to sustain themselves selling their wares even though they are three to ten times more expensive. I imagine it will be the same thing after the AI revolution.

4

SgathTriallair t1_j3s1fxz wrote

There is some shift that way. It's a pretty small sub so previously it was only those specifically interested in the topic, which will be very heavily skewed towards enthusiasts.

As we get closer, more non-enthusiasts are finding the sub. Thought bad but the poll is doing a good job of illustrating it.

r/futurism is the big sub for discussion about it and that one has a lot more anti-tech people.

1