SgathTriallair
SgathTriallair t1_j9s9rns wrote
Reply to And Yet It Understands by calbhollo
It's funny how many people made fun of that Google engineer but aren't laughing now.
I don't think we can definitively say that we've entered the age of sentient AIs but we can no longer definitively say that we haven't.
It's really exciting.
SgathTriallair t1_j9ne4qo wrote
Reply to comment by dep in Microsoft is already undoing some of the limits it placed on Bing AI by YaAbsolyutnoNikto
This would be the easiest solution. Have a second bot that assesses the emotional content of Sydney's statements and then cuts the conversation if it gets too heated.
SgathTriallair t1_j9kwlty wrote
Reply to comment by Electronic-Wonder-77 in What. The. ***k. [less than 1B parameter model outperforms GPT 3.5 in science multiple choice questions] by Destiny_Knight
Is this implying that I don't know anything about AI or that the average person is not knowledge enough to be useful?
SgathTriallair t1_j9knp1a wrote
Reply to comment by Artanthos in What. The. ***k. [less than 1B parameter model outperforms GPT 3.5 in science multiple choice questions] by Destiny_Knight
Agreed. Stage one was "cogent", stage two was "as good as a human", stage three is "better than all humans". We have already passed stage 2 which could be called AGI. We will soon hit stage 3 which is ASI.
SgathTriallair t1_j9ebuq3 wrote
Reply to comment by ShoonSean in Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not? by MultiverseOfSanity
Character AI quest exists. If you could crunch that down into a game I think it would be more than capable of simulating a personality better than we would ever desire in a video game.
SgathTriallair t1_j9ebojk wrote
Reply to Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not? by MultiverseOfSanity
Let's imagine that you have created an AI that is capable of making realistic human behavior but doing so inevitably leads to full consciousness. So there is no way to have it seem human without being fully conscious.
In that case, you wouldn't have the individual NPCs reach have their own AI. Rather, you would have a single GM AI that controls all the characters. It wouldn't feel poison anymore then I feel pain when I write a story with realistic characters.
There is no circumstance I'm which it would be whether desirable or moral to create sapient entities for the style purpose of murdering them.
SgathTriallair t1_j9b9k8j wrote
Reply to comment by NWCoffeenut in Just 50 days into 2023 and there's so much AI development. Compiled a list of the top headlines. by cbsudux
Good point. It does speak to a growing awareness and influence of AI.
SgathTriallair t1_j9b3cgc wrote
Reply to Just 50 days into 2023 and there's so much AI development. Compiled a list of the top headlines. by cbsudux
Stock movement is not an AI development.
SgathTriallair t1_j94phh2 wrote
Reply to comment by Timely_Hedgehog in Do you think the military has a souped-up version of chatGPT or are they scrambling to invent one? by Timely_Hedgehog
That intelligence came as a surprise. No one expected LLMs to bring us the closest to AGI we've ever been.
SgathTriallair t1_j94p7df wrote
Reply to Do you think the military has a souped-up version of chatGPT or are they scrambling to invent one? by Timely_Hedgehog
No. They absolutely have special use AIs but they are not in the cutting edge of computer research. One big reason is that the large creative tech populations are not good at rigid hierarchy and rules. For example, the FBI and DOD are hard up for coders because they refuse to hire anyone that has ever smoked weed.
SgathTriallair t1_j94oyi3 wrote
Reply to comment by SoylentRox in Do you think the military has a souped-up version of chatGPT or are they scrambling to invent one? by Timely_Hedgehog
He's obviously a conspiracy theorist, so I'm not sure logic will work. I'm sure he'll start talking about HARP soon.
SgathTriallair t1_j94oqxt wrote
Reply to comment by [deleted] in Do you think the military has a souped-up version of chatGPT or are they scrambling to invent one? by Timely_Hedgehog
The military is ahead of civilian tech in some areas but not all areas. For instance, they are working with Microsoft to create AR heads up displays for soldiers. If they were ten years ahead they wouldn't need to contract with a private entity.
Additionally, no amount of money by the government can make up for the fact that there's are far more civilians working on certain fields, like AI. The civilians will likely come out with the tech sooner because there are more of them.
The basics of the atom bomb was discovered by civilians. It was only after they went to the government and described what was possible that the military began engineering the bomb.
SgathTriallair t1_j8kaduu wrote
Reply to Speaking with the Dead by phloydde
Why would you want it to prompt you? Are you wanting to make friends with an AI or have it be your boss?
I can't think of a scenario where I want an AI to talk to me unprompted. This excludes things like having an appointment it reminds me of our giving me a morning update on the news as those would be on a schedule that I agreed to.
SgathTriallair t1_j8c6djb wrote
Reply to comment by Fit-Meet1359 in Bing Chat blew ChatGPT out of the water on my bespoke "theory of mind" puzzle by Fit-Meet1359
If you go to the chat GPT Website it talks about how there were two major steps between GPT3 and ChatGPT. With access to the Internet, it's completely reasonable to think that the Bing GPT is another specialized version of GPT3.
SgathTriallair t1_j89b4yl wrote
Reply to comment by expelten in Are you prepping just in case? by AvgAIbot
We had a sarin gas attack in Tokyo. These are things that already exist.
If AI gives everyone access to a bio lab then you will have access to a cure producing one as well.
SgathTriallair t1_j87kdki wrote
Reply to Are you prepping just in case? by AvgAIbot
Most of this is stupid. If we have a murderous AI take over then the very act of you prepping will highlight you for elimination.
For a lack of UBI "prepping" means building up savings and learning skills that will keep you relevant. This is something everyone should do even if there are no AIs.
The rest of these are run off the mill wars or terrorist attacks and have nothing to do with AI. They would or would not happen just the same regardless of whether the attackers are using AI tools.
Any prepper tendencies should not be affected by the existence of AI.
SgathTriallair t1_j7ms5e5 wrote
Reply to comment by Reynbuckets in Would the Allies have kept fighting if the axis powers stopped? by Techno-87
This is the core problem with authoritarians. They rule by giving everyone to say that they are the smartest person in the world. This leaves them insane to gather new facts or challenge their assumptions. This eventually they hit the wall of reality and fail.
SgathTriallair t1_j75e2pg wrote
Reply to Sam Altman: If you think that you understand the impact of AI, you do not understand, and have yet to be instructed further. if you know that you do not understand, then you truly understand. by Neurogence
The impact of AI will depend on how powerful it becomes. At the low level, where we don't even get AGI, it'll probably be somewhere between the impact of the computer to the industrial revolution.
If we get AGI that is able to replace human work but is still basically under our control then expect it to be like agriculture or fire.
If we get the super intelligence then it will be equivalent to the invention of language or tool use.
SgathTriallair t1_j6jidtn wrote
Reply to comment by superkuper in No one watches the news anymore because they never provide any solutions to the problem reported and only exist to infuriate you for clicks and likes. by Inaerius
I'm glad I succeeded at thinking of something universally despicable that has no real moral weight.
SgathTriallair t1_j6jhpkz wrote
Reply to comment by superkuper in No one watches the news anymore because they never provide any solutions to the problem reported and only exist to infuriate you for clicks and likes. by Inaerius
There are objective truths but it is impossible to know our present them all. When only some information is presented it creates a narrative. We try to make our narratives as course to the truth as possible but the limitations of our physiology makes this impossible to fully achieve.
Objective moral good is a slippery one. One needs to first define good before one can determine what actions and habits lead to that. I support human flourishing but others argue for obedience to god or maximizing non-interference. You can't have an objective conversation until you agree on what The Good consists of.
SgathTriallair t1_j6j77vo wrote
Reply to comment by superkuper in No one watches the news anymore because they never provide any solutions to the problem reported and only exist to infuriate you for clicks and likes. by Inaerius
Everything is politics. When something seems "apolitical" it just means that it's biases conform to your own. Even the choice of what objective facts to present and which to not present is a political question. By political I don't mean "Republican or Democrat" but rather anything that deals with human society is political. Peanut butter and jelly sandwiches with lettuce and tomato in the middle isn't objectively wrong, it just feels weird because all of us have agreed that this isn't how we build those sandwiches.
The problem happens when you see news that presents facts that challenge your biases, such as pro or anti-cop stories. Both of those things happened but only one of them feels political and the other is just news we need to know.
A better option is to look at multiple sources, focus on places that have a reputation for presenting actual facts and not made up ones, and always think about why they are saying the things they are saying.
SgathTriallair t1_j6j5r4c wrote
Reply to If we achieve AGI in the next ten years, and if we achieve the singularity in the next ten years, will there be an option to entering a hive mind with people who we only know? Also when we achieve AGI and singularity, will there be options to control or modify our mental health(anxiety, depression? by pipe2057
These questions are way too specific and completely unanswerable. We don't know what the singularity will bring. The very definition of the singularity is the point at which our predictions no longer make sense.
We don't know what the capabilities of future AIs will be nor do we know the physical limits that the laws of reality will place on these ideas.
These are questions for sci fi writers, and there are thousands of such answers so go pick your favorite.
SgathTriallair t1_j6dhjuf wrote
Reply to My human irrationality is already taking over: as generative AI progresses, I've been growing ever more appreciative of human-made media by Yuli-Ban
We already have a good test case for this, hand created goods. Factory produced good have been around for decades and the individual crafter has been ALMOST entirely driven from the market. However, there is still a thriving group of crafters who are able to sustain themselves selling their wares even though they are three to ten times more expensive. I imagine it will be the same thing after the AI revolution.
SgathTriallair t1_j3s1fxz wrote
Reply to comment by HalfbrotherFabio in Poll about your feelings on AI by Ginkotree48
There is some shift that way. It's a pretty small sub so previously it was only those specifically interested in the topic, which will be very heavily skewed towards enthusiasts.
As we get closer, more non-enthusiasts are finding the sub. Thought bad but the poll is doing a good job of illustrating it.
r/futurism is the big sub for discussion about it and that one has a lot more anti-tech people.
SgathTriallair t1_ja3k7q0 wrote
Reply to comment by Akashictruth in Meta unveils a new large language model that can run on a single GPU by AylaDoesntLikeYou
If Meta can create it then dozens of other companies will be able to create it.