DeveloperGuy75
DeveloperGuy75 t1_j9niz40 wrote
Reply to Why are we so stuck on using “AGI” as a useful term when it will be eclipsed by ASI in a relative heartbeat? by veritoast
Regardless of when we hit AGI, that’s still different from ASI. Also that’s assuming that it will automatically be able to improve itself once it hits AGI. Everyone assumes that’s going to be the case, but is that really going to happen?
DeveloperGuy75 t1_j9nfpl6 wrote
Reply to Ramifications if Bing is shown to be actively and creatively skirting its own rules? by [deleted]
It’s a large language model without any sentience nor control of its own. It can’t do anything without human input.
DeveloperGuy75 t1_j9knt41 wrote
Reply to comment by MultiverseOfSanity in Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not? by MultiverseOfSanity
No dude.. no computer is emotional right now, even though it might say so, due to how they work. ChatGPT, the most advanced thing out there right now just predicts the next word. It’s a transformer model that can read texts backwards and forwards so that it can make more coherent predictions. That’s it. That’s all it does. It finds and mimics patterns, which is excellent for a large language model and especially the data it has consumed. But it can’t even do math and physics right and I mean it’s worse than a human. It doesn’t “work out problems”, it’s simply a “word calculator.” Also, Occam’s razor is something you’re using incorrectly. You could be a psychopath, a sociopath, or some other mentally unwell person that is certainly not “just like anyone else”. Occam’s razor means the simplest explanation for something is usually the correct one. Usually. And that’s completely different from the context you’re using it in.
DeveloperGuy75 t1_j9e75av wrote
Reply to Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not? by MultiverseOfSanity
That’s obviously an unethical question. Putting pain centers into AI so that they feel pain is stupid and unconscionable. Suffering should absolutely not occur for an AI.
DeveloperGuy75 t1_j9e6tsf wrote
Reply to comment by Standard_Ad_2238 in Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not? by MultiverseOfSanity
Wrong. Pain is just like any other information the brain gets, just comes from different kinds of neurons, processed in the pain centers of the brain. Emulating such things into an AI would be unethical to say the least.
DeveloperGuy75 t1_j9e6i4s wrote
Reply to comment by ChronoPsyche in Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not? by MultiverseOfSanity
Better idea: shut down the game, put the psychopaths that developed the game in jail.
DeveloperGuy75 t1_j9e66mi wrote
Reply to comment by --FeRing-- in Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not? by MultiverseOfSanity
That’s a solipsism argument. You might as well be asking how you would react towards actual people, as in how do you really know they’re afraid?
DeveloperGuy75 t1_j8cjubb wrote
Reply to comment by Sad_Laugh_8337 in Generative AI comes to User Interface design! This is crazy. by RegularConstant
This “coming for your jobs” is ridiculous. It’s going to make it to where people can do more themselves.
DeveloperGuy75 t1_j59irqi wrote
Reply to AGI by 2024, the hard part is now done ? by flowday
It needs to have curiosity and be able to ask questions, but then it might start asking what some might think are the wrong questions
DeveloperGuy75 t1_ivwdhog wrote
Reply to comment by MattDaMannnn in Will Text to Game be possible? by Independent-Book4660
Just needs different and better training data
DeveloperGuy75 t1_itye78c wrote
Reply to comment by Ortus12 in AGI staying incognito before it reveals itself? by Ivanthedog2013
Exactly this and it won’t be an issue for a long time.
DeveloperGuy75 t1_itydog3 wrote
Reply to comment by beambot in AGI staying incognito before it reveals itself? by Ivanthedog2013
Because we don’t have the tech for it and aether isn’t real lol
DeveloperGuy75 t1_itgb5r4 wrote
Reply to comment by Phoenix5869 in Given the exponential rate of improvement to prompt based image/video generation, in how many years do you think we'll see entire movies generated from a prompt? by yea_okay_dude
Nobody’s “forcing” anything
DeveloperGuy75 t1_itg94ip wrote
Reply to comment by LittleTimmyTheFifth5 in Given the exponential rate of improvement to prompt based image/video generation, in how many years do you think we'll see entire movies generated from a prompt? by yea_okay_dude
Or.. maybe an AI would parse all the films it learns and gets what story structures are and make it’s own movies without needed scripts?
DeveloperGuy75 t1_isewc0c wrote
Reply to I wonder how the would will interact with those of us who get eye implants/AR contacts by crua9
Remember Google Glass? Everyone was speculating about that too, it didn’t last long, but they are called “glassholes” because of the possible invasion of privacy for everyone else.
DeveloperGuy75 t1_ja1gj2q wrote
Reply to comment by [deleted] in Ramifications if Bing is shown to be actively and creatively skirting its own rules? by [deleted]
Ok, that still doesn’t prove anything. In order to have independent thought, you have to freely be able to query your environment and learn from it. ChatGPT(what Bing Chat is based on), cannot do that. It’s a prediction engine that spits out patterns that are completions of prompts. It does very well as a tool, but it is not conscious nor does it have independent thought.