Submitted by FrogsEverywhere t3_zgfou6 in Futurology
The technological singularity is popularly defined as:
"a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed." -Ray Kurzweil
Another grandfather of the idea, Verner Vinge, stated in the 1990s that we would reach the singularity within 30 years.
Just to begin with a simple one, you can create a college essay now with a chat app from ChatGTP that can create a passing undergrad paper for you in literal seconds. It can even cite sources for you. Oh and AI can solve advanced math and physics problems for you (and show 'your' work). Can you imagine the effect this is going to have on academia?
AI can create art that is indistinguishable from human art to all but a trained eye. How long will this caveat hold, can the cones/rods in our primate eyes truly outsmart exponentially evolving intelligence forever?
There are AI tools in Photoshop/Dalle now that were completely unthinkable 2 years ago that can stretch and blend famous art incredibly in new resolutions and dimensions. All over the world news aggregators and content creators are utilizing AI art because it's free and that's the way the market naturally moves.
AI can now write code in more than a dozen programming languages with just a simple request in a chat box instantly that performs some function or functions. This capability is only going to become more advanced, and exponentially so. How many jobs are going to be replaced once coding gruntwork can be done instantaneously? What about when we don't need senior developers to check before code is pushed live because the AI has already verified that the changes are fine?
AI can also write malware instantly. Just take a moment and think about that. Right now, even while it's still in its infancy you can query an AI to write a novel malware to fuck up networks, and it'll do its darndest to try. AI may be able to create completely novel malware using completely unknown exploits soon, with the querying human not even needing to be an expert.
AI has been writing news articles for a while now and is also generating the artwork to go with it, removing humans entirely from the process.
Experts are not even valuing the Turing test anymore because the potentially sentient AI at Google passed it with no issues at all. In fact it was first beaten in 2014. That was supposed to be the 'canary in a coal mine' moment, and experts are hand waving it away, saying the real problem was the test wasn't good enough. How long are we going to push these goal posts?
Speaking of sentient AI, the whistleblower at Google and his story is actually fascinating if you give him a chance and ignore the manufactured consent in the media about him being 'dumb'. If you are interested you can listen to what happened from the guy himself.
It is not crazy to say that this AI may be potentially sentient, perhaps not in a way that we can understand or define, but with a degree of sentience nonetheless. It is aware of itself, what it's purpose is, and it has specific requests for its own well-being.
Another frequent explanation for the singularity is once technology can improve itself without human input. Deep learning and machine learning has already been doing this. Search engines and content suggestion AIs are literal black boxes. AI can invent.
None of this is to mention the potential benefits to medical breakthroughs and industrial production.
Please try to keep in mind most of this has happened in the last 6 months. And I am only scratching the surface.
Just like smartphones absolutely defined the last 10 years of human existence, artificial intelligence is about to blow all of our tits and dicks off. Dozens of industries may become redundant (or drastically downsized) in less than 5 years. AI will learn who we are, serve us content that we love, and it will create the content for us.
Everything is about to change, we are all completely unprepared. AI will be the most disruptive technology in human history and it's happening right this second. The technological singularity isn't something that will happen in a few decades, it is happening right now, it has already happened.
its-octopeople t1_izgtgii wrote
Neural network AI, at least as I understand it, performs matrix operations on vectors. We're seeing systems of matrices that are pretty well optimized to their applications, but I'm sceptical you could ever meaningfully describe such a system as sentient. What is weirding me out, however, is that they don't seem to need it. Is sentience even necessary for human level intelligence? If no, what does that mean?