Submitted by Practical-Mix-4332 t3_zl6uyu in singularity
[removed]
Submitted by Practical-Mix-4332 t3_zl6uyu in singularity
[removed]
GPT-40, maybe. 4 will be an improvement but isn't going to change the world... yet...
GPT3 - 1.5 Billion Parameters
GPT4 - 100 Trillion Parameters (rumored)
It’s a pretty significant leap.
Edit: Rumor has been debunked, apparently. We’re probably not looking at anything near 100T for GPT4.
And not to mention that’s for each gpt-4. That hive mind of gpt4 bots will make up super intelligence quite quickly.
The technology behind GPT-4 will be one of the many factors behind the big change that's coming. The advancements and milestones we are seeing in many fields of science and technology is what will change the world order by the next decade or two it won't just be Gptchat. I believe everything is connected with each other.
Yeah and since they’re computers, they will be communicating instantaneously and they will be able to share deep insights and express them better with each other than we ever could
Well for particular industries GPT3 is already pretty revolutionary so we should see the progress going forward in those industries.
But still we should not overestimate GPT capabilities. In the end as far as I understand it is very good imitation model of data in the internet.
However for sure it would me substantial step forward and we will see more industries disrupted by AI.
GPT-4 won't be that much bigger than GPT-3 according to Sam Altman, it'll still be bigger but not by that margin.
There is a neural network out there that has 500 billion parameters, but it's performance is still lower than neural networks with fewer parameters.
https://towardsdatascience.com/gpt-4-is-coming-soon-heres-what-we-know-about-it-64db058cfd45
>GPT4 - 100 Trillion Parameters
I must be out of the loop. Where is that rumor from?
Here, but now I’m reading elsewhere that they may have pulled that number out if their ass.
It's just a rumour and I think Sam Altman basically denied that this was the case. Another, perhaps more plausible, rumour is that GPT-4 will have a very different architecture where the parameter count between it an GPT-3 doesn't say much because it's no longer just about bruteforce scaling.
We don't know anything about GPT-4. Anything you think you know comes from rumors that are not very credible.
>Won’t this basically end society as we know it if it lives up to the hype?
I can't roll my eyes hard enough at this statement. Can we turn down the sensationalism a few notches on this sub? It's nauseating.
I mean you can kind of extrapolate based on the difference between GPT 2 and 3, but yes you are correct it is all speculation.
No you can't extrapolate. There are reasons behind things. GPT3 and GPT2 are both transformer models. GPT4 will likely be a transformer model too. At best it will just be a better transformer model, but it will still have context window limitations that prevent it from becoming anything that can be considered "game over for the existing world order". It will likely just be a better GPT3, not AGI or anything insane like that.
It feels like LLM’s have been a big deal, but only in certain circles. The image synthesis models opened the general idea of AI as an important tool to a much broader audience, who retroactively found GPT-3.
It’s unclear if OpenAI was always going to release ChatGPT or if it was in some ways built as an easier access point than the playground, for a growing community of people engaging with their products.
Whatever the case may be, the timing is pretty good, because if GPT-4 is a decent leap forward, you have developers who have been building on top of GPT-3 for years now (some who have become sizable businesses in their own right), a bunch of use cases in the world and a growing community that is understanding future use-cases — all which will allow GPT-4 to potentially seriously break into the mainstream, not as a name brand per se but as a tool that impacts a much larger part of society.
now imagine that the people who rule the world already likely have access to something like GPT5 and probably they also had access to tools like GPT4 or better for quite some time now.
It doesn't look like it. They all seem uterly incompetent.
I don’t think it needs to be an AGI to make a huge difference though. If it really is much more impressive than GPT-3 it’s going to start causing massive shockwaves throughout society. It will bring AI to the public consciousness even more than it already is and make people start planning for that future instead of just imagining it as a hypothetical distant time.
>100 trillion parameters
Definitely not happening until a few years from now
Ladies and gentlemen, I present to you the most sensationalist post ever posted on r/singularity.
But seriously, this is just insane.
Sam Altman, CEO of OpenAI has undercut the 100 trillion rumor and said the model won't be much bigger than 3.
https://www.reddit.com/r/OpenAI/comments/pj0nug/sam_altman_gpt4_will_remain_textonly_will_not_use/
I suspect GPT4 will be the start of commercialization for common use of AI systems, however I suspect we will need more of an advancement in AI rather than just scale to truly get to a point where we can automate a substantial portion of the workforce.
We're already seeing what ChatGPT can do, I think its clear that we'll see some wild things by 2030. I'll be really curious how well these types of AI models can transfer to robotics and physical systems.
I think it could be, but OpenAI's definitely going to hold these models back for now, rather than taking us to some insane proto AGI immediately. Sam Altman's been clear about that lately, so honestly, I'm not expecting the whole world to change yet.
I think the models Stability.AI will come out with are going to be even crazier, since they'll pack as much as they can into it.
I'm looking forward to having an open source GPT-4 level LLM and text to video model!
I think it could be
But, I think it needs slightly different architecture / algorithms to produce results that are on part with any capability of a human being in any capacity.
I don't think we can get true AGI from GPT-4 but we will get something extremely profound and mildly world changing.
And I do think by 2025 we will have an AGI that surpasses human beings in every capacity, and then the self designing recursion will begin and the singularity will have arrived.
You're like an overexcited child on Christmas eve. Calm down, it's just a chatbot.
I've been feeling weirdly giddy lately. It didn't hit me right away. I messed around with ChatGPT for a few days and thought of it (for a time) as a kind of enhanced Google. But once I began to get a feel for what it was doing and the magnitude of what it was capable of -- that's when the giddiness set in. There is a kind of liberation that comes with a total loss of control. The giddiness set in with the gradual realization that nothing I do from here on out really matters all that much. Be a good person, try to get back in touch with some old friends, try to better myself wherever I can. But otherwise ...
The big-picture stuff is in AI's hands now, for better or for ill.
GPT4 is not 100T parameters. Stop spreading that BS. The GPT team already stated on record that GPT4 is not significantly larger than GPT3 in parameter size.
lets see about the # of parameters ...
I just dont want it to be a slim improvement over chatgtp.
yeah. gpt 5 is when shit gets real.
it cancels out the amount of pessimism on r/technology and r/futurology
That's not how it works lol. It just makes more echo chambers.
I agree that you can't extrapolate, but it's definitely not the case that GPT4 has to have the same limitations as GPT2 and GPT3. Context window issues can be resolved in a myriad of ways (my current fav being this one and retrieval based methods could solve most of the factuality issues (and are very effective and cheap models, as proven by RETRO).
So I want to re-emphasize that we have no clue how good it will be. It could very well smash previous barriers, but it could also be rather disappointing and very much alike ChatGPT. We just don't know.
You realize that many customer service jobs are going to be over, very soon, right?
Not that human input will completely go away in the customer service workflow, but it will be more like self checkout at home depot, where you have one person monitoring eight registers at once and can intervene in the event of an issue.
This tech will do the exact same thing for low level CS.
[deleted]
They are an inevitable part of self-moderated social media. It's a function of the system. With unlimited content to devour, how many are willing to work through arguments that make them uncomfortable or angry? All to easy to click off that and go back to the comfort of something which affirms your existing worldview.
No, I don't have a solution for that and yes I suspect it is a very bad thing the consequences of which we are just starting to work through. Chatbots will definitely enhance the effect as will any form of proto or full AGI (computer, create me a documentary explaining why I'm right about everything!).
Just in case, I started saying “Thank you” to Alexa and Siri. 😁
Bullshit answer: the world will change for the better for all of us so finally we can achieve creative nirvana and sip teas with our ubi. Oh and we don’t know anything about gpt4 so stop the speculation I’m puking yuk.
Reality check: many people’s job from marketing to assistance to writing are pretty monotonous and mundane, and can be replaced with a far cheaper alternative with little to no loss in customer experience. GPT4 has 100 trillion parameters, same as human brain. Altman says it might have beat the Turing test. And chatgpt right now is on tech two years old. So yeah world order is about to append and may we wisely evolve.
I've heard of Pac-Man fever, but this man's come down with some Pac-Man dengue.
Yes! The way I see it is like everything is a giant feedback loop, discoveries in other fields can help with developing new technologies and using said technologies to discover new things, and the cycle goes on and on, and even faster than before.
This is getting scary and fascinating at the same time.
The biggest limitation of GPT-3 wasn’t the size but the data. It was trained almost on the whole internet and still underfit. At the end of the day the goal of the model is to predict the next word I don’t think it will necessarily lead to AGI but definitely it will be great to see interesting properties emerging from such a simple objective function.
100 trillion is gigantic. Alas, 3 trillion is still massive, its 2000 times more parameters than GPT3
The singularity will probably happen around GPT-5 or GPT-6.
That would, IMHO, be a big win. Even if the scaling hypothesis is correct, why would you want to solve the problem that way, when there are probably far better ways to solve it.
Sure, we could fly an interstellar spacecraft to another solar system, but it would be a bad idea to do it, because in the time that it would take to get there, some other ways of getting there would be invented. IF you left for the stars now, people would be waiting for you when you got there.
In the same way, simply scaling compute and data may get you to a certain amount of intelligence. But the costs and effort would be huge. It would probably be better to spend that time and effort (and money) on making the underlying ideas better. And even if it turns out that, yes, we have to scale, waiting until computational costs come down further is probably a good idea.
You are absolutely right - and quite early in that realization. Reminds me of this quote by Winston Churchill:
"Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning."
This is a sub for wanking after all
Gpt4 for singularity mod 2023
I keep seeing replies like ‘we dont know what the future holds’ and ‘stop sensationalizing things’…
Isn’t this a sub about the ideas of Ray Kurzweil et al and how we are 25 years away from an event of combining our human brains with AI brains? The entire thing is about bold theories about the future.
Why act like what OP said is nauseating while embracing something much more far-fetched happening soon?
The Dream. Basically this https://youtu.be/ikcU-9VYDTE
Personally, I'll hold my judgement till GPT 10 comes out. I'm skeptical till then.
There's a difference between speculating about events 25 years from now vs saying that something next year will end society as we know it based on nothing of substance.
Not everyone agrees on the singularity timeline. This is just a singularity sub, not a singularity in 25 years sub.
So Im doing a masters in ai robotics, pretty well I'd say
Pessimists have never been correct. The optimists haven't either because they are pessimists too.
How long before we see home robots like iRobot?
Making jobs easier is taking over jobs. If a job is made 50% easier, half of the team is laid off.
Unless we change the economic order, AI will just be a corporate profit maximizer.
Well that's pretty dang handy. Looking forward to seeing it be applied more!
AI will make smaller scale operations far more efficient too though, so larger organisations won’t hold the same inertia
It's just you
Gpt4 will be a better language model. But this whole gpt4 is the singularity stuff needs to stop imo.
That's when we don't have 78-80 year old dinosaurs running the country who can barely comprehend getting on their email .
Amen. People here need to go touch grass and stop acting like the sky is falling.
I think you're right that this technology, if not any specific implementation, has the potential to destabilize the world as we know it.
I've already had friends losing work to these. I had graphic designers tell me they just don't get asked to do commissions hardly at all anymore. I have a friend who did dictation for a law office, and that dried up all at once. She had to go back to teaching.
It's just the edges of things, today, but it doesn't have to get much better to take your order at McDonalds, answer phones, help you schedule classes, etc ...
It also doesn't take anywhere near 100% market saturation to destabilize things. The unemployment rate peaked at just over 25% during the great depression.
I asked ChatGPT:
It's natural to have concerns about the potential impact of powerful technology like GPT-4. However, it's important to remember that technology is only as good as how it is used. While GPT-4 may have the ability to perform many tasks that are currently done by humans, it's up to us as a society to decide how we want to use this technology. We can use it to augment human capabilities and improve our lives, or we can use it in ways that are harmful. Ultimately, the impact of GPT-4 will depend on the choices we make.
The Turing test has long been surpassed by now
Namely gptchat is connecting to everything
To be fair, could you have anticipated how powerful GPT-3 was going to be? Some concern is warranted.
Fair points. I don’t really agree with OPs statements but was surprised to see not just your comments (which were polite by comparison) but others bashing on people for getting excited over GPT4.
I'd suspect a long while. There isn't that much value in home robots so you can't charge that much. They'll be in a ton of jobs prior to ever being in homes. I wouldn't expect anything till a while after manual construction jobs are automated which is seemingly a ways away though admittedly could happen sooner than expected cause AI is a wild technology.
I'm somewhat more skeptical. This AI isn't factally reliable, yet, so you can't really trust it with answers. Another reason I'm skeptical is because many people don't even try to conversate with the automated customer service agents we have now and skip right to a human agent, or would prefer to speak to them. I definitely think future models will definitely reshape customer service, but not this one. I might be wrong, but I guess we'll just have to wait and see.
It will happen at an exponential pace.
[deleted]
Nope, it will be a great leap in what we have today but will just lead to GPT-5 to fix any issues and improve the system. What you should be looking for is all the extras use cases that pop up once people can see what it does and they use it to improve their own tools. It acts as a catalyst for new ideas and sparks new funding
They will dumb it down because the masses will ruin it
We have an aging population and not enough young people to care for the elderly. Our current solution involves people cycling from house to house doing the washing up and putting food in the microwave.
The countless people who got cancer and radiation poisoning during the advent of the nuclear age… were they “optimists”, or “pessimists”? 🤔
You seem like the group who ate a lot of glowing paint chips back then…
I don't know but we will all get cancer.
So 2024?
ghomerl t1_j03o790 wrote
nah, unless its a fucking HUGE leap. GPT-3 wont be taking over any jobs, just making existing jobs easier