ItsAConspiracy t1_it3sum3 wrote
Reply to comment by Fantastic-Climate-84 in The End of Moore’s Law: Silicon computer chips are nearing the limit of their processing capacity. But is this necessarily an issue? Copenhagen Institute for Futures Studies by CPHfuturesstudies
Maybe it depends on your definitions of "Moore's Law" and "end." From the article:
> In the 15 years from 1986 to 2001, processor performance increased by an average of 52 percent per year, but by 2018, this had slowed to just 3.5 percent yearly – a virtual standstill.
I'm feeling that. I got my first computer around 1986 and those first fifteen years were incredible. A new computer was way faster than one just a couple years old. RAM and disk space was growing like crazy.
Ain't like that anymore. I bought a Macbook Pro eight years ago and it doesn't even seem slow. New ones that cost about the same have the same RAM and just double the storage. This is not the Moore's Law we enjoyed in the '90s.
Fantastic-Climate-84 t1_it3ulbr wrote
I get it, really I do. I’m just being glib. As I’ve gone through a few of these articles each have made some variation of that point.
Weird Al had a song called “it’s all about the pentiums” which, in the late nineties, called out how if you took a computer home from a box store it was already out of date by the time you opened the box.
The consumer isn’t on the bleeding edge any more, but that doesn’t mean the end of moors law. It’s way better for us at this point. That the consumer isn’t being “punished” or forced to upgrade by the advances in tech is a great thing. The consumer being the backbone of the tech industry was never going to last, but we’re nowhere near dead in the water yet.
ItsAConspiracy t1_it3vr0x wrote
I wouldn't say it's better. Those years were tremendous fun. You could keep running your old stuff if you wanted, but if you had money you didn't because the new stuff was so much better.
Fantastic-Climate-84 t1_it3w9fz wrote
>if you had the money
Xist3nce t1_it4a2gy wrote
A 3 story house was worth what my car is now. Kids could afford a car on a part time job. Support a family of 4 comfortably on min wage. Everyone could have had money then.
SatanLifeProTips t1_it5lx02 wrote
A microwave oven was $1700 in 1970’s money. Now it’s $39.97 at walmart. Cars and houses got expensive. Everything else got insanely cheap. A new T-shirt is five bucks!
And you can furnish a home for $0 on craigslist free. If you are handy with a paintbrush you can actually furnish a home quite nicely. Moving out in the 80’s had you living with cinder block furniture (stolen from a local construction site). Now some students can equip a suite and live large.
Once your rent is covered everything else is easy. Live by a place with good mass transit and you don’t need a car. I live in a dense city with light rail. Modern E-scooters have 16km of range, can cook along at 50kph or faster (illegally but no one cares) and you can take them on the train. It’s brilliant. Wear good rain gear and you can commute faster than your car.
Xist3nce t1_it5ni8o wrote
How much money do you think living in a modern place is? Find me one that you can afford rent in for $7.50 and I’ll acknowledge it. My grandpa got his house for a month of work. I’m not even allowed to get a house because I don’t make enough even though the mortgage is way lower than my rent.
SatanLifeProTips t1_it5omb6 wrote
Here a basement suite or a one bedroom apartment will set you back $1200-$1700/mo but min wage is $15.65/hr CAD. That’s just minimum wage however and few work for that. Even mcdicks is paying 20+ or you can’t get anyone. And medical is free.
Buying a place is going to need a decent career. Housing is super expensive to buy in the cities and places with great transit.
America’s $7.50 min wage is basically 3rd world poverty. But that is a system designed to trap people in poverty.
Xist3nce t1_it5uqrq wrote
Bingo there. Only way out is to make an absurd amount of money, unfortunately if you have to work 40-50 hours to make ends meet it’s hard to give up rest to work on skills. It’s all a trap and the tipping point is coming.
Fantastic-Climate-84 t1_it4bq8r wrote
Their follow-up statement was that they didn’t have the latest and greatest, just enjoyed that it was out there. That said, inflation is a raging birch, I’m with you there.
ItsAConspiracy t1_it3xcvk wrote
I was making eight bucks an hour for most of that time, but it was still fantastic.
Now it doesn't matter how much money you have, you're still not going to buy that kind of performance leap every couple years. Everything's just gonna stay about the same, with just small incremental improvements.
That's the end of Moore's Law. We're going to be stuck with pretty much the same computers we have now, until someone invents a whole new computing technology that's not based on silicon chips.
Fantastic-Climate-84 t1_it3y2x5 wrote
Dude, now you’re being glib.
Families couldn’t afford a new computer every two years to keep up with schools, people struggled getting new laptops for universities, that you were able to afford it — shit, so was I — doesn’t make it ideal.
The way we work with computers over the last five years has dramatically changed, already. It’s now possible to work from your phone! You can hook up an adapter to an hdmi cable and run that to a tv, use Bluetooth devices for mouse and keyboard, and off you go.
I do 90% of my work from a tablet today. To do what I do, I would never have dreamed that possible.
You’re choosing to ignore the dynamic swing occurring, which is another element to every. Single. One. Of these articles.
ItsAConspiracy t1_it3zkfc wrote
Dude, I was making like a buck and a half over minimum wage. Don't tell me how awful Moore's Law was for people without money. I barely had any and thought it was fantastic. In any case, doesn't matter whether we like it or not, point is that it's gone.
As for phones, I have an iPhone 6s and my girlfriend has a 13, and they're not all that different.
But sure, people are still engineering clever new things. That's great, but it's not Moore's Law, which was an absolute tsunami of raw new computing power every year.
Sylvurphlame t1_it4h16i wrote
> As for phones, I have an iPhone 6s and my girlfriend has a 13, and they’re not all that different.
To an extent, I that’s because software developers have to account for people having older phones. Apps don’t fully utilize the performance capability of smartphones because they have to assume somebody has a three or four year old device.
Also, I kind feel like if you’re not noticing a difference between an iPhone 6S and a 13, either you just don’t ask much of your phone or your girlfriend is severely underutilizing hers. :)
Fantastic-Climate-84 t1_it411q9 wrote
Now you’re just being dishonest.
> As for phones, I have an iPhone 6s and my girlfriend has a 13, and they’re not all that different.
Really? Really.
> But sure, people are still engineering clever new things.
And what handles the computations and functions those new things? The absolute powerhouses that sit in our pockets — well, not yours, but other pockets.
Again, that you could say that you barely had enough money, but we’re buying a new computer/processor/gpu every two years — because that’s what it was to keep up from the 2000s to about 2016 — tells me you’re not being honest.
I’m hopping off this comment train.
ItsAConspiracy t1_it44kxf wrote
I didn't say I bought a new computer every two years. I said people with money did. Doesn't mean I sat around being depressed about it. I was still super excited to see it all happening, and I got to experience it when we upgraded at work, in addition to the few upgrades I managed at home.
And all this is a side issue to that measly 3.5% annual improvement we have now.
But please, yes, hop off, this is getting unpleasant.
Fantastic-Climate-84 t1_it45vok wrote
ItsAConspiracy t1_it5362d wrote
Yeah that's great, but that's just regular technological progress. Of course that will continue. That's not the same as Moore's Law, which was a doubling of performance every 18 to 24 months over a long period of time. If there had been a Moore's Law for cars, they'd get millions of miles per gallon by now.
Fantastic-Climate-84 t1_it5554g wrote
The point was that, even with pistons, adding more doesn’t mean better performance.
It’s no doubt you don’t see a difference when you’re still using tech that’s almost a decade old. Try keeping up, and you’ll notice a difference.
That said, crazy that your MacBook and phone are still working and able to be used, hey? Sure is rough for the consumer these days. Couldn’t use a ten year old computer back in 2008, let alone a phone.
Bleeding edge cuts both ways. Ai, drones, tablets replacing laptops, laptops replacing desktops, phones being the bulk of where we compute, but you’re still complaining.
ItsAConspiracy t1_it5ejgn wrote
Sure there's a difference. But in terms of sheer compute it's still just 3.5% annually, according to OP's article. That's not Moore's Law. Tech progress continues but Moore's Law is still dead until we get a whole new chip tech. It's not complaining to just recognize reality.
Fantastic-Climate-84 t1_it5hnx2 wrote
https://reddit.com/r/Futurology/comments/y91xtu/_/it44stu/?context=1
You’re just not worth talking to.
Key_Abbreviations658 t1_it7ng3r wrote
But if you didn’t you still had the same computer it’s not like your computer got worse you just had much better options.
Key_Abbreviations658 t1_it7ngzz wrote
But if you didn’t you still had the same computer it’s not like your computer got worse you just had much better options.
Apokolypze t1_it6k52a wrote
points at $1600+ 4090 prices
Still true.
Plastic-Wear-3576 t1_it4qf2m wrote
Eh. Computer speeds have definitely improved in other ways. SSDs can make an otherwise slow computer fast.
It's like in video games. Games today in terms of textures don't really look much better than games from 5 or 6 years ago.
But lighting has improved immensely.
People will find ways to continue to improve, physical limits be damned.
Fantastic-Climate-84 t1_it4sj1h wrote
Totally agree with you.
Even if transistor count was stagnant, material science null, the design of the chipsets had gotten way more efficient. The boards are more efficiently designed, gpu and other system memory bottlenecks are just gone, kids these days don’t even talk about ghz any more.
Say what you will about the games themselves, but I’ve been able to play civ 6 on my phone for a few years now. To me, a gamer who remembers civ 2 not being able to play on a computer that cost twice as much as my current phone, it’s kinda magical.
Plastic-Wear-3576 t1_it4szic wrote
I ran into a scenario years ago when Starcraft 2 came out. I bought it, and it completely crushed my computer beneath it's boot.
Convincing my parents I all of a sudden needed a new computer was a stressful one.
Nowadays you just expect a game to run on your PC unless you have an older PC and the game is a true ship of the line nuts to butts eye watering game.
Fantastic-Climate-84 t1_it4ud7k wrote
> Nowadays you just expect a game to run on your PC unless you have an older PC and the game is a true ship of the line nuts to butts eye watering game.
Even then, today you just get the console version instead haha
I was selling computers when Doom 3 came out. That game made us a lot of commission. StarCraft 2, too. Kids like you were a big reason for our bonuses!
Plastic-Wear-3576 t1_it4uj6b wrote
Haha. I'm glad to be of service!
Evethewolfoxo t1_it3whfg wrote
I believe we’re kinda stuck in the consumer market and nearing the edge…for CPU’s and at least temporarily.
However, no one can deny that GPU’s have done nothing but improved year after year. I think that’s our current frontier in the consumer market while companies figure out tech like DLSS, RTX, and making transistors tinier.
ItsAConspiracy t1_it407ad wrote
Yeah GPUs are a bright spot. But partly it's because they're massively parallel and can just keep getting bigger and more power-hungry.
Another bright spot is neural chips, which aren't so much about Moore's law as getting better at specialized machine-learning architectures.
metekillot t1_it5nwvr wrote
Computer technology is only about a century old. I'm sure 100 years after they cast the first metal sword that they thought they were nearing the limits of metallurgy.
ItsAConspiracy t1_it78x1n wrote
We're definitely not nearing the limits of computation in general, just silicon chips specifically. We went from mechanical relays to vacuum tubes to silicon chips, now we need something else for the next big leap.
Cabana_bananza t1_itajcxw wrote
I think we will see the forerunners to computorium over the next twenty years. You have big companies like Element 6 (De Beers) working with others on creating better carbon semiconductors and researching use in computation.
The level at which they are manipulating the diamonds as they are developing has grown by leaps and bounds over the past 40 years. From the large x-ray diamond plates for satellites of the 80s to perfecting their ability to control and inlay imperfections onto the diamond structure of today.
Its starting to resemble what I think of when I pictured Kurzweil talking about computorium.
[deleted] t1_it4jq4w wrote
[deleted]
frankyseven t1_it4n9ci wrote
Umm, you realize that the M processors from Apple are incredibly fast and efficient, faster than anything else on the market. The new processors are a massive leap forward in processing and power management.
danielv123 t1_it4ps0y wrote
Faster? No. More power efficient? Yes. Amazing chips.
frankyseven t1_it4r0jg wrote
The M1 was faster than the I7 and whatever the AMD chip is called when it was released. Maybe not on paper but there were plenty of tests done around the time it was released showing that it was the fastest on the market and that was the Air not the Pro. Now, some of those speed gains are due to the OS being optimized for the chip and all of the other hardware but it was still the fastest on the market.
Regardless, Apple is one of the few companies really pushing the cutting edge with their computers.
MadDocsDuck t1_it4woul wrote
Yes and no. The real problem here is already in the way that you perceive their marketing material because the I7 hasn't been Intels top chip in each generation for quite some time. Then you have to consider the different wattages of the laptops conpared (especially if you compare a MacBook Air which is more focused on efficiency) because the "regular" chips vary vastly in power target and thus performance. And then there are the desktop chips, which are a whole different story to begin with. And on top of all that come the asynchronous release cycles so when Apple releases somethin in June but this years competition's products haven't released yet, they are esentially comparing them to a year old technology.
Then there is the whole issue of selecting the software for the benchmarks. Not just the OS makes a difference but also the individual programs yoh select.
Don't get me wrong, I like the chips and I wish that more companies focus on more efficiency like apple did with the M1 chips (although I heard that it is a different story with the M2 chips now). But every company will select the test suites to be as much in their favour as possible and when you comoare the Mac Platform to Windows there is always that inherrent difference that programs are just not the same between the two.
danielv123 t1_it6gsi8 wrote
Yes, the apple chip won hands down in workloads they added hardware acceleration for like some video editing workflows. It doesn't make the CPU faster in general though. There is a reason why you haven't seen data centers full of m1 mac's like with the old PlayStations.
[deleted] t1_it4xjmn wrote
[deleted]
ChicagoThrowaway422 t1_it7foec wrote
The MHz battles of the 90s were insane. You'd have the fastest computer for maybe a month before someone else's parents bought one, then you'd have to wait three years before your parents could afford a new one, putting you briefly back at the top of the pile.
Sniffy4 t1_it6g1og wrote
SSDs have been a far bigger performance improvement in user experience over the last 10-12 years than any gains in CPU speed.
Apokolypze t1_it6jzv1 wrote
Macbooks aside, enthusiast PC hardware has been being pushed along by the massive gaming industry. 30- series GPUs from Nvidia were a massive generational leap forward from the 20- series. Both Intel and AMD are making big strides in the CPU space with 13th gen and zen4 respectively. Talking of RAM, DDR5 is finally actually here and in a big way.
Running a system (especially RAM) from 8+ years ago in the gaming space, while technically feasible, could not compare to the capabilities of a modern enthusiast system.
Halvus_I t1_it8p43f wrote
Direct Storage too. First on consoles, now PCs.
[deleted] t1_itcabg9 wrote
[deleted]
Viewing a single comment thread. View all comments