Viewing a single comment thread. View all comments

Fantastic-Climate-84 t1_it3gsnv wrote

2021 - https://builtin.com/hardware/moores-law

2020 - https://www.google.com/amp/s/www.wired.com/beyond-the-beyond/2020/03/preparing-end-moores-law/amp

2019 - https://www.google.com/amp/s/www.cnet.com/google-amp/news/moores-law-is-dead-nvidias-ceo-jensen-huang-says-at-ces-2019/

2018 - https://www.google.com/amp/s/steveblank.com/2018/09/12/the-end-of-more-the-death-of-moores-law/amp/

2017 - https://www.computer.org/csdl/magazine/cs/2017/02/mcs2017020007/13rRUypGGeJ

2016 - https://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338

2015 -https://www.economist.com/the-economist-explains/2015/04/19/the-end-of-moores-law

2014 - https://www.businessinsider.com/great-graphic-is-this-the-end-of-moores-law-2014-1

2013 - https://www.pcworld.com/article/457384/the-end-of-moores-law-is-on-the-horizon-says-amd.html

2012 - https://techland.time.com/2012/05/01/the-collapse-of-moores-law-physicist-says-its-already-happening/

2011 - https://www.google.com/amp/s/www.forbes.com/sites/alexknapp/2011/03/30/the-end-of-moores-law/amp/

2010 - https://www.google.com/amp/s/techcrunch.com/2010/08/23/the-end-of-moores-law-a-love-story/amp/

2009 - https://archive.nytimes.com/bits.blogs.nytimes.com/2009/05/22/counting-down-to-the-end-of-moores-law/

2008 - was a good year, people were pretty optimistic

2007 - https://www.reuters.com/article/us-intel-moore-idUSN1846650820070919

2006 - https://www.indybay.org/newsitems/2006/05/18/18240941.php (had to switch over to bing to find results this far back)

2005 - https://slate.com/technology/2005/12/the-end-of-moore-s-law.html#:~:text=Dec%2020%2C%2020053%3A15%20PM%20Until%20recently%2C%20Moore%E2%80%99s%20Law%2C,a%20chip%20could%20hold%20a%20few%20dozen%20transistors.

2004 - some guy named moor had an issue with the Ten Commandments being infront of state buildings and legal battles ensued. Can’t find anything in two minutes here.

2003 - https://dl.acm.org/doi/10.1109/MC.2003.1250885

2002 - https://spectrum.ieee.org/the-death-of-moores-law-will-spur-innovation

This is a long way of saying “is it that time if year already”.

531

ItsAConspiracy t1_it3sum3 wrote

Maybe it depends on your definitions of "Moore's Law" and "end." From the article:

> In the 15 years from 1986 to 2001, processor performance increased by an average of 52 percent per year, but by 2018, this had slowed to just 3.5 percent yearly – a virtual standstill.

I'm feeling that. I got my first computer around 1986 and those first fifteen years were incredible. A new computer was way faster than one just a couple years old. RAM and disk space was growing like crazy.

Ain't like that anymore. I bought a Macbook Pro eight years ago and it doesn't even seem slow. New ones that cost about the same have the same RAM and just double the storage. This is not the Moore's Law we enjoyed in the '90s.

80

Fantastic-Climate-84 t1_it3ulbr wrote

I get it, really I do. I’m just being glib. As I’ve gone through a few of these articles each have made some variation of that point.

Weird Al had a song called “it’s all about the pentiums” which, in the late nineties, called out how if you took a computer home from a box store it was already out of date by the time you opened the box.

The consumer isn’t on the bleeding edge any more, but that doesn’t mean the end of moors law. It’s way better for us at this point. That the consumer isn’t being “punished” or forced to upgrade by the advances in tech is a great thing. The consumer being the backbone of the tech industry was never going to last, but we’re nowhere near dead in the water yet.

36

ItsAConspiracy t1_it3vr0x wrote

I wouldn't say it's better. Those years were tremendous fun. You could keep running your old stuff if you wanted, but if you had money you didn't because the new stuff was so much better.

20

Fantastic-Climate-84 t1_it3w9fz wrote

>if you had the money

17

Xist3nce t1_it4a2gy wrote

A 3 story house was worth what my car is now. Kids could afford a car on a part time job. Support a family of 4 comfortably on min wage. Everyone could have had money then.

13

SatanLifeProTips t1_it5lx02 wrote

A microwave oven was $1700 in 1970’s money. Now it’s $39.97 at walmart. Cars and houses got expensive. Everything else got insanely cheap. A new T-shirt is five bucks!

And you can furnish a home for $0 on craigslist free. If you are handy with a paintbrush you can actually furnish a home quite nicely. Moving out in the 80’s had you living with cinder block furniture (stolen from a local construction site). Now some students can equip a suite and live large.

Once your rent is covered everything else is easy. Live by a place with good mass transit and you don’t need a car. I live in a dense city with light rail. Modern E-scooters have 16km of range, can cook along at 50kph or faster (illegally but no one cares) and you can take them on the train. It’s brilliant. Wear good rain gear and you can commute faster than your car.

5

Xist3nce t1_it5ni8o wrote

How much money do you think living in a modern place is? Find me one that you can afford rent in for $7.50 and I’ll acknowledge it. My grandpa got his house for a month of work. I’m not even allowed to get a house because I don’t make enough even though the mortgage is way lower than my rent.

1

SatanLifeProTips t1_it5omb6 wrote

Here a basement suite or a one bedroom apartment will set you back $1200-$1700/mo but min wage is $15.65/hr CAD. That’s just minimum wage however and few work for that. Even mcdicks is paying 20+ or you can’t get anyone. And medical is free.

Buying a place is going to need a decent career. Housing is super expensive to buy in the cities and places with great transit.

America’s $7.50 min wage is basically 3rd world poverty. But that is a system designed to trap people in poverty.

1

Xist3nce t1_it5uqrq wrote

Bingo there. Only way out is to make an absurd amount of money, unfortunately if you have to work 40-50 hours to make ends meet it’s hard to give up rest to work on skills. It’s all a trap and the tipping point is coming.

1

Fantastic-Climate-84 t1_it4bq8r wrote

Their follow-up statement was that they didn’t have the latest and greatest, just enjoyed that it was out there. That said, inflation is a raging birch, I’m with you there.

0

ItsAConspiracy t1_it3xcvk wrote

I was making eight bucks an hour for most of that time, but it was still fantastic.

Now it doesn't matter how much money you have, you're still not going to buy that kind of performance leap every couple years. Everything's just gonna stay about the same, with just small incremental improvements.

That's the end of Moore's Law. We're going to be stuck with pretty much the same computers we have now, until someone invents a whole new computing technology that's not based on silicon chips.

4

Fantastic-Climate-84 t1_it3y2x5 wrote

Dude, now you’re being glib.

Families couldn’t afford a new computer every two years to keep up with schools, people struggled getting new laptops for universities, that you were able to afford it — shit, so was I — doesn’t make it ideal.

The way we work with computers over the last five years has dramatically changed, already. It’s now possible to work from your phone! You can hook up an adapter to an hdmi cable and run that to a tv, use Bluetooth devices for mouse and keyboard, and off you go.

I do 90% of my work from a tablet today. To do what I do, I would never have dreamed that possible.

You’re choosing to ignore the dynamic swing occurring, which is another element to every. Single. One. Of these articles.

1

ItsAConspiracy t1_it3zkfc wrote

Dude, I was making like a buck and a half over minimum wage. Don't tell me how awful Moore's Law was for people without money. I barely had any and thought it was fantastic. In any case, doesn't matter whether we like it or not, point is that it's gone.

As for phones, I have an iPhone 6s and my girlfriend has a 13, and they're not all that different.

But sure, people are still engineering clever new things. That's great, but it's not Moore's Law, which was an absolute tsunami of raw new computing power every year.

3

Sylvurphlame t1_it4h16i wrote

> As for phones, I have an iPhone 6s and my girlfriend has a 13, and they’re not all that different.

To an extent, I that’s because software developers have to account for people having older phones. Apps don’t fully utilize the performance capability of smartphones because they have to assume somebody has a three or four year old device.

Also, I kind feel like if you’re not noticing a difference between an iPhone 6S and a 13, either you just don’t ask much of your phone or your girlfriend is severely underutilizing hers. :)

3

Fantastic-Climate-84 t1_it411q9 wrote

Now you’re just being dishonest.

> As for phones, I have an iPhone 6s and my girlfriend has a 13, and they’re not all that different.

Really? Really.

> But sure, people are still engineering clever new things.

And what handles the computations and functions those new things? The absolute powerhouses that sit in our pockets — well, not yours, but other pockets.

Again, that you could say that you barely had enough money, but we’re buying a new computer/processor/gpu every two years — because that’s what it was to keep up from the 2000s to about 2016 — tells me you’re not being honest.

I’m hopping off this comment train.

2

ItsAConspiracy t1_it44kxf wrote

I didn't say I bought a new computer every two years. I said people with money did. Doesn't mean I sat around being depressed about it. I was still super excited to see it all happening, and I got to experience it when we upgraded at work, in addition to the few upgrades I managed at home.

And all this is a side issue to that measly 3.5% annual improvement we have now.

But please, yes, hop off, this is getting unpleasant.

2

ItsAConspiracy t1_it5362d wrote

Yeah that's great, but that's just regular technological progress. Of course that will continue. That's not the same as Moore's Law, which was a doubling of performance every 18 to 24 months over a long period of time. If there had been a Moore's Law for cars, they'd get millions of miles per gallon by now.

1

Fantastic-Climate-84 t1_it5554g wrote

The point was that, even with pistons, adding more doesn’t mean better performance.

It’s no doubt you don’t see a difference when you’re still using tech that’s almost a decade old. Try keeping up, and you’ll notice a difference.

That said, crazy that your MacBook and phone are still working and able to be used, hey? Sure is rough for the consumer these days. Couldn’t use a ten year old computer back in 2008, let alone a phone.

Bleeding edge cuts both ways. Ai, drones, tablets replacing laptops, laptops replacing desktops, phones being the bulk of where we compute, but you’re still complaining.

−1

ItsAConspiracy t1_it5ejgn wrote

Sure there's a difference. But in terms of sheer compute it's still just 3.5% annually, according to OP's article. That's not Moore's Law. Tech progress continues but Moore's Law is still dead until we get a whole new chip tech. It's not complaining to just recognize reality.

1

Key_Abbreviations658 t1_it7ng3r wrote

But if you didn’t you still had the same computer it’s not like your computer got worse you just had much better options.

1

Key_Abbreviations658 t1_it7ngzz wrote

But if you didn’t you still had the same computer it’s not like your computer got worse you just had much better options.

1

Plastic-Wear-3576 t1_it4qf2m wrote

Eh. Computer speeds have definitely improved in other ways. SSDs can make an otherwise slow computer fast.

It's like in video games. Games today in terms of textures don't really look much better than games from 5 or 6 years ago.

But lighting has improved immensely.

People will find ways to continue to improve, physical limits be damned.

2

Fantastic-Climate-84 t1_it4sj1h wrote

Totally agree with you.

Even if transistor count was stagnant, material science null, the design of the chipsets had gotten way more efficient. The boards are more efficiently designed, gpu and other system memory bottlenecks are just gone, kids these days don’t even talk about ghz any more.

Say what you will about the games themselves, but I’ve been able to play civ 6 on my phone for a few years now. To me, a gamer who remembers civ 2 not being able to play on a computer that cost twice as much as my current phone, it’s kinda magical.

2

Plastic-Wear-3576 t1_it4szic wrote

I ran into a scenario years ago when Starcraft 2 came out. I bought it, and it completely crushed my computer beneath it's boot.

Convincing my parents I all of a sudden needed a new computer was a stressful one.

Nowadays you just expect a game to run on your PC unless you have an older PC and the game is a true ship of the line nuts to butts eye watering game.

2

Fantastic-Climate-84 t1_it4ud7k wrote

> Nowadays you just expect a game to run on your PC unless you have an older PC and the game is a true ship of the line nuts to butts eye watering game.

Even then, today you just get the console version instead haha

I was selling computers when Doom 3 came out. That game made us a lot of commission. StarCraft 2, too. Kids like you were a big reason for our bonuses!

2

Evethewolfoxo t1_it3whfg wrote

I believe we’re kinda stuck in the consumer market and nearing the edge…for CPU’s and at least temporarily.

However, no one can deny that GPU’s have done nothing but improved year after year. I think that’s our current frontier in the consumer market while companies figure out tech like DLSS, RTX, and making transistors tinier.

7

ItsAConspiracy t1_it407ad wrote

Yeah GPUs are a bright spot. But partly it's because they're massively parallel and can just keep getting bigger and more power-hungry.

Another bright spot is neural chips, which aren't so much about Moore's law as getting better at specialized machine-learning architectures.

9

metekillot t1_it5nwvr wrote

Computer technology is only about a century old. I'm sure 100 years after they cast the first metal sword that they thought they were nearing the limits of metallurgy.

2

ItsAConspiracy t1_it78x1n wrote

We're definitely not nearing the limits of computation in general, just silicon chips specifically. We went from mechanical relays to vacuum tubes to silicon chips, now we need something else for the next big leap.

1

Cabana_bananza t1_itajcxw wrote

I think we will see the forerunners to computorium over the next twenty years. You have big companies like Element 6 (De Beers) working with others on creating better carbon semiconductors and researching use in computation.

The level at which they are manipulating the diamonds as they are developing has grown by leaps and bounds over the past 40 years. From the large x-ray diamond plates for satellites of the 80s to perfecting their ability to control and inlay imperfections onto the diamond structure of today.

Its starting to resemble what I think of when I pictured Kurzweil talking about computorium.

1

[deleted] t1_it4jq4w wrote

[deleted]

5

frankyseven t1_it4n9ci wrote

Umm, you realize that the M processors from Apple are incredibly fast and efficient, faster than anything else on the market. The new processors are a massive leap forward in processing and power management.

3

danielv123 t1_it4ps0y wrote

Faster? No. More power efficient? Yes. Amazing chips.

5

frankyseven t1_it4r0jg wrote

The M1 was faster than the I7 and whatever the AMD chip is called when it was released. Maybe not on paper but there were plenty of tests done around the time it was released showing that it was the fastest on the market and that was the Air not the Pro. Now, some of those speed gains are due to the OS being optimized for the chip and all of the other hardware but it was still the fastest on the market.

Regardless, Apple is one of the few companies really pushing the cutting edge with their computers.

−2

MadDocsDuck t1_it4woul wrote

Yes and no. The real problem here is already in the way that you perceive their marketing material because the I7 hasn't been Intels top chip in each generation for quite some time. Then you have to consider the different wattages of the laptops conpared (especially if you compare a MacBook Air which is more focused on efficiency) because the "regular" chips vary vastly in power target and thus performance. And then there are the desktop chips, which are a whole different story to begin with. And on top of all that come the asynchronous release cycles so when Apple releases somethin in June but this years competition's products haven't released yet, they are esentially comparing them to a year old technology.

Then there is the whole issue of selecting the software for the benchmarks. Not just the OS makes a difference but also the individual programs yoh select.

Don't get me wrong, I like the chips and I wish that more companies focus on more efficiency like apple did with the M1 chips (although I heard that it is a different story with the M2 chips now). But every company will select the test suites to be as much in their favour as possible and when you comoare the Mac Platform to Windows there is always that inherrent difference that programs are just not the same between the two.

7

danielv123 t1_it6gsi8 wrote

Yes, the apple chip won hands down in workloads they added hardware acceleration for like some video editing workflows. It doesn't make the CPU faster in general though. There is a reason why you haven't seen data centers full of m1 mac's like with the old PlayStations.

2

ChicagoThrowaway422 t1_it7foec wrote

The MHz battles of the 90s were insane. You'd have the fastest computer for maybe a month before someone else's parents bought one, then you'd have to wait three years before your parents could afford a new one, putting you briefly back at the top of the pile.

2

Sniffy4 t1_it6g1og wrote

SSDs have been a far bigger performance improvement in user experience over the last 10-12 years than any gains in CPU speed.

1

Apokolypze t1_it6jzv1 wrote

Macbooks aside, enthusiast PC hardware has been being pushed along by the massive gaming industry. 30- series GPUs from Nvidia were a massive generational leap forward from the 20- series. Both Intel and AMD are making big strides in the CPU space with 13th gen and zen4 respectively. Talking of RAM, DDR5 is finally actually here and in a big way.

Running a system (especially RAM) from 8+ years ago in the gaming space, while technically feasible, could not compare to the capabilities of a modern enthusiast system.

1

Halvus_I t1_it8p43f wrote

Direct Storage too. First on consoles, now PCs.

1

ReddFro t1_it42s06 wrote

Neat, but if you’re trying to say we’ve been saying this same shit for 20 years, I’ll point out from reviewing your own article list that older stuff talks about Moore’s law ending “soon”, and newer stuff saying its already dead. In fact several of the older articles even point to right around now as the end (one from 2011 said early 2020’s, another in 2013 that it’s dead in about 10 years).

8

Fantastic-Climate-84 t1_it44stu wrote

They all really have the same theme.

“New tech isn’t as much of an improvement over last years model”.

The point I’m making is this article comes out every single year, saying the same thing as last years article. Sometimes it’s from CEOs that want to slow down how much they invest in r&d, sometimes it’s from investors prophesying the end of the growth of IT stocks, sometimes it’s the end users themselves.

Quantifiably, moors law has literally ended. The law pertained to the actual count of transistors on a chip, and how small we can make them. The “law” stated we would be doubling the count every — we’ll, it was first every four years, but we outpaced that so he adjusted — two years. That’s not the defining element to how we use tech any more, and chips are so small and with rapidly changing design and dynamics that it’s still, functionally, in place. We haven’t stagnated on transistor count, it’s still going up, but with varying ways of building the chips into the systems tech has in no way plateaued.

10

ReddFro t1_it5cih7 wrote

The way my brother states it is the new moore’s law is the number of ways moors law is defined doubles every 18 months.

6

Fantastic-Climate-84 t1_it5j7kz wrote

Love it, that sounds pretty accurate.

Honestly I look forward to next years article for the same argument and the same conversations about the same topic.

2

MonkeyPawClause t1_it4uzii wrote

maybe they applied moores law to the pricing instead of the chips for a change of pace.

2

Yashugan00 t1_it72nfd wrote

I roll my eyes every time I see a tech journal write this comment every few years. been predicted since I graduated

2

kushal1509 t1_itd7v5l wrote

Tech journos: "well now since you're about to reach 3nm transistor size, Moore's law is going to die right?"

Scientist: "nope we will just stack the transistors over each other."

TJ: "No!! my "expert" predictions are wrong...again!!"😭😭

2

OptimisticSkeleton t1_it6lc3o wrote

Except that we really are approaching a place where any further reduction in transistor size creates electrical problems. This happens at 1nm and below, I believe. Companies are working on 2nm transistors right now.

1

___Price___ t1_it8g957 wrote

It’s been that way for years so they have altered chip design layout.

Even the idea of stacking or cube shaped models etc.

Mores law will be dead when we have 1nm chips laid out in the most efficient form with superconductors that runs in tandem with a quantum chip and artificial brain tissue, even then is that the actual limitation? Could anti matter silicon create faster computations by energy relativisticly running backwards in time. We are still working on theory’s on different states of matter, quantum loop theory is still pretty new, saying more law is dead is saying science has figured it out and manufacturing has caught up.

2

Benton_Tarentella t1_itlsw2i wrote

Well, no, Moore's law would not be dead only if a plateau was reached. The law is concerned with the rate of increase, so if progress slowed down (or sped up) significantly from doubling every two years, that would be the end of Moore's Law, regardless of whether it is theoretically possible to continue.

1

___Price___ t1_itltunn wrote

That’s making the assumption exponential growth would stop.

Exponential growth would only stop because economic limitations and a barrier of theory.

As of right now it’s nowhere near dead.

1