Viewing a single comment thread. View all comments

xeneks t1_iv7z3mx wrote

Actually they do.

It lags the leading edge though. I think that’s due to the large time it takes to recode at simple or base levels, replacing routines or libraries or writing entire new code bases or implementing algorithms that take advantage of unique or abstracted hardware.

Having rewritten software using new codebases with new libraries or upgraded dependencies often addresses software bloat issues. If you upgrade the OS you can often run more recent apps.

Guessing mostly,

If you take a bunch of computers that is <1 yo and the best os & software you can find. The software choices often work fine, but are actually not so optimised. Sometimes they are brutal in their resource requirements.

Then you take a bunch of computers >5 yo. And you install the best os & software you can find. The software choices apply many code optimisations that actually take substantial advantage of the full set of hardware features.

It’s another reason why old hardware is amazing and always worth keeping, repairing and maintaining and even, actively using privately, professionally or commercially.

It’s why even a low end old mobile phone is worth spending hours to repair, service, and make hardware reliable on.

Apply upgraded OS or different apps, suddenly the phone is a completely different machine, not only functional, but usable and even satisfying and enjoyable to use.

This is really easy to do with PCs, that typically run windows or linux, but with phones or with Apple hardware it’s less possible due to the closed development environment.

I’ve done it using jailbroken android stacks though, and been very happy as old hardware suddenly works equal to new hardware with no additional resources or pollution and water needs, and deferred recycling costs.

When old hardware is reliably and operating consistently, or even only low cost but working well and repairable, you really warm to the manufacturers.

Source:

Personal/professional experience over 20+ years of trying to get new optimal OS & software working on old hardware, to avoid disrespecting embedded resource, material, carbon costs, water and air pollution.

Ps: 1nm… low power! Low temperature! Awesome! I’m thinking this might create the first generations of hardware for computers, phones and tablets that might be in-field functional past two decades… I hope it’s able to be adjusted to remain viable even if hardware code exploits are discovered after a decade or more of use. Aside from microcode, what other approaches are taken to make hardware reliable aside from air gaps and isolation from networks?

6

wen_mars t1_iv8r7f8 wrote

> Guessing mostly, > > If you take a bunch of computers that is <1 yo and the best os & software you can find. The software choices often work fine, but are actually not so optimised. Sometimes they are brutal in their resource requirements. > > Then you take a bunch of computers >5 yo. And you install the best os & software you can find. The software choices apply many code optimisations that actually take substantial advantage of the full set of hardware features. > > It’s another reason why old hardware is amazing and always worth keeping, repairing and maintaining and even, actively using privately, professionally or commercially.

This is not true. The actual reasons why old hardware works just fine are that CPUs have not improved all that much in single-threaded performance over the last decade or so and RAM does not meaningfully impact performance unless you have too little of it. The only big change has been the transition from HDDs to SSDs. Loading times and boot times have improved a lot because of it.

CPUs now have more cores than before but most software does not take advantage of it.

5