AtomKanister

AtomKanister t1_iycydix wrote

>"Lets consider upending our infrastructure and putting millions of pounds worth or existing and battle proven code and hardware up in flux so we can fuck around seeing if Intel has actually made a viable GPU-like product on their umpteenth attempt"

That's exactly how innovation is made, and missing out on this in crucial moments is how previously big players become irrelevant in the blink of an eye. See: Kodak, Blockbuster, Sears, Nokia.

It's valid to be skeptical of new developments (because a lot of them will be dead ends), but overdo it and you're setting yourself up for disaster.

−9

AtomKanister t1_iyanrqr wrote

Minecraft is honestly a pretty bad example - it's procedurally generated, so there's no "world" the game has stored beforehand, just the rules of how a world can look. Most games aren't like that though. Adventure games, shooters, RPGs, usually have hand-crafted worlds with very deliberately placed features.

And these are large. Many modern games are 10s or even >100GB in size, most of which are world files. But there are still clever tricks to keep the size smaller than it would be otherwise. Small patches of textures can be expanded to larger areas, terrain in the background can have lower resolution, and so on.

And finally, the storage density of modern electronics is just extremely high.

3

AtomKanister t1_iwwx4hs wrote

Might also be the data. The open-source internet is full of images with related text that can be crawled, but you won't find a lot of document scans with annotated boxes out there.

However, it's definitely doable. The paid services from cloud providers are all very, very high quality. It's more likely an open source availability issue.

1