Submitted by Soupjoe5 t3_z58i6a in Futurology
garry4321 t1_ixvbqyl wrote
Reply to comment by avalonian422 in A bot that watched 70,000 hours of Minecraft could unlock AI’s next big thing by Soupjoe5
And CPU’s have stagnated in power offer the last 10 years. Unless we get a breakthrough, there’s zero chance that the game is going to simulate AI villagers over an infinite map.
aliokatan t1_ixvfw8j wrote
gary let me tell you in the 8800GT days nobody could even dream of the word AI aside from terminator and here it is running on-device, mainstream
garry4321 t1_ixwfzk2 wrote
I’m saying near future. Also, there are infinite villages, so each village you find now has to be rendered and simulated constantly in addition to your actions. Picture each villager as now being a separate instance of Minecraft running on your PC.
Also, referencing historic gains in computing ignores the current factual reality that the exponential growth of CPU power has stalled for the last 10+ years and we are getting smaller and smaller returns on investment into CPU power.
aliokatan t1_ixwuuy1 wrote
AI efficiency and processing power has exploded in the last few years alone
Crivos t1_ixvgudp wrote
I saw something about graphene chips instead of silicone ones upping the charts.
LastPlaceStar t1_ixwbrpk wrote
The fact that the map is infinite has absolutely zero relevance.
garry4321 t1_ixwfgav wrote
Infinite map means infinite villages means each new village you come across needs to be-maintained in processing and simulated in real-time. So yea, infinite map DOES matter. Each new village now needs to be simulated in addition to just simulating gameplay.
LastPlaceStar t1_ixwxkr8 wrote
The game doesn't process everything at the same time. That's what chunk loading is.
RedditFuelsMyDepress t1_ixx2yxj wrote
I guess the confusion is that how would they change the village based on AI behavior if that AI isn't always simulated on the background. Obviously they don't need to render it graphically when you're not there, but you'd expect the village to change when you leave and come back. Could they just quickly run the simulation or "predict" what changes would have occurred when the chunk is loaded in? I'm genuinely asking, because I don't know how this stuff usually works.
GOOD_BONE_N_CALCIUM t1_ixyjtlk wrote
Hell they could have the information processed into the seed how things will proceed once observed / loaded / reloaded vs in world time.
garry4321 t1_ixzrlyb wrote
Thats my point. If you want AI robots always building while you are not there, the game has to render those chunks and simulate the AI building. Perhaps not graphically, but still has to process it exactly the same as if you were there. Otherwise the game just has to fake it and slap in a pre-built castle as if the AI built it, but then thats not AI at all, thats just slapping down pre-builds.
Emu1981 t1_ixwsz31 wrote
>And CPU’s have stagnated in power offer the last 10 years.
You are joking right? 2012 saw the release of Intel's 3rd gen Core processors and AMD's Piledriver CPUs. The 13900k is around 250% faster than the i7-3770k in the single core Geekbench 5 and nearly 400% faster than the FX-8350 - in the multicore benchmark that lead increases to around 700% and 800% respectively (bit unfair given that the 3770k is 4c/8t, the FX-8350 is 8c/8t and the 13900k is 24c/32t). That isn't even taking into account the iGPU/dGPU would be likely used for running the AI models over the CPU - you cannot deny that there has been a huge uplift in performance of GPUs from 2012 to now.
TL;DR: CPUs may have stagnated with only minor performance gains from each new generation from around 2011 to 2017 but from 2017 to now there has been constant noticeable improvements for each new generation due to AMD actually putting up some decent competition with their Zen architecture.
garry4321 t1_ixzsrfy wrote
Im not saying it hasnt improved at all, I am saying that moore's law is dead, and the multiplication of processing power each year is less and less. Processors used to double in processing power roughly every 2 years.
You even prove my point. The 3770k was released in early 2012. The I9 13900k which is the TOP OF THE FUCKING LINE CPU was JUST released.
Over 10 years and the top of the line CPU is only 250% faster than a chip released back in 2012? We were still flying the space shuttle in 2012. Obama still was a president with black hair in 2012.
2.5x better CPU in over 10 years. NOT GREAT
Cale111 t1_iy2ucr0 wrote
Moore’s law is about transistor count, not speed. It’s still in effect, just slightly less since 2010
[deleted] t1_ixvfuui wrote
[deleted]
Idrialite t1_ixxnpmn wrote
Modern large neural networks usually run on GPUs, not CPUs
garry4321 t1_ixzr35f wrote
So you have to have quad 4090's to run minecraft now since you are rendering dozens to potentially hundreds/thousands of AI's building across an infinite map requiring the areas they are in to be rendered at all times.
Its not going to happen.
Idrialite t1_ixzzmig wrote
I'm 99% sure it'll be a trivial amount of computation within 100 years. We still have plenty of avenues to explore.
garry4321 t1_iy0thtr wrote
Perhaps, but these people are talking like we should do it now with zero understanding of how games work or things get done in a computer game. You don't just simulate AI villages with full pathfinding, and construction/deconstruction, crafting etc. abilities without huge PC requirements.
Viewing a single comment thread. View all comments