Nanohaystack

Nanohaystack t1_j5ukavy wrote

You'd have to fiddle with the firmware in any case to get such capacities even if you weren't using the router itself for computing. If you reeeeeaaaalllyyyy optimized a machine learned model that's fitted precisely to the conditions of a particular room, then it could be possible. There are wifi routers out there on the more expensive side with beefy CPUs that have like 1 gig memory and can take a few hundred MB worth of firmware. Even stuff you can find off the shelf in a BestBuy now, like Asus AX1800, can carry 128MB flash, that's sufficient for a rudimentary machine learning setup, though with its 256MB RAM and 4-core 1.5GHz ARM Cortex, it would be rather slow at training a model and will definitely need external storage for swap space.

If I were approaching such a task today, I'd be using two or three access points as "sensors" using a jerry rigged radio driver to stream raw data straight to a dedicated machine learning setup. I've met tech wiz guys who are in the business of optimizing trained neural networks and they do some very impressive stuff, but even then, I'd be surprised if a run of the mill home router CPU wouldn't burst into fire under all this load.

2

Nanohaystack t1_j5tizx0 wrote

Echolocation has been a thing for a while, it's just that the normal radio background made it impractical to try developing deterministic echolocation techniques for heavily trafficked applications, though attempts were made even in the early 2000s. This is essentially the same thing as we saw in Dark Knight film, in 2008. The use of machine learning for processing such massive amounts of data enabled this application of a well-known technology.

38