Viewing a single comment thread. View all comments

dashingstag t1_j2ibzjj wrote

Biggest difference imo is Quantum is analog(continuous) while classic computing is digital(discrete).

This mean quantum computing can potentially model the real world but classic computing can only ever be an abstraction at best.

Classical computing will still be better in terms of energy tradeoff to calculate discrete problems like simple decision making, but quantum will be better for real world modelling like pathfinding where all paths can be considered at once.

How I would explain it is if you turn on a smoke machine in a maze, the smoke will spread and try all paths at once and the edge of the smoke and hence the whole smoke will always know whether it found an exit. This is like quantum. Whereas classical computing is like have multiple people trying all routes but they still have to come back to each other before they all know where the exit is.

1