Comments

You must log in or register to comment.

AideNo621 t1_j2hdaas wrote

Quantum computers are not better computers. They are good for different tasks. They are good to solve tasks that current computers take very long to compute. It's almost impossible to explain how they work in layman terms, because it's quantum technology, which is complete weird shit. Most likely we will never see quantum home computers, maybe we could see some quantum capabilities added to the normal classical computers, but currently the tech is mostly run in lab environment. Because there's lots of interference that will break the function. Also needs to be supercooled to almost absolute zero. An example of a task that is very tedious for a normal computer but easy for a quantum computer is so called "traveling salesman problem". You have a bunch of cities on a map that you want to visit and you need to compute the most optimal route. This problem is easy to solve, but extremely hard to solve to be sure, that your solution is really the most optimal. For normal computers, the more points you have on the map, the longer it will take to compute. And this growth is probably around exponential, so you get to a point where it's simply not practical to even try to compute it, because it would take weeks, months, years. But quantum computers can somehow calculate all the possibilities at the same time and thus making the total time quite short. At least that's how I understand it. Problem with this is, that current computer security and cryptography is based on this, that it is extremely difficult to calculate some things for normal computers. But there will be a point of development in quantum computing where the quantum computer can break through the best current security in minutes. So new security algorythm will have to be developed, and are currently being worked on.

94

suvlub t1_j2hpza8 wrote

>An example of a task that is very tedious for a normal computer but easy for a quantum computer is so called "traveling salesman problem".

Not true. Travelling salesman is an NP-complete problem, quantum computers can't solve those any better than classical computers. See this diagram. P is what traditional computers are good at, BQP is quantum computers.

An example of a problem that quantum computers can solve (and classical probably can't) is prime factorization.

54

hbrthree t1_j2hviol wrote

So previous poster is full of shit lol?

12

suvlub t1_j2hwusy wrote

They were largely right until the example. To be fair, this is a common mistake, for sake of simplicity, or out of laziness, P and NP-complete problems are often explained as two opposite categories without mentioning all the other ones, so when people then hear that quantum computers can (easily) solve problems outside of P, they jump to the conclusion that they can solve NP-complete problems.

25

Oatz3 t1_j2hwcc7 wrote

Yes, quantum still can't solve traveling salesmen

8

AunKnorrie t1_j2huhwx wrote

Ah, You must have had professional training in the field ;)

4

suvlub t1_j2hwyfk wrote

Am programmer with masters in software engineering. Quantum computing is just something I'm vaguely interested in. I'd like to learn more about it, but shit's mind-boggling.

11

AunKnorrie t1_j2ilrrj wrote

Perhaps I should read up myself. I have an engineering degree in IT, so I had to read about Church’ thesis. Quantum is something new to me though.

3

coolthesejets t1_j2i3e3w wrote

Would you say the existence of Shors means prime factorization is definitely not in np complete?

4

suvlub t1_j2ibswe wrote

It's a strong indication, but we still don't have a proof that P != NP, so no, not definitely.

4

AideNo621 t1_j2ih99i wrote

Thanks for the correction. I was only reproducing what I read about the topic.

3

Yamidamian t1_j2j39b2 wrote

How is prime factorization ‘unsolvable’ on a classic computer? It seems like something any programmer could pound out a simple program to do pretty easily.

It would be slow as heck for really big values, due to recursion involved, but it would eventually give a solution.

Is there some kind of math definition of ‘solved’ that I’m unfamiliar with?

1

suvlub t1_j2j4qhu wrote

Sorry, I was sloppy. "Solve in polynomial time" is what I meant.

1

lunaticloser t1_j2hmghe wrote

Isn't there already a quantum-computer-safe encryption algorithm?

I seem to recall the problem with that algorithm was it was easily solved by regular computers 😅

11

sterexx t1_j2hnr7k wrote

NIST has been running a competition for quantum resistant encryption algos and somewhat recently announced some finalists for upcoming standards. They wouldn’t have any interest in ones not resistant to classical methods. If you can recall which algo you’re thinking of, though, I’d be interested to see

https://www.nist.gov/news-events/news/2022/07/nist-announces-first-four-quantum-resistant-cryptographic-algorithms

17

lunaticloser t1_j2ho8ox wrote

I can't. I might be misremembering but I thought some mathematician had devised such an algorithm even before the first quantum computer ever existed. Like way back in the 80s or so.

4

sterexx t1_j2hootb wrote

Ah okay! Yeah it’s fascinating how long they’ve been able to work on this stuff without having any actual hardware. Kinda like Turing’s machine!

If you can imagine how the machine works, you can design programs for it. Shor’s algorithm, which breaks RSA and other venerable public key cryptography systems (if you had a quantum computer to run it on), was made in 1994

6

Cryptizard t1_j2hyxko wrote

The algorithms behind the new NIST standards have mostly been known for a long time (since the 90s) but it took a while to refine them and be confident in their security.

2

warren_stupidity t1_j2i0a3a wrote

Things I learned: governments have archives of encrypted communications that they will decrypt as soon as they have a viable QC up to the task. It’s sort of the encryption apocalypse.

3

fenton7 t1_j2i53oi wrote

The largest prime number ever factored by a real quantum computer using Shor's algorithm was 21 back in 2012 so don't hold your breath. In 2019 an attempt was made to factor the number 35 using Shor's algorithm on an IBM Q System One, but the algorithm failed because of accumulating errors. It's tech that sounds great in science fiction but actually building a working quantum computer with enough qbits and low enough noise to do anything useful may be an impossible engineering problem.

1

warren_stupidity t1_j2iermh wrote

right - it might be in the 'fusion reactor' mode of breakthroughs that never actually pan out into something functional. Or not.

1

thecoat9 t1_j2irs6z wrote

>which is complete weird shit.

I move that "weird shit" be generally used in place of "quantum technology" world wide.

3

MeteorOnMars t1_j2hurbf wrote

For some kinds of problems, a classical computer considers combinations one at a time:

ABC

ACB

BAC

BCA

CAB

CBA

While a quantum computer does it all on one step:

ABC/ACB/BAC/BCA/CAB/CBA

This is because, in some ways, quantum particles can be in more than one state at the same time!

Now imagine I had used ABCDEFGH for the example. You would be scrolling all day to get to the end of the classic computer example.

92

Alpha-Sierra-Charlie t1_j2imgzo wrote

So it can essentially run a lot of calculations in parallel?

11

Futechteller t1_j2injfn wrote

It is much stranger than this, there is nothing that happens in our normal macro scale life that is comprable to it. Things on a small scale behave way differently than our intuition allows for. The actual probablities of different outcomes interact with each other in a physical-like way. The single photon "double slit" experiment demonstrates this. 1 photon goes through two holes with equal probabilities, and the probabilities bounce off of each other making a new probability distribution as if they were waves bouncing of each other in the ocean. Nobody really knows why and the smartest people on the planet disagree as to why. What they agree on is ow to use this weirdness to make huge calculations very easily.

13

MeteorOnMars t1_j2inhym wrote

Yes. But, only for some kinds of calculations that are looking at combinations or choices between things.

7

digitalindigo t1_j2hfo1k wrote

Quantum computing is a way of using the properties of quantum mechanics, which is the physics that describes how very small things behave, to perform calculations that are faster than is possible using classical computers.

Classical computers use bits to store information and perform calculations. A bit is a unit of information that can be either a 0 or a 1. Quantum computers use quantum bits, or qubits, to store information. Qubits can represent both a 0 and a 1 at the same time, which allows quantum computers to perform many calculations at once.

Because of this, quantum computers have the potential to solve certain problems much faster than classical computers. For example, they could be used to break encryption codes that would take a classical computer many years to crack. However, quantum computers are still very experimental and there are many challenges to overcome before they can be used for practical applications.

55

CriminalizeGolf t1_j2hi1rd wrote

Was this written by chatGPT?

91

digitalindigo t1_j2hi3nt wrote

Yup 😂

33

CriminalizeGolf t1_j2jwh4l wrote

ChatGPT is like the Midjourney of language AIs. It can spit out some pretty impressive stuff, but it all kinda sounds the same once you learn to recognize it.

1

digitalindigo t1_j2jyahv wrote

Yeah, it's a great launch point, but definitely formulaic. It lacks personality but does a lot of the information gathering and structuring. This just happened to be an example question it uses that was one of the best explanations I'd seen.

1

zshadowjon t1_j2i7rkg wrote

You can tell because the second paragraph sucks lol

14

Working_Early t1_j2iq4s4 wrote

Do we yet understand how a quibit can represent 0 and 1? Or is that still undiscovered/still being studied? I'd appreciate the insight!

3

Chaosfox_Firemaker t1_j2j1d3q wrote

A quibit is (essentially) a probability of being a zero or a one. Or at least sort of. quantum mechanics has this thing called probability amplitudes which are complex numbers, not a regular percentage, but when you square those you get normal probability

so because these are weird, you don't just have one degree of freedom that you could just represent with normal numbers probability. Each Qubit can be represented by a position on a "bloch sphere".

So you can think about each gate doing some math between two those qubits.

At the end of the whole thing though, you end up with a series of classical bits

2

digitalindigo t1_j2izn2b wrote

I think it's more that the programming is able to use each bit of data as an 'if this, than that' context, much like the human brain can but without the biological limits. It requires a ton of energy but allows it to run simultaneous possibilities against each other without needing to reach entire conclusions sequentially.

1

fwubglubbel t1_j2hfibn wrote

As proven by the other responses, the answer to your question is No. After years of searching, I have never found anyone who can clearly and effectively explain how quantum computing works.

16

village_aapiser OP t1_j2hirsu wrote

Without anyone able to explain it, how are people actually managing to develop it.

4

Late-Pomegranate3329 t1_j2hk8re wrote

I think this comes down to it being such a complicated topic, that simplifying it to an easy-to-understand way requires losing so much complexity that it's watered-down to this meh explanation. People do understand it, but finding a way to turn the equations and theory into words everyone understands is quite hard.

14

Vishnej t1_j2hkhvj wrote

Poorly.

Slowly.

At extreme low temperatures, where thermal energy and electrical charge behave a bit differently.

>Several hardware companies, such as Google, Honeywell, IBM, and Intel, have built gate model quantum computers, that are now available in the marketplace. These devices must function at very low temperatures, and require expensive refrigeration technology. It turns out that it is very challenging to build reliable qubits, and very difficult to incorporate them into chips. All of these companies are presently working to scale up their machines.

It has been a largely theoretical math-physics construct for much of its existence, first proposed 1980 & 1981. Every time it goes another step further into being physically realized, people celebrate, but we're still in the 1940's or earlier as far as the progress of conventional digital computing.

A quantum computing algorithm is a highly specialized arrangement of hardware, and the unit parts don't rapidly generalize to other tasks like digital computing does. Only certain tasks are amenable at all to quantum algorithms being developed for them. If we can find one task ("Solving a specific encryption algorithm with a specific number of bits key length") that it's well-suited for, we could in theory do it incomparably faster than digital computers ever could (eg "Solve the puzzle in 2^64 guesses instead of 2^128 guesses, which would take longer than the universe has existed"). This point, where it's indisputable that quantum computing is faster for a specific task, is termed "quantum supremacy", and it's part of proving that you have a working quantum computer, whatever's inside the black box. We're starting to get to the point where quantum supremacy is arguably relevant for real physical devices, though not yet to the point that it's reshaping our world for even one practical computing task. As far as I can tell, everyone expects that first task to be breaking conventional encryption algorithms, which will then give way to post-quantum encryption algorithms which work by principles that quantum algorithms can't be applied as well to.

4

kernal42 t1_j2hjf1w wrote

Consider searching for an item in an unsorted list of length N. There's no classical algorithm that lets you find your item in fewer than N/2 queries, on average. This probably makes sense naively.

Grover's algorithm, a quantum computer algorithm, can find your item in sqrt(N) queries.

This seems impossible, but it works because quantum computing Is fucking magic.

Edit: had algorithm name wrong.

16

kernal42 t1_j2hk0s2 wrote

To add, more seriously, there are other quantum algorithms that would revolutionize (or disrupt) our lives as we know them. The most obvious example is Shor's algorithm which, as Grover's above, can factorize numbers more efficiently than we know how to with classical computers. This matters because a majority of public-key encryption algorithms rely on the difficulty for factorization of large numbers. If/when someone figures out how to build a large enough quantum computer, all messages sent with this encryption (future or past) will be trivially decrypted. This breaks so much.

NB we should all be using elliptic curve public key cryptography because there's no known quantum algorithm to break it (yet?).

5

Cryptizard t1_j2hzmlw wrote

That’s not really how Grover’s algorithm works. It can find the correct preimage of a function with only O(sqrt(N)) calls to the function. It can’t find things in a list in less than O(N) time because it would take that long just to read the list elements into registers.

3

Terrible_Toe t1_j2hdhd2 wrote

First off lex Friedman had a really good podcast on this. Quantum computers use a different processor which is specific to solving one problem, a problem which could take decades to solve on today's supercomputers in days or even hours. A classic computer is great at solving many problems but doesn't specialize in any specific one of them. The use case for me that is interesting is the use of quantum computing in solving complex medicinal, biological, and chemistry problems. IE develop new medicines in an extremely accurate and efficient process

14

[deleted] t1_j2hfq5v wrote

A true quantum computer can perform the calculations a normal computer does in fractions of the time. For instance, you could run a program to solve for the asymmetric cryptography keys for a Bitcoin (hacking ownership of the Bitcoin) that could take years to complete on a modern computer. A quantum computer is so much faster it could do this in fractions of a second. The digital cryptographic cybersecurity solutions used for banking and stock market accounts would also be similarly at risk of being made obsolete in fractions of a second. Luckily, true quantum computers as they exist now are only functional for maybe seconds at a time in isolated development sandboxes as they are only operational at -273 Celsius and below. It's not a technology that could make it to being a household device, probably, ever.

6

TayoEXE t1_j2hjkq9 wrote

"-273 Celsius and below"

Wait, isn't that Absolute Zero? How can there be temperatures lower than that?

4

Raevix t1_j2hpftg wrote

Absolute zero is -273.15C, which is frequently rounded to -273C for popular consumption. However, that 0.15C is like a buffer the size of an ocean to a scientist.

6

TayoEXE t1_j2hqizs wrote

Huh, interesting. That makes sense though!

2

[deleted] t1_j2jr8z7 wrote

Now you're understanding the real problem with the technology EVER becoming operational at any practical level.

2

TayoEXE t1_j2jtfz1 wrote

Time to break the laws of thermodynamics!

2

C00lerking t1_j2ieyh1 wrote

Here is how I think of it (could be wrong, could be too simple).

Traditional computers give information in binary. Example; It is either this or it is not this. It is either on or it is not on.

Quantum computers provide a spectrum answer; it is on, it is not on, it is somewhat on, it is off.

Some computational problems (the things computers solve at their root level) are fine with binary math and don’t benefit from what quantum offers. Counting problems for example. Some computational problems, particularly ones that require advanced math will benefit from computational capabilities of a quantum computer.

Hope this helps.

This answer by the way is right, kind of right and wrong all at the same time. QUANTUM!

5

No_Ninja3309_NoNoYes t1_j2hv3z1 wrote

The quantum world deals with the very small. It's really hard to grasp it because we're not able to imagine that kind of scale. The atom is like a miniature solar system with lots of empty space. In the beginning of the previous century Rutherford bombarded gold atoms with alpha particles, helium, and some of the particles were scattered back. There were other strange experiments showing the interaction of light and slits which made people think that light was made of waves but also behaved as waves. The photo electric effect showed that light can interact with matter, producing electricity.

Theoreticians came up with mathematical functions. These correspond to probability. Quantum wave functions. In this theory some things are unknowable. Einstein scoffed that 'God does not play with dice!' But the theory has been tested. We can compute probabilities. This is done with special integrals, mathematical functions. But the gist of it is, that particles are not localized. They can go though potential barriers. Quantum tunneling. It's equivalent in our world to walking through a wall. And there is the Heisenberg uncertainty principle which states that you can't know everything about a particle. You can measure certain properties accurately but that will prevent you from knowing more about other properties.

The wave functions when turned into numbers through the appropriate operations that we can use give a probability between one and zero. However, in the quantum world everything is much quicker than in our world and very little energy makes quite a difference. Quantum states want to be in their ground state. You can think of them as pendulums in rest most of the time. If you give them a quantum of energy, they want to go back. This happens in such a short time that none of us can really imagine it.

Unfortunately, we need to bring qubits out of their ground state to do meaningful computation. This means shielding them from outside influences. Which currently requires a special mix of helium isotopes, tubing, and other stuff. This part is hard to shrink.

Also because the quantum world is a busy chaos and decoherence, going back to ground state, exists, quantum computers make many errors. Algorithms to fix them are still in development.

Qubits work fast and due to their nature can explore possibilities that classic computers simply can't. They could be useful for AI systems which will have impact for consumers. For the foreseeable future the resources required are unaffordable for almost everyone.

But never say never. If we have better algorithms and someone figures out how to keep the qubits coherent longer at somewhat higher temperature, perhaps just by having many qubits we could get closer to your wish

3

Kheras t1_j2idz53 wrote

A traditional CPU is measured in actions per second. So 4GHz is 4 billion calculations per second.

It works be performing these actions on each cycle of an internal clock. And these actions are based in binary (0 or 1).

A qubit has more than two states. So by encoding a qubit’s states you can ‘use’ more bits per clock. It’s a multiplicative performance increase. And additional tricks/encoding can make it roughly exponential.

3

Sobutai t1_j2iiy80 wrote

When I was in college, one of my friends was a math major. I asked him what math he was studying for, and he said, "Quantum Mathematics." When I asked him what that even meant, he looked up at me with eye bags that wouldn't pass a TSA check in and a dead stare, then said "weird, space math."

I'm going to assume it's like that but with computers.

3

Grayman222 t1_j2hhajz wrote

https://www.youtube.com/watch?v=Eak_ogYMprk

A lot of encryption revolves around super primes 3 * 5 = 15 (put way bigger numbers). A classic computer just has to try every formula in sequence and brute force what primes make up 15 if any. A quantum computer can be asked 15??? and the super position is somehow trying all the answers at once and gives you 3 * 5 in a single hz.

General purpose CPU have bits, mostly 64 bit now, that's how complex a task they can do per cycle. A single qbit quantum computer that can even solve that basic problem above needs to be super cooled with lasers and such though. So at least today it can't instantly crack any harddrive or mine all the bitcoins left because those tasks are too large to fit in the instructions per clock the processor can do.

2

Grayman222 t1_j2hhl00 wrote

quantum science in general i understand as quantum particles are smaller than electrons but act similar except

by default they exist in a super position of positive and negative at the same time until you decide to measure them.

they exist in pairs and applying a charge to one also flips it's twin, even if separated apart so there should not be a cause/effect between the two.

1

WWDB t1_j2hht4r wrote

Let’s say there is a football game between the Rams and the Vikings. A classic computer will pick a winner based on which team has the better record.

A quantum computer will analyze every possible statistic available about both teams to mankind and come up with a prediction.

2

1AlphaGeek1 t1_j2hhv6k wrote

Quantum computing takes a different approach than normal computing does. As far as my understanding goes, they are more likely to specialize in a certain area than a classical computer is. Take for example a problem that can have a lot of variables and possibly trillions of combination answers. A classical supercomputer might fail at this task because the brute force computing it does would require memory that it simply cannot possess. A quantum computer might instead create a model where patterns may emerge, with the best combination of patterns being the answer. This is however a very rudimentary way of explaining the differences.

As regards to the possibility of there being Quantum PCs, the chances seem very slim. For one, Quantum computers require their processors to be super-cooled, we are talking at the very least a few degrees above absolute zero. They may become more common, but Quantum PCs don't seem achievable, at least for now.

2

[deleted] t1_j2hzi5m wrote

For the ELI5 answer:

A normal computer will compute things as True or False. So a binary 1 or a 0. This forces the computer into only accepting two possible answers for any given problem (True or False). The issue is that no matter how big or small the problem is it has to be solved using that exact same method. It just scales accordingly. Whether it's 2+2 or some calculus equation. This is very time consuming and CPUs today get around that from shear horsepower.

A quantum computer understands that not everything is solved with a True or False statement. It has the capability of engaging with more possible options right from the start. It instead says "How are all of these bits going to relate to my problem" rather than "Is this true, no, is this true, yes, is this true, no"

This might not be entirely accurate but I think it illustrates the point well enough.

2

DropsTheMic t1_j2i2khj wrote

ChatGPT got you homie:

Prompt: explain it to me like I'm a drunk five year old: what is the difference between classic computing and quantum computing

Sure! Classic computing is like using a regular old calculator to do math. It can only do one thing at a time, and it can only do things that you tell it to do using a special language called code. Quantum computing is like using a super fancy calculator that can do lots of math problems at the same time, and it can do things that you might not have even thought to ask it to do. It's really fast and powerful, but it's also really hard to understand and use.

2

Fisk77 t1_j2i5qxn wrote

Classic computing represents information in 0 and 1 binary digits (bits). Quantum computing bits (qubits) can represent any proportion between 0 and 1, which allows quantum computers to process information in exponential fractions of time when compared to classic computers.

2

Late-Pomegranate3329 t1_j2hck3z wrote

From my understanding and keeping it short and sweet. No. You will never run a QC as your home computer. They do have some very cool uses, but you will only see them pop up in a few places. Things like better computer security and more accurate weather forecasts. Most of the other stuff will be in ways that a consumer wouldn't really see.

1

village_aapiser OP t1_j2hcwet wrote

It hard to believe that we won't have any pc sized quantum computers for personal use in 2050s.

−2

Late-Pomegranate3329 t1_j2hi8xo wrote

You've probably already gotten a satisfactory answer, but just in case. It's not that it can't happen, but that it doesn't make sense to happen. In 2050, it may be possible to have one set up in a home environment, but they work in such a different way than normal computers and they have such a different use case, that they wouldn't make for a good home PC. It's like this: I can buy a demilitarized APC, but just because I can doesn't mean that it would make a better commuting vehicle than a normal car.

2

[deleted] t1_j2hdbbx wrote

[removed]

1

Late-Pomegranate3329 t1_j2hiiib wrote

They may have working QC's that they are doing research on/with, buts that's an enormous jump to having a complete consumer product ready for use by the general public.

1

dynedain t1_j2hnxuf wrote

You missed the news last week - they’re available now starting at $8K

https://www.tomshardware.com/news/spinq-introduces-trio-of-portable-quantum-computers

Sure, there’s nothing to do with them yet and they are very pricey. But the first home computers were the same way and in 20 years they went from an expensive hobbiest toy to ubiquitous in most middle-to-upper class homes.

I think it’s entirely reasonable accessible and cheapish quantum computers will be available as personal/home devices. If nothing else, because we’ll need quantum chips in our devices to handle encryption since cheap quantum computing will destroy classical encryption algorithms.

2

Late-Pomegranate3329 t1_j2i4un8 wrote

I love being proven wrong. I did indeed miss that news. I do think at that price that some people will buy one just because, but it's still not going to take the place of a home computer. The problems that they are good at are not what we use classical computers for. I can however see add-on chip sets (QPU?) that could be used for the few cases that overlap quantum computers and the problems that normal consumers have.

I'm still in the air about if they would/could be added to mobile devices. They have such strict operating conditions, that I don't see the cost of manufacturing and operating them as well as the space they take up being outweighed by something like using an encryption method that's not as secure from a classical point of view, but is harder for quantum, and using that to send data to a quantum node that passes it along to the end user with better encryption. But I leave that to those working on the bleeding edge and eagerly await all the cool stuff that they'll make.

1

solsbarry t1_j2hvels wrote

These people who keep saying this are fools. They have no idea what the future will bring.

1

winkydevil t1_j2hcyco wrote

This is a gross oversimplification, but a classical computer will think about A and then B and then C. A quantum computer can think about ABC all at the same time.

1

village_aapiser OP t1_j2hd2qc wrote

So instead of continuity it will consider all the units as the same time. So it will be a lot more faster especially for ai stuff. Is it?. What does quantum physics has to do with it?

2

[deleted] t1_j2hiato wrote

[deleted]

6

Late-Pomegranate3329 t1_j2hko6d wrote

This is a fantastic way to explain the difference between classical and quantum. It's so simple but gives a very clear and easy-to-understand comparison. I may borrow this for the future.

1

Cryptizard t1_j2hzr4r wrote

That’s not a very good explanation. If that were true, then quantum computers would be able to efficiently solve any problem in NP, which is not the case.

1

guidedhand t1_j2houpi wrote

Eli10: some problems get much harder when they get bigger. Rather than difficulty scaling linearl (eg, y=a*X), they can scale like a polynomial or exponential (eg y =a**X)

This rule isn't necessarily true for all problems on a quantum computer. You can have problems that take an exponential amount of time to solve on a classical computer, actually take a linear amount of time on a quantum one.

1

Seemose t1_j2hpmvc wrote

Classical computer - calculates slowly, with an output answer of "yes" or "no"

Quantum computer - calculates very quickly, with an output answer of "maybe"

1

r2k-in-the-vortex t1_j2hy0xw wrote

Suppose you had a dice and wanted to know how it rolls. Well, you could build a physics simulator, 3D model, run it on a classical computer and find out, or you could just roll the die and have reality sort it out.

Quantum computing is more like the second option, except with particles and systems that exhibit quantum behavior. The important bit, is that it's not just rolling the die. It's possible to map computation tasks to behavior of these quantum systems. That is an incredibly powerful concept, it can compute some things that are just plain not computable on classical computers because it would take longer than age of the universe to run the computation.

Hard part about it is to map useful computational tasks to quantum computing, it's very much not like programming we are used to on classical computers, it's very demanding on actually understanding the physics involved which is very complicated. The other hard part is building a good enough quantum computer, that's not easy and all the presently working quantum computers have significant limitations.

In summary, don't expect to directly use a quantum computer any time soon, but if things go well, you may be able to use something created using computations done on a quantum computer that couldn't have otherwise been computed.

For example, battery manufacturers have tried to find better materials for making batteries using quantum computers. The particular experiment didn't yield useful results, but it's a good example of how one day results of quantum computing might reach a consumer.

1

Tupcek t1_j2hyi4f wrote

normal computer have to try every combination one at a time (or per number of cores).
Quantum computer can try all of them at once.
It’s very useful if you have a lot of combinations to try, like trying what molecule works best as a treatment for X.
It isnt very useful in task, like if X then Y. Or keep track of Z.
So basically, it wouldn’t enhance your smartphone in any other way, but in use cases where you have lot of combinations, it’s light years ahead of any other computer

1

joenurses t1_j2i4c3o wrote

Classic is like a light switch on off. Quantum is like a switch with more settings. Yellow green blue etc. these are called states. More states helps with certain types of tasks.

1

war2death t1_j2i6q80 wrote

Simple answer don’t worry about it, it’s going to be another 100 years before an individual is going to be able to afford one let alone have a portable version within your life time

1

dashingstag t1_j2ibzjj wrote

Biggest difference imo is Quantum is analog(continuous) while classic computing is digital(discrete).

This mean quantum computing can potentially model the real world but classic computing can only ever be an abstraction at best.

Classical computing will still be better in terms of energy tradeoff to calculate discrete problems like simple decision making, but quantum will be better for real world modelling like pathfinding where all paths can be considered at once.

How I would explain it is if you turn on a smoke machine in a maze, the smoke will spread and try all paths at once and the edge of the smoke and hence the whole smoke will always know whether it found an exit. This is like quantum. Whereas classical computing is like have multiple people trying all routes but they still have to come back to each other before they all know where the exit is.

1

mic-cavemave t1_j2ifxkn wrote

It's all about speed and having multiple tasks to undertake, using Less power to do it, from old slow chips of the past, Pentium , to a single bio chip computer , to quantum going from binary, numeric, to automatically create even smaller and higher advanced bio micro technology. Technologically, we are still not, where we're supposed to be, based on historical evidence of other civilisations that have risen and fallen.

1

[deleted] t1_j2ih0f0 wrote

I'm sure it'll have all sorts of real world uses but it'll just end up trying to serve more ads to people.

1

Solonotix t1_j2iltb7 wrote

Here's my take on an ELI5 answer

Imagine you're travelling to some destination. A straight line is the shortest path. However, traditional computing requires tracing the path with Right/Left directions. Quantum computing provided a way to estimate a turn much like degrees on a compass. This is because of the superposition of states (not relevant to this explanation).

When measured, these will "collapse" (as-is the term for describing wave-form probabilities resolving to concrete values) into the Left/Right directions we're familiar with, but it will have arrived at the destination/solution with fewer total turns.

Sometimes this isn't a better solution. If you have a straight line path already, then you don't need to use Quantum Computing for it (they are generally slower at resolving simple problems). If you have a path that is all Left/Right turns, then Quantum Computing will likely save you no time (number crunching is often better performed on Traditional Computers).

Hope this helps

1

bahuchha t1_j2iltfm wrote

In classic computers, their is a small unit called “bit”. It can be either “on” or “off”.

In quantum computers, the small unit is called “Qbit” (Q for quantum). It can be “on”, “off” or “I don’t know “ state.

In layman terms, Quantum mechanics has a state called “I don’t know” which basically means you don’t know until you know. It’s like when you were in school, your parents wants to know if you are studying or not. They have to peep into your room to really see if you are studying or not. You may be studying, you may be dancing, you may be watching p@rn etc. At minute scale lot of things behave like you in the room. We don’t know until we peep to see what they are doing.

Coming back to quantum computers, having that third state will help in calculations taking lesser time than classic computers.

Hope it helps.

1

Ps1on t1_j2inf1p wrote

Maybe I'm too late, but there's one other applications that might one day reach the average consumer and that's quantum encryption. So the basic idea is that if you send out signals that have specific spin states, so it could be either up or down, then nobody could intercept that signal without you knowing it. Because if they intercept that signal they will mess with it causing some of these spin states to flip.

Now you could argue that they could piece together the original information, but with real signals that is extremely unlikely. You have to consider that most real data is extremely large, literally millions or billions of individual bits. Plus they already have an encryption, because the information will probably be in a format that is readable to a certain programm. You can think about it like this: if you opened an image in an editor you could still read the information in the image, but it will look like gibberish. If you know change random values here, it's impossible to decode this information again. Plus you will also get random information, because the would be hacker will have messed with your signal too.

But, yeah if this will ever hit average consumers, who knows.

1

REmarkABL t1_j2inhxj wrote

Biggest benefit is security, there is no way to predict or measure a qbit without changing it, so it creates theoretically impossible to crack passwords

Another is instantaneous communication, theoretically a quantum linked quark and it’s partner will always change at the exact same moment no matter how far away they are.

1

REmarkABL t1_j2inxq1 wrote

Biggest benefit is security, there is no way to predict or measure a qbit without changing it, so it creates theoretically impossible to crack passwords

Another is instantaneous communication, theoretically a quantum linked quark and it’s partner will always change at the exact same moment no matter how far away they are.

The third is processing speed, a quantum computer can check all possible combinations in a list at once, conventional computers have to check one at a time (incredibly quickly)

1

OneChrononOfPlancks t1_j2ip094 wrote

In simplest terms, a quantum computer will never be better for your purposes as an average consumer. They'll be for science and crypto nerds.

1

PlayAccomplished3706 t1_j2ip2nn wrote

Benefit us? Depends on who you are. For example, if you like the total collapse of encryption, then yes.

1

CautiousImpact5032 t1_j2ipoja wrote

It was explained to me that normal computers work on a yes/no type system. It has to do each yes/no one at a time, that's where processing time comes into play. With a quantum computer it is able to do all yes/no questions at once, so basically instant computers.

1

Green-Future_ t1_j2irrm4 wrote

Useful comments here but if still not understood look at the recent post in r/OurGreenFuture - it might help.

1

fade2black244 t1_j2ivvt0 wrote

To simplify, classical computers use 0 or 1 known as "bits", quantum computers use "qubits" which is 0 and 1 which exist in a superposition state that can be false and true at the same time. This allows them to perform complex calculations exponentially faster than classical computers. Classical computers will likely still be a thing which offloads complex computations to quantum computers through the quantum internet.

1

zethenus t1_j2iw3l3 wrote

Not a Computer Scientist. I don’t have a Computer Science degree. The way I understood it is this.

Computer works on 1 and 0. So if you apply that over a light bulb. 1 is on. 0 is off. So 8 bit is 8 light bulbs in a row. 16 bit is 16 light bulbs in a row. 32 is … so on and so on.

Each combo of 1 and 0 are used to represent something. Thus 10111001 = something. (I just made that up. No clue what it might represent)

For the majority of computers today. We managed to arrange millions of light bulbs in a grid and layers upon layers on top of each other on a single surface, and that is a single computer chip. That’s the computing today.

Now imagine we managed to do that on a cube which has 6 surfaces vs 1 on today’s computer chips. Thus one computer chip in Quantum Computing is essentially 6 computer chips of today. How much more data can be processed in a single Quantum Computing chip vs a single Computer chip today?

That’s how I know understood it. However I’ve not idea how accurate this is, if at all.

1

WindblownSquash t1_j2jlg4e wrote

Idk if this is quantum computing this is just better chip design lol

1

musingofrandomness t1_j2izc1q wrote

I am a layman when it comes to quantum computing, but I have heard it described similarly to the jump in register size in traditional computers. Every bit you add doubles the computing power and addressable memory.

So a 16 bit computer is orders of magnitude more capable than an 8 bit computer and our current generation of 64 bit computers are orders of magnitude over that.

With quantum computers instead of each additional bit increasing capacity by the power of 2, it is increased by the power of 6 (as it was explained to me at the time). So a 16 Qbit quantum computer would theoretically outperform an equivalent clock speed 64 bit digital computer.

This could be incorrect, but it was how it was presented to me by someone who works in the field.

1

KeyboardRacc00n t1_j2j5iy9 wrote

I read an article that was probably not great that said quantum computers can barrow from parallel universes via superposition. Then it went to explain that what it means is just being able to use energy via superposition??? Idk man, but I was just curious if anyone who knows more about quantum computing thinks about that because from what I'm reading here quantum computing is just a step up from Binary right ?

And just because I can see the replies coming : No I do not believe it, yes it's probably a shit article , I have nothing to do with this article besides taking interest in the subject of quantum computing. Please don't insult insult me , insult the article.

1

Jeff_72 t1_j2j8192 wrote

Classic: 1 or 0

Quantum: 1 or 0 or (1 and 0 at the same time)

1

Futurology-ModTeam t1_j2j8caq wrote

Rule 2 - Submissions must be futurology related or future focused.

1

WindblownSquash t1_j2jlmn6 wrote

Quantum computing uses things found in quantum physics right? Light reflection and reading is simple optics learned in early physics.

Think of this instead of working with the lights, they work with the atoms that create that light.

1

NOLALaura t1_j2i6y7l wrote

Doesn’t the term quantum refer to microscopic size?

0

Hentai_Yoshi t1_j2hgamg wrote

Quantum computing is based on quantum principles, so it is really fucking hard to put it in layman terms. I took quantum mechanics 1 and 2. I also took many classes on electrical/computer engineering logic. I won’t even both to try to explain it. You need to understand so many different things. You need to know things beyond what most undergrad physics students know. Plus you’d ideally need a good understanding of computer engineering.

−6

Romeos_Crying t1_j2hmyxi wrote

You can definitely describe quantum computers in simple terms compared to our current binary computers. OP is not asking for the physics, engineering, and detailed principles, they want a simple explanation, which pretty much every comment I have seen has described some aspect of.

2