Viewing a single comment thread. View all comments

BobbyThrowaway6969 t1_j6lg0ft wrote

The CPU is a mathematician that sits in the attic working on a new theory.

The GPU is hundreds of thousands of 2nd graders working on 1+1 math all at the same time.

These days, the CPU is now more like 8 mathematicians sitting in the attic but you get the point.

They're both suited for different jobs.

The CPU could update the picture that you see on the display, but that's grunt work.

Edit: I don't mean the cores in a GPU are stupid, but their instruction set isn't as complex & versatile as a CPU's which is what I meant.

663

TheRomanRuler t1_j6m5ezh wrote

That is actually a good visualization, thank you.

120

Dysan27 t1_j6nwk50 wrote

No no. This is a good visualization.

72

JackSartan t1_j6or86g wrote

I knew exactly what that was before clicking on it. Mythbusters and paintball is a good mix. If anyone is confused, that's the video to watch.

21

Icolan t1_j6ou77p wrote

Wow, I don't know how I have not seen that before.

4

Thrawn89 t1_j6mvpr4 wrote

It's a great explanation, but a few issues with the metaphor's correctness.

The kids are all working on the exact same step of their individual problem at the same time. The classroom next door is on a different step for their problems. The entire school is the GPU.

Also replace kids with undergrads, and they don't work on 1+1 problems, they work on the exact same kind of problems the CPU does.

To translate, the reason they are undergrads and not mathematicians is because GPUs are clocked lower than CPUs so they don't do the individual work as fast. However the gap between mathematician and kids was a little too many orders of magnitudes.

Also, they do work on the same complexity of problems, GPUs have been more heterogeneous compute platforms than strictly graphics since the programmable shader model was introduced making them Turing complete. Additionally, the GPU's ALU and shader model is as complex as a C program these days.

The classroom analogy is what DX calls a wave and each undergrad is a lane.

In short there is no large difference between GPU and CPU besides the GPU uses what is called SIMD (single instruction, multiple data) architecture which is what this analogy was trying to convey.

Programs either CPU machine code or GPU machine code are basically a list of steps to do. CPUs run the program by going through each step and running it on a single instance of state. GPUs however, run the same step on multiple instances of state at the same time before moving onto the next step. An instance of state could be a pixel or a vertex or just a generic compute instance.

27

espomatte t1_j6n3owq wrote

Sir, this is an ELI5

51

Thrawn89 t1_j6n6gt9 wrote

Sir, read rule 4.

−2

ap0r t1_j6nbo15 wrote

As a person who has over ten years of experience building and repairing computers, I understood what you meant, but I also can see how a layperson would not understand anything you wrote.

27

Yancy_Farnesworth t1_j6nladg wrote

> In short there is no large difference between GPU and CPU besides the GPU uses what is called SIMD (single instruction, multiple data) architecture which is what this analogy was trying to convey.

The GPU is heavily geared towards floating point operations, while the CPU is less so. CPUs used to have to use a separate FPU chip. Transistors got small enough where they could fit the FPU on the CPU. Then the need for dedicated floating point performance skyrocketed with the rise of 3D games, which ultimately required a separate dedicated chip that could do absurd numbers of floating point operations in parallel, resulting in the GPU.

This floating point performance is why GPUs are a great tool for AI/ML and why Nvidia came to dominate hardware dedicated to AI/ML applications.

4

Thrawn89 t1_j6no21t wrote

GPUs are not better at floating point operations, they are just better at doing them in parallel as per SIMD just like any other operation benefitting from SIMD.

In fact floating point support is generally not quite as good as CPU. Some GPUs do not even natively support double precision or natively all floating point operations. Then there's denorm behavior and rounding modes that have been scattered across each implementation. Many GPUs take short cuts by not implementing a full FPU internally and convert to fixed point instead.

−1

BobbyThrowaway6969 t1_j6p2not wrote

Double precision is the black sheep of the family. It was just thrown in for convenience. GPUs don't have double precision because what do you care if a vertex is a millionth of a pixel off or a billionth? Graphics has no use for double precision so why make the chip more expensive to produce?

Compute programming might need it but not for the general public.

3

Thrawn89 t1_j6p76g9 wrote

Agreed, which is why it's wrong to say that GPUs are better at floating point operations than CPU.

1

BobbyThrowaway6969 t1_j6pcmfu wrote

Depends how you look at it. Their circuitry can handle vector math more efficiently

2

Thrawn89 t1_j6pdf0b wrote

No, most GPUs haven't had vector instructions for maybe a decade. Modern GPUs use SIMD waves for parallelization with scalar instructions.

2

BobbyThrowaway6969 t1_j6p131z wrote

I left "1+1 math problems at the same time" pretty vague on purpose. Math in my analogy isn't referring to processor arithmetic, it refers to "stuff" a processor can do. They don't all have to be on the same task. Some can handle vertices while others handle pixels.

>they work on the exact same kind of problems the CPU does.

They can do arithmetic the same way, sure, but you wouldn't exactly expect to be able to communicate with a mouse & keyboard using one of the cores in a GPU.

The instruction set for a GPU (based around arithmetic) is definitely nothing like the instruction set of a CPU lol. That's what I meant by 2nd grader vs mathematician.

3

Thrawn89 t1_j6p8b42 wrote

Each wave can only work on the same task. You can't process vertices and pixels in the same wave (classroom). Other cores (classrooms) can be working on other tasks though which is what I said above.

1

Ancient-Ad6958 t1_j6mkl09 wrote

this made me chuckle. someone please draw this

10

AndarianDequer t1_j6mrm0z wrote

Yes, and please draw the rest of the characters around the house that do everything else.

2

Horndog2015 t1_j6n2tbo wrote

I can see the GPU walking around pushing the characters. Some of the fans waving a giant leaf at parts to cool them. The heat sink is talking to the CPU, saying to put the heat in em. The keyboard delegating duties to the CPU, and the GPU is saying "alright then." The ram is auditing everything, and the HDD is like "Wow! This is an all you can eat buffet.! The OS is just saying "I'll throw some papers on your desk. Figure it out." The MoBo is preaching to everyone how it brings everyone together. The Disk player just says "Whatever! I'm calling in today!"

3

Ancient-Ad6958 t1_j6n6ong wrote

And make it cartoonish enough so we can use it to teach kids how computers work

2

Agifem t1_j6mqhze wrote

>but that's grunt work.

Excellent ELI5 explanation, through and through to the last word.

7

BigDisk t1_j6mu8to wrote

You just made me compare whether it makes sense that hundreds of thousands of 2nd graders really are more expensive than 8 mathematicians.

I still could not come up with an answer.

EDIT: I'm getting downvoted and "um, ackshually"'d because of a dumb joke. Never change, Reddit.

4

king_27 t1_j6mvk9n wrote

I honestly have no idea what a good salary is for a mathematician but according to google the average median salary is around $100k p/a in the US.

Let's say we're paying the 2nd graders in cookies and juice boxes, even if you're only spending $1 per child, that's still $100k per day as a minimum. The math checks out.

6

RhynoD t1_j6nkzsz wrote

  1. It may be thousands of cores in the GPU, but probably not.

  2. The GPU is almost its own complete computer system that comes with everything else needed to run, where the CPU is only the CPU and maybe an OK heat sink and fan. The GPU unit comes with its own cooling solution, its own RAM, essentially its own motherboard to control the chips and interface with the actual motherboard, etc.

So to take the analogy way too far: the CPU is just the PhD mathematician but you still have to pay for pay for his office and air conditioning and all the paper he needs to do the work and the whole campus full of TAs and whatnot.

The GPU is like paying for the entire elementary school complete with teachers, cafeteria, and supplies, and you drop that entire school onto your campus next to the PhD's office.

3

rob_allshouse t1_j6o83cf wrote

These are incorrect.

The GPU and CPU are similar. A “graphics card” has all of these things. A GPU in an SoC would have similar limitations to a CPU.

But consumers don’t buy GPUs, companies like MSI do and integrate them into a graphics card. They do buy CPUs.

2

LiamTheHuman t1_j6o9di5 wrote

GPU is often used to refer to the graphics card as a whole

5

Zombieattackr t1_j6ohnr9 wrote

Yep, no reason a CPU couldn’t output video, in fact some do! And I’m not even referring to integrated graphics in APU’s, I’m talking about arduinos and stuff. Hook them up to little LED matrices, 32x128 oled displays, or whatever else you desire, and boom, you have a CPU giving a display output. These are just very low resolution, low refresh rate, and often black and white. Could obviously do better with a real CPU, but it would still be reallly bad

1