Viewing a single comment thread. View all comments

BobbyThrowaway6969 t1_j6p2not wrote

Double precision is the black sheep of the family. It was just thrown in for convenience. GPUs don't have double precision because what do you care if a vertex is a millionth of a pixel off or a billionth? Graphics has no use for double precision so why make the chip more expensive to produce?

Compute programming might need it but not for the general public.

3

Thrawn89 t1_j6p76g9 wrote

Agreed, which is why it's wrong to say that GPUs are better at floating point operations than CPU.

1

BobbyThrowaway6969 t1_j6pcmfu wrote

Depends how you look at it. Their circuitry can handle vector math more efficiently

2

Thrawn89 t1_j6pdf0b wrote

No, most GPUs haven't had vector instructions for maybe a decade. Modern GPUs use SIMD waves for parallelization with scalar instructions.

2