Viewing a single comment thread. View all comments

dryingsocks t1_jdewgvj wrote

USB4 is Thunderbolt which carries PCIe?

36

[deleted] t1_jdexhp4 wrote

Yeah but it's still not quite there speed-wise. The bottleneck is still there.

3

dryingsocks t1_jdeyicc wrote

surely it's plenty for most people, especially in a laptop

19

QuietGanache t1_jdf3f1y wrote

Thunderbolt tops out at 40Gb/s, PCIe gen 4 x16 tops out at 32GB/s. This means that things like textures and new geometries will load much more slowly so while it might be fine for CAD, you'll encounter issues with gaming, especially with modern engines that use much more refined levels of detail.

22

sFXplayer t1_jdgdraa wrote

Last I checked GPUs don't saturate a gen 4 x16 link when gaming, unless the game implements direct storage which as of now isn't that many.

10

G1ntok1_Sakata t1_jdgoorq wrote

Thing is that 40Gbps is only 1.25x the bandwidth of x8 PCIe gen2 speeds. Accounting for USB overheard I wouldn't be surprised if it was equal or just barely above x8 PCIe gen2 bandwidth. Remember on release of the RX6500 it got flack for being x4 only as it got a huge perf hit on x4 PCIe gen3? Note that x4 PCIe gen3 happens to be about the same bandwidth as x8 PCIe gen2. 40Gbps is pretty tiny, esp if there is overhead to deal with.

1

celaconacr t1_jdfioj0 wrote

Razer Core X and similar eGPUs seem to do ok. I have seen a RTX 3080 rated at about 20% slower than on a desktop. Slightly better if you don't have to return the graphics output back down the thunderbolt connection. A large amount of memory on the cards can certainly help with the bandwidth. I can't really see it being an issue for most graphics cards when heat will stop them first in a laptop.

9

Svenskensmat t1_jdgwklj wrote

A desktop RTX 380 maxes out at around 16GB/s in bandwidth utilisation over PCIe. 32Gb/s is the equivalent of 4GB/s in bandwidth, so it’s not nearly enough.

3

celaconacr t1_jdhs888 wrote

It doesn't particularly matter what it maxes out at if the utilisation over time is much lower. e.g. a spike loading a new texture set would potentially have to wait 4 cycles rather than 1 but if there isn't another texture set waiting to be loaded after it's a small hit on performance.

The main bottlenecks for modern graphics cards aren't usually the PCIe lanes. As I put performance tests show it's about a 20% hit on a desktop class card with current games. Before eGPUs existed similar tests were done with GPUs running on 4 and 8 lanes with similar results. The result will vary by game, texture volume and the cards memory. The more memory the card has the less it will use PCIe.

For the future Thunderbolt 5 will be 80Gb/s uni-directional or upto 120 bi-directional so it will be even less of an issue. PCIe 5 will be a similar increase but again is likely low utilisation over time.

1

Sol33t303 t1_jdgy4ue wrote

>Thunderbolt tops out at 40Gb/s, PCIe gen 4 x16 tops out at 32GB/s

Last I knew GPUs don't use nearly that much PCIe bandwidth if your not SLI-ing or something.

It has it available, but doesn't use it.

Could become more relevant as directstorage becomes a thing though.

2

QuietGanache t1_jdh4eez wrote

That's reasonable, I hadn't considered where the textures and geometries are coming from.

1

Iintl t1_jdgxcvz wrote

It's not. TB3 enclosures have been known to offer significantly worse performance than plugging it in directly into a desktop, and the performance difference only increases as GPUs get more and more powerful (and demand greater bandwidth). Off the top of my head, a moderately powerful GPU like the RTX3080 could see anywhere from 20% to 50% performance drop when put into an eGPU enclosure.

2

Sol33t303 t1_jdgyc8x wrote

How does that compare to laptop dGPUs though? Which are already neutered performance-wise for heat and power consumption reasons. Both of which also get worse as you scale up to more powerful GPUs just as happens with PCIe bandwidth.

1

Purple_Form_8093 t1_jdhto8y wrote

4x link falls flat for most midrange and up graphics cards. Thunderbolt/usb4 is really cool. Just not really for egpu. It works. Technically. But it has its own issues.

3