Viewing a single comment thread. View all comments

nipsen t1_jdekv15 wrote

Oh, good. Finally usb is fast enough to not have to worry about internal motherboard contacts. Now we can actually just have a small enclosure and plug an external card into any laptop-sleeve with usb! Perfect!

But wait, dear nerds, hearken to this! - A company will now allow you to buy a specially fitted module, for absurd amounts of money, that you can /slot into the back of only a single type of laptop/ via USB4! Not bad, eh!!! Eh!!!

(Also, here's some flak about mxm egpu solutions, because it's not Mac enough)

Seriously, though - the keyboard modules are fantastic. That'd save me a day of pulling fused plastic pips off the plate on the back of the motherboard on a laptop, to get another keyboard replacement in. Why not sell the laptop on that? "No need to replace your entire laptop if the spacebar breaks".

39

[deleted] t1_jdeon3h wrote

USB is not nearly as fast enough to carry a modern GPU.

34

dryingsocks t1_jdewgvj wrote

USB4 is Thunderbolt which carries PCIe?

36

[deleted] t1_jdexhp4 wrote

Yeah but it's still not quite there speed-wise. The bottleneck is still there.

3

dryingsocks t1_jdeyicc wrote

surely it's plenty for most people, especially in a laptop

19

QuietGanache t1_jdf3f1y wrote

Thunderbolt tops out at 40Gb/s, PCIe gen 4 x16 tops out at 32GB/s. This means that things like textures and new geometries will load much more slowly so while it might be fine for CAD, you'll encounter issues with gaming, especially with modern engines that use much more refined levels of detail.

22

sFXplayer t1_jdgdraa wrote

Last I checked GPUs don't saturate a gen 4 x16 link when gaming, unless the game implements direct storage which as of now isn't that many.

10

G1ntok1_Sakata t1_jdgoorq wrote

Thing is that 40Gbps is only 1.25x the bandwidth of x8 PCIe gen2 speeds. Accounting for USB overheard I wouldn't be surprised if it was equal or just barely above x8 PCIe gen2 bandwidth. Remember on release of the RX6500 it got flack for being x4 only as it got a huge perf hit on x4 PCIe gen3? Note that x4 PCIe gen3 happens to be about the same bandwidth as x8 PCIe gen2. 40Gbps is pretty tiny, esp if there is overhead to deal with.

1

celaconacr t1_jdfioj0 wrote

Razer Core X and similar eGPUs seem to do ok. I have seen a RTX 3080 rated at about 20% slower than on a desktop. Slightly better if you don't have to return the graphics output back down the thunderbolt connection. A large amount of memory on the cards can certainly help with the bandwidth. I can't really see it being an issue for most graphics cards when heat will stop them first in a laptop.

9

Svenskensmat t1_jdgwklj wrote

A desktop RTX 380 maxes out at around 16GB/s in bandwidth utilisation over PCIe. 32Gb/s is the equivalent of 4GB/s in bandwidth, so it’s not nearly enough.

3

celaconacr t1_jdhs888 wrote

It doesn't particularly matter what it maxes out at if the utilisation over time is much lower. e.g. a spike loading a new texture set would potentially have to wait 4 cycles rather than 1 but if there isn't another texture set waiting to be loaded after it's a small hit on performance.

The main bottlenecks for modern graphics cards aren't usually the PCIe lanes. As I put performance tests show it's about a 20% hit on a desktop class card with current games. Before eGPUs existed similar tests were done with GPUs running on 4 and 8 lanes with similar results. The result will vary by game, texture volume and the cards memory. The more memory the card has the less it will use PCIe.

For the future Thunderbolt 5 will be 80Gb/s uni-directional or upto 120 bi-directional so it will be even less of an issue. PCIe 5 will be a similar increase but again is likely low utilisation over time.

1

Sol33t303 t1_jdgy4ue wrote

>Thunderbolt tops out at 40Gb/s, PCIe gen 4 x16 tops out at 32GB/s

Last I knew GPUs don't use nearly that much PCIe bandwidth if your not SLI-ing or something.

It has it available, but doesn't use it.

Could become more relevant as directstorage becomes a thing though.

2

QuietGanache t1_jdh4eez wrote

That's reasonable, I hadn't considered where the textures and geometries are coming from.

1

Iintl t1_jdgxcvz wrote

It's not. TB3 enclosures have been known to offer significantly worse performance than plugging it in directly into a desktop, and the performance difference only increases as GPUs get more and more powerful (and demand greater bandwidth). Off the top of my head, a moderately powerful GPU like the RTX3080 could see anywhere from 20% to 50% performance drop when put into an eGPU enclosure.

2

Sol33t303 t1_jdgyc8x wrote

How does that compare to laptop dGPUs though? Which are already neutered performance-wise for heat and power consumption reasons. Both of which also get worse as you scale up to more powerful GPUs just as happens with PCIe bandwidth.

1

Purple_Form_8093 t1_jdhto8y wrote

4x link falls flat for most midrange and up graphics cards. Thunderbolt/usb4 is really cool. Just not really for egpu. It works. Technically. But it has its own issues.

3

plutoniaex t1_jdf1eic wrote

Why not just expose the PCIe as an external port then?

5

Sol33t303 t1_jdgz6uf wrote

Theres nothing that's fundementally stopping a manufacturer from doing that.

But it's designed as an internal connector:

-Full 16x slots are big. Bigger then any external connectors that I can think of off the top of my head, or 8x.

- your going to get an unpleasant surprise when you try to disconnect something while running without compatible hardware and without jumping through the hoops you gotta do on the software side to shutdown the power to a pcie device (and not accidentally power down something like your internal sata or USB controller). Same goes for connecting something.

- the devices are also in general going to be designed for internal use (exposed fans, exposed PCBs that you could shock with ESD, etc.).

- You can't expect users to know that e.g. plugging in a PCIe device might rerout pcie lanes from say your NVME controller to the new device and that will make buyers *very* unhappy and support tickets will go through the roof, your don't expect your VGA connector to stop working because you have used all your USB slots for example because to an inexperienced that's what it would look like.

- And i'm sure there's a lot more reasons I could think of that an external pcie connector is a terrible idea.

2

[deleted] t1_jdexa56 wrote

[deleted]

2

ChrisFromIT t1_jdf0wnu wrote

Nope. USB4's bandwidth in symmetric mode is 80 Gbps. While PCIe 4.0 x16 is 32 GBps. The difference is big B, bytes vs little b, bits. USB4 is 80 Gbits per second or 10 GBytes per second. While PCIe 4.0 x16 is 32 GBytes per second.

USB4's bandwidth would be the equivalent of PCIe 2.0 x16 or PCIe 3.0 x8. Just slightly higher bandwidth that that.

5

djk29a_ t1_jdfm1ro wrote

Oculink seems to do better than Thunderbolt for reasons that are unclear in my zero research so far

2

ChrisFromIT t1_jdf0bhu wrote

USB4 2.0 almost has enough bandwidth to carry a modern GPU. It would have a bit more bandwidth than a PCIe 3.0 x8 slot or a bit more bandwidth than a PCIe 2.0 x16 slot. And with the 4090 desktop, you lose about 6% performance if you use a PCIe 2.0 x16 slot vs. a PCIe 4.0 x16 slot.

−3

nipsen t1_jdewl7i wrote

..Without going into a huge rant here about why pci-e exists, and how absurdly obsolete it is - usb4 has 80Gbps transfer rates (same as pci-ex16, in ..theory).

−5

QuietGanache t1_jdf3ixa wrote

I think you're mixing and matching your bits and bytes.

11

JaggedMetalOs t1_jdfob8p wrote

> Why not sell the laptop on that?

They already did that when they first launched as the "modular, repairable laptop". I don't think it's a bad thing they're branching out into areas like performance/gaming.

5

Pineappl3z t1_jdglq3m wrote

They're using PCIe x8 for the expansion bay on the Framework 16; not, USB4. Here's the github for the electrical design for the Expansion Bay. The theoretical maximum power available to the expansion card in this early design is 210W with supplemental wall power.

2

DriftingMemes t1_jdfwzqi wrote

>Seriously, though - the keyboard modules are fantastic. That'd save me a day of pulling fused plastic pips off the plate on the back of the motherboard on a laptop, to get another keyboard replacement in. Why not sell the laptop on that? "No need to replace your entire laptop if the spacebar breaks".

What brand are you working on? Maybe I'm spoiled, but all mine are on a tray, you just pop the tray, replace the tray.

1

financialmisconduct t1_jdhyqhb wrote

ThinkPad: pull three screws, open the laptop, slide keyboard up 8mm, lift

1

nipsen t1_jdi1dom wrote

The rest of us have keyboard modules that are fused to the chassis, I'm afraid. So to replace a very flimsy plastic part, it's necessary to replace the whole front cover, or to dig through the whole laptop, and pop it off the fused plastic contacts.

1

financialmisconduct t1_jdi1k4f wrote

None of my various laptops have that

Is it cheap consumer grade shit?

1

nipsen t1_jdi8t1x wrote

Yes and no, so to speak. My dearly bought Thinkbook has it. Any Asus of any price-range. Most HPs made in recent years. Literally anything in a slim form-factor will have this solution with the keyboard being plastic-welded to the top chassis.

Ironically, a lot of the actual consumer-grade shit, like Lenovo Yoga, etc., inherited their design from the old elephant euthanasia brick devices, and have detachable keyboard modules. But it is actually glued and fused to what is doubling as the back panel for the mainboard. Which is also the solution on many of the older IBM-ish Lenovos. The keyboard module itself is not difficult to produce or replace on these devices, and is just attached with a standard ribbon cable. But to actually replace it requires some form of OEM-specific voodoo.

I am told by entirely reliable industry insiders that this is not done to make sure the laptops must be specified to different regions, to maintain these artificial region-offices of these companies, at all. Or that it is one of the few remaining things that can be glued to the laptop, so that when it breaks, the whole thing has to be replaced - meaning that it is a great way to make sure random consumers also buy warranty "deals". I'm told none of these things are involved, at all.

1