big_dog_2k OP t1_iu86jfr wrote
Reply to comment by poems_4_you in [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
Thanks! I have now seen quite a consistent theme from people that Triton is worth it. I might then bite the bullet and invest more time in getting onnx conversions right.
Viewing a single comment thread. View all comments