robdupre t1_iu7z0uu wrote on October 29, 2022 at 6:59 AM Reply to [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k We use onnx models deployed using Nvidias tensorRT. We have been impressed with it so far Permalink 3
robdupre t1_iu7z0uu wrote
Reply to [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
We use onnx models deployed using Nvidias tensorRT. We have been impressed with it so far