jukujala t1_iu7x05z wrote on October 29, 2022 at 6:30 AM Reply to [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k Has anyone tried to transform ONNX to TF SavedModel and use TF serving? TF at least in the past was good at inference. Permalink 2
jukujala t1_iu7x05z wrote
Reply to [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
Has anyone tried to transform ONNX to TF SavedModel and use TF serving? TF at least in the past was good at inference.