BestSentence4868 t1_iu8gi68 wrote
Reply to comment by big_dog_2k in [D] How to get the fastest PyTorch inference and what is the "best" model serving framework? by big_dog_2k
Yep! Fire up Triton(I'd used their docker container), install pytorch via pip or just put it in the dockerfile and you're off to the races! I actually did just deploy Triton+pytorch+flask for a web app this week :)
big_dog_2k OP t1_iu8gxg1 wrote
Wow! I did not know that! I think I have answers to my questions now.
BestSentence4868 t1_iu8h0kj wrote
Feel free to dm for any further questions
Viewing a single comment thread. View all comments