Submitted by GPUaccelerated t3_yf5jm3 in deeplearning
suflaj t1_iu1yx11 wrote
I don't think upgrading is ever worth. It's easier to just scale horizontally, i.e. buy more hardware.
The hardware you do inference on for production is not bought anyways, it's mostly rented, so that doesn't matter. And if you are running models on an edge device you don't have much choice.
GPUaccelerated OP t1_iu4tflp wrote
This makes sense. Scaling horizontally is usually the case. Thank you for commenting!
But I would argue that hardware for inference is actually bought more than one would assume. I have many clients who purchase mini-workstations to put in settings where data processing and inference jobs are done in the same premise. To limit latency and data travel.
suflaj t1_iu4ue5y wrote
Well, that is your clients' choice. It's not cost effective to buy Quadros when you could just rent them as you go, especially given their low resale value. It's not like there are many places you can't rent a nearby server with sub 10ms or at least 100ms latency.
GPUaccelerated OP t1_iu4wl57 wrote
That's right but sometimes data sensitivity prevents the use of cloud.
suflaj t1_iu4zaqo wrote
Well then it's a matter of trust - every serious cloud provider has a privacy policy that claims nothing is logged. Of course, you don't have to trust this, but this is a liability for the cloud provider, so you get to shift the blame if something goes wrong. And I'd argue that for most companies the word of a cloud peovider means more than your word, since they've got much to lose.
It's also standard practice to use end-to-end encryption, with some using end-to-end encrypted models. I don't really see a way how our company would handle personal data and retain samples in a GDPR compliant way without proprietary models in the cloud.
GPUaccelerated OP t1_iuimm3t wrote
Right, but for example in the medical field, It's not a trust issue. It's a matter of laws that prevent patient data from leaving the physician's premise.
suflaj t1_iuioq7c wrote
Which laws?
GPUaccelerated OP t1_iuitwy2 wrote
not exactly sure, i'm not a lawyer. But it's something that gets taken very seriously by a lot of my medical field clients. Its definitely something for their side, not mine. I just help those specific clients go on-prem
suflaj t1_iujhefz wrote
I asked for the specific law so I could show you that it cannot apply to end-to-end encrypted systems, which either have partly destroyed information, or the information that leaves the premises is not comprehensible to anything but the model and there is formal proof that it is infeasible to crack it.
These are all long solved problems, the only hard part is doing hashing without losing too much information, or encryption compact enough to both fit into the model and be comprehensible to it.
Viewing a single comment thread. View all comments