Viewing a single comment thread. View all comments

GPUaccelerated OP t1_iu4tflp wrote

This makes sense. Scaling horizontally is usually the case. Thank you for commenting!

But I would argue that hardware for inference is actually bought more than one would assume. I have many clients who purchase mini-workstations to put in settings where data processing and inference jobs are done in the same premise. To limit latency and data travel.

1

suflaj t1_iu4ue5y wrote

Well, that is your clients' choice. It's not cost effective to buy Quadros when you could just rent them as you go, especially given their low resale value. It's not like there are many places you can't rent a nearby server with sub 10ms or at least 100ms latency.

2

GPUaccelerated OP t1_iu4wl57 wrote

That's right but sometimes data sensitivity prevents the use of cloud.

1

suflaj t1_iu4zaqo wrote

Well then it's a matter of trust - every serious cloud provider has a privacy policy that claims nothing is logged. Of course, you don't have to trust this, but this is a liability for the cloud provider, so you get to shift the blame if something goes wrong. And I'd argue that for most companies the word of a cloud peovider means more than your word, since they've got much to lose.

It's also standard practice to use end-to-end encryption, with some using end-to-end encrypted models. I don't really see a way how our company would handle personal data and retain samples in a GDPR compliant way without proprietary models in the cloud.

2

GPUaccelerated OP t1_iuimm3t wrote

Right, but for example in the medical field, It's not a trust issue. It's a matter of laws that prevent patient data from leaving the physician's premise.

1

suflaj t1_iuioq7c wrote

Which laws?

1

GPUaccelerated OP t1_iuitwy2 wrote

not exactly sure, i'm not a lawyer. But it's something that gets taken very seriously by a lot of my medical field clients. Its definitely something for their side, not mine. I just help those specific clients go on-prem

1

suflaj t1_iujhefz wrote

I asked for the specific law so I could show you that it cannot apply to end-to-end encrypted systems, which either have partly destroyed information, or the information that leaves the premises is not comprehensible to anything but the model and there is formal proof that it is infeasible to crack it.

These are all long solved problems, the only hard part is doing hashing without losing too much information, or encryption compact enough to both fit into the model and be comprehensible to it.

2