zlacker

[return to "Zoom terms now allow training AI on user content with no opt out"]
1. danShu+1u1[view] [source] 2023-08-06 21:15:54
>>isodev+(OP)
Tangentially related, but a number of telehealth operations with hospitals/therapists/etc... use Zoom -- I suspect because their clients can connect without an app or an account over a browser.

When you join a Zoom session over the browser, you don't sign a TOS. And I assume that actual licensed medical establishments are under their own TOS provisions that are compatible with HIPPA requirements. Training on voice-to-text transcription, etc... would be a pretty huge privacy violation particularly in the scope of services like therapy. Both because there are demonstrable attacks on AIs to get training data out of them, and because presumably that data would then be accessible to employees/contractors who were validating that it was fit for training.

Out of curiosity, has anyone using telehealth checked with their doctor/therapist to see what Zoom's privacy policies are for them?

◧◩
2. infamo+3S1[view] [source] 2023-08-07 00:09:41
>>danShu+1u1
IANAL, but I did health tech for 10 years and had my fair share of interactions with lawyers asking questions about stuff I built.

HIPAA applies to the provider. Patient have no responsibility to ensure the tech used by their care provider is secure or that their medical records don't wind up on Twitter. HIPAA dictates that the care providers ensure that happens by placing both civil and sometimes criminal liability on the provider for not going to great lengths here.

In practice, this means lawyers working with the care providers have companies sign legal contracts ensuring the business associate is in compliance with HIPAA, and are following all of the same rules as HIPAA (search: HIPAA BAA).

Additionally, you can be in compliance with HIPAA and still fax someone's medical records.

◧◩◪
3. halduj+ST1[view] [source] 2023-08-07 00:24:28
>>infamo+3S1
I don’t think the question is about Zoom’s safeguards which are audited, and as you say almost certainly stronger than HIPAA requirements, but rather whether they can use the stored PHI for product development where the law appears ambiguous.
◧◩◪◨
4. johndh+G32[view] [source] 2023-08-07 01:53:37
>>halduj+ST1
Imo the law basically says you can do this with PHI:

-De-identify it then do whatever you want with it -use it to provide some service for the covered entity, but not for anyone else -enter a special research contract if you want to use it slightly de-identified for some other specific purpose

◧◩◪◨⬒
5. halduj+x82[view] [source] 2023-08-07 02:36:25
>>johndh+G32
One note is that the act of deidentification itself requires accessing PHI when done retroactively, this may be institutional policy or specific to covered entities but per the privacy office lawyers such access (apart from a small dataset) requires a permitted use to be accessible in order to then deidentify and use freely.

As with all things HIPAA, this only becomes a problem when HHS starts looking and I’m sure in practice many people ignore this tidbit (if in fact this is the law and not Stanford policy).

[go to top]