When you join a Zoom session over the browser, you don't sign a TOS. And I assume that actual licensed medical establishments are under their own TOS provisions that are compatible with HIPPA requirements. Training on voice-to-text transcription, etc... would be a pretty huge privacy violation particularly in the scope of services like therapy. Both because there are demonstrable attacks on AIs to get training data out of them, and because presumably that data would then be accessible to employees/contractors who were validating that it was fit for training.
Out of curiosity, has anyone using telehealth checked with their doctor/therapist to see what Zoom's privacy policies are for them?
https://blog.zoom.us/answering-questions-about-zoom-healthca...
The UK, for example, has hundreds of private mental health practitioners (therapists, psychologists, etc.) that provice their services directly to clients. They almost universally use off-the-shelf technology for video calling, messaging, and reporting.
Covered entities (including the EMR and hospital itself) can use protected health information for quality improvement without patient consent and deidentified data freely.
Where this gets messy is that deidentification isn’t always perfect even if you think you’re doing it right (especially if via software) and reidentification risk is a real problem.
To my understanding business associates can train on deidentified transcripts all they want as the contracts generally limit use to what a covered entity would be allowed to do (I haven’t seen Zoom’s). I know that most health AI companies from chatbots to image analysis do this. Now if their model leaks data that’s subsequently reidentified this is a big problem.
Most institutions therefore have policies more stringent than HIPAA and treat software deidentified data as PHI. Stanford for example won’t allow disclosure of models trained on deidentified patient data, including on credentialed access sources like physionet, unless each sample was manually verified which isn’t feasible on the scale required for DL.
Edit: Zoom’s BAA: https://explore.zoom.us/docs/en-us/baa.html
“Limitations on Use and Disclosure. Zoom shall not Use and/or Disclose the Protected Health Information except as otherwise limited in this Agreement or by application of 42 C.F.R. Part 2 with respect to Part 2 Patient Identifying Information, for the proper management and administration of Zoom…”
“Management, Administration, and Legal Responsibilities. Except as otherwise limited in this BAA, Zoom may Use and Disclose Protected Health Information for the proper management and administration of Zoom…”
Not sure if “proper management and administration” has a specific legal definition or would include product development.
Edit 2: My non-expert reading of this legal article suggests they can. https://www.morganlewis.com/-/media/files/publication/outsid...
“But how should a business associate interpret these rules when effective management of its business requires data mining? What if data mining of customer data is necessary in order to develop the next iteration of the business associate’s product or service? … These uses of big data are not strictly necessary in order for the business associate to provide the contracted service to a HIPAA-covered entity, but they may very well be critical to management and administration of the business associate’s enterprise and providing value to customers through improved products and services.
In the absence of interpretive guidance from the OCR on the meaning of ‘management and administration’, a business associate must rely almost entirely on the plain meaning of those terms, which are open to interpretation.”
HIPAA applies to the provider. Patient have no responsibility to ensure the tech used by their care provider is secure or that their medical records don't wind up on Twitter. HIPAA dictates that the care providers ensure that happens by placing both civil and sometimes criminal liability on the provider for not going to great lengths here.
In practice, this means lawyers working with the care providers have companies sign legal contracts ensuring the business associate is in compliance with HIPAA, and are following all of the same rules as HIPAA (search: HIPAA BAA).
Additionally, you can be in compliance with HIPAA and still fax someone's medical records.
(FERPA is to higher ed in the US what HIPAA is to healthcare.)
Analog line fax is HIPAA compliant because it is not "stored"
Using a cloud fax provider will inmediately put you out of compliance for this reason, unless you have a HIPAA compliant cloud fax service, which are rare.
Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.
What's your line of work out of curiosity?
-De-identify it then do whatever you want with it -use it to provide some service for the covered entity, but not for anyone else -enter a special research contract if you want to use it slightly de-identified for some other specific purpose
The privacy issues here are bottomless, and so are the legal issues.
Attorney client privilege is an interesting case.
"Privacy issues" is a meaningless phrase to me when divorced from the law. Do you mean, like, ethically concerning? This term in the contract is neither uncommon nor illegal.
As with all things HIPAA, this only becomes a problem when HHS starts looking and I’m sure in practice many people ignore this tidbit (if in fact this is the law and not Stanford policy).
Not that I’m an expert on the nuance here but I think it gives them permission to use PHI, especially if spun in the correct way, which then gives them permission to deid and do whatever with.
My experience has been that it’s pretty easy to spin something into QI.
> Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.
https://loyalhealth.com/ is one I’ve recently heard of that trains on de-id’d PHI from customers.
> What's your line of work out of curiosity?
Previously founded a health tech startup and now working primarily as a clinician and researcher (NLP) with some side work advising startups and VCs.
HIPAA is the correct abbreviation of the Health Information Portability and Accountability Act which as an aside doesn't necessarily preclude someone from training on patient data.
HIPPA is the unnecessarily capitalized spelling of a (quite adorable) crustacean found in the Indo-Pacific and consumed in an Indonesian delicacy known as yutuk.
All of this is a lot of BS about nothing.