zlacker

Zoom terms now allow training AI on user content with no opt out

submitted by isodev+(OP) on 2023-08-06 12:15:51 | 1581 points 506 comments
[view article] [source] [go to bottom]

NOTE: showing posts with links only show all posts
10. _ddzr+6c[view] [source] 2023-08-06 13:51:36
>>isodev+(OP)
Thankfully nothing like this is in Jitsi Meet’s TOS: https://jitsi.org/meet-jit-si-terms-of-service/

It never ceases to amaze me how companies choose the worst software!

◧◩
16. Maxiou+Kc[view] [source] [discussion] 2023-08-06 13:56:35
>>ginko+Fa
You can self host the enterprise tier "User and meeting metadata are managed in the public cloud while the meeting traffic including video, audio and data-sharing goes through the Meeting Connector in your company network." https://explore.zoom.us/docs/en-us/plan/enterprise.html
◧◩
18. pavlov+Vc[view] [source] [discussion] 2023-08-06 13:57:39
>>lopati+pb
Google also used to read email contents for targeted advertising. This was a major point of contention when Gmail was introduced in 2004. They stopped 13 years later:

https://www.fastcompany.com/4041720/heres-why-it-took-gmail-...

Most people just don’t care though.

◧◩
38. rany_+Ff[view] [source] [discussion] 2023-08-06 14:11:17
>>ta1243+Te
I think so, given that it mentions the case of the client being a paying customer:

> 31.3 Data Processing Addendum. If you are a business, enterprise, or education account owner and your use of the Services requires Zoom to process an End User’s personal data under a data processing agreement, Zoom will process such personal data subject to Zoom’s Global Data Processing Addendum.

Though it limits the scope of the data collection: https://explore.zoom.us/docs/doc/Zoom_GLOBAL_DPA.pdf

46. pawelw+4g[view] [source] 2023-08-06 14:13:21
>>isodev+(OP)
Does this mean that ZOOM is basically using every attendee's audio and video stream to train their models? How do they define the "Service Generated Data"?

I made a video-conferencing app for virtual events (https://flat.social). No audio and video is ever recorded, packets are only forwarded to subscribed clients. Frankly, while designing the service it didn't even cross my mind to use this data for anything else than the online event itself.

◧◩
47. westco+eg[view] [source] [discussion] 2023-08-06 14:14:09
>>jxf+6f
You will have to excuse me if I don’t trust a company that kicks off users at the behest of the PRC!

https://techcrunch.com/2020/06/11/zoom-admits-to-shutting-do...

Quibbles over the definition of phrases like “Customer Content” and “Service Generated Data” are designed to obfuscate meaning and confuse readers to think that the headline is wrong. It is not wrong. This company does what it wants to, obviously, given it’s complicity with a regime that is currently engaging in genocide.

https://www.bbc.com/news/world-asia-china-22278037.amp

Why do you trust them to generate an AI model of your appearance and voice that could be used to destroy your life? I don’t.

◧◩◪
48. jsnell+ig[view] [source] [discussion] 2023-08-06 14:14:35
>>sjmuld+Lb
You're both misquoting and misunderstanding. Misquoting in that you clipped out the "to the extent and in the manner permitted under applicable Law". And misunderstanding since the text was talking "service generated data", not about "customer data". That's basically data generated by their system (e.g. debug logs). It's not the data you entered into the system (contact information), the calls you made, the chats you sent, etc.

Also, the linked document is effectively a license for the intellectual property rights. The data protection side of things would be covered by the privacy policy[0]. This all seems pretty standard?

[0] https://explore.zoom.us/en/privacy/

◧◩
56. ebiest+hh[view] [source] [discussion] 2023-08-06 14:19:41
>>_ddzr+6c
How does Jitsi handle 500-person+ conference calls these days? This is the killer zoom feature - it looks like Jitsi can handle up to 500 now. https://jaas.8x8.vc/#/comparison .

That's personally not enough for many remote companies. So if we're going to have to have Zoom on our machines anyway (to handle an all-company meeting), why not just use it for the rest?

◧◩
70. sickil+Wi[view] [source] [discussion] 2023-08-06 14:29:03
>>tikkun+Ec
I like https://meet.jit.si/ the most.
71. tikkun+Yi[view] [source] 2023-08-06 14:29:13
>>isodev+(OP)
I've posted an Ask HN thread about which other companies are doing this: >>37022000
◧◩
93. throwa+Kk[view] [source] [discussion] 2023-08-06 14:39:45
>>morkal+Nc
Education is schizophrenic.

One part constantly fears it's missing a beat and jumps on new tech without thinking about it.

Another part believes that kids are able to construct all human knowledge by playing together in a field.

Education Technology seems to focus on selling education machines [1] (now with AI) to the first group while the second group focus on resenting any form of testing at all. Which leads to * , indeed, a huge legal minefield, that will be shirked up to government for 'not providing leadership' years down the road.

* If you are in any way involved with a school, ask them how many, and importantly what %, of non-statutory requests for comment from government agencies they've responded to, you may be surprised how low the number is or if they even count. Despite talking about 'leadership', not a lot walk the talk.

[1] https://www.youtube.com/watch?v=jTH3ob1IRFo

120. rolph+Bn[view] [source] 2023-08-06 14:55:19
>>isodev+(OP)
The Dawn of A.I. Mischief Models

https://slate.com/technology/2022/08/4chan-ai-open-source-tr...

Adversarial machine learning explained: How attackers disrupt AI and ML systems

https://www.csoonline.com/article/573031/adversarial-machine...

How to attack Machine Learning ( Evasion, Poisoning, Inference, Trojans, Backdoors)

https://towardsdatascience.com/how-to-attack-machine-learnin...

AI Security and Adversarial Machine Learning 101

https://towardsdatascience.com/ai-and-ml-security-101-6af802...

The Road to Secure and Trusted AI

https://adversa.ai/report-secure-and-trusted-ai/

◧◩◪◨
134. rolph+4p[view] [source] [discussion] 2023-08-06 15:02:31
>>kortex+Bm
set yourself up with a couple of vices [coffee, smokes] and have look here, for things you can do:

>>37022623 [a number of links regarding how to play with bots and bork training by"malforming" your inputs]

148. Hamcha+Mq[view] [source] 2023-08-06 15:12:25
>>isodev+(OP)
How does this work with GDPR?

As far as I'm aware (not a lawyer) you must provide a easy opt-out from data collection and usage, plus you must not force your employees into such agreements[1]. ChatGPT already got blocked over GDPR issues, and with new regulations coming forward I really don't see how they think this can be the right call.

1. https://commission.europa.eu/law/law-topic/data-protection/r...

200. DrZoot+Mw[view] [source] 2023-08-06 15:49:46
>>isodev+(OP)
Great, what could possibly go wro.. >>37013704
205. deusex+qx[view] [source] 2023-08-06 15:53:44
>>isodev+(OP)
I guess the equivalent terms for Google Meet are much more privacy friendly:

> Control over your data

> ...

> Google does not store video, audio, or chat data unless a meeting participant initiates a recording during the Meet session.

https://support.google.com/meet/answer/9852160

Though, I suppose this isn't exactly the same as a TOS.

◧◩◪
211. albert+ny[view] [source] [discussion] 2023-08-06 15:59:40
>>accoun+Yk
Yep! I realize this kind of sounds like Ray Bradbury's imagined dystopian future where a fully automated house continues to go about its programmed routine after all its inhabitants had died in a nuclear event that obliterated the rest of the city.

https://en.wikipedia.org/wiki/There_Will_Come_Soft_Rains_(sh...

◧◩
216. icco+nz[view] [source] [discussion] 2023-08-06 16:05:34
>>galley+hs
I am not a lawyer, but https://explore.zoom.us/docs/doc/Zoom-hipaa.pdf reads like as long as they don't disclose your information, they aren't in violation.
218. buildb+Az[view] [source] 2023-08-06 16:06:46
>>isodev+(OP)
How to enable end to end encryption in Zoom: https://support.zoom.us/hc/en-us/articles/360048660871-End-t...

(Presuming of course that their closed source software really E2E encrypts without a backdoor)

◧◩
223. _throw+aA[view] [source] [discussion] 2023-08-06 16:09:51
>>galley+hs
While I am not an expert in the details, this seems aligned with HIPAA to me at a high level in the following sense. While HIPAA got marketed as protecting medical data (privacy), it really was intended make medical data shareable (portability). Think of it like a trojan horse: get this in with that. Or, a misdirection: look over here, while some other thing happens. Therefore, automatic Zoom transcripts of tele-health appointments are remarkably well-aligned with the intent of HIPAA.

Think how much more sharable and more complete digital medical records can be now. (And the breakthroughs that may come of it! Etc., etc.)

To wit, "As much as there's a law, HIPAA supposed to prevent people from revealing your medical data, suppose to be protected, which is completely false. 20% of the population works in health care. That 20% of pop can see the med data. the med data you are not ware of is being sent to insurance companies, forwarded to government health info exchanges...." - Rob Braxman Tech, "Live - Rant! Why I am able to give you this privacy information, + Q&A", https://youtube.com/watch?v=ba6wI1BHG9A

224. chrisc+fA[view] [source] 2023-08-06 16:10:17
>>isodev+(OP)
PRC instructing a Zoom employee to place child pornography on Chinese dissidents should have been enough to stop companies from using Zoom https://www.justice.gov/opa/pr/china-based-executive-us-tele...
◧◩◪
235. shagie+UC[view] [source] [discussion] 2023-08-06 16:23:04
>>accoun+Yk
Nine Planets With No Intelligent life - https://www.bohemiandrive.com/npwil

Not so much dystopian... as philosophical. Though, Uranus was both.

◧◩◪◨
248. dctoed+pF[view] [source] [discussion] 2023-08-06 16:35:14
>>kortex+Bm
>> You agree to grant and hereby grant

"Hereby grant" means the grant is (supposedly) immediately effective even for future-arising rights — and thus would take precedence (again, supposedly) over an agreement to grant the same rights in the future. [0]

(In the late oughts, this principle resulted in the biotech company Roche Molecular becoming a part-owner of a Stanford patent, because a Stanford researcher signed a "visitor NDA" with Roche that included present-assignment language, whereas the researcher's previous agreement with Stanford included only future-assignment language. The Stanford-Roche lawsuit on that subject went all the way to the U.S. Supreme Court.)

[0] https://toedtclassnotes.site44.com/Notes-on-Contract-Draftin...

◧◩
264. rst+TH[view] [source] [discussion] 2023-08-06 16:48:53
>>_ddzr+6c
What I take to be the TOS for Google Meet (it's a little hard to tell!) makes no specific reference to AI, but does mention use of customer data for "developing new technologies and services" more generally. https://policies.google.com/terms#toc-permission
◧◩◪◨
275. Shelnu+kL[view] [source] [discussion] 2023-08-06 17:04:23
>>albert+ny
This also reminds me of "They Will Not Return" by John Ayliff

https://johnayliff.itch.io/they-will-not-return

◧◩
288. MaxBar+uN[view] [source] [discussion] 2023-08-06 17:13:45
>>galley+hs
Perhaps slightly off-topic: the U.S. Department of Health and Human Services (HHS) seem to be paying particular attention to security/privacy as it relates to providers of medical services using online tracking services. In a recent open letter they mentioned Meta/Facebook and Google Analytics by name. I imagine communication services like Zoom are also on their minds.

* https://www.ftc.gov/news-events/news/press-releases/2023/07/...

* https://www.ftc.gov/business-guidance/blog/2023/07/ftc-hhs-j...

◧◩◪◨⬒
290. qingch+ON[view] [source] [discussion] 2023-08-06 17:14:55
>>eh9+qB
Surprised me too...

"The company has previously acknowledged that much of its technology development is conducted in China and security concerns from governments abound."

https://techcrunch.com/2020/06/11/zoom-admits-to-shutting-do...

◧◩◪
300. r2b2+GO[view] [source] [discussion] 2023-08-06 17:18:59
>>jonas2+Wt
Jitsi App Privacy:

https://apps.apple.com/us/app/jitsi-meet/id1165103905

And Zoom:

https://apps.apple.com/us/app/id546505307

Looks like one company likes to gobble data more than the other even if both privacy policies are gobble-open.

◧◩
308. r4inde+WR[view] [source] [discussion] 2023-08-06 17:36:01
>>Sunspa+AJ
The Flatpak wrapper for Zoom is not made or endorsed by Zoom, Inc. as indicated in its description [1].

I am definitely not a fan of Zoom either and had my own issues with the Linux client, but if the problems you describe are unique to the Flatpak and not in the official Linux distribution, you can't blame Zoom for that.

[1] https://flathub.org/apps/us.zoom.Zoom

323. zakemb+qX[view] [source] 2023-08-06 18:07:24
>>isodev+(OP)
This is not new. These terms were quietly updated on 1st April 2023. Looks like very few people noticed it until now.

https://web.archive.org/web/20230401045359/https://explore.z...

◧◩◪◨⬒
348. jfkimm+0d1[view] [source] [discussion] 2023-08-06 19:44:27
>>samspe+ya1
There's now also https://github.com/vector-im/element-call.

They have SFU support as of recently, so it should scale similarly to Jitsi et al.

◧◩
362. WhereI+Vk1[view] [source] [discussion] 2023-08-06 20:28:50
>>blindr+OG
Where were you when Microsoft trained their models on everyone's github repo without their consent?

Where were you when Microsoft announced the exact same for Teams? https://www.microsoft.com/en-us/microsoft-365/blog/2022/06/1...

By allowing Microsoft do as they please, we collectively gave up our rights for privacy, we deserve what's happening

I chose to not use their products at all, it starts from there if you care

◧◩
363. WhereI+kl1[view] [source] [discussion] 2023-08-06 20:31:46
>>galley+hs
If Microsoft is allowed to do it, why can't Zoom?

https://www.microsoft.com/en-us/microsoft-365/blog/2022/06/1...

◧◩◪◨
374. Tokume+au1[view] [source] [discussion] 2023-08-06 21:17:01
>>wkat42+Nd1
Have you experienced anything like this other commenter mentioned?

>>37022878

◧◩
377. johndh+rv1[view] [source] [discussion] 2023-08-06 21:24:53
>>danShu+1u1
Looks like they have a separate offering, Zoom for Healthcare that presumably has different terms and conditions.

https://blog.zoom.us/answering-questions-about-zoom-healthca...

◧◩◪◨⬒⬓
395. dctoed+AG1[view] [source] [discussion] 2023-08-06 22:38:28
>>mafuy+BD1
> Simply "hereby grant" should suffice.

Not necessarily — in some circumstances, the law might not recognize a present-day grant of an interest that doesn't exist now but might come into being in the future. (Cf. the Rule Against Perpetuities. [1])

The "hereby grants and agrees to grant" language is a fallback requirement — belt and suspenders, if you will.

[1] https://en.wikipedia.org/wiki/Rule_against_perpetuities

◧◩
410. halduj+8Q1[view] [source] [discussion] 2023-08-06 23:51:11
>>danShu+1u1
IANAL but “Zoom for Healthcare” is a business associate under HIPAA and treated as an extension of the provider with some added restrictions.

Covered entities (including the EMR and hospital itself) can use protected health information for quality improvement without patient consent and deidentified data freely.

Where this gets messy is that deidentification isn’t always perfect even if you think you’re doing it right (especially if via software) and reidentification risk is a real problem.

To my understanding business associates can train on deidentified transcripts all they want as the contracts generally limit use to what a covered entity would be allowed to do (I haven’t seen Zoom’s). I know that most health AI companies from chatbots to image analysis do this. Now if their model leaks data that’s subsequently reidentified this is a big problem.

Most institutions therefore have policies more stringent than HIPAA and treat software deidentified data as PHI. Stanford for example won’t allow disclosure of models trained on deidentified patient data, including on credentialed access sources like physionet, unless each sample was manually verified which isn’t feasible on the scale required for DL.

Edit: Zoom’s BAA: https://explore.zoom.us/docs/en-us/baa.html

“Limitations on Use and Disclosure. Zoom shall not Use and/or Disclose the Protected Health Information except as otherwise limited in this Agreement or by application of 42 C.F.R. Part 2 with respect to Part 2 Patient Identifying Information, for the proper management and administration of Zoom…”

“Management, Administration, and Legal Responsibilities. Except as otherwise limited in this BAA, Zoom may Use and Disclose Protected Health Information for the proper management and administration of Zoom…”

Not sure if “proper management and administration” has a specific legal definition or would include product development.

Edit 2: My non-expert reading of this legal article suggests they can. https://www.morganlewis.com/-/media/files/publication/outsid...

“But how should a business associate interpret these rules when effective management of its business requires data mining? What if data mining of customer data is necessary in order to develop the next iteration of the business associate’s product or service? … These uses of big data are not strictly necessary in order for the business associate to provide the contracted service to a HIPAA-covered entity, but they may very well be critical to management and administration of the business associate’s enterprise and providing value to customers through improved products and services.

In the absence of interpretive guidance from the OCR on the meaning of ‘management and administration’, a business associate must rely almost entirely on the plain meaning of those terms, which are open to interpretation.”

◧◩
417. kkylin+pV1[view] [source] [discussion] 2023-08-07 00:41:32
>>danShu+1u1
Related to this, anyone know if Zoom has a separate offering for education (universities, schools, etc)? I teach at a university, and not only do we use Zoom for lectures etc, but also for office hours, meetings, etc, where potentially sensitive student information may be discussed. I'm probably not searching for the right thing; all I found was this: https://explore.zoom.us/docs/doc/FERPA%20Guide.pdf

(FERPA is to higher ed in the US what HIPAA is to healthcare.)

419. cobych+RW1[view] [source] 2023-08-07 00:55:14
>>isodev+(OP)
Interesting when taken together with things like the story from the other day about emerging AI/ML-based acoustic keystroke attacks against, which mentioned being trainable via Zoom calls: >>37013704 / https://www.bleepingcomputer.com/news/security/new-acoustic-...
432. menset+u82[view] [source] 2023-08-07 02:35:50
>>isodev+(OP)
https://cyberscoop.com/zoom-china-doj-eric-yuan/

Their executives love money as much as Apple so they comply to crush wrongthink about how amazing the CCP is.

◧◩◪◨
434. halduj+v92[view] [source] [discussion] 2023-08-07 02:45:14
>>johndh+f32
> Haha wow this is a great post. I am a lawyer and you may have solved a problem I recently encountered. So you think this is saying that generic language in the Zoom BAA constitutes permission to de-identify?

Not that I’m an expert on the nuance here but I think it gives them permission to use PHI, especially if spun in the correct way, which then gives them permission to deid and do whatever with.

My experience has been that it’s pretty easy to spin something into QI.

> Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.

https://loyalhealth.com/ is one I’ve recently heard of that trains on de-id’d PHI from customers.

> What's your line of work out of curiosity?

Previously founded a health tech startup and now working primarily as a clinician and researcher (NLP) with some side work advising startups and VCs.

◧◩
441. cade+Ij2[view] [source] [discussion] 2023-08-07 04:18:24
>>imiric+qP
Enjoyed the Black Mirror reference, and will hopefully add to the pop culture enjoyable cross linking with https://en.wikipedia.org/wiki/HumancentiPad.

> WHY WON’T IT READ?!

◧◩◪
449. halduj+sx2[view] [source] [discussion] 2023-08-07 06:29:52
>>vondur+Ho2
Forgive me for being pedantic but this is like nails on a chalkboard to me.

HIPAA is the correct abbreviation of the Health Information Portability and Accountability Act which as an aside doesn't necessarily preclude someone from training on patient data.

HIPPA is the unnecessarily capitalized spelling of a (quite adorable) crustacean found in the Indo-Pacific and consumed in an Indonesian delicacy known as yutuk.

https://en.wikipedia.org/wiki/Hippa_adactyla

◧◩
454. raphma+WK2[view] [source] [discussion] 2023-08-07 08:16:13
>>Hamcha+Mq
At least the German ToS (updated April 10) do not contain any AI provisions.

https://explore.zoom.us/de/privacy/

◧◩
459. 88j88+253[view] [source] [discussion] 2023-08-07 11:25:48
>>aparna+Kd2
Remember when Zoom lied about having strong encryption, and sharing data without permission? https://arstechnica.com/tech-policy/2021/08/zoom-to-pay-85m-...
◧◩◪◨⬒
466. aparna+Mz3[view] [source] [discussion] 2023-08-07 14:26:03
>>tailsp+Jq2
Thanks for your question - we have clarified our position in this blog. We do not use video, audio and chat content to train our AI models without customer consent. Please read more here https://blog.zoom.us/zooms-term-service-ai/
◧◩◪◨
474. catlif+254[view] [source] [discussion] 2023-08-07 16:37:58
>>Tokume+HA1
An example of a federated learning scheme: https://ai.googleblog.com/2017/04/federated-learning-collabo...

Perhaps zero knowledge is a poor choice of terms on my part as it is used in ZKP (as you pointed out). What I meant is “privacy-preserving”.

◧◩◪◨⬒
481. jech+9B4[view] [source] [discussion] 2023-08-07 18:46:48
>>samspe+ya1
> Do you happen to know of others by any chance.

There's Galene, <https://galene.org>. It's easy to deploy, uses minimal server resources, and the server is pretty solid. The client interface is still a little awkward, though. (Full disclosure, I'm the main author.)

◧◩◪◨⬒⬓⬔
484. michae+VR4[view] [source] [discussion] 2023-08-07 19:42:33
>>webmin+SH3
Following up on this point, we’ve updated our terms of service (in section 10.4) to further confirm that we will not use audio, video, or chat customer content to train our artificial intelligence models without your consent.

You can see that now clearly stated in our blog: https://blog.zoom.us/zooms-term-service-ai/

◧◩◪◨⬒⬓
487. rexree+wV4[view] [source] [discussion] 2023-08-07 19:57:07
>>aparna+Mz3
Do you have a public, published trustworthy AI framework that you use to guide your AI projects? Something like https://www.cognilytica.com/trustworthy-ai-workshop/ ? Would be good to see what decisions and processes you follow to guide your AI efforts, how you work with suppliers and partners, consent and disclosure policies, and how you communicate internally and externally.
494. freedu+3a5[view] [source] 2023-08-07 21:11:49
>>isodev+(OP)
They updated their TOS again today. https://explore.zoom.us/en/terms/
◧◩◪◨⬒⬓⬔⧯
495. Norber+8b5[view] [source] [discussion] 2023-08-07 21:17:54
>>michae+VR4
This addresses concerns about Zoom Video Communications, Inc. itself using e.g. recordings for purposes of training their own AI models. It does not address the potentially much greater risks arising from the company potentially selling access to the collection of zoom recordings to other companies for purposes of training AI models of such other companies. Here’s a somewhat-in-depth analysis: https://zoomai.info/
503. TahoeB+sCc[view] [source] 2023-08-09 23:14:39
>>isodev+(OP)
The original posting about “no opt-out” was either incorrect, or prior to the current terms of service, per the following from the Zoom site: ( https://blog.zoom.us/zooms-term-service-ai/ ) ” It’s important to us at Zoom to empower our customers with innovative and secure communication solutions. We’ve updated our terms of service (in section 10.4) to further confirm that we will not use audio, video, or chat customer content to train our artificial intelligence models without your consent. ” (unless that is a “distinction without a difference” — meaning, how does one opt-out )
◧◩◪◨
504. r4inde+iei[view] [source] [discussion] 2023-08-11 17:22:15
>>Sunspa+Z81
I never said that the issues are unique to the Flatpak. I said if they are, then you shouldn't blame Zoom.

The reason why I commented in the first place is because you explicitly mentioned the Flatpak of the Zoom client which stood out to me.

It is my understanding that Flatpak sandboxes apps [1], which could cause various issues if the app is not expecting to be run inside one or of the permissions are misconfigured.

But it certainly doesn't have to. Of course the app itself can be buggy. My point is that an official release should be checked before reporting bugs.

[1] https://docs.flatpak.org/en/latest/sandbox-permissions.html

[go to top]