Unequivocally, yes.
LLMs have proved themselves to be useful, at times, very useful, sometimes invaluable assistants who work in different ways than us. If sticking health data into a training set for some other AI could create another class of AI which can augment humanity, great!! Patient privacy and the law can f*k off.
I’m all for the greater good.
You misread the post I was responding to. They were suggesting health data with PII removed.
Second, LLMs have proved that AI which gets unlimited training data can provide breakthroughs in AI capabilities. But they are not the whole universe of AIs. Some other AI tool, distinct from LLMs, which ingests en masse as much health data as it can could provide health and human longevity outcomes which could outweigh an individual's right to privacy.
If transformers can benefit from scale, why not some other, existing or yet to be found, AI technology?
We should be supporting a Common Crawl for health records, digitizing old health records, and shaming/forcing hospitals, research labs, and clinics into submitting all their data for a future AI to wade into and understand.
If that’s the case, let’s put it on the ballet and vote for it.
I’m tired of big tech making policy decisions by “asking for permission later” and getting away with everything.
If there truly is some breakthrough and all we need is everyone’s data, tell the population and sell it to the people and let’s vote on it!
To me this says that openai would have access to ill-gotten raw patient data and would do the PII stripping themselves.
> If that’s the case, let’s put it on the ballet and vote for it.
This vote will mean "faster horses" for everyone. Exponential progress by committee is almost unheard of.