zlacker

[return to "A journey into the shaken baby syndrome/abusive head trauma controversy"]
1. rapate+Vf5[view] [source] 2023-09-27 05:37:06
>>rossan+(OP)
This doesn’t surprise me. We have massive systemic issues in medical science and care delivery.

- Medical science handles variation by simply assuming that large enough samples will average out variation. This loses a ton of information as the “average person” is a construct that almost certainly doesn’t exist.

- news media on medical science glosses over all uncertainties in the name of clickbaity sensationalism.

- lawyers are the incentivized by our adversarial legal system to adopt aggressively hyperbolic interpretations of the science to sue people and extract money.

- medical associations then tweak policies to protect against malpractice

Run this loop enough times and lots of noise gets amplified.

My hope is the AI+sensors ushers in the era of truely personalized medicine.

◧◩
2. idoubt+so5[view] [source] 2023-09-27 07:00:41
>>rapate+Vf5
Basically, you want to replace statistics ("large enough samples will average out variation") with AI. I'm afraid that's cargo cult instead of science.

AI can lie. It means a "truely personalized medicine" would sometimes poison its patients. See for instance Donald Knuth experiment with Chat GPT, starting with "Answer #3 is fouled up beautifully!" with some totally wrong AI answers https://www-cs-faculty.stanford.edu/~knuth/chatGPT20.txt

Of course medical science could make a better use of statistics, get help from AI, and discern more profiles (e.g. one US adult out of two is obese, and it's often unclear how to adjust medication to person mass). But that's a long process, with no obvious path, and much distinct from the magic "AI will solve it all".

◧◩◪
3. Shorel+4Z9[view] [source] 2023-09-28 12:40:52
>>idoubt+so5
Current AI is deeply rooted in probability and statistics, so it would actually increase the use of statistics.

I'm not saying a false positive or a false negative cannot happen. I am saying that we would have better estimates of both, according to probability theory.

Also: false positives and false negatives are basically impossible to prevent, for a sufficient small margin of error. And that's science.

[go to top]