- Medical science handles variation by simply assuming that large enough samples will average out variation. This loses a ton of information as the “average person” is a construct that almost certainly doesn’t exist.
- news media on medical science glosses over all uncertainties in the name of clickbaity sensationalism.
- lawyers are the incentivized by our adversarial legal system to adopt aggressively hyperbolic interpretations of the science to sue people and extract money.
- medical associations then tweak policies to protect against malpractice
Run this loop enough times and lots of noise gets amplified.
My hope is the AI+sensors ushers in the era of truely personalized medicine.
AI can lie. It means a "truely personalized medicine" would sometimes poison its patients. See for instance Donald Knuth experiment with Chat GPT, starting with "Answer #3 is fouled up beautifully!" with some totally wrong AI answers https://www-cs-faculty.stanford.edu/~knuth/chatGPT20.txt
Of course medical science could make a better use of statistics, get help from AI, and discern more profiles (e.g. one US adult out of two is obese, and it's often unclear how to adjust medication to person mass). But that's a long process, with no obvious path, and much distinct from the magic "AI will solve it all".