You're spelling out a specific process in detail--which is the only reason I'm picking on details. Do you have anything documenting what you're describing?
From what I remember, Apple's system was proposed, but never shipped. They proposed hashing your photos locally and comparing them to a local database of known CSAM images. Only when there was was a match, they would transmit the photos for manual confirmation. This describes Apple's proposal [1].
I believe what did ship is an algorithm to detect novel nude imagery and gives some sort of warning for kids sending or receiving that data. None of that involves checks against Apple's server.
I do think other existing photo services will scan only photos you've uploaded to their cloud.
I'm happy to make corrections. To my knowledge, what you're describing hasn't been done so far.
[1] https://www.hackerfactor.com/blog/index.php?/archives/929-On...
This article is a few years old, but has more of a plain-English, third party explanation: https://appleinsider.com/articles/20/01/21/what-apple-surren...
Its fair not to trust Apple or any company, but Google and a lot of companies were scanning the cloud versions without the negative press Apple got. My understanding is Apple proposed scanning on-device because images were encrypted in the cloud. Uploading and have manual review process seems like a big ongoing cost.
Personally, I dont think Apple is doing anything with photos it stores in the cloud.
Like the first article says, technically they could, because they store the encryption key for user-convenience. Turning on Advanced Data Protection should take away their ability to decrypt photos. But there are a whole bunch of caveats if you're talking about all cloud their data and that has changed over the years.