That's not what Apple's plans state. The comparisons are done on phone, and are only escalated to Apple if there are more than N hash matches, at which point they are supposedly reviewed by Apple employees/contractors.
Otherwise, they'd just keep doing it on the material that's actually uploaded.
> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning
images in the cloud, the system performs on-device matching using a database of known CSAM image
hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database
into an unreadable set of hashes, which is securely stored on users’ devices.
Otherwise, they'd just keep doing it on the material that's actually uploaded.