It is my understanding that Apple implemented a program that generates a hash of a file and compares it to a blacklist, and notifies Apple in the event of a match. It's not clear, but it appears this is only run when uploading to iCloud.
The blacklist itself is not maintained by Apple, but by the US government or a third party like NCMEC, which means Apple can't be sure content that isn't child abuse imagery hasn't made it onto the list. Perceptual hashes probably can't be abused to target non-image/video content because they're an inherently image-oriented technology.
Apple could, however cause such a program to match on different criteria with a simple update, and such a change would likely be difficult to detect. Most of us assume Apple wouldn't voluntarily do such a thing, but it's very probable that they would do it involuntarily. The US government has already attempted to compel Apple to create a tool to compromise the security of an iPhone, and might have eventually succeeded in court if they hadn't gained access by other means. That fight took place in public, but the next one might well take place in secret.
In what way does the OP not understand what was implemented? This gives apple the ability to monitor any file on your device. Not only that, but all we have is the government's pinky promise that the database contains only CSAM. There is no verification. How do we know that it isn't going to end up filled with gay pornography involving consenting adults when the "feature" is rolled out in Saudi Arabia?
How do you know that the image classifier on your iPhone, that exists today, and detects faces, isn't also detecting whether the photo has a gun in it?
We don't know, but a photo of a gun isn't going to break your marriage, make you lose your job, make your friends hate you and get you thrown into jail for weeks.
Someone finding a way to plant this kind of material on your phone does that.
Besides, the fact that something that can be misused exist isn't a very good defense for creating something else that widens the opportunity for abuse significantly.