You can also include "vf=format:film-grain=no" in the config itself to start with no film grain by default.
I do sometimes end up with av1 for streaming-only stuff, but most of that looks like shit anyway, so some (more) digital smudging isn’t going to make it much worse.
The problem you see with AV1 streaming isn't the film grain synthesis; it's the bitrate. Netflix is using film grain synthesis to save bandwidth (e.g. 2-5mbps for 1080p, ~20mbps for 4k), 4k bluray is closer to 100mbps.
If the AV1+FGS is given anywhere close to comparable bitrate to other codecs (especially if it's encoding from a non-compressed source like a high res film scan), it will absolutely demolish a codec that doesn't have FGS on both bitrate and detail. The tech is just getting a bad rap because Netflix is aiming for minimal cost to deliver good enough rather than maximal quality.
I’m ok with that for things where I don’t care that much about how it looks (do I give a shit if I lose just a little detail on Happy Gilmore? Probably not) and agree that faking the grain probably gets you a closer look to the original if you’re gonna erase the grain for better compression, but if I want actual high quality for a film source then faked grain is no good, since if you’re having to fake it you definitely already sacrificed a lot of picture quality (because, again, the grain is the picture, you only get rid of it by discarding information from the picture)
But for anything modern, the film grain was likely added during post-production. So it really is just random noise, and there’s no reason it can’t be recreated (much more efficiently) on the client-side.