There also are no scene rules for AV1, only for H265 [1]
With the SVT-AV1 encoder you can achieve better quality in less time versus the x265 encoder. You just have to use the right presets. See the encoding results section:
https://www.spiedigitallibrary.org/conference-proceedings-of...
You can also include "vf=format:film-grain=no" in the config itself to start with no film grain by default.
FGS makes a huge difference at moderately high bitrates for movies that are very grainy, but many people seem to really not want it for HQ sources (see sibling comments). With FGS off, it's hard to find any sources that benefit at bitrates that you will torrent rather than stream.
I do sometimes end up with av1 for streaming-only stuff, but most of that looks like shit anyway, so some (more) digital smudging isn’t going to make it much worse.
The problem you see with AV1 streaming isn't the film grain synthesis; it's the bitrate. Netflix is using film grain synthesis to save bandwidth (e.g. 2-5mbps for 1080p, ~20mbps for 4k), 4k bluray is closer to 100mbps.
If the AV1+FGS is given anywhere close to comparable bitrate to other codecs (especially if it's encoding from a non-compressed source like a high res film scan), it will absolutely demolish a codec that doesn't have FGS on both bitrate and detail. The tech is just getting a bad rap because Netflix is aiming for minimal cost to deliver good enough rather than maximal quality.
Most new UHD, yes, but otherwise BRD primarily use h264/avc
This problem is only just now starting to get solved in SVT-AV1 with the addition of community-created psychovisual optimizations... features that x264 had over 15 years ago!
https://wiki.x266.mov/docs/encoders/SVT-AV1
https://jaded-encoding-thaumaturgy.github.io/JET-guide/maste...
Bigger PT sites with strict rules do not allow it yet and are actively discussing/debating it.Netflix Web-DLs being AV1 is definitely pushing that. The codec has to be a select-able option during upload.
I’m ok with that for things where I don’t care that much about how it looks (do I give a shit if I lose just a little detail on Happy Gilmore? Probably not) and agree that faking the grain probably gets you a closer look to the original if you’re gonna erase the grain for better compression, but if I want actual high quality for a film source then faked grain is no good, since if you’re having to fake it you definitely already sacrificed a lot of picture quality (because, again, the grain is the picture, you only get rid of it by discarding information from the picture)
But for anything modern, the film grain was likely added during post-production. So it really is just random noise, and there’s no reason it can’t be recreated (much more efficiently) on the client-side.