zlacker

[return to "Netflix’s AV1 Journey: From Android to TVs and Beyond"]
1. ls612+G8[view] [source] 2025-12-05 01:12:35
>>Charle+(OP)
On a related note, why are release groups not putting out AV1 WEB-DLs? Most 4K stuff is h265 now but if AV1 is supplied without re-encoding surely that would be better?
◧◩
2. Dwedit+qa[view] [source] 2025-12-05 01:28:00
>>ls612+G8
Because pirates are unaffected by the patent situation with H.265.
◧◩◪
3. ls612+Ja[view] [source] 2025-12-05 01:30:43
>>Dwedit+qa
But isn’t AV1 just better than h.265 now regardless of the patents? The only downside is limited compatibility.
◧◩◪◨
4. phanta+ug[view] [source] 2025-12-05 02:23:54
>>ls612+Ja
I avoid av1 downloads when possible because I don’t want to have to figure out how to disable film grain synthesis and then deal with whatever damage that causes to apparent quality on a video that was encoded with it in mind. Like I just don’t want any encoding that supports that, if I can stay away from it.
◧◩◪◨⬒
5. Wowfun+Dk[view] [source] 2025-12-05 03:01:51
>>phanta+ug
With HEVC you just don't have the option to disable film grain because it's burned into the video stream.
◧◩◪◨⬒⬓
6. phanta+Nl[view] [source] 2025-12-05 03:14:46
>>Wowfun+Dk
I’m not looking to disable film grain, if it’s part of the source.
◧◩◪◨⬒⬓⬔
7. Mashim+TD[view] [source] 2025-12-05 07:10:14
>>phanta+Nl
Does AV1 add it if it's not part of the source?
◧◩◪◨⬒⬓⬔⧯
8. phanta+qq1[view] [source] 2025-12-05 12:53:27
>>Mashim+TD
I dunno, but if there is grain in the source it may erase it (discarding information) then invent new grain (noise) later.
◧◩◪◨⬒⬓⬔⧯▣
9. Wowfun+9M1[view] [source] 2025-12-05 14:45:57
>>phanta+qq1
I'm skeptical of this (I think they avoid adding grain to the AV1 stream which they add to the other streams--of course all grain is artificial in modern times), but even if true--like, all grain is noise! It's random noise from the sensor. There's nothing magical about it.
◧◩◪◨⬒⬓⬔⧯▣▦
10. phanta+hZ1[view] [source] 2025-12-05 15:43:36
>>Wowfun+9M1
The grain’s got randomness because distribution and size of grains is random, but it’s not noise, it’s the “resolution limit” (if you will) of the picture itself. The whole picture is grain. The film is grain. Displaying that is accurately displaying the picture. Erasing it for compression’s sake is tossing out information, and adding it back later is just an effect to add noise.

I’m ok with that for things where I don’t care that much about how it looks (do I give a shit if I lose just a little detail on Happy Gilmore? Probably not) and agree that faking the grain probably gets you a closer look to the original if you’re gonna erase the grain for better compression, but if I want actual high quality for a film source then faked grain is no good, since if you’re having to fake it you definitely already sacrificed a lot of picture quality (because, again, the grain is the picture, you only get rid of it by discarding information from the picture)

◧◩◪◨⬒⬓⬔⧯▣▦▧
11. Wowfun+n63[view] [source] 2025-12-05 20:46:52
>>phanta+hZ1
If you’re watching something from the 70s, sure. I would hope synthesized grain isn’t being used in this case.

But for anything modern, the film grain was likely added during post-production. So it really is just random noise, and there’s no reason it can’t be recreated (much more efficiently) on the client-side.

[go to top]