So counterintuitively, noise reduction improves compression ratios. In fact many video codecs are about determining which portion of the video IS noise that can be discarded, and which bits are visually important...
Or to put it another way, to me it would be similarly disingenuous to describe e.g. dead code elimination or vector path simplification as "just a compression algorithm" because the resultant output is smaller than it would be without. I think part of what has my hackles raised is that it claims to improve video clarity, not to optimise for size. IMO compression algorithms do not and should not make such claims; if an algorithm has the aim (even if secondary) to affect subjective quality, then it has a transformative aspect that requires both disclosure and consent IMO.
It's in the loop of the compression and decompression algorithm.
Video compression has used tricks like this for years. For example, reducing noise before decode and then adding it back in after the decode cycle. Visual noise doesn't need to be precise, so it removing it before compression and then approximating it on the other end saves a lot of bits.
I think the first comment is why they would position noise reduction as being both part of their compression and a way to improve video clarity.