So why isn’t it AV1 higher? The post doesn’t say, so we can only speculate. It feels like they’re preferring hardware decoding to software decoding, even if it’s an older codec. If this is true, it would make sense - it’s better for the client’s power and battery consumption.
But then why start work on AV2 before AV1 has even reached a majority of devices? I’m sure they have the answer but they’re not sharing here.