It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR". But you could probably catch the most egregious cases, like "every single pixel in the video has brightness above 80%".
Like HDR abuse makes it sound bad, because the video is bright? Wouldn't that just hurt the person posting it since I'd skip over a bright video?
Sorry if I'm phrasing this all wrong, don't really use TikTok
HDR is meant to be so much more intense, it should really be limited to things like immersive full-screen long-form-ish content. It's for movies, TV shows, etc.
It's not what I want for non-immersive videos you scroll through, ads, etc. I'd be happy if it were disabled by the OS whenever not in full screen mode. Unless you're building a video editor or something.
Sure, in the same way that advertising should never work since people would just skip over a banner ad. In an ideal world, everyone would uniformly go "nope"; in our world, it's very much analogous to the https://en.wikipedia.org/wiki/Loudness_war .
eventually, it'll wear itself out just like every other over use of the new
OTOH pointing a flaslight at your face is at least impolite. I would put a dark filter on top of HDR vdeos until a video is clicked for watching.
That sounds like a job our new AI overlords could probably handle. (But that might be overkill.)
For things filmed with HDR in mind it's a benefit. Bummer things always get taken to the extreme.
I know how bad the support for HDR is on computers (particularly Windows and cheap monitors), so I avoid consuming HDR content on them.
But I just purchased a new iPhone 17 Pro, and I was very surprised at how these HDR videos on social media still look like shit on apps like Instagram.
And even worse, the HDR video I shoot with my iPhone looks like shit even when playing it back on the same phone! After a few trials I had to just turn it off in the Camera app.
My idea is: for each frame, grayscale the image, then count what percentage of the screen is above the standard white level. If more than 20% of the image is >SDR white level, then tone-map the whole video to the SDR white point.
99.9% of people expect HDR content to get capped / tone-mapped to their display's brightness setting.
That way, HDR content is just magically better. I think this is already how HDR works on non-HDR displays?
For the 0.01% of people who want something different, it should be a toggle.
Unfortunately I think this is either (A) amateur enshittification like with their keyboards 10 years ago, or (B) Apple specifically likes how it works since it forces you to see their "XDR tech" even though it's a horrible experience day to day.
Unless you're using a video editor or something, everything should just be SDR when it's within a user interface.
The solution is for social media to be SDR, not for the UI to be HDR.
So all music producers got out of compressing their music was clipping, and not extra loudness when played back.
I have zero issues and only an exceptional image on W11 with a PG32UQX.
Exacly this. I usually do not want high dynamic audio because that means it's either to quiet sometimes or loud enough to annoy neighbors at other times, or both.
KDE Wayland went the better route and uses Gamma 2.2
The latest android release has a setting that is the HDR version of “volume leveling”.
The "normal" video should aim to be moderately bright on average, the extra peak brightness is good for contrast in dark scenes. Other comments comparing it to the loudness war ar apt. Some music streaming services are enfircing loudness normalization to solve this. Any brickwalled song gets played a bit quieter when the app is being a radio.
Instagram could enforce this too, but it seems unlikely unless it actually effects engagement.