Since KDE plasma 6 with hdr support came out recently, I decided to check how some of the hdr tagged movies I'd watched previously look with that. I was surprised to see that they are rather dark, even in scenes under direct sunlight. I 'tested' the brightness by opening the movie in hdr and sdr at the same time and manually changing the sdr brightness to compare.
Most scenes, including those outside and in sunlight seem equivalent to 200 nits in sdr. Only highlights (like sky if the sky isn't a large part of the shot, or glimpses of outside in indoor scenes) seem to reach 700-800 nits. I thought there was some kind of a baked in abl in the files, but then I found a scene (starkiller base firing in swtfa) that got ~~more bright than the sdr brightness slider goes~~ (while covering half the screen too), so that's not it (Edit: I think this is an hdr to sdr mapping caused error. In some frames the laser becomes gray thus much darker in sdr versus hdr. In other frames 800-1000 nits seem right. Still the brightest scene I could find.). There seems to be a conscious decision to keep most scenes the same brightness
Are movies supposed to be like that? I'd think the cameras would capture the brightness accurately and that would be what you see with minimal modifications to it. What's the point of hdr if there isn't a brightness difference between a sunny scene and a cloudy scene? I mean, the highlights have a lot more detail instead of crush and that's good.. I'm pretty sure those that I've seen in the theather were not this dark in most scenes tho.
I've tried a few web-dls and blurays. They all seem to have this issue.
Increasing contrast from the player seems to work and I guess I'll just find a good default for that and forget about it eventually, unless you have a suggestion. Expected more from the fabled hdr tho
Sorry if this doesn't fit here
Edit: Bit the bullet and booted windows to analyse the files. They are indeed dim for my taste (due to having low max brightness and/or baked in abl), with high brightnesses only being used in highlights and the rest are 400-200 nits. Took some screenshots (they're badly blooming since they're sdr screenshots of hdr). The cursor is positioned on the sky in most and in a bright area in the rest:

I took another right before this shot while they're still in the ship but forgot to save it. The tiny bit of visible sky was 600 nits in that shot, so I think this file does have baked in abl.

Star wars turned out to be pretty good with a 1000 nit target in general, but the desert is still dim for some reason (below 50 nits in this scene!)

Also tried spiderverse on suggestion. It wasn't that bright but I think that's fine for animation.

The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it's not buffering whole frames before displaying them.
And sometimes less, like when vsync is disabled.
That's not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That's 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora' mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn't miss vsync