It’s not hard to do vulkan layers, but first xlibre needs to support it getting colorspace data from the application, triggering HDR on the monitor, and doing the basic transforms.
Context because the terms HDR and SDR are dumb.
a color space is Gamut/primaries + transfer/gamma + whitepoint. As far as I, and the compositor cares, the most "unique" part here is the transfer and gamut. the "HDRness" of colorspaces as far as I am concerned, and most users are concerned, comes from the transfer, but this isn’t really relevant because content is done in colorspaces
Most legacy un-colormanaged applications are either bt.1886(rec709) which is common on TVs, or sRGB which is more common on PC displays. These are "SDR" colorspaces. There are others, but for uncolormanaged or "S…
It’s not hard to do vulkan layers, but first xlibre needs to support it getting colorspace data from the application, triggering HDR on the monitor, and doing the basic transforms.
Context because the terms HDR and SDR are dumb.
a color space is Gamut/primaries + transfer/gamma + whitepoint. As far as I, and the compositor cares, the most "unique" part here is the transfer and gamut. the "HDRness" of colorspaces as far as I am concerned, and most users are concerned, comes from the transfer, but this isn’t really relevant because content is done in colorspaces
Most legacy un-colormanaged applications are either bt.1886(rec709) which is common on TVs, or sRGB which is more common on PC displays. These are "SDR" colorspaces. There are others, but for uncolormanaged or "SDR" applications, this is all we really care about. Colormanaged applications can be treated differently (and should be.)
The common HDR colorspaces are rec.2100 with either HLG or PQ transfers. HDR metadata can be added onto it to modify them, but it’s not a hard requirement to handle. when it comes to displaying, HLG is probably the "easiest" to do, but PQ is by far the most common.
Games often use scRGB colorspace which is a linear transfer with the primaries of sRGB and doing some weird hijinks with values to expand it. You need to convert scRGB to rec2100 with either of the transfers because no monitors use it. (linear is a lot of data :D)
the current method compositors seem to be going with now to implement SDR->HDR is to have a dedicated SDR plane on the GPU, then use operations to transform that into "HDR" Presumably using a lut, but I don’t know what all operations are available to work with here (https://melissawen.github.io/blog/2025/05/19/drm-info-with-kms-color-api).
The actual specifics of the mapping should be user configurable (or compositor configurable, not really sure here) There are a lot of more nuanced things to think about like ’to what luminance do you make "SDR" to "HDR" ’ which really should be user configurable.
I think modeling the color data based on the wayland protocols themselves is fine, they are sufficiently detailed and staying in line with them will only serve to help applications port to xlibre.