Re: Incorrect Algorithm and mDCv for HLG (From last night's meeting)

Hi Everyone,

In broadcast we commonly differentiate between signaling and metadata. The signaling covers identification of color primaries, transfer function and defining the matrix coefficients for conversion between YCbCr and RGB. MDCV could be considered metadata (in this case ST 2086 static metadata).

With PQ, if MDCV drops away, the transfer function is absolute so we are still OK but may not provide the optimal compression of highlights if the consumer display has inferior capabilities in peak brightness.  In this case, because PQ is absolute, it is only the highlights that are affected because the focal areas between black and graphic white probably are supported (0-400nits) even in cheaper displays. In live linear, we use 203nits as ref/graphic white (BT.2408-6).

For HLG, because a gamma-adaption is applied to the entire signal range, shadows and midtones are shifted (dependent on the display brightness which alters variable gamma – see BT.2100), a fixed look is applied with the gamma adaption which we think alters the appearance and affects the way shaders act on the content.  Consider the Hunt Effect which changes the subjective appearance based on brightness. If we store MDCV information, MDCV metadata would not be required but we think it can help optimize the tone mapping based on identifying the mastering displays variable-gamma-adjustment during the point of content creation.  Of course, as Greg mentions, that information might be dropped in which case the optimal tone mapping won’t occur.  That would be acceptable in our opinion but not optimal.

Best,
Chris

From: Coppa, Greg <gmcoppa@cbs.com>
Date: Wednesday, September 13, 2023 at 9:29 AM
To: Simon Thompson - NM <Simon.Thompson2@bbc.co.uk>, Sebastian Wick <sebastian.wick@redhat.com>
Cc: Pekka Paalanen <pekka.paalanen@collabora.com>, public-colorweb@w3.org <public-colorweb@w3.org>
Subject: [EXTERNAL] Re: Incorrect Algorithm and mDCv for HLG (From last night's meeting)
I could not agree more with Simon re:


  *   “In a large, live production, the signal may traverse a number of production centres and encode/decode hops before distribution.  Each of these will need to have a method for conveying this metadata for it to arrive at the distribution point.
In live Sports production we are challenged with not only getting a basic metadata parameter e.g. VPID through all processes (linear and file based) but ensuring it is correct is even more difficult. And when a device such as a monitor is set to use VPID to adjust its settings dynamically and VPID is incorrect – the result is incorrectly displayed video.

Best regards

greg

From: Simon Thompson - NM <Simon.Thompson2@bbc.co.uk>
Date: Wednesday, September 13, 2023 at 7:08 AM
To: Sebastian Wick <sebastian.wick@redhat.com>
Cc: Pekka Paalanen <pekka.paalanen@collabora.com>, "public-colorweb@w3.org" <public-colorweb@w3.org>
Subject: Re: Incorrect Algorithm and mDCv for HLG (From last night's meeting)
Resent-From: Gregory Coppa <gmcoppa@cbs.com>, <public-colorweb@w3.org>
Resent-Date: Wednesday, September 13, 2023 at 7:07 AM

<External Email>



Hi Sebastien

>The Static HDR Metadata's mastering display primaries are used to
>improve the quality of the correction in the panels by limiting the
>distance of the mappings they have to perform. I don't see why this
>would not equally benefit HLG.

>Regarding composition: the metadata has to be either recomputed or
>discarded. Depending on the target format, the metadata on the
>elements can be useful to limit the distance of the mappings required
>to get the element to the target format.
I’m not sure that the minimum distance mapping is necessarily the best, it would depend on the colour space in which you’re performing the process.  In dark colours, the minimum distance may lift the noise floor too.  Chris Lilley has an interesting page looking at the effect in a few different colour spaces, I’ll see if I can find it.
As I said in the previous email, a significant proportion of HDR is currently being created whilst looking at the SDR downmapping on an SDR monitor – the HDR monitor may be used for checking, but not for mastering the image.
When an HDR monitor is used, there are still a number of issues:

  *   The HLG signal is scene-referred and I would expect that there will be colours outside the primaries of this display as it’s not usual to clip signals in broadcast workflows. (Clipping leads to ringing in scaling filters and issues with compression – this is why the signal has headroom and footroom for overshoots).  The signal does not describe the colours of pixels on a screen.
  *   In a large, live production, the signal may traverse a number of production centres and encode/decode hops before distribution.  Each of these will need to have a method for conveying this metadata for it to arrive at the distribution point.
As the transform from Scene-referred HLG signal to a display is standardised, I don’t think additional metadata is needed.  If included, the software rendering to the screen must not expect it to be present.
Best Regards
Simon

Received on Wednesday, 13 September 2023 14:54:55 UTC