Re: Incorrect Algorithm and mDCv for HLG (From last night's meeting)

I could not agree more with Simon re:


  *   “In a large, live production, the signal may traverse a number of production centres and encode/decode hops before distribution.  Each of these will need to have a method for conveying this metadata for it to arrive at the distribution point.
In live Sports production we are challenged with not only getting a basic metadata parameter e.g. VPID through all processes (linear and file based) but ensuring it is correct is even more difficult. And when a device such as a monitor is set to use VPID to adjust its settings dynamically and VPID is incorrect – the result is incorrectly displayed video.

Best regards

greg

From: Simon Thompson - NM <Simon.Thompson2@bbc.co.uk>
Date: Wednesday, September 13, 2023 at 7:08 AM
To: Sebastian Wick <sebastian.wick@redhat.com>
Cc: Pekka Paalanen <pekka.paalanen@collabora.com>, "public-colorweb@w3.org" <public-colorweb@w3.org>
Subject: Re: Incorrect Algorithm and mDCv for HLG (From last night's meeting)
Resent-From: Gregory Coppa <gmcoppa@cbs.com>, <public-colorweb@w3.org>
Resent-Date: Wednesday, September 13, 2023 at 7:07 AM

<External Email>


Hi Sebastien

>The Static HDR Metadata's mastering display primaries are used to
>improve the quality of the correction in the panels by limiting the
>distance of the mappings they have to perform. I don't see why this
>would not equally benefit HLG.

>Regarding composition: the metadata has to be either recomputed or
>discarded. Depending on the target format, the metadata on the
>elements can be useful to limit the distance of the mappings required
>to get the element to the target format.
I’m not sure that the minimum distance mapping is necessarily the best, it would depend on the colour space in which you’re performing the process.  In dark colours, the minimum distance may lift the noise floor too.  Chris Lilley has an interesting page looking at the effect in a few different colour spaces, I’ll see if I can find it.
As I said in the previous email, a significant proportion of HDR is currently being created whilst looking at the SDR downmapping on an SDR monitor – the HDR monitor may be used for checking, but not for mastering the image.
When an HDR monitor is used, there are still a number of issues:

  *   The HLG signal is scene-referred and I would expect that there will be colours outside the primaries of this display as it’s not usual to clip signals in broadcast workflows. (Clipping leads to ringing in scaling filters and issues with compression – this is why the signal has headroom and footroom for overshoots).  The signal does not describe the colours of pixels on a screen.
  *   In a large, live production, the signal may traverse a number of production centres and encode/decode hops before distribution.  Each of these will need to have a method for conveying this metadata for it to arrive at the distribution point.
As the transform from Scene-referred HLG signal to a display is standardised, I don’t think additional metadata is needed.  If included, the software rendering to the screen must not expect it to be present.
Best Regards
Simon

Received on Wednesday, 13 September 2023 13:27:26 UTC