RE: Help with HLG profile

Simon, Tim (and Andrew) nicely present their view of video, HLG, and HDR in many venues, and have done so on this reflector. Dolby, and some others across the industry, see HDR and television in a different "light". While not intending to stimulate an extended debate of all points on this reflector, I would like address a few of the points made so that a full diversity of views is represented on this reflector.
On scene vs display referencing:
Simon: "we designed HLG precisely as relative scene luminance (just like Rec.709)."
Lars Borg: "Presumably the camera operator optimized the camera rendering for a pleasing appearance on the reference display, so 'scene' is a misnomer, as it's always been also with Rec. 709."
Craig R.: "is it reasonable to describe HLG as encoding relative scene luminance?"

Let's start from how television has evolved. The camera signal is put into a non-linear format so that 8-10 bits can be used. The non-linearity was not explicitly designed but came from CRT characteristics. Professional CRT's had settled on a luminance of about 100 nits. BT.709 specified an encoding function (OETF) that when concatenated with the CRT decoding function produced a nice result. With the demise of the CRT, the decoding function needed to be made explicit so it could be implemented consistently across new display technologies. The ITU-R first created BT.1886 to specify the relative display curve, and then BT.2035 to tie it to an absolute luminance (100 nits, for a reference display in a reference environment). For HDR, the legacy gamma curve was not satisfactory to represent a much wider range, so PQ was designed, and then standardized, first by SMPTE in ST 2084, then by ITU-R in BT.2100. BT.2100 is a more complete spec than BT.709 as it includes specs for reference OETF, EOTF, and OOTF, as well as viewing environment and some parameters of the reference display. Consumer display and non-reference displays and viewing environments are not addressed in BT.2100 (like they were not addressed for HDTV family of BT.709, BT.1886, BT.2035). Other than being a more elegantly laid out and more complete specification, we (Dolby) don't see much if any difference between legacy HDTV and PQ HDR (other than of course dynamic range). In both HDTV and PQ HDR, cameras shoot scenes, the camera shaders watch on reference monitors in a (typically) dim environment, and adjust the cameras for the desired pictures; then the signal is distributed to consumers. The signals for HDTV and PQ HDR are, in our view of things, display referenced. We see the HDTV gamma signal as display referenced; originally by the nominal 100 nit CRT brightness, and currently by the flat panel display set to 100 nits per BT.2035 (thus adding units of luminance to the otherwise dimensionless gamma signal). What happens with consumer displays in homes is out of our control, but of course we hope they will show the correct picture when the room is dim, and an appropriately brighter picture to compensate for brighter rooms. The place where the term "scene referred" does make sense is for matching cameras, and BBC is of course correct that it is proper to get back to scene light in that case.
On live television and streaming vs film:
Craig R.: From the perspective of someone only peripherally involved in the motion picture or broadcast television industry, it seems that HLG provides a good solution for live broadcasts whereas PQ provides a better model for motion picture production. Is that a reasonable assumption?" "It is not so clear how the PQ model can be used for live streaming of content and it seems that this aspect was the primary motivation for HLG. How is this aspect handled in a PQ world?"  "On the point about 'grading', in talking to people from the BBC and NHK, I understand that the main reason to develop HLG was that for many use cases there is no opportunity for grading, for example for live broadcasts. In practice, I would have thought that PQ would be used for grading of film production and would be converted to HLG at the time of broadcast."
Simon: "... some time ago, the BBC prepared the Blue Planet series as HLG."
Tim: "I am unaware of live sporting events broadcast using PQ."
In order to get HDR technology available to consumers quickly, film content with OTT delivery was the first target. Broadcast systems take a long time to penetrate and of course generally require standards. Films have high quality assets and many could be regraded for HDR. The film community has very high standards so this provided the most critical review of the technology we had developed. There is nothing about PQ that limits it to post produced films or makes it inappropriate for live television. There is an advantage to HLG in early live productions where the monitors are legacy SDR. That is the "degree of compatibility" (ITU-R language) with SDR monitors. PQ requires a PQ monitor, or a 3D LUT in front of an SDR monitor to look acceptable. In the future all monitors will be HDR/BT.2100 so this is a short-term advantage for live production in HLG. The (very beautiful) BBC productions of "Blue Planet II" and "Earth: One Amazing Day" that were issued on Blu-ray are, like all HDR Blu-ray discs, in PQ. (Also, the Earth: One Amazing Day disc is provided in Dolby Vision.)
(Sorry but a minor plug for Dolby here.)
Reviewer on the Dolby Vision versions: "The main draw for me is that this is presented on 4K UHD in Dolby Vision AND Dolby Atmos. If you have the equipment to see and hear films in these formats, then prepare to be blown away. Dolby Vision really improves upon HDR and HDR 10. Don't get me wrong - PE2 is breathtaking in HDR, but this goes a step beyond."
Re PQ in live broadcast:
Craig R.: "It is not so clear how the PQ model can be used for live streaming of content and it seems that this aspect was the primary motivation for HLG. How is this aspect handled in a PQ world?"
There have been a number of broadcast trials with PQ. The FIFA World Cup had a 4k HDR production supplied to broadcasters worldwide. The camera signals were in Sony's proprietary Slog3 format, and then were converted to both HLG and PQ. In the U.S. the PQ version was made available by FOX Sports to devices with their OTT client, and by Comcast on their VOD service. Some live sport events in PQ:

*         World Cup https://www.avsforum.com/how-watch-2018-world-cup-4kuhd-hdr/ https://www.sportsvideo.org/2018/07/02/fubotv-becomes-first-vmvpd-to-launch-4k-hdr10-with-2018-fifa-world-cup-matches/

*         French Open 2017 https://ultra-k.fr/roland-garros-2017-ultra-hd-hdr-dolby-vision-dolby-atmos-ac-4/

*         UEFA Champions League final 2017 https://www.svgeurope.org/blog/headlines/sporttech-2017-real-and-virtual-worlds-of-football-with-uefa-and-bt-sport/

*         BT Sport football HDR trial to mobiles: http://home.bt.com/tech-gadgets/tech-news/bt-sport-high-definition-hdr-live-broadcast-champions-league-11364256662397

PQ is used for live streaming, and of course is predominant format in use for non-live streaming, e.g.:

*         Netflix

*         Amazon

*         Vudu

*         iTunes
Multiple broadcast standards specify use of both HLG and PQ, i.e. ATSC and DVB.
Why some prefer PQ vs HLG:
PQ has a wider dynamic range than HLG. I characterize the PQ signal dynamic range as 0.005 nits (lower threshold of visibility in the reference environment) to 10k nits, or 2M:1. While 10k nit monitors are not yet available (though Sony showed one at CES), there are consumer monitors in the 1k-2k nit range, thus offering dynamic range of 200k-400k:1. PQ has more code values allocated to the deep black range making it genuinely possible to have detail in the dark areas. On the top end, considering the production guidelines in BT.2408 (diffuse white at 200 nits), HLG can represent highlights 5x brighter (presuming 1k nit display) while PQ can represent highlights 50x brighter (but of course displayed highlight luminance depends on actual display capability).
With PQ, the creator of the image can be confident it will look the same in another reference viewing situation. Whites and mid-tones will always be reproduced at the same luminance on a reference monitor. As the HLG monitor luminance is not specified ("relative system"), the HLG "look" is dependent on the monitor luminance which directly affects the luminance of mid-tones and whites (visualization of this is shown at https://www.dolby.com/us/en/technologies/dolby-vision/image-level-shifts-with-hlg.pdf). Some favor PQ over HLG for this reason. Blu ray only uses PQ; the UHDA specifies only PQ for a mastering format (but does recognize HLG usage for broadcast). Hollywood was very influential in setting those specifications.
Best regards,

Craig Todd
Sr. VP and Dolby Fellow
Dolby Laboratories, Inc.
1275 Market St.
San Francisco, CA 94103, USA
T  415-558-0221  M 415 672-0221
www.dolby.com<http://www.dolby.com/> |ct@dolby.com

From: Tim Borer [mailto:tim.borer@bbc.co.uk]
Sent: Tuesday, June 19, 2018 9:44 AM
To: Lars Borg <borg@adobe.com>; Craig Revie <Craig.Revie@FFEI.co.uk>; Phil Green <green@colourspace.demon.co.uk>
Cc: Leonard Rosenthol <lrosenth@adobe.com>; Max Derhak <Max.Derhak@onyxgfx.com>; Simon Thompson-NM <Simon.Thompson2@bbc.co.uk>; public-colorweb@w3.org
Subject: Re: Help with HLG profile


Hi Lars,



The reason I discussed the consumer display was to show that there was little sense in archiving broadcast content in a display referred format when there was no single reference monitor on which it should be displayed.
HLG is defined in ITU-R Recommendation BT.2100. Nowhere in that  specification is there a defined luminance for a reference display. The reference display for HLG may, in principle, be any peak luminance. BT.2100 does mention a 1000nit display for HLG, but that is by no means a defined reference peak luminance. The recommendation makes it clear that the display may have a wide range of peak luminances. What BT.2100 does define is a reference view condition, which is basically a dim environment with surround luminance (i.e. the backdrop for the display) of 5 nits.
Of course Planet Earth II was graded. It was graded on a Dolby PRM 4220 monitor (I think that was the model number). The Dolby monitor used a custom (internal) LUT so that it functioned as an HLG display. The peak luminance was 600nits. Actually that seemed a bit bright (according to the colourist). As you well know many people now use a, 1000nit, Sony BVM X300 for HDR grading (and, of course, other grading monitors are available). In general programmes may be graded (or shaded for live) on a variety of monitors. Parts of the programme may well be graded/shaded on different monitors (at different peak luminance). That is a difference from movie production. Consequently there is no one single reference monitor to which the programme is naturally display referred. Since television programmes do not have a specific reference monitor (which is also true, though to a lesser extent, with SDR), it makes good sense to use a scene referred signal, and to archive in that scene referred format. This is what has always been done for broadcast TV.



Best regards,
Tim

On 19/06/2018 16:06, Lars Borg wrote:
The consumer display is of no interest here.
Compare with a printed catalog. The D50 light booth is the reference viewing condition for the catalog. - the catalog print is carefully verified in a D50 light booth, but is actually used in completely different viewing environments, just as happens for TV viewing.

Was Planet Earth II HLG produced without color grading?

Thanks,
Lars Borg  |  Principal Scientist  |  Adobe  |  p. 408.536.2723  |  c. 408.391.9479  |  borg@adobe.com<mailto:borg@adobe.com>


On 6/18/18, 11:43 AM, "Tim Borer" <tim.borer@bbc.co.uk<mailto:tim.borer@bbc.co.uk>> wrote:

Hi Craig and all,
What Lars says sounds very logical, but I think it is incorrect both in general and for HLG.
The key to what he says is broadcast material, by which I assume he means finished programmes. The key part is "broadcast". To understand this it is useful to reprise the difference between movies and television, which has become more greatly accentuated with the development of HDR.
With movies (not my specialist field) the viewer is always in a standardised movie theatre. A conventional movie theatre is specified to have a peak luminance of 48 cd/m^2. Since movie theatres are standardised it does make sense both to grade and to archive a display referred signal, which is what is done.
TV, on the other hand, does not control the display or the viewing environment. Modern CRT displays, before their replacement by flat panels, had a peak brightness of about 200 cd/m^2 (hereafter "nits" to simplify my typing). Modern SDR flat panels went up to as high as 600nits. Now, with the advent of HDR, consumer HDR TVs can be as bright as 1600 nits and increasing. Mobile phones may be a few hundred nits and perhaps as much as 600nits. At the same time the viewing environment can vary greatly from a dim living room (on a "movie night") to a  bright daytime living room for live sport (think Wimbledon Tennis in the UK), to a train in the dark or a bus in bright sunlight for mobiles. In short, a broadcast programme, in contrast to a movie, does not know how brightly the programme will be shown, not the viewing environment. But with TV the display does "know" its own brightness and either "knows" or can estimate the viewing conditions (as I said previously this is a second order effect, and so precise knowledge is not required). It therefore makes sense for the programme to be stored in a display referred format and for the display to render to its capabilities and the viewing environment.
With TV, unlike movies, content may be graded under a variety of conditions on displays with different luminances. Obviously the ideal is a controlled grading suite, but this is not always available particularly for live and for news. For example news is quite often shaded using (carefully selected) consumer monitors in a relatively bright environment.
So when archiving movie content it makes sense that it should be a display referred signal.
With broadcast material archiving a scene referred signal makes more sense. Indeed this is what has been done for the history of TV where the recording of the scene referred signal (e.g. Ree.709) has been archived. HLG simply continues conventional broadcast practice.
Best regards,
Tim
________________________________
From: Lars Borg [borg@adobe.com<mailto:borg@adobe.com>]
Sent: 18 June 2018 15:04
To: Craig Revie; Tim Borer; Phil Green
Cc: Leonard Rosenthol; Max Derhak; Simon Thompson-NM; public-colorweb@w3.org<mailto:public-colorweb@w3.org>
Subject: Re: Help with HLG profile
Hi Craig,

If you're archiving already broadcast material, then a display referred profile seems the most appropriate, as it tells you what the approver saw.
If you're archiving original footage that has not been broadcast, nor approved on a display, then a camera profile would be appropriate for future processing.
These would be general recommendations, not specific to HLG.
Is HLG an exception to these general rules?

Thanks,
Lars Borg  |  Principal Scientist  |  Adobe  |  p. 408.536.2723  |  c. 408.391.9479  |  borg@adobe.com<mailto:borg@adobe.com>


On 6/18/18, 5:51 AM, "Craig Revie" <Craig.Revie@FFEI.co.uk<mailto:Craig.Revie@FFEI.co.uk>> wrote:

Hi Tim,

Thanks for this detailed response which I hope will be helpful in the PQ/HLG discussion. From the perspective of someone only peripherally involved in the motion picture or broadcast television industry, it seems that HLG provides a good solution for live broadcasts whereas PQ provides a better model for motion picture production. Is that a reasonable assumption? How do you expect that HDR BBC and NHK dramas will be produced?

I also see a statement on the World Cup 2018 link: "The production workflow for the World Cup is not native HLG as Host Broadcaster Services have defined their own HDR workflow." Can you provide more details about this?

@Lars, given Tim's description, it would seem that a 'scene-referred' profile would make more sense than a display referred profile if the intent is to allow archival of images encoded as HLG.

Best regards,
_Craig

From: Tim Borer <tim.borer@bbc.co.uk<mailto:tim.borer@bbc.co.uk>>
Sent: 18 June 2018 10:17
To: Lars Borg <borg@adobe.com<mailto:borg@adobe.com>>; Craig Revie <Craig.Revie@FFEI.co.uk<mailto:Craig.Revie@FFEI.co.uk>>; Phil Green <green@colourspace.demon.co.uk<mailto:green@colourspace.demon.co.uk>>
Cc: Leonard Rosenthol <lrosenth@adobe.com<mailto:lrosenth@adobe.com>>; Max Derhak <Max.Derhak@onyxgfx.com<mailto:Max.Derhak@onyxgfx.com>>; Simon Thompson-NM <Simon.Thompson2@bbc.co.uk<mailto:Simon.Thompson2@bbc.co.uk>>; public-colorweb@w3.org<mailto:public-colorweb@w3.org>
Subject: Re: Help with HLG profile

P.S. For more information see:
World Cup 2018 in UHD HDR on BBC iPlayer (https://www.bbc.co.uk/rd/blog/2018-05-uhd_hdr_world_cup_2018<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Fwww.bbc.co.uk-252Frd-252Fblog-252F2018-2D05-2Duhd-5Fhdr-5Fworld-5Fcup-5F2018-26data-3D02-257C01-257Cborg-2540adobe.com-257Cac8c6af30bfe4b1fc2dd08d5d5011d01-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C0-257C636649123133564877-26sdata-3Dvj4UwJMsC3OUMJXl1T-252BTQmhMp9YNYJhfhi1oddW-252FXCE-253D-26reserved-3D0&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=tEzG7mHWnPHuUveKthcrX_1SouqywbKWSnUdVh8ZB14&e=>).
The Royal Wedding in High Dynamic Range (https://www.bbc.co.uk/rd/blog/2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Fwww.bbc.co.uk-252Frd-252Fblog-252F2018-2D05-2Dultra-2Dhigh-2Ddefinition-2Ddynamic-2Drange-2Droyal-2Dwedding-2Duhd-2Dhdr-26data-3D02-257C01-257Cborg-2540adobe.com-257Cac8c6af30bfe4b1fc2dd08d5d5011d01-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C0-257C636649123133564877-26sdata-3DCnJQNiixcSmCIi-252Bsl-252Buuh7HdzpRdeqDE-252Fp30I0AAXp0-253D-26reserved-3D0&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=7_ETv5layndFQJPEBJHRpnxbAGn9FOT1BY8Dv_uGJTM&e=>)
Tim
On 18/06/2018 09:46, Tim Borer wrote:

There are one or two inaccuracies in the thread. So I though it worthwhile to introduce a few facts to correct these. Hope this is useful.

It IS reasonable to describe HLG as relative scene luminance. There has been a lot of debate by proponents of PQ (which is absolute/non-relative display referred encoding) that this is not really so. Nevertheless we designed HLG precisely as relative scene luminance (just like Rec.709).

I have often heard it said that there is no standard Rec.709 production (because camera operators, it is claimed, universally adjust their cameras). The claim is that by tweaking the camera the picture somehow becomes display referred. Even if it were true that cameras are always adjusted (not so) this would not make the signal display referred. If you doubt this simply look at the dimensions of the signal. HLG is dimensionless (a relative signal) and PQ has dimensions of candelas per square metre (nits). All that adjusting a 709/HLG camera does is to produce an "artistic" modification to the signal. The signal still represents relative scene referred light, just not the actual scene (but, rather, one that the producer wished had existed). Adjusting the camera does not convert a dimensionless signal into a dimensional one.

In fact a great deal of live television is produced using a standard Rec709 OETF. This includes almost all live sport (especially soccer), live light entertainment, and news. This encompasses a large part of broadcast television output. In sport it is often a contractual obligation to use the Rec 709 OETF. In other instances the producers often do not like knees because, as typically implemented, then can distort flesh tones. A further consideration it that in a multicamera shoot it is essential to match all the cameras. This is difficult if they don't use a standard Rec 709 setting (often cameras may have different firmware versions, which means that setting up a camera does not necessarily mean the same thing  even on the same model of camera). It is not the case that "the camera's linearity response is tweaked by the operator for various reasons". This is not really viable in a multicamera live shoot, and the shaders don't have time to do it live. Similarly for live production the gamut is standard Rec 709 (if necessary clipped from the wider camera taking gamut). This is necessary to ensure consistent colours for sporting strips both between cameras and at different venues and different games. It is not unusual for sporting strips to be outside 709, and it is important that such colours are treated in a consistent way, so gamut mapping, other than simple clipping, is not viable  (bear in mind that footage from different games may often be shown as part of the commentary). So, to emphasise, the fact is that a great deal of television IS produced using the standard Rec 709 OETF using standard 709 gamut (without gamut mapping).

Scene referred conversions need to be used appropriately. When used appropriately they do not create significant colour shifts. Scene referred conversion should be used when matching camera outputs (as opposed to matching the picture that is seen on the display - they are not the same). Display referred conversions are used to ensure that the displayed image is the same. There are different, distinct, use cases for these two types of conversion. It is a mistake to assume that all conversions should be display referred. As an example consider the recent coverage of the Royal Wedding. This was shot using a mixture of HLG HDR cameras and HD cameras (using Rec 709). The production architecture was that shown in ITU-R Report BT.2408-1 2018 (note the -1 version) (freely available at https://www.itu.int/pub/R-REP-BT.2408-1-2018<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Fwww.itu.int-252Fpub-252FR-2DREP-2DBT.2408-2D1-2D2018-26data-3D02-257C01-257Cborg-2540adobe.com-257Cac8c6af30bfe4b1fc2dd08d5d5011d01-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C0-257C636649123133564877-26sdata-3DAqwlwS97EBGMkSeszU18HBpEuyXD-252F7nanzK5GqVNJ-252Fs-253D-26reserved-3D0&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=lQd016eKSEO3PJsS6ydR9t37gEGIHbxs2hTq6D5Zvmc&e=>), figure 4, page 14. In this workflow you will note that there are many scene referred Rec 709 to HLG conversions. This is so that pictures can be shaded using standard Rec 709 monitors (which is a requirement when the majority of viewers are watching in 709). Note, in particular, that the final Rec 709 SDR output is converted from the HLG signal using a scene referred conversion. We estimate that the Royal Wedding was viewed by 1.8 billion viewers. The international feed was derived from the HLG signal as described, using a scene referred conversion. The colour was NOT distorted using this conversion (and, clearly, the producers would not have allowed distortion for such a prestigious broadcast). On the other hand if one were producing primarily for an HDR audience then you would use the alternative architecture in BT.2408 (fig 3, page 12). Here shading is performed primarily on the HDR monitor (with an SDR monitor fed by a DISPLAY referred conversion, so that the shader can check that nothing untoward is happening on the SDR signal). Note that in a joint HDR SDR production you can give primacy to either shading in HDR (for a majority HDR audience) or to shading in SDR (for a mainly SDR audience - the current situation). But you cannot prioritise both. We have found that shading in SDR gives very good quality HDR as well as SDR. NHK have in mind producing for a primarily HDR audience and, therefore, they favour the production workflow of figure 3. To summarise scene referred conversion do not produce significant colour shifts when they are used properly. Indeed without using scene referred conversions it is not possible to match the look of HDR and SDR output when shading in SDR.

There are few HDR broadcasts in Europe yet though a number are in the pipeline. To the best of my knowledge all these broadcasts will be using HLG. Similarly broadcasters that produce live content (such as sport) in the US also favour HLG. The BBC has made our Planet Earth II series available in UHD HDR HLG on catch up (OTT) television, and are showing some of the Soccer World Cup matches is HLG HDR OTT. Most OTT movie distribution currently uses PQ (HDR10). This is possible because of the non-live workflow. I am unaware of live sporting events broadcast using PQ. YouTube distribute in both HLG and HDR10. Since production is easier in HLG (particularly with pro-summer equipment) most of the user generated HDR content on YouTube is HLG.

Best regards,
Tim

Dr Tim Borer MA, MSc, PhD, CEng, MIET, SMIEEE, Fellow SMPTE

Lead Engineer

Immersive & Interactive Content

BBC Research & Development

BBC Centre House, 56 Wood Lane, London  W12 7SB

T:  +44 (0)30304 09611  M: +44 (0)7745 108652
On 12/06/2018 05:20, Lars Borg wrote:
Hello Craig,

See below.


On 6/9/18, 9:12 PM, "Craig Revie" <Craig.Revie@FFEI.co.uk<mailto:Craig.Revie@FFEI.co.uk>> wrote:

Hi Lars,

Thanks for your reply.

First of all on terminology, is it reasonable to describe HLG as encoding relative scene luminance?

Not really. I wouldn't.
It's an approximation of scene colorimetry just like 709 or 2020.
How accurate is it in real production?
And what accuracy do you need? Why?
I haven't seen any published tests.
But here's what we know from 709 production.
I've yet to find anyone that sets their camera in reference 709 mode in real production.
Typically 709 productions do not use the standard OETF found in the 709 spec.
The camera's linearity response is tweaked by the operator for various reasons.
Extra suppression of the darks as a means to reduce noise.
And the typical knee for highlight rolloff. (Which HLG now includes in a standard way)
Maybe there are options for gamut mapping (which is non-linear)
We usually do't call these tweaks color grading, but rather camera setup.
Either tweak changes the contrast relations from true scene colorimetry.
These tweaks are not indicated in metadata (as HLG has no such, and other formats seem to be proprietary or incomplete), so you have no knowledge on how to undo them in post.

I am not very familiar with the technical details of the cameras used for broadcast television but my assumption was that the controls allow white balance, aperture control and selection of filters. I also assume that the sensor is linear

So far I agree with you.

and that there is minimal processing of the signal. If so, the assumption of relative scene luminance (as described in a number of papers) would seem to be at least a reasonable approximation. Are these assumptions incorrect?

I doubt it. As I noted above, there are many in-camera processing options. So the signal might be non-linear.

I think that for the uses outlined by Simon on this thread, it would be helpful to have a V4 ICC profile even if there are some limitations (which there may well be). Is this something you could provide? It seems to me that the more useful of the two profiles would be the 'HLG scene profile' although depending on the result of our discussion about terminology this may need a different description.

I have yet to find a use case for an HLG scene profile.
Please explain how you would use it, and what workflow.

Primarily I see post production uses for an HLG reference display profile.
With this profile I can mix display-referred content across HLG, PQ, 709 media.
For example, the colors in a commercial are display-referred and the repurposing to another media should preserve those colors.  Scene-referred conversions wouldn't cut it, so I would not use an HLG scene profile for this.

On the point about 'grading', in talking to people from the BBC and NHK, I understand that the main reason to develop HLG was that for many use cases there is no opportunity for grading, for example for live broadcasts.

Grading, no.
But camera matching, yes, also for live broadcasts.
It is the rare event where all cameras are of the same make, model, and revision, including lens and lights.
In practice, cameras at major sports events include both HDR and 709 cameras, different sensors, multiple brands, different light setups, different codecs.
Same sports jersey => very different colors.
So some effort is often spent to make cameras produce similar colors on output.
Same sports jersey => similar colors.
Most likely that matching process undermines true scene colorimetry.
BT.2087 shows conversions from 709 to 2020.
My study shows that doing this 'scene'-referred (case 2) creates significant color shifts.

In practice, I would have thought that PQ would be used for grading of film production and would be converted to HLG at the time of broadcast.

Are there movie HLG broadcasts in Europe yet?
PQ for movie grading, yes.
Although Adobe Premiere is agnostic and lets you grade (display-referred) HLG content as well.
But it seems movie distribution is mostly PQ (HDR10), not HLG. Think Netflix, etc.
I like this diagram from Yoeri Geutskens <https://www.linkedin.com/in/yoerigeutskens/<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Fwww.linkedin.com-252Fin-252Fyoerigeutskens-252F-26data-3D02-257C01-257Cborg-2540adobe.com-257Cac8c6af30bfe4b1fc2dd08d5d5011d01-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C0-257C636649123133564877-26sdata-3DpRjL8L1MQvG-252FJsLHjH9QrOVnyjpFqithv326FVuunVg-253D-26reserved-3D0&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=hPkFdbQkvyyK5f30HEDajOVdLRrYxuJY8XbyaAK8bGw&e=>>
More names in the PQ circle than in the HLG circle.
[cid:image001.png@01D406F0.5E0A7B90]

Lars

Best regards,
_Craig
________________________________
From: Lars Borg <borg@adobe.com<mailto:borg@adobe.com>>
Sent: 10 June 2018 04:53:58
To: Craig Revie; Phil Green
Cc: Leonard Rosenthol; Max Derhak; Simon Thompson-NM; public-colorweb@w3.org<mailto:public-colorweb@w3.org>
Subject: Re: Help with HLG profile

Hi Craig,

A very sensible question.
Rec. 2100 gives you two reference decodings for HLG media: scene referred and  display referred @ 1000 nits.
Scene versus display is not a metadata item, but rather a choice by the reader.
Presumably the camera operator optimized the camera rendering for a pleasing appearance on the reference display, so 'scene' is a misnomer, as it's always been also with Rec. 709.

At the consumer end, each HDR TV set (PQ or HLG) is expected to apply some form of re-rendering based on its assumed luminance (assumed as it is not actually measured or calibrated). That would be your metadata source. However, we should not expect that they implement anything like what's indicated in Rec. 2100. Power limits, motion enhancements, user preferences, etc. are out of scope for Rec.2100. This makes it rather meaningless to try to model anything other than the reference display.
In a separate expert forum, a majority stated that even the best consumer HDR TV sets are so bad, they should never be used of color grading.
So this begs the question, who would need a parameterized HLG decoder?

BTW, the HLG scene profile is trivially implementable in V4. The reference display profile is a little bit more complex in V4, but doable.

Thanks,
Lars Borg  |  Principal Scientist  |  Adobe  |  p. 408.536.2723  |  c. 408.391.9479  |  borg@adobe.com<mailto:borg@adobe.com>


On 6/8/18, 9:46 PM, "Craig Revie" <Craig.Revie@FFEI.co.uk<mailto:Craig.Revie@FFEI.co.uk><mailto:Craig.Revie@FFEI.co.uk><mailto:Craig.Revie@FFEI.co.uk%3E>> wrote:

Hi Max and Phil,
I hesitate to ask as this may show my ignorance but I had thought that HLG encodes relative scene luminance and does not carry any metadata. In my understanding the choice of the reference white is made by the camera/cameraman shooting the scene and the camera signal is encoded as HLG which makes this model ideal for broadcast television.
You seem to anticipate having max and min luminance values - where do these come from?
What does the profile represent and how do you anticipate it being used?
As I say, this may just show my ignorance...
_Craig

On 8 Jun 2018, at 23:18, Phil Green <green@colourspace.demon.co.uk<mailto:green@colourspace.demon.co.uk><mailto:green@colourspace.demon.co.uk><mailto:green@colourspace.demon.co.uk%3E>> wrote:

In my understanding yes, but you would not be able to pass in the max and min luminance so would need a different profile for each condition supported.
Phil

On 08/06/2018 16:28, Leonard Rosenthol wrote:
Can this profile be defined WITHOUT the calculator?

Leonard

From: Max Derhak <Max.Derhak@onyxgfx.com<mailto:Max.Derhak@onyxgfx.com>><mailto:Max.Derhak@onyxgfx.com>
Date: Wednesday, June 6, 2018 at 2:43 PM
To: Simon Thompson-NM <Simon.Thompson2@bbc.co.uk<mailto:Simon.Thompson2@bbc.co.uk>><mailto:Simon.Thompson2@bbc.co.uk>, "public-colorweb@w3.org<mailto:public-colorweb@w3.org>"<mailto:public-colorweb@w3.org> <public-colorweb@w3.org<mailto:public-colorweb@w3.org>><mailto:public-colorweb@w3.org>
Subject: RE: Help with HLG profile
Resent-From: <public-colorweb@w3.org<mailto:public-colorweb@w3.org>><mailto:public-colorweb@w3.org>
Resent-Date: Wednesday, June 6, 2018 at 2:43 PM

Simon has indeed helped me, and I'm very grateful for his assistance.  As it turns out there was a small bug in my matlab/octave code.

Based on my matlab/octave code I have been able to create iccMAX profiles for both narrow range and full range encoding of Rec 2100 using Hgl curves. I created XML representations of the profiles and used the iccFromXML tool from the reference implementation to create the profiles for testing using the iccMAX reference implementation CMM.

Significant features of these prototype profiles include:

  *   Full floating point based algorithmic encoding of RGB to XYZ and XYZ to RGB tags using calculator processing elements
  *   Uses a D65 based PCS
  *   Max and min luminances can be passed in as CMM environment variables to adjust Hlg curves
  *   The profiles use display relative Y=100 for max white for default relative intent processing
  *   Scene relative luminances can be supported with CMM luminance matching using the spectral viewing conditions tag (part of Profile Connection Conditions - PCC) illuminant white point luminance.  (Note: Using CMM control options for luminance based matching by CMM allows for scene relative luminances to be used and adjusted for.  The PCC can be externally substituted to define an alternate white point luminance for luminance scaling, but matching values for CMM environment variables are also needed to ensure that the corresponding Hlg curves are used).

An Interoperability Conformance Specification (ICS) is needed to help define specific profile and CMM requirements for this type of profile.  That is something that the ICC display working group will work on.

None of this can be directly accomplished using a single V4 profile.  V4 uses display relative with a D50 PCS (with chromatic adaptation applied).  By playing around with defining the media white point one can achieve a level of luminance scaling using absolute intent with some possible interoperability issues to contend with.  The Hlg algorithm cannot be directly encoded in v4, and LUTs and inverse LUTs using interpolation need to be populated for a fixed scene luminance condition with different profiles created for different scene luminances.

Regards,

Max Derhak (PhD)
Principal Scientist

From: Simon Thompson-NM [mailto:Simon.Thompson2@bbc.co.uk]
Sent: Tuesday, May 29, 2018 8:52 AM
To: public-colorweb@w3.org<mailto:public-colorweb@w3.org><mailto:public-colorweb@w3.org>
Subject: Re: Help with HLG profile


Hi Chris, all,

I've already sent a code correction to Max, so hopefully that will fix the issue he was seeing.

I'll comment further inline, below.

From: Chris Lilley [mailto:chris@w3.org]
On 17-May-18 23:25, Max Derhak wrote:
Hi,

I have an action item in the ICC Display Working group to develop an HLG based iccMAX display profile.  In order to understand how to go about it I've first prototyped functionality in Matlab/Octave.

The attached zip file has my code along with the BT2100 document that I'm using to implement..  The problem is that for a display profile you need to have both device to PCS as well as PCS to device transforms.

The device to PCS transform is conceptually implemented in HLG_FullToXYZ.m, and the PCS to device is conceptually implemented in HLG_XYZToFull.m  (I'm using full 0.0 to 1.0 range encoding in this case).  Alternatively one could use the HLG_Narrow12ToXYZ.m and HLG_XYZToNarrow12.m to use the narrow 12-bit integer encoding.

The problem is that the OOTF (implemented in HLG_EOTF.m)  and inverse OOTF (implemented in HLG_invEOTF) functions are not logical inverses of each other (from what I can tell from the docs).
I recall hearing that these are not round-trippable.
The equations for HLG are reversible and we have used the reverse transforms for converting other HDR formats to HLG, converting SDR camera feeds to HLG and converting HLG to SDR.  An example use case is given in: https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Cborg%40adobe.com%7Cc687232c3775491b340f08d5cea18e1c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636642115681164772&sdata=eICRO%2Fa3ISG%2Fxm50j3UznDG5HMD1Ekf5YHc6uCxp5%2F4%3D&reserved=0<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Clrosenth%40adobe.com%7Cb810dc2b06c24735603308d5cbdd749e%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636639074360711277&sdata=R%2BFSqmXWBTIxF8EfODM8ZR83S4jkiC8YwMpWTlFzi4Q%3D&reserved=0><https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Cborg%40adobe.com%7Cc687232c3775491b340f08d5cea18e1c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636642115681164772&sdata=eICRO%2Fa3ISG%2Fxm50j3UznDG5HMD1Ekf5YHc6uCxp5%2F4%3D&reserved=0%3Chttps://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Clrosenth%40adobe.com%7Cb810dc2b06c24735603308d5cbdd749e%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636639074360711277&sdata=R%2BFSqmXWBTIxF8EfODM8ZR83S4jkiC8YwMpWTlFzi4Q%3D&reserved=0%3E>

On the other hand the BBC claim that the transcode looks "identical"
A more complete description of the process of transcoding between HDR formats is given in ITU-R BT.2390 section 7: https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.itu.int%2Fdms_pub%2Fitu-r%2Fopb%2Frep%2FR-REP-BT.2390-4-2018-PDF-E.pdf&data=02%7C01%7Cborg%40adobe.com%7Cc687232c3775491b340f08d5cea18e1c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636642115681164772&sdata=ZLkH19GPIpEJ3v48wDdMaoIkMDT8aGfgOleSjeXPVoI%3D&reserved=0<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.itu.int%2Fdms_pub%2Fitu-r%2Fopb%2Frep%2FR-REP-BT.2390-4-2018-PDF-E.pdf&data=02%7C01%7Clrosenth%40adobe.com%7Cb810dc2b06c24735603308d5cbdd749e%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636639074360711277&sdata=cXEstDUoMa2r2vIP1fmlx57urfV5uyxbR2hOEjEFbQc%3D&reserved=0><https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Fwww.itu.int-252Fdms-5Fpub-252Fitu-2Dr-252Fopb-252Frep-252FR-2DREP-2DBT.2390-2D4-2D2018-2DPDF-2DE.pdf-26data-3D02-257C01-257Cborg-2540adobe.com-257Cc687232c3775491b340f08d5cea18e1c-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C1-257C636642115681164772-26sdata-3DZLkH19GPIpEJ3v48wDdMaoIkMDT8aGfgOleSjeXPVoI-253D-26reserved-3D0-253Chttps-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Fwww.itu.int-252Fdms-5Fpub-252Fitu-2Dr-252Fopb-252Frep-252FR-2DREP-2DBT.2390-2D4-2D2018-2DPDF-2DE.pdf-26data-3D02-257C01-257Clrosenth-2540adobe.com-257Cb810dc2b06c24735603308d5cbdd749e-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C1-257C636639074360711277-26sdata-3DcXEstDUoMa2r2vIP1fmlx57urfV5uyxbR2hOEjEFbQc-253D-26reserved-3D0-253E&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=HAls4oHhDh5_ukItxlqchnDbigUZr7aU8v2oiQkEURc&e=>
*Also it would be incredibly helpful to better understand what is the purpose of having an HLG ICC profile?
There are a few general use cases that we've thought of so far which apply equally to all HDR variants:
*       Storage of subtitles and other still images - currently still formats like PNG can only store a gamma value in the header files - the only way to store non-gamma encoded images is to embed an ICC Profile.  Currently for PQ HDR, there's a draft proposal that looks for a given ICC Profile filename in the header which then over-rides both the header and profile.  We would prefer to have the correct ICC profiles available.
*       Allowing operating systems to correctly display images on non-gamma displays.
*       IP disptribution platforms (both PC based and Set-top box will need to display video and background images) - e.g. a video embedded in a webpage, you may have a video box in HDR and a surrounding webpage in sRGB, both of which need to be correctly displayed and need an ICC profile description.

Any transforms from SDR to HDR will need to place the SDR diffuse white at the correct signal level (given in ITU-R BT.2408) - I'm not sure how this is encoded in to the transforms.

Best Regards

Simon

--
Simon Thompson MEng CEng MIET
Project R&D Engineer
BBC Research and Development South Laboratory


________________________________
[FFEI Limited]<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.ffei.co.uk&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636641272535313819&sdata=Q1alNsLsCwIaIzgnNuFnw45OG31IyZE1TqVlXo8XBCM%3D&reserved=0<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttp-253A-252F-252Fwww.ffei.co.uk-26data-3D02-257C01-257Cborg-2540adobe.com-257C95e992930a4b4a0f85c308d5cddd420a-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C1-257C636641272535313819-26sdata-3DQ1alNsLsCwIaIzgnNuFnw45OG31IyZE1TqVlXo8XBCM-253D-26reserved-3D0&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=NlShWGLY4425BLKMbZY1nj53_9dynJrv1Qhekx9iygg&e=>>
CONFIDENTIALITY AND DISCLAIMER NOTICE
This message and any attachment is confidential and is protected by copyright. If you are not the intended recipient, please email the sender and delete this message and any attachment from your system.

Dissemination and or copying of this email is prohibited if you are not the intended recipient. We believe, but do not warrant, that this email and any attachments are virus free. You should take full responsibility for virus checking.

No responsibility is accepted by FFEI Ltd for personal emails or emails unconnected with FFEI Limited's business.

FFEI Limited is a limited company registered in England and Wales (Registered Number: 3244452).
[Join us on Linked In]<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.linkedin.com%2Fcompany%2Fffei&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=e4Mg0RADSEh5xmatm7jdcxGZ9Og0E9bnIj%2BAA5HK2kM%3D&reserved=0>[Follow<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttp-253A-252F-252Fwww.linkedin.com-252Fcompany-252Fffei-26data-3D02-257C01-257Cborg-2540adobe.com-257C95e992930a4b4a0f85c308d5cddd420a-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C0-257C636641272535313819-26sdata-3De4Mg0RADSEh5xmatm7jdcxGZ9Og0E9bnIj-252BAA5HK2kM-253D-26reserved-3D0-253E-255bFollow&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=gX2wDckyrsRAgWY7-0gHE_oQxrCLaPJFQm_pJQWsLu0&e=> @FFEI_ltd]<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftwitter.com%2FFFEI_ltd&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=WAHld0A%2BGPGtnugMfF52J1cfJignDk0J4%2FFrsHGZ%2F34%3D&reserved=0>[FFEI<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Ftwitter.com-252FFFEI-5Fltd-26data-3D02-257C01-257Cborg-2540adobe.com-257C95e992930a4b4a0f85c308d5cddd420a-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C0-257C636641272535313819-26sdata-3DWAHld0A-252BGPGtnugMfF52J1cfJignDk0J4-252FFrsHGZ-252F34-253D-26reserved-3D0-253E-255bFFEI&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=I_KPUcuVXRJ4No2M8E2ahbbQFnDSYCChgM4FyXVMS7s&e=> YouTube Channel]<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.youtube.com%2Fuser%2FFFEIPrintTechnology&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=5G6oBnD43AAKg976Cs1M1%2FlXFf0Urol%2BjaQoIDY3CTI%3D&reserved=0<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttp-253A-252F-252Fwww.youtube.com-252Fuser-252FFFEIPrintTechnology-26data-3D02-257C01-257Cborg-2540adobe.com-257C95e992930a4b4a0f85c308d5cddd420a-257Cfa7b1b5a7b34438794aed2c178decee1-257C0-257C0-257C636641272535313819-26sdata-3D5G6oBnD43AAKg976Cs1M1-252FlXFf0Urol-252BjaQoIDY3CTI-253D-26reserved-3D0&d=DwMD-g&c=lI8Zb6TzM3d1tX4iEu7bpg&r=_F90w7eDD4MBjtHx1p0KJg&m=f9S0uKXPL5_fSv3jznSyJZ_qgCoxxxJn9V3SZiYe2-4&s=120cWwTsDgIoII1V8rkRY_rGzw_g31jHxr8OH7WA8tg&e=>>
Registered Office: The Cube, Maylands Avenue, Hemel Hempstead, Hertfordshire, HP2 7DF, England.





--

Dr Tim Borer MA, MSc, PhD, CEng, MIET, SMIEEE, Fellow SMPTE

Lead Engineer

Immersive & Interactive Content

BBC Research & Development

BBC Centre House, 56 Wood Lane, London  W12 7SB

T:  +44 (0)30304 09611  M: +44 (0)7745 108652

Received on Tuesday, 31 July 2018 19:18:16 UTC