Re: Help with HLG profile

P.S. For more information see:
World Cup 2018 in UHD HDR on BBC iPlayer 
(https://www.bbc.co.uk/rd/blog/2018-05-uhd_hdr_world_cup_2018).
The Royal Wedding in High Dynamic Range 
(https://www.bbc.co.uk/rd/blog/2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr)
Tim

On 18/06/2018 09:46, Tim Borer wrote:
>
> There are one or two inaccuracies in the thread. So I though it 
> worthwhile to introduce a few facts to correct these. Hope this is useful.
>
> It IS reasonable to describe HLG as relative scene luminance. There 
> has been a lot of debate by proponents of PQ (which is 
> absolute/non-relative display referred encoding) that this is not 
> really so. Nevertheless we designed HLG precisely as relative scene 
> luminance (just like Rec.709).
>
> I have often heard it said that there is no standard Rec.709 
> production (because camera operators, it is claimed, universally 
> adjust their cameras). The claim is that by tweaking the camera the 
> picture somehow becomes display referred. Even if it were true that 
> cameras are always adjusted (not so) this would not make the signal 
> display referred. If you doubt this simply look at the dimensions of 
> the signal. HLG is dimensionless (a relative signal) and PQ has 
> dimensions of candelas per square metre (nits). All that adjusting a 
> 709/HLG camera does is to produce an “artistic” modification to the 
> signal. The signal still represents relative scene referred light, 
> just not the actual scene (but, rather, one that the producer wished 
> had existed). Adjusting the camera does not convert a dimensionless 
> signal into a dimensional one.
>
> In fact a great deal of live television is produced using a standard 
> Rec709 OETF. This includes almost all live sport (especially soccer), 
> live light entertainment, and news. This encompasses a large part of 
> broadcast television output. In sport it is often a contractual 
> obligation to use the Rec 709 OETF. In other instances the producers 
> often do not like knees because, as typically implemented, then can 
> distort flesh tones. A further consideration it that in a multicamera 
> shoot it is essential to match all the cameras. This is difficult if 
> they don’t use a standard Rec 709 setting (often cameras may have 
> different firmware versions, which means that setting up a camera does 
> not necessarily mean the same thing  even on the same model of 
> camera). It is not the case that “the camera’s linearity response is 
> tweaked by the operator for various reasons”. This is not really 
> viable in a multicamera live shoot, and the shaders don’t have time to 
> do it live. Similarly for live production the gamut is standard Rec 
> 709 (if necessary clipped from the wider camera taking gamut). This is 
> necessary to ensure consistent colours for sporting strips both 
> between cameras and at different venues and different games. It is not 
> unusual for sporting strips to be outside 709, and it is important 
> that such colours are treated in a consistent way, so gamut mapping, 
> other than simple clipping, is not viable  (bear in mind that footage 
> from different games may often be shown as part of the commentary). 
> So, to emphasise, the fact is that a great deal of television IS 
> produced using the standard Rec 709 OETF using standard 709 gamut 
> (without gamut mapping).
>
> Scene referred conversions need to be used appropriately. When used 
> appropriately they do not create significant colour shifts. Scene 
> referred conversion should be used when matching camera outputs (as 
> opposed to matching the picture that is seen on the display – they are 
> not the same). Display referred conversions are used to ensure that 
> the displayed image is the same. There are different, distinct, use 
> cases for these two types of conversion. It is a mistake to assume 
> that all conversions should be display referred. As an example 
> consider the recent coverage of the Royal Wedding. This was shot using 
> a mixture of HLG HDR cameras and HD cameras (using Rec 709). The 
> production architecture was that shown in ITU-R Report BT.2408-1 2018 
> (note the -1 version) (freely available at 
> https://www.itu.int/pub/R-REP-BT.2408-1-2018), figure 4, page 14. In 
> this workflow you will note that there are many scene referred Rec 709 
> to HLG conversions. This is so that pictures can be shaded using 
> standard Rec 709 monitors (which is a requirement when the majority of 
> viewers are watching in 709). Note, in particular, that the final Rec 
> 709 SDR output is converted from the HLG signal using a scene referred 
> conversion. We estimate that the Royal Wedding was viewed by 1.8 
> billion viewers. The international feed was derived from the HLG 
> signal as described, using a scene referred conversion. The colour was 
> NOT distorted using this conversion (and, clearly, the producers would 
> not have allowed distortion for such a prestigious broadcast). On the 
> other hand if one were producing primarily for an HDR audience then 
> you would use the alternative architecture in BT.2408 (fig 3, page 
> 12). Here shading is performed primarily on the HDR monitor (with an 
> SDR monitor fed by a DISPLAY referred conversion, so that the shader 
> can check that nothing untoward is happening on the SDR signal). Note 
> that in a joint HDR SDR production you can give primacy to either 
> shading in HDR (for a majority HDR audience) or to shading in SDR (for 
> a mainly SDR audience – the current situation). But you cannot 
> prioritise both. We have found that shading in SDR gives very good 
> quality HDR as well as SDR. NHK have in mind producing for a primarily 
> HDR audience and, therefore, they favour the production workflow of 
> figure 3. To summarise scene referred conversion do not produce 
> significant colour shifts when they are used properly. Indeed without 
> using scene referred conversions it is not possible to match the look 
> of HDR and SDR output when shading in SDR.
>
> There are few HDR broadcasts in Europe yet though a number are in the 
> pipeline. To the best of my knowledge all these broadcasts will be 
> using HLG. Similarly broadcasters that produce live content (such as 
> sport) in the US also favour HLG. The BBC has made our Planet Earth II 
> series available in UHD HDR HLG on catch up (OTT) television, and are 
> showing some of the Soccer World Cup matches is HLG HDR OTT. Most OTT 
> movie distribution currently uses PQ (HDR10). This is possible because 
> of the non-live workflow. I am unaware of live sporting events 
> broadcast using PQ. YouTube distribute in both HLG and HDR10. Since 
> production is easier in HLG (particularly with pro-summer equipment) 
> most of the user generated HDR content on YouTube is HLG.
>
> Best regards,
> Tim
>
> Dr Tim Borer MA, MSc, PhD, CEng, MIET, SMIEEE, Fellow SMPTE
> Lead Engineer
> Immersive & Interactive Content
> BBC Research & Development
> BBC Centre House, 56 Wood Lane, London  W12 7SB
> T:  +44 (0)30304 09611  M: +44 (0)7745 108652
> On 12/06/2018 05:20, Lars Borg wrote:
>> Hello Craig,
>>
>> See below.
>>
>>
>> On 6/9/18, 9:12 PM, "Craig Revie" <Craig.Revie@FFEI.co.uk 
>> <mailto:Craig.Revie@FFEI.co.uk>> wrote:
>>
>>     Hi Lars,
>>
>>     Thanks for your reply.
>>
>>     First of all on terminology, is it reasonable to describe HLG as
>>     encoding relative scene luminance?
>>
>>
>> Not really. I wouldn’t.
>> It’s an approximation of scene colorimetry just like 709 or 2020.
>> How accurate is it in real production?
>> And what accuracy do you need? Why?
>> I haven’t seen any published tests.
>> But here’s what we know from 709 production.
>> I’ve yet to find anyone that sets their camera in reference 709 mode 
>> in real production.
>> Typically 709 productions do not use the standard OETF found in the 
>> 709 spec.
>> The camera’s linearity response is tweaked by the operator for 
>> various reasons.
>> Extra suppression of the darks as a means to reduce noise.
>> And the typical knee for highlight rolloff. (Which HLG now includes 
>> in a standard way)
>> Maybe there are options for gamut mapping (which is non-linear)
>> We usually do’t call these tweaks color grading, but rather camera setup.
>> Either tweak changes the contrast relations from true scene colorimetry.
>> These tweaks are not indicated in metadata (as HLG has no such, and 
>> other formats seem to be proprietary or incomplete), so you have no 
>> knowledge on how to undo them in post.
>>
>>     I am not very familiar with the technical details of the cameras
>>     used for broadcast television but my assumption was that the
>>     controls allow white balance, aperture control and selection of
>>     filters. I also assume that the sensor is linear
>>
>>
>> So far I agree with you.
>>
>>     and that there is minimal processing of the signal. If so, the
>>     assumption of relative scene luminance (as described in a number
>>     of papers) would seem to be at least a reasonable approximation.
>>     Are these assumptions incorrect?
>>
>>
>> I doubt it. As I noted above, there are many in-camera processing 
>> options. So the signal might be non-linear.
>>
>>
>>     I think that for the uses outlined by Simon on this thread, it
>>     would be helpful to have a V4 ICC profile even if there are some
>>     limitations (which there may well be). Is this something you
>>     could provide? It seems to me that the more useful of the two
>>     profiles would be the ‘HLG scene profile’ although depending on
>>     the result of our discussion about terminology this may need a
>>     different description.
>>
>>
>> I have yet to find a use case for an HLG scene profile.
>> Please explain how you would use it, and what workflow.
>>
>> Primarily I see post production uses for an HLG reference display 
>> profile.
>> With this profile I can mix display-referred content across HLG, PQ, 
>> 709 media.
>> For example, the colors in a commercial are display-referred and the 
>> repurposing to another media should preserve those colors. 
>>  Scene-referred conversions wouldn’t cut it, so I would not use an 
>> HLG scene profile for this.
>>
>>
>>     On the point about ‘grading’, in talking to people from the BBC
>>     and NHK, I understand that the main reason to develop HLG was
>>     that for many use cases there is no opportunity for grading, for
>>     example for live broadcasts.
>>
>>
>> Grading, no.
>> But camera matching, yes, also for live broadcasts.
>> It is the rare event where all cameras are of the same make, model, 
>> and revision, including lens and lights.
>> In practice, cameras at major sports events include both HDR and 709 
>> cameras, different sensors, multiple brands, different light setups, 
>> different codecs.
>> Same sports jersey => very different colors.
>> So some effort is often spent to make cameras produce similar colors 
>> on output.
>> Same sports jersey => similar colors.
>> Most likely that matching process undermines true scene colorimetry.
>> BT.2087 shows conversions from 709 to 2020.
>> My study shows that doing this ‘scene’-referred (case 2) creates 
>> significant color shifts.
>>
>>     In practice, I would have thought that PQ would be used for
>>     grading of film production and would be converted to HLG at the
>>     time of broadcast.
>>
>>
>> Are there movie HLG broadcasts in Europe yet?
>> PQ for movie grading, yes.
>> Although Adobe Premiere is agnostic and lets you grade 
>> (display-referred) HLG content as well.
>> But it seems movie distribution is mostly PQ (HDR10), not HLG. Think 
>> Netflix, etc.
>> I like this diagram from Yoeri Geutskens 
>> <https://www.linkedin.com/in/yoerigeutskens/>
>> More names in the PQ circle than in the HLG circle.
>>
>> Lars
>>
>>
>>     Best regards,
>>     _Craig
>>     ________________________________
>>     From: Lars Borg <borg@adobe.com <mailto:borg@adobe.com>>
>>     Sent: 10 June 2018 04:53:58
>>     To: Craig Revie; Phil Green
>>     Cc: Leonard Rosenthol; Max Derhak; Simon Thompson-NM;
>>     public-colorweb@w3.org <mailto:public-colorweb@w3.org>
>>     Subject: Re: Help with HLG profile
>>
>>     Hi Craig,
>>
>>     A very sensible question.
>>     Rec. 2100 gives you two reference decodings for HLG media: scene
>>     referred and  display referred @ 1000 nits.
>>     Scene versus display is not a metadata item, but rather a choice
>>     by the reader.
>>     Presumably the camera operator optimized the camera rendering for
>>     a pleasing appearance on the reference display, so ‘scene’ is a
>>     misnomer, as it’s always been also with Rec. 709.
>>
>>     At the consumer end, each HDR TV set (PQ or HLG) is expected to
>>     apply some form of re-rendering based on its assumed luminance
>>     (assumed as it is not actually measured or calibrated). That
>>     would be your metadata source. However, we should not expect that
>>     they implement anything like what’s indicated in Rec. 2100. Power
>>     limits, motion enhancements, user preferences, etc. are out of
>>     scope for Rec.2100. This makes it rather meaningless to try to
>>     model anything other than the reference display.
>>     In a separate expert forum, a majority stated that even the best
>>     consumer HDR TV sets are so bad, they should never be used of
>>     color grading.
>>     So this begs the question, who would need a parameterized HLG
>>     decoder?
>>
>>     BTW, the HLG scene profile is trivially implementable in V4. The
>>     reference display profile is a little bit more complex in V4, but
>>     doable.
>>
>>     Thanks,
>>     Lars Borg  |  Principal Scientist  |  Adobe  |  p.
>>     408.536.2723  |  c. 408.391.9479  | borg@adobe.com
>>     <mailto:borg@adobe.com>
>>
>>
>>     On 6/8/18, 9:46 PM, "Craig Revie" <Craig.Revie@FFEI.co.uk
>>     <mailto:Craig.Revie@FFEI.co.uk><mailto:Craig.Revie@FFEI.co.uk>
>>     <mailto:Craig.Revie@FFEI.co.uk%3E>> wrote:
>>
>>     Hi Max and Phil,
>>     I hesitate to ask as this may show my ignorance but I had thought
>>     that HLG encodes relative scene luminance and does not carry any
>>     metadata. In my understanding the choice of the reference white
>>     is made by the camera/cameraman shooting the scene and the camera
>>     signal is encoded as HLG which makes this model ideal for
>>     broadcast television.
>>     You seem to anticipate having max and min luminance values -
>>     where do these come from?
>>     What does the profile represent and how do you anticipate it
>>     being used?
>>     As I say, this may just show my ignorance...
>>     _Craig
>>
>>     On 8 Jun 2018, at 23:18, Phil Green
>>     <green@colourspace.demon.co.uk
>>     <mailto:green@colourspace.demon.co.uk><mailto:green@colourspace.demon.co.uk>
>>     <mailto:green@colourspace.demon.co.uk%3E>> wrote:
>>
>>     In my understanding yes, but you would not be able to pass in the
>>     max and min luminance so would need a different profile for each
>>     condition supported.
>>     Phil
>>
>>     On 08/06/2018 16:28, Leonard Rosenthol wrote:
>>     Can this profile be defined WITHOUT the calculator?
>>
>>     Leonard
>>
>>     From: Max Derhak <Max.Derhak@onyxgfx.com
>>     <mailto:Max.Derhak@onyxgfx.com>><mailto:Max.Derhak@onyxgfx.com>
>>     Date: Wednesday, June 6, 2018 at 2:43 PM
>>     To: Simon Thompson-NM <Simon.Thompson2@bbc.co.uk
>>     <mailto:Simon.Thompson2@bbc.co.uk>><mailto:Simon.Thompson2@bbc.co.uk>,
>>     "public-colorweb@w3.org
>>     <mailto:public-colorweb@w3.org>"<mailto:public-colorweb@w3.org>
>>     <public-colorweb@w3.org
>>     <mailto:public-colorweb@w3.org>><mailto:public-colorweb@w3.org>
>>     Subject: RE: Help with HLG profile
>>     Resent-From: <public-colorweb@w3.org
>>     <mailto:public-colorweb@w3.org>><mailto:public-colorweb@w3.org>
>>     Resent-Date: Wednesday, June 6, 2018 at 2:43 PM
>>
>>     Simon has indeed helped me, and I’m very grateful for his
>>     assistance.  As it turns out there was a small bug in my
>>     matlab/octave code.
>>
>>     Based on my matlab/octave code I have been able to create iccMAX
>>     profiles for both narrow range and full range encoding of Rec
>>     2100 using Hgl curves. I created XML representations of the
>>     profiles and used the iccFromXML tool from the reference
>>     implementation to create the profiles for testing using the
>>     iccMAX reference implementation CMM.
>>
>>     Significant features of these prototype profiles include:
>>
>>       *   Full floating point based algorithmic encoding of RGB to
>>     XYZ and XYZ to RGB tags using calculator processing elements
>>       *   Uses a D65 based PCS
>>       *   Max and min luminances can be passed in as CMM environment
>>     variables to adjust Hlg curves
>>       *   The profiles use display relative Y=100 for max white for
>>     default relative intent processing
>>       *   Scene relative luminances can be supported with CMM
>>     luminance matching using the spectral viewing conditions tag
>>     (part of Profile Connection Conditions – PCC) illuminant white
>>     point luminance.  (Note: Using CMM control options for luminance
>>     based matching by CMM allows for scene relative luminances to be
>>     used and adjusted for.  The PCC can be externally substituted to
>>     define an alternate white point luminance for luminance scaling,
>>     but matching values for CMM environment variables are also needed
>>     to ensure that the corresponding Hlg curves are used).
>>
>>     An Interoperability Conformance Specification (ICS) is needed to
>>     help define specific profile and CMM requirements for this type
>>     of profile.  That is something that the ICC display working group
>>     will work on.
>>
>>     None of this can be directly accomplished using a single V4
>>     profile.  V4 uses display relative with a D50 PCS (with chromatic
>>     adaptation applied).  By playing around with defining the media
>>     white point one can achieve a level of luminance scaling using
>>     absolute intent with some possible interoperability issues to
>>     contend with.  The Hlg algorithm cannot be directly encoded in
>>     v4, and LUTs and inverse LUTs using interpolation need to be
>>     populated for a fixed scene luminance condition with different
>>     profiles created for different scene luminances.
>>
>>     Regards,
>>
>>     Max Derhak (PhD)
>>     Principal Scientist
>>
>>     From: Simon Thompson-NM [mailto:Simon.Thompson2@bbc.co.uk]
>>     Sent: Tuesday, May 29, 2018 8:52 AM
>>     To: public-colorweb@w3.org
>>     <mailto:public-colorweb@w3.org><mailto:public-colorweb@w3.org>
>>     Subject: Re: Help with HLG profile
>>
>>
>>     Hi Chris, all,
>>
>>     I’ve already sent a code correction to Max, so hopefully that
>>     will fix the issue he was seeing.
>>
>>     I’ll comment further inline, below.
>>
>>     From: Chris Lilley [mailto:chris@w3.org]
>>     On 17-May-18 23:25, Max Derhak wrote:
>>     Hi,
>>
>>     I have an action item in the ICC Display Working group to develop
>>     an HLG based iccMAX display profile.  In order to understand how
>>     to go about it I’ve first prototyped functionality in Matlab/Octave.
>>
>>     The attached zip file has my code along with the BT2100 document
>>     that I’m using to implement..  The problem is that for a display
>>     profile you need to have both device to PCS as well as PCS to
>>     device transforms.
>>
>>     The device to PCS transform is conceptually implemented in
>>     HLG_FullToXYZ.m, and the PCS to device is conceptually
>>     implemented in HLG_XYZToFull.m  (I’m using full 0.0 to 1.0 range
>>     encoding in this case).  Alternatively one could use the
>>     HLG_Narrow12ToXYZ.m and HLG_XYZToNarrow12.m to use the narrow
>>     12-bit integer encoding.
>>
>>     The problem is that the OOTF (implemented in HLG_EOTF.m)  and
>>     inverse OOTF (implemented in HLG_invEOTF) functions are not
>>     logical inverses of each other (from what I can tell from the docs).
>>     I recall hearing that these are not round-trippable.
>>     The equations for HLG are reversible and we have used the reverse
>>     transforms for converting other HDR formats to HLG, converting
>>     SDR camera feeds to HLG and converting HLG to SDR.  An example
>>     use case is given in:
>>     https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Cborg%40adobe.com%7Cc687232c3775491b340f08d5cea18e1c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636642115681164772&sdata=eICRO%2Fa3ISG%2Fxm50j3UznDG5HMD1Ekf5YHc6uCxp5%2F4%3D&reserved=0<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Clrosenth%40adobe.com%7Cb810dc2b06c24735603308d5cbdd749e%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636639074360711277&sdata=R%2BFSqmXWBTIxF8EfODM8ZR83S4jkiC8YwMpWTlFzi4Q%3D&reserved=0>
>>     <https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Cborg%40adobe.com%7Cc687232c3775491b340f08d5cea18e1c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636642115681164772&sdata=eICRO%2Fa3ISG%2Fxm50j3UznDG5HMD1Ekf5YHc6uCxp5%2F4%3D&reserved=0%3Chttps://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.bbc.co.uk%2Frd%2Fblog%2F2018-05-ultra-high-definition-dynamic-range-royal-wedding-uhd-hdr&data=02%7C01%7Clrosenth%40adobe.com%7Cb810dc2b06c24735603308d5cbdd749e%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636639074360711277&sdata=R%2BFSqmXWBTIxF8EfODM8ZR83S4jkiC8YwMpWTlFzi4Q%3D&reserved=0%3E>
>>
>>     On the other hand the BBC claim that the transcode looks "identical"
>>     A more complete description of the process of transcoding between
>>     HDR formats is given in ITU-R BT.2390 section 7:
>>     https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.itu.int%2Fdms_pub%2Fitu-r%2Fopb%2Frep%2FR-REP-BT.2390-4-2018-PDF-E.pdf&data=02%7C01%7Cborg%40adobe.com%7Cc687232c3775491b340f08d5cea18e1c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636642115681164772&sdata=ZLkH19GPIpEJ3v48wDdMaoIkMDT8aGfgOleSjeXPVoI%3D&reserved=0<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.itu.int%2Fdms_pub%2Fitu-r%2Fopb%2Frep%2FR-REP-BT.2390-4-2018-PDF-E.pdf&data=02%7C01%7Clrosenth%40adobe.com%7Cb810dc2b06c24735603308d5cbdd749e%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636639074360711277&sdata=cXEstDUoMa2r2vIP1fmlx57urfV5uyxbR2hOEjEFbQc%3D&reserved=0>
>>     <https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.itu.int%2Fdms_pub%2Fitu-r%2Fopb%2Frep%2FR-REP-BT.2390-4-2018-PDF-E.pdf&data=02%7C01%7Cborg%40adobe.com%7Cc687232c3775491b340f08d5cea18e1c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636642115681164772&sdata=ZLkH19GPIpEJ3v48wDdMaoIkMDT8aGfgOleSjeXPVoI%3D&reserved=0%3Chttps://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.itu.int%2Fdms_pub%2Fitu-r%2Fopb%2Frep%2FR-REP-BT.2390-4-2018-PDF-E.pdf&data=02%7C01%7Clrosenth%40adobe.com%7Cb810dc2b06c24735603308d5cbdd749e%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636639074360711277&sdata=cXEstDUoMa2r2vIP1fmlx57urfV5uyxbR2hOEjEFbQc%3D&reserved=0%3E>
>>     *Also it would be incredibly helpful to better understand what is
>>     the purpose of having an HLG ICC profile?
>>     There are a few general use cases that we’ve thought of so far
>>     which apply equally to all HDR variants:
>>     ·       Storage of subtitles and other still images – currently
>>     still formats like PNG can only store a gamma value in the header
>>     files – the only way to store non-gamma encoded images is to
>>     embed an ICC Profile.  Currently for PQ HDR, there’s a draft
>>     proposal that looks for a given ICC Profile filename in the
>>     header which then over-rides both the header and profile.  We
>>     would prefer to have the correct ICC profiles available.
>>     ·       Allowing operating systems to correctly display images on
>>     non-gamma displays.
>>     ·       IP disptribution platforms (both PC based and Set-top box
>>     will need to display video and background images) – e.g. a video
>>     embedded in a webpage, you may have a video box in HDR and a
>>     surrounding webpage in sRGB, both of which need to be correctly
>>     displayed and need an ICC profile description.
>>
>>     Any transforms from SDR to HDR will need to place the SDR diffuse
>>     white at the correct signal level (given in ITU-R BT.2408) – I’m
>>     not sure how this is encoded in to the transforms.
>>
>>     Best Regards
>>
>>     Simon
>>
>>     --
>>     Simon Thompson MEng CEng MIET
>>     Project R&D Engineer
>>     BBC Research and Development South Laboratory
>>
>>
>>     ________________________________
>>     [FFEI
>>     Limited]<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.ffei.co.uk&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C1%7C636641272535313819&sdata=Q1alNsLsCwIaIzgnNuFnw45OG31IyZE1TqVlXo8XBCM%3D&reserved=0>
>>     CONFIDENTIALITY AND DISCLAIMER NOTICE
>>     This message and any attachment is confidential and is protected
>>     by copyright. If you are not the intended recipient, please email
>>     the sender and delete this message and any attachment from your
>>     system.
>>
>>     Dissemination and or copying of this email is prohibited if you
>>     are not the intended recipient. We believe, but do not warrant,
>>     that this email and any attachments are virus free. You should
>>     take full responsibility for virus checking.
>>
>>     No responsibility is accepted by FFEI Ltd for personal emails or
>>     emails unconnected with FFEI Limited's business.
>>
>>     FFEI Limited is a limited company registered in England and Wales
>>     (Registered Number: 3244452).
>>     [Join us on Linked
>>     In]<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.linkedin.com%2Fcompany%2Fffei&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=e4Mg0RADSEh5xmatm7jdcxGZ9Og0E9bnIj%2BAA5HK2kM%3D&reserved=0>[Follow
>>     <https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.linkedin.com%2Fcompany%2Fffei&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=e4Mg0RADSEh5xmatm7jdcxGZ9Og0E9bnIj%2BAA5HK2kM%3D&reserved=0%3E[Follow>
>>     @FFEI_ltd]<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftwitter.com%2FFFEI_ltd&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=WAHld0A%2BGPGtnugMfF52J1cfJignDk0J4%2FFrsHGZ%2F34%3D&reserved=0>[FFEI
>>     <https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftwitter.com%2FFFEI_ltd&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=WAHld0A%2BGPGtnugMfF52J1cfJignDk0J4%2FFrsHGZ%2F34%3D&reserved=0%3E[FFEI>
>>     YouTube
>>     Channel]<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.youtube.com%2Fuser%2FFFEIPrintTechnology&data=02%7C01%7Cborg%40adobe.com%7C95e992930a4b4a0f85c308d5cddd420a%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C636641272535313819&sdata=5G6oBnD43AAKg976Cs1M1%2FlXFf0Urol%2BjaQoIDY3CTI%3D&reserved=0>
>>     Registered Office: The Cube, Maylands Avenue, Hemel Hempstead,
>>     Hertfordshire, HP2 7DF, England.
>>
>

Received on Monday, 18 June 2018 09:17:41 UTC