Re: Tone Mapping Mentioned in Monday's Meeting

I think the only requirement I would put on any tonemapping operator
candidates anybody would like to offer up would simply be that the operator
shouldn't be too expensive to compute; perhaps something that can be done
in a typical fragment shader without too much additional work or data.

In my personal efforts with this, I ended up using just a slightly modified
Reinhard operator, where I let the enduser tune a handful of knobs and
decide for themselves. My formula was borrowed from a great GDC talk from
Timothy Lottes of AMD:

https://www.gdcvault.com/play/1023512/Advanced-Graphics-Techniques-Tutorial-Day

Around 17:30, he clearly begins with Reinhard, and then slowly adds in some
tuning. My goal was enduser flexibility/agency though, not One Set Of
Coefficients To Rule Them All.

I imagine those in this mailing list that have dedicated their life to
color theory probably have some pretty outstanding candidates they could
come up with. I think as long as they aren't miserable to implement on a
GPU, I'm all ears!


On Thu, May 13, 2021, 8:38 PM Lars Borg <borg@adobe.com> wrote:

> Joe,
>
>
>
> Yes, that’s the sad truth.
>
> So maybe we should just provide the most basic tone mapping, as it will
> get messed up anyhow?
>
> Lars
>
>
>
> *From: *Joe Drago <jdrago@netflix.com>
> *Date: *Thursday, May 13, 2021 at 5:29 PM
> *To: *Lars Borg <borg@adobe.com>
> *Cc: *Mike Smith <miksmith@attglobal.net>, Simon Thompson <
> Simon.Thompson2@bbc.co.uk>, "public-colorweb@w3.org" <
> public-colorweb@w3.org>
> *Subject: *Re: Tone Mapping Mentioned in Monday's Meeting
>
>
>
>
> Thanks for that explanation of the "unspoken tension"; it was a great
> read! I have noticed a similar disconnect, although I think what I've seen
> is with another group. Perhaps I'll define a (c).
>
> Your explanation of the group of content creators (a) is spot on. The
> phrase I've heard from multiple companies (including my own) is "HDR is
> just a container" (or "look"), meaning that it is something we can store
> our creative intent into, not any kind of hard requirement on how dynamic
> the content contained must actually be. If someone wants to film a
> purposefully drab / partially-desaturated piece of content because it helps
> them tell the story, so be it. In their minds, HDR provides them a means to
> re-create that precisely drab experience in the home-theater-enthusiast
> enduser's home. They appreciate the potential accuracy this provides.
>
> My run-in with (b) is more from an enduser's perspective, when wrangling
> picture modes on TVs while implementing HDR output in our clients, and
> doing side-by-side tests with unsuspecting coworkers. More on that in a
> minute.
>
> I'd like to categorize a group (c) as "the consumer", and how we sell HDR
> to them. For those in this mailing list that haven't tried this yet, ask
> someone in your life that doesn't work with HDR daily but is otherwise
> techy, and ask them to explain what having HDR buys you. You will get a
> range of replies, and I'll bet you very little (if any) of the replies
> you'll get will be what Group (a) considers it to be. In fact, our
> side-by-side tests fly directly in the face of that. You can ask someone
> "if you could pick between seeing exactly what the cinematographer intended
> or a modified/enhanced version from your TV's hardware", many people will
> say "I'd love to see the original". However, if you put them in a room with
> two identical high-end TVs, and one is configured for accuracy (Cinema
> Mode, etc) and the other one is oversaturated and overbright (Vivid Mode,
> etc), they almost always pick the vivid TV as the "better" one. They might
> even say "oh this one is clearly HDR, and this one probably has it off."
>
> This perception from group (c) is at odds with group (a), unfortunately.
> If some content happens to look near-identical when HDR is disabled due to
> not really leveraging the ranges HDR provides, it *technically *isn't
> incorrect to still mark it HDR in your content browser, as long as it was
> authored into an HDR container, retains creative intent, etc. However, if
> the consumer watches a handful of titles marked as HDR, then purposefully
> disables HDR and the content doesn't appear to change, what then? Will they
> feel ripped off, or is it simply a question of consumer education?
>
> I think the product folks from group (b) exacerbate this, knowingly. TV
> manufacturers know that they're going to have their flagship TV adjacent to
> their competitors' TVs at Best Buy, and they know that most consumers won't
> research HDR before coming into the store, and will just look for "vibrant"
> or "bright". Unfortunately, it is in the TV maker's financial interests to
> boost the incoming signal so that people will pick their TV, so it is tough
> to blame them for this strategy. All of these things fly in the face of
> "creative intent" / "HDR is just a container".
>
> The reality of all of this is most endusers will *never *see what the
> cinematographer saw, period. For every home theater that went out of its
> way to balance the lighting and sound and buy the best TV they could, there
> are thousands of people buying the $300 "HDR" TV that was on sale,
> configured in Standard or Vivid Mode, and put it in a living room with a
> sliding glass door or large window, and they are trying to watch your HDR
> content during the brightest part of the afternoon. Even the most brilliant
> tonemapping operators we could standardize on can't make something that was
> graded on a Dolby Pulsar look remotely similar to a TV that is struggling
> to hit 100% of BT709 @ 150 nits.
>
> Crushing these larger color volumes into the comparatively tiny ones that
> modern displays offer is going to be half science and half art. I'd love
> for group (a) to realize that labeling things "HDR" creates expectations
> for group (c), and betraying that might lose us customers. I'd love group
> (b) to care more about accuracy and perhaps not try to sell a TV as HDR if
> it can't hit some reasonable minimum bar of excellence. I'd love group (c)
> to realize what HDR really means to these other two groups, and use that to
> make informed decisions about how to configure the hardware they bought,
> and perhaps when to draw the blinds in their living room. Unfortunately,
> these are all unlikely to change for us in the near future, and we're just
> going to have to decide for ourselves. Someone is going to take our
> carefully crafted tonemapping operator standardized output and watch it in
> Vivid Mode anyway. [grin]
>
>
>
> On Thu, May 13, 2021 at 8:02 PM Lars Borg <borg@adobe.com> wrote:
>
> Mike,
>
>
>
> Thanks for elaborating on the issues.
>
> As common with any gamut mapping, all of the discussed methods perform
> poorly for some content.
>
> Most are tuned to diffuse white at 200/300 nits, while iPain 12 in some
> tests put it at 800 nits.
>
> And max RGB mapping seems to amplify blue noise more than luminance
> mapping does.
>
>
>
> Given that there is no perfect method, what to do?
> Should we search for a good method, over some set of test cases? (This has
> frustrated me for some time)
>
> Or should we provide a simple, low-cost, basic method (how about none?),
> and advice devs that they can do better?
>
>
>
> Lars
>
>
>
> *From: *Mike Smith <miksmith@attglobal.net>
> *Date: *Thursday, May 13, 2021 at 2:12 PM
> *To: *Lars Borg <borg@adobe.com>, Simon Thompson <
> Simon.Thompson2@bbc.co.uk>, "public-colorweb@w3.org" <
> public-colorweb@w3.org>
> *Subject: *Re: Tone Mapping Mentioned in Monday's Meeting
>
>
>
> Unfortunately I missed the last call but it sounds like interesting topics
> were discussed.
>
> Perhaps now is a good time to mention that our SMPTE paper "Color Volume
> and Hue-preservation in HDR Tone Mapping" examined the performance of the
> different BT.2390 EETF tonemapping methods and proposed an additional
> variant based on a MaxRGB norm.  I think the MaxRGB variant will be
> included in a future ITU-R BT report.  SMPTE just granted open-access to
> our paper, here is the link:
>
> https://ieeexplore.ieee.org/document/9086648
> <https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fieeexplore.ieee.org%2Fdocument%2F9086648&data=04%7C01%7Cborg%40adobe.com%7C506cd5bdc8194c1bab8908d91688439c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C637565597401150527%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=OJ4aUx1MYZ%2Fv0nyjkRNcn4eyysF8rHZvagk6YHSKfD8%3D&reserved=0>
>
> My understanding is that the BT.2390 EETF rolloff is a hermite spline not
> a beizer.  I've attached my excel sheets that perform the basic BT.2390
> EETF and ST2094-10 tonemapping curve computations using the equations from
> those documents.  The BT.2390 EETF takes luminance parameters of the
> mastering display (Lw,Lb) and the target display (Lmax, Lmin).  If the
> MaxCLL value is known to be good and reliable, Lw should be set equal to
> MaxCLL instead of using the ST2086 max luminance value.  This point about
> using MaxCLL is missing from the BT.2390 report.  The EETF also does not
> alter the midtone slope or gain of the output, which could be important
> basic adjustment when viewing the content outside of a reference viewing
> environment.  Other types of spline curves could be designed to take as
> input a simple slope (contrast) or gain adjustment as well.  ST2094-10
> tonemapping includes content-based parameters (min/average/max) in addition
> to target luminance min/max.  The ST2094-10 framework also includes
> additional "potentially creative" adjustments corresponding to
> min/average/max offset values, tonemapping offset/gain/gamma and midtone
> slope/contrast n.
>
> A reference [1] to a display-referred format conversion algorithm
> describes SDR BT.709 to HDR10 conversion steps.  References to
> scene-to-display referred conversions have been defined as the OOTFs in
> BT.2100 in addition to the advanced process defined in ACES CTL code [2].
>
> I also want to provide some background on the topic of recommendation vs
> report vs example vs standard for tonemapping.  There is perhaps an
> unspoken tension between content creators, product designers and engineers
> that may not be obvious to all.  Below is a quick summary of my
> understanding of it from my point of view working with episodic television
> and movie content creators and the consumer electronics industry:
>
> a) Content creators want their content shown as it appeared to them when
> it was created and finished, usually in a dark room on a great display.
> This is the "display-referred" paradigm or the older
> what-you-see-is-what-you-get (WYSIWYG) paradigm.  Acknowledging that the
> content will be shown on worse-performing displays and in less ideal
> viewing environments is an obvious commercial reality but is difficult to
> address for content creators, unless different versions are specifically
> created (dark room great display, dark room bad display, bright room with
> small display, etc.).   There can be a lot of variation across different
> types of content, because content creators also don't want always use the
> same range of colors and luminance for all shows.  For example, there is a
> creative desire to visually differentiate a depressing movie taking place
> in a low contrast desert from an energetic children's cartoon.  This
> variation also exists in SDR but the variation is less because the palette
> and range are much reduced.
>
> b) Product designers want some flexibility to differentiate the way their
> products look from competitors, this is especially true for traditional
> television products and seems to be becoming more important for the
> computer/mobile display markets.  To differentiate the products the
> designers often want to change the way the image looks on the television to
> make it "look better" or differentiated vs. their completion.  If the
> products follow a standard tonemapping approach, the product designers
> feared their products will look identical to the competitors, and this
> leads to a fear of being marginalized by cheaper but otherwise identically
> looking products.  During the introduction of HDR Video several years ago,
> television companies embraced the "scene-referred" paradigm because it was
> being promoted as "display-independent".  The television companies thought
> the format was designed to give the display flexibility on how the image is
> displayed and which played into their desire to differentiate and control
> their products.  I believe the original goal of introducing the
> "scene-referred" paradigm in HDR Video was to standardize on a high-dynamic
> range log-like camera output format that could be mixed and matched and
> inter-cut across different camera products and camera brands within a
> broadcast facility rather than dealing with the multitude of proprietary
> log formats (LogC, SLog, SLog2, Slog3, RedLog, CLog, VLog, FLog, etc),
> which seemed like a good goal to me but it seems that this message may have
> been lost somehow and instead HLG promotion turned into a "display
> independent" message.  I don't know if today's cameras that output HLG can
> be mixed and matched and inter-cut with other brands and products that also
> output scene-referred HLG.
>
> c) Engineers that aren't attempting to differentiate their products
> basically want to be told what they should do, then do it, and then move on
> to the next engineering project but agreement between a) and b) about
> tonemapping was never reached so here we are.  Display or format logo
> licensing programs often have some certification requirements that must be
> followed within some tolerance but those are usually private programs that
> don't openly share implementation details and performance tolerances.
> Companies offering logo/format licensing programs also have flexibility in
> what they require on a case by case basis and who they want to do business
> with.  In this logo/format licensing case, implementors are generally
> guided about what to do based on what the certification program requires.
> The openly available reports and examples also can help guide an
> implementor into doing something that makes sense in different situations
> when logo/format licensing applications are not in play.
>
> Thanks,
> Mike
>
> [1]
> https://movielabs.com/ngvideo/MovieLabs_Mapping_BT.709_to_HDR10_v1.0.pdf
> <https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmovielabs.com%2Fngvideo%2FMovieLabs_Mapping_BT.709_to_HDR10_v1.0.pdf&data=04%7C01%7Cborg%40adobe.com%7C506cd5bdc8194c1bab8908d91688439c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C637565597401160520%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=2qHI7ndIjiCo%2B%2FXqgtlxdIntFsaevDrsxhcvec9kf3c%3D&reserved=0>
>
> [2]
> https://github.com/ampas/aces-dev/blob/v1.3/transforms/ctl/outputTransform/rec2020/RRTODT.Academy.Rec2020_1000nits_15nits_HLG.ctl
> <https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fampas%2Faces-dev%2Fblob%2Fv1.3%2Ftransforms%2Fctl%2FoutputTransform%2Frec2020%2FRRTODT.Academy.Rec2020_1000nits_15nits_HLG.ctl&data=04%7C01%7Cborg%40adobe.com%7C506cd5bdc8194c1bab8908d91688439c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C637565597401160520%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=%2FV3ZssV1DHfrz8i3vcmi79OfS%2BMtRjj%2FRxXmgt0BAck%3D&reserved=0>
>
> https://github.com/ampas/aces-dev/blob/v1.3/transforms/ctl/outputTransform/rec2020/RRTODT.Academy.Rec2020_1000nits_15nits_ST2084.ctl
> <https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fampas%2Faces-dev%2Fblob%2Fv1.3%2Ftransforms%2Fctl%2FoutputTransform%2Frec2020%2FRRTODT.Academy.Rec2020_1000nits_15nits_ST2084.ctl&data=04%7C01%7Cborg%40adobe.com%7C506cd5bdc8194c1bab8908d91688439c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C637565597401170512%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=FUOsAF2xqHUHqckXMb5eFDXRXivV8dtVsLcPRCqZ9z8%3D&reserved=0>
>
> On 5/12/2021 1:02 PM, Lars Borg wrote:
>
> Indeed.
>
> Note that BT.2390 is not an ITU recommendation, but a report.
>
> The mapping in 5.4.1 is stated as an example.
>
> This example does not desaturate colors to the display primaries, so is
> not practically usable.
>
> I’ve implemented the BT.2390 5.4.1 example, and found that the bezier was
> not easily adjustable for different roll-offs.
>
> BT.2408 is also a report, not a recommendation.
>
> The standard SMPTE ST 2094 parts 10, 20, 40 provide other tone mapping
> functions for PQ content. Yet again, only examples, but they include
> desaturation as well.
>
> It seems the industry is still searching for a best practice for tone
> mapping.
>
> Let’s hope for success.
>
>
>
> Lars
>
>
>
> *From: *Simon Thompson <Simon.Thompson2@bbc.co.uk>
> <Simon.Thompson2@bbc.co.uk>
> *Date: *Wednesday, May 12, 2021 at 3:02 AM
> *To: *"public-colorweb@w3.org" <public-colorweb@w3.org>
> <public-colorweb@w3.org> <public-colorweb@w3.org>
> *Subject: *Tone Mapping Mentioned in Monday's Meeting
> *Resent-From: *<public-colorweb@w3.org> <public-colorweb@w3.org>
> *Resent-Date: *Wednesday, May 12, 2021 at 2:58 AM
>
>
>
>
>
>
>
> Hi Lars, all,
>
>
>
> In Monday's meeting a tone-mapping for PQ was discussed.
>
>
>
> It is available in
> https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2390-8-2020-PDF-E.pdf
> <https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.itu.int%2Fdms_pub%2Fitu-r%2Fopb%2Frep%2FR-REP-BT.2390-8-2020-PDF-E.pdf&data=04%7C01%7Cborg%40adobe.com%7C506cd5bdc8194c1bab8908d91688439c%7Cfa7b1b5a7b34438794aed2c178decee1%7C0%7C0%7C637565597401180508%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=tZbMWt%2FJjAouF59tmSaSLo7FjaCdxzhdptsJnHfEN6M%3D&reserved=0>
> section 5.4.1
>
>
>
> Note that the ITU updates from the Feb Meetings are appearing and I think
> in the newest version, this may have moved to BT.2408, but that is not
> available on the website yet.
>
> Best Regards
>
>
>
> Simon
>
>
>
> --
>
> *Simon Thompson MEng CEng MIET*
> Senior R&D Engineer
>
> *BBC Research and Development South Laboratory*
>
>
>
>
>
>

Received on Friday, 14 May 2021 04:11:28 UTC