- From: Seeger, Chris (NBCUniversal) <Chris.Seeger@nbcuni.com>
- Date: Wed, 14 Jun 2023 12:20:12 +0000
- To: Pierre-Anthony Lemieux <pal@sandflow.com>, "public-colorweb@w3.org" <public-colorweb@w3.org>
- Message-ID: <BL0PR14MB3795F7C271D6F58FA68B37D9E65AA@BL0PR14MB3795.namprd14.prod.outlook.com>
The only conflict I see between broadcast and computing is with sRGB Gamma 2.2 which conflicts with broadcasts BT.1886 display gamma 2.4 But as Pierre notes, the convergence between the two industries has produced many desktop displays that support BT.1886. Can we converge around BT.1886 in desktop computing and on the canvas? In broadcasting we use reference white as an anchor in a single-master workflow. Graphic White is usually the same or very close to reference white. SDR consumer displays use approximately the same peak white as HDR reference white (see our display luminance survey here): https://www.dropbox.com/s/8f8ceh2a8f3zhqv/Summaries%20-%20MovieLabs-NABA-NBCU%20SDR%20Consumer%20TV%20Luminance%20Survey.pdf?dl=0 An “Effective Gamma” can be calculated by measuring midgray against reference white and it should be consistent in a reference environment between both computing and broadcasting. The effective gamma can be calculated in both SDR and HDR. HDR Midgray=26nits; Reference White=203nits as per BT.2408 and will result in an “Effective Gamma” of 2.4. Best, Chris From: Pierre-Anthony Lemieux <pal@sandflow.com> Date: Wednesday, June 14, 2023 at 12:33 AM To: public-colorweb@w3.org <public-colorweb@w3.org> Subject: [EXTERNAL] Feedback re: Broadcast vs Computing assumption Good morning/evening, Before delving into solution space, e.g. gainmaps, I would like to challenge the assumption, laid out in Chris Cameron's presentation, that "broadcast" and "computing" are fundamentally in conflict. There has in fact been a convergence in the devices, and viewing environments, used for "broadcast" and "computing": mobile devices are used to enjoy linear content, games and balance checkbooks. Graphics content regularly show up in "broadcast" content in the form of UIs and subtitles/captions. Conversely, "broadcast" content regularly shows up in computing, e.g. gallery of scene stills, lens flares, etc. A significant part of the expansion of the web platform in the past decade has been closing the gap between balancing checkbooks and watching movies passively. So I do not see a fundamental conflict between "broadcast" and "computing". Maybe the terms could be improved? Perhaps there is a difference between how linear experiences and interactive experiences are authored/rendered? We need to be more specific before trying to find solutions. Below are specific points/questions. Best, -- Pierre > "Relatively static, controlled environment" vs "Dynamic, uncontrolled environment." The viewing environment has become extremely similar since consumers watch TV and perform computing tasks on largely the same devices and environment, e.g. mobile devices. Do you mean that there are two classes of content and/or experiences: one interactive and one linear? > "Usually the only content on-screen" vs "Shares screen with other applications (SDR, HDR, and other)." TV content is often overlaid with subtitles and captions, and is often mixed with other content, e.g. episode/title/scene galleries. > "Graphics white is largely irrelevant." vs "Graphics white is a critical anchor point, especially for UI and text." All content needs to be comfortable to watch, which means neither too bright nor too dark. > "A display is HDR if it can display HLG and PQ content (with reasonable brightness & contrast)" vs "A display is HDR if it can go brighter than graphics white (HDR headroom > 1)" An HDR display is defined by its physical characteristics, e.g. intraframe min/max luminance, monotonicity of gray scale, etc. How is it related to HLG/PQ encoding and/or brighter than graphics white?
Received on Wednesday, 14 June 2023 12:20:48 UTC