- From: David Woolley <forums@david-woolley.me.uk>
- Date: Sun, 31 Aug 2008 12:22:48 +0100
- To: www-style@w3.org
Chris Murphy wrote: > OK first we need to get the terminology correct. Yes few people actually > understand a gamma function. Gamma by itself makes no sense, it is a I accept that true display responses are different form any mathematical idealisation. > gamma function. And a gamma function is used to define tone reproduction > curves. It's not correct to refer to a tone reproduction curve as gamma. This would make more sense if gamma correction actually used a mathematical gamma function. It wouldn't surprise me if the etymology actually refers to the response curve rather than a particular approximation. The function used in typical software is actually a power function. I think the term dates back to silver halide media, when one wouldn't have been applying numerical corrections. (The sRGB functions seems to be a power function for all except very small values, where it is linear.) > > Now perhaps you're talking about digital cameras that shoot only into > JPEG, and tone mapping has already been performed. This is not a linear Yes. That is the type of camera used for many web site images. > > >> In the simple case of a CRT display, the output is approximately >> proportional to the 2.2th power of the input, which means that a 50% >> absolute brightness on the display corresponds to a 72% input value, >> or a 50% input value produces only a 21% output value, i.e. if you >> feed a linear image into a CRT (sRGB colour space) it will look dark. > > Yes but this is not how a digital camera works. I wasn't talking about cameras. I was assuming a camera with an ideal linear energy response, and then looking at how it would display on a typical display (which is excessively dark). Camera correction needs to be the inverse of the display response. > L* defines the perceptual tone response of human vision, it has no exact > gamma function for defining it, but could be approximated with gamma > 2.4. When you account for display flare however, a display tone Interesting. Strange that a standard chosen on the basis of minimum manufacturing cost should coincidentally be optimum. Maybe I should see if there is any open (i.e. not on pay per view sites) literature on the underlying research. > reproduction curve defined by gamma 2.2 makes sense. There are various > stories as to how Apple arrived at a TRC defined by gamma 1.8, but all > are related to approximating the tone response of output devices (in > particular, maybe, the original Apple Laserwriter). > > >> However, this is a red herring in terms of making the display >> brightness match the scene brightness, as the same eye companding >> curve is used in both cases, and one can't directly measure the >> brain's brightness value, and it wouldn't be useful to do so. > > It is certainly possible to measure the stimulus even if we cannot > measure the response. Clearly we cannot, in the vast majority of cases, > compel display white luminance to match scene white luminance. They're > several orders of magnitude in difference at this point. I wasn't talking about ultimate dynamic range, only about the behaviour within the common dynamic range. Within that range, you want the overall scene to display output transfer function to be linear, i.e. net gamma 1.0, which, given that sensors tend to be linear, anyway, means that you need to apply the inverse of the display response in the camera, or post-processing. > > > If the exposure is wrong, white point is wrong, and thus tone mapping > may too aggressive. Yes you would need to adjust both midtone and white > point in order to correct the image. Actually, for the pure power law correction used by typical image manipulation programs, the maths turns out to result in white level and "gamma" corrections being orthogonal, when one ignores quantisation considerations. (This is not true of black level errors.) > > The assertion that camera manufacturer's are routinely improperly tone > mapping captures is interesting but unproven in your posts. When The assertion was that camera manufacturers were routinely not tone mapping at all, not that they were doing it wrongly. You more or less accept this yourself, by saying that professional cameras capture the raw readings. The problem is that a lot of users don't realise that they then need to apply this correction before putting the image on a web site. They assume the camera produces an image that already complies with the web standards. It may well turn out that the problem images come from would be professionals who are using cameras that are too sophisticated for their level of knowledge, but such images are quite common. > adjusting either white point, or midtone in the examples you supplied, > the shadows are very noisy which is a classic case of underexposure, not > improper tone mapping. > > Further the suggestion that the "example of images that are clearly > gamma 1.0" makes zero sense. Tone mapping that would result in an image > TRC defined by gamma 1.0 would not make it out of R&D let alone for But youi said that that is exactly what professional cameras do. Given that sensor count photons, the natural response of a camera is gamma 1.0. > prototype manufacture, let alone for mass production, let alone for sale > among consumers. It simply would not happen, and you're saying it's > common. It's just ridiculous. > > -- David Woolley Emails are not formal business letters, whatever businesses may want. RFC1855 says there should be an address here, but, in a world of spam, that is no longer good advice, as archive address hiding may not work.
Received on Sunday, 31 August 2008 11:23:25 UTC