Re: [csswg-drafts] [css-color-4] Channel clipping breaks author expectations, especially when using 'perceptually uniform' spaces (#9449)

I've had a little more time to evaluate Scale LH, and I think I understand by reservations with it more. I also provide some adjustments that I think correct my issues with the approach and provide better results (results would need more extensive testing to ensure other issues don't exist).

It is stated that it does better in many cases because ∆h is smaller, and while ∆h is smaller I'm not sure it
indicates a better, more correct color. Some of the cases in Scale LH have closer ∆h, but can sacrifice so much
of lightness at the same time that the gamut mapping overall does not retain the intent very well from the original
color.

![scaled_orig](https://github.com/w3c/csswg-drafts/assets/1055125/88e55b91-91d4-47d7-bf1f-6906db4e264b)

This is why we end up with bright clumps of blue in the image below when we gamut map with the algorithm.

![flowers-scale](https://github.com/w3c/csswg-drafts/assets/1055125/08212e40-5980-4951-8781-11dc5aee37ed)

In the original algorithm, the hue does deviate more, there is no denying that, but it holds the hue constant up
until we are within the target JND, so when we do clip, allowing the hue shift, we have a greater ∆h, but the difference
between it and the ideal color with the perfect hue is not that perceptible to the average eye. The hue can be more off, but the color looks better.

![flowers-css](https://github.com/w3c/csswg-drafts/assets/1055125/93334901-bbe0-4651-9072-8b11419094a1)

I realize though that the point of Scale LH is to provide gamut mapped colors faster though, but the results don't
seem better. So can we make the algorithm faster but provide better results?

I thought about this and why the colors are deviating so I tweaked the algorithm. The current scaled algorithm takes
two passes. The first essentially maps in linear display P3, but then adjusts the L and h to essentially only keep the
chroma changes, but then it takes a second pass allowing all OkLCh properties to deviate. This allows a great shift
in lightness.

I wanted to somehow use this approach to just scale chroma and only chroma (as much as possible).

- I decided to take the color to be mapped and create two colors: the OkLCh representation of it and the OkLCh
  representation with zero chroma.
- I convert both colors to the target RGB gamut and, using inverse interpolation, calculate what it would take to
  get each channel back in gamut if interpolating between the original color and the achromatic version of the color. This is done using inverse interpolation.
- Using the channel that requires the greatest interpolation factor, we take that factor and interpolate the chroma
  between the original OkLCh color and the achromatic version.
- Do two passes of this.
- Clip at the end to make sure we are in gamut.
<details>
<summary>Code</summary>

```py
class OkLChScale2(Fit):
    """
    Gamut mapping by scaling.

    Expected gamut mapping spaces are RGB type spaces.
    For best results, linear light RGB spaces are preferred.
    """

    NAME = "oklch-scale2"
    SPACE = "oklch"
    ITERATIONS = 2

    def fit(self, color: Color, space: str, **kwargs: Any) -> None:
        """Scale the color within its gamut but preserve L and h as much as possible."""

        # Requires an RGB-ish space, preferably a linear space.
        if not isinstance(color.CS_MAP[space], RGBish):
            raise ValueError("Scaling only works in an RGBish color space, not {}".format(type(color.CS_MAP[space])))

        # Get the LCh form of the color and the achromatic LCh (fully reduced chroma) of the same color
        orig = color.space()
        mapcolor = color.convert(self.SPACE, norm=False) if orig != self.SPACE else color.clone().normalize(nans=False)
        achroma = mapcolor.clone().set('c', 0)

        # Scale the chroma based on the channel that is furthest out of gamut.
        # Perform this twice.
        for x in range(self.ITERATIONS):
            self.scale(mapcolor, achroma, space)

        # Clip in the target gamut in case we are still out of gamut.
        return color.update(mapcolor.clip(space)).clip(space)

    def scale(self, mapcolor: Color, achroma: Color, space: str) -> None:
        """
        Scale the chroma based on the channel that is furthest out of gamut.

        If the channel is out of gamut, use inverse interpolation to see what factor
        would be needed to get the channel in gamut when interpolating between itself
        and an achromatic version of itself. This is used as a rough approximation to
        then scale the chroma between itself and the achromatic version of itself.
        """

        deltas = []
        for a, b in zip(mapcolor.convert(space).coords(), achroma.convert(space).coords()):
            if a > 1:
                deltas.append(alg.ilerp(a, b, 1))
            elif a < 0:
                deltas.append(alg.ilerp(a, b, 0))

        if deltas:
            mapcolor['c'] = alg.lerp(mapcolor['c'], achroma['c'], max(deltas))
```

</details>

This gives us results closer to the original CSS gamut map algorithm but in two passes. Because we aren't using MINDE,
colors will be mapped to the OkLCh geometry with a smaller JND at times.

![scaled_alt](https://github.com/w3c/csswg-drafts/assets/1055125/18b81a55-2a98-4d32-8063-03704970500f)

We can also see that when gamut mapping the same image as earlier, we no longer get the blue clumps but more natural results similar to the bisect approach.

![flowers-scale2](https://github.com/w3c/csswg-drafts/assets/1055125/e1737acf-6962-4032-bb1c-3313dc62cb9a)

I ran a comparison of the two approaches, randomly selecting Rec2020 colors, and if they were out of gamut, I would
gamut map them in Scale LH and my variant, and I would then compare ∆h. After 3 million+ colors, ∆h was always smaller
in the alternate version and we had better lightness.

[Here](https://facelessuser.github.io/coloraide/playground/?source=https%3A%2F%2Fgist.githubusercontent.com%2Ffacelessuser%2F170f02fb17310e4f85615ea330769b78%2Fraw%2F08c6582304ec5387cf5ee5a95573dcd86bfd0991%2Foklch-scale2.py) I compare the two approaches doing out of gamut gradients across hues. It can be noted that the original Scale LH
has more trouble with consistent colors when lightness is lower. I imagine this helps contribute to some of non-ideal
gamut mapping results. But we can see that the alternate approach matches closer to the original CSS algorithm with
smoother transitions.

In each example the order of algorithms: original CSS, alternate, and Scale LH.  This was done on Chrome with a Display P3 monitor. Results are similar in Safari. In Firefox, the results were mapped in sRGB, but results were similar.

<img width="1344" alt="Screenshot 2024-02-15 at 6 36 12 AM" src="https://github.com/w3c/csswg-drafts/assets/1055125/2d35f06a-75d6-4448-979b-c01c9e4eb430">

Obviously, such an alternate approach would need to be more fully tested, but I think helps illustrate what I think
are the issues with Scale LH and provides a possible, alternate approach that may correct some of the issues and still
provide a faster approach compared to the original algorithm.

-- 
GitHub Notification of comment by facelessuser
Please view or discuss this issue at https://github.com/w3c/csswg-drafts/issues/9449#issuecomment-1946144137 using your GitHub account


-- 
Sent via github-notify-ml as configured in https://github.com/w3c/github-notify-ml-config

Received on Thursday, 15 February 2024 13:55:35 UTC