Re: [mediaqueries] Making the 'color' query static, like CSSOM colorDepth?

On Fri, Aug 2, 2013 at 7:19 AM, Brad Kemper <brad.kemper@gmail.com> wrote:
> On Aug 1, 2013, at 8:13 PM, "Tab Atkins Jr." <jackalmage@gmail.com> wrote:
>
>> On Thu, Aug 1, 2013 at 7:54 PM, Zack Weinberg <zackw@panix.com> wrote:
>>> Do we need to be able to distinguish a medium capable of grayscale from a
>>> medium capable only of black and white? I can see this being relevant for
>>> print if nothing else, and it's not clear to me which you mean by
>>> "monochrome".
>>
>> I doubt it.  Much like the actual color depth is fairly unimportant
>> (at worst, you can dither to get close to what you specified), whether
>> you're monochrome or grayscale is fairly unimportant (again, you can
>> dither if necessary).
>
> Dithering to get grayscale can look pretty bad, and would be a good reason to know the device capabilities if you knew that was a high probability, so you could deliver a very high-contrast version of the design instead. You probably wouldn't use that for pages on the general Web, but it might be a factor if you are using Web technologies in some other vertical application like equipment displays in a factory or something.

If you're dealing with a device low-tech enough to have white/black
pixels only, it's likely also got an extremely low resolution, and
probably a weak processor underneath it too. You're not going to be
putting content on that without explicitly designing for the device,
which means you don't need to be able to discriminate in CSS.

And realize that you're talking about *really* low-tech displays.
E-ink screens are true monochrome, for example (I think), but they
dither to achieve grayscale just fine.

> If current browsers are ignoring color depth, it doesn't mean it wouldn't be useful for them to do the right thing, when they can.

The issue is that they really can't, as the final output device color
depth is hard to know.  Plus, it just doesn't really matter - a lot of
screens today dither down to 6bit channels, and you don't even notice.
 Plus plus, it's not clear whether it's the bit depth used for
calculations throughout the system or the bit depth of the output
device that matters, and I don't think people can usefully distinguish
between them anyway.

It's like the luminosity query - we *could* expose the actual light
level in lum, but it wouldn't help, and would more likely just get in
the way of doing the right thing.  It's better to chunk the
information into something useful.

~TJ

Received on Friday, 2 August 2013 16:39:11 UTC