W3C home > Mailing lists > Public > public-html@w3.org > September 2013

Re: The picture element: complexity

From: Reinier Kaper <rp.kaper@gmail.com>
Date: Thu, 12 Sep 2013 15:35:56 -0400
Message-ID: <CAAz96OtB7XMw9=yFtRg4n_kjL6Nc65Vp16-zt9OEqjcm5eJVDQ@mail.gmail.com>
To: Anselm Hannemann <info@anselm-hannemann.com>
Cc: Simon Pieters <simonp@opera.com>, "public-html@w3.org" <public-html@w3.org>
On 12 September 2013 14:58, Anselm Hannemann <info@anselm-hannemann.com>wrote:

> On 12.09.2013, at 18:07, Reinier Kaper <rp.kaper@gmail.com> wrote:
> On 12 September 2013 11:52, Simon Pieters <simonp@opera.com> wrote:
>> On Thu, 12 Sep 2013 17:45:13 +0200, Reinier Kaper <rp.kaper@gmail.com>
>> wrote:
>>  And there's no way for a browser to detect *if* CSS will be parsed?
>> Unless CSS has been disabled altogether, it has to assume that there will
>> be CSS to be downloaded and parsed.
> Okay, so no issue here then, correct?
>>  So you'd end up with two scenarios:
>>> 1. No CSS will be parsed on this page, therefore load src[0];
>>> 2. CSS *will* be parsed on this page, therefore don't start downloading
>>> image resources yet;
>> The delay in downloading the image is not acceptable.
> I'm confused by this. What exactly is unacceptable about this?
> I can only imagine that the "delay" you're talking about would be the
> delay between the server's response and the actual rendering of the CSS,
> which is dependent on the device and connection speed, right?
> So if a device is slow in rendering CSS (either a slow device or a poor
> connection) how would it benefit from downloading image sources in the
> mean-time?
> Also, is there any evidence that this is "unacceptable"? As in: use cases
> that show websites become unusable / inaccessible because image sources
> aren't being downloaded as fast as possible?
> Wouldn't the "cost" of a solution like this greatly outweigh any
> "solution" where HTML and CSS become tied-in again?
> Again I'm sorry if it's my limited knowledge or if this has been passed
> before, but it just doesn't seem logical to me that an image source should
> be downloaded as fast as possible without looking at any kind of context.
> Hi, no worries about your knowledge. This is why we are here and we're all
> happy to help each other.
> But in this case you need to trust me when I have to comply with Simon.
> What you propose is against the nowadays implemented lookahead prefetcher
> which establish a http connection and the http request before anything else
> is interpreted.
> This guarantees that images can be served quite quickly while in parallel
> CSS is parsed. You can't think of a browser only executing one task at a
> time. It's much more complex and the system it works today is basically why
> IE6 is so slow showing a website and Webkit for example is so fast doing
> that task. Just trust me when I'm saying we had several approaches and
> discussions about all this in RICG but came to the reasonable solution
> along with browser vendors that we can't do what you request.
> Fortunately a inline media query is parseable due to a little trick some
> guy found when he created a Blink based picture supporting build which does
> work with all performance tricks a browser do.
> -Anselm

Okay thanks for being patient with me and explaining why it isn't a good
I understand how the prefetching works and why it's essential for (amongst
other things) speed.

The tie-in between HTML and CSS still doesn't sound like a solution to me
though. Although media queries might not be limited to CSS, it does make a
dependency between mark-up and styling, which should be avoided.
Received on Thursday, 12 September 2013 19:36:23 UTC

This archive was generated by hypermail 2.4.0 : Saturday, 9 October 2021 18:46:05 UTC