Re: ReSpec toolchain...

On Tue, Jul 15, 2014 at 10:38 AM, Marcos Caceres <w3c@marcosc.com> wrote:
> On July 15, 2014 at 11:45:52 AM, Shane McCarron (shane@aptest.com) wrote:
>> I very much doubt that search engines are not taking the many many seconds
>> to run respec on a page so that they can extract the content. Regardless,
>> there is no need to place that kind of burden on the network or our
>> constituents. Recommendations are NOT living standards. Recommendations
>> are stable documents and must remain so.
>
> Go look at the XML spec and how many revisions it went through - pretending that those 5 (five!!!) recs revisions are not indicative of a living model is... well, delusional. So "stable"(tm), right :P
>
> Also, HTML is a living standard. The W3C can pretend that it's not all it wants by copy/pasting from the WHATWG one, but still doesn't change the fact that it's a living standard.

While this is a lovely conversation, I'm pretty sure "specs are living
documents/no they're stable snapshots" is irrelevant to the main
discussion, which is "we need stable processing for a given spec
snapshot/no it's okay if it changes; people will let us know if it
breaks and they care".

For example, as I said in an earlier response, Bikeshed (the spec
preprocessor CSS and a few others are now using) relies on Shepherd (a
spec-parsing tool written and maintained by Peter Linss) to generate
its cross-spec referencing database.  Shepherd just parses the HTML
and does not run JS, so anything that happens in ReSpec that creates
or influences <dfn> elements will be invisible to Shepherd, and thus
to Bikeshed.  So, having a "stable" version that does not require JS
to render is valuable for that purpose.

(Not to mention, I find the formatting flash frustrating, and it's not
something you can really get rid of.)

~TJ

Received on Tuesday, 15 July 2014 17:46:12 UTC