- From: Sander Tekelenburg <tekelenb@euronet.nl>
- Date: Fri, 8 Dec 2006 19:39:12 +0100
At 02:42 +1100 UTC, on 2006-12-09, Lachlan Hunt wrote: > Sander Tekelenburg wrote: >> [...] errors that result in 'good' looking pages in Explorer, and >> 'bad' in HTML5 browsers. Simply by producing code that they know will result >> in 'bad' pages when parsed in accordance with the HTML5 parsing rules. > > That might be theoretically possible, but the algorithm in the spec has > been designed to be as compatible with the existing web as physically > possible. I suspect that it would be quite difficult to find such a > hack on purpose that wouldn't also break compatibility with the existing > web. OK. That's a pretty convincing argument, although it still leaves room for the question what your suspicion is based on. Is a conscious effort being made to define the HTML5 parse error spec such that it leaves as little room as possible for such a hack? > Besides, Microsoft aren't out to attack like that, they appear to be > trying to win back the trust of web developers. Maybe. But with things like laws and specs you shouldn't only look at today's situation. It takes years to define and then implement such things, so they need to aim for usability in a not entirely foreseeable future. I name Microsoft as today's obvious example, but no more than that. There will be others. As long as a party can gain something it will try -- especially when it stands a chance at succeeding. It is the limitability of that chance I'm thinking about. > I'm sure they realise > that such a move would only be detrimental to the web, not to mention > themselves, and not beneficial in any way. Why wouldn't it be beneficial? When you control the market, you can ask any price you like; you can operate in a much more comfortable stability then when you're just one of many competitors. -- Sander Tekelenburg The Web Repair Initiative: <http://webrepair.org/>
Received on Friday, 8 December 2006 10:39:12 UTC