- From: Erik Reppen <erik.reppen@gmail.com>
- Date: Fri, 10 Aug 2012 19:06:05 -0500
- To: "Tab Atkins Jr." <jackalmage@gmail.com>
- Cc: whatwg@lists.whatwg.org
Sorry if this double-posted but I think I forgot to CC the list. Browser vendor politics I can understand but if we're going to talk about what "history shows" about people like myself suggesting features we can't actually support I'd like to see some studies that contradict the experiences I've had as a web ui developer for the last five years. Everybody seems on board with providing a JavaScript strict mode. How is this any different? Do people blame the vendors when vars they try to define without a var keyword break their strict-mode code? Do we fret about all the js out there that's not written in strict mode? And HTML5 has found the key to eliminating the political issue, I should think. Don't just worry about the rules for when the authors get it right. Explicitly spell out the rules for how to handle it when they get it wrong. How can you blame the browser for strict mode face plants when every modern browser including IE goes about face-planting in exactly the same way? Sure, I could integrate in-editor validation into my process, but why add to bloat to any number of tools I might be using for any number of different stacks when we had something I know worked for a lot of developers who were all as confused as I was when people inexplicably started shouting about XHTML strict's "failure" from the rooftops. Is there some unspoken concern here? If there is, I'll shut up and try to find out what it is through other means but I really don't see the logic in not having some strict provision for authors who want it. How hard is it to plug in an XML validator and rip out the namespace bits if that's not something we want to deal with just yet and propose a set of behaviors for when your HTML5 isn't compliant with a stricter syntax? Because yes, these bugs can be kinda nasty when you don't think to check to make sure your HTML is well-formed and it's the kind of stuff that can easily slide into production as difficult-to-diagnose edge-cases. Believe me. Front-liner here. It's an issue. Markup is where presentation, behavior, content, client-side, and server-side meet. I'm comfortable with letting people embrace their own philosophies but I like my markup to be done right in the first place and visible breakage or at least browser console error messages is the easiest and most obvious way to discover that it isn't. And I developed that philosophy from my experience moving from less strict to strict markup, not just toeing some weird technorati political line or zeitgeist. On Fri, Aug 10, 2012 at 5:44 PM, Tab Atkins Jr. <jackalmage@gmail.com>wrote: > On Fri, Aug 10, 2012 at 3:29 PM, Erik Reppen <erik.reppen@gmail.com> > wrote: > > This confuses me. Why does it matter that other documents wouldn't work > if > > you changed the parsing rules they were defined with to stricter > versions? > > As far as backwards compatibility, if a strict-defined set of HTML would > > also work in a less strict context, what could it possibly matter? It's > only > > the author's problem to maintain (or switch to a more forgiving mode) and > > backwards compatibility isn't broken if the same client 500 years from > now > > uses the same general HTML mode for both. > > > > I think there's a legit need for a version or some kind of mode for HTML5 > > that assumes you're a pro and breaks visibly or throws an error when > you've > > done something wrong. Back in the day nobody ever forced authors who > didn't > > know what they're doing to use doctypes they were too sloppy to handle. I > > wasn't aware of any plan to discontinue non-XHTML doctypes. How everybody > > started thinking of it as a battle for one doctype to rule them all > makes no > > sense to me but I'm fine with one doctype. I just want something that > works > > in regular HTML5 but that will break in some kind of a strict mode when > > XML-formatting rules aren't adhered to. You pick degrees of strictness > based > > on what works for you. I don't really see a dealbreaking issue here. Why > > can't we all have it the way we want it? > > > > As somebody who deals with some pretty complex UI where the HTML and CSS > are > > concerned it's a problem when things in the rendering context give no > > indication of breakage, while in the DOM they are in fact getting tripped > > up. Sure, I can validate and swap out doctypes or just keep running > stuff in > > IE8 to see if it breaks until I actually start using HTML5-only tags but > > this is kind of awkward and suggests something forward-thinking design > could > > address don't you think? > > As I said, years of evidence have provided strong evidence that a > large majority of authors cannot guarantee that their pages are valid > all of the time. This covers both authoring-time validity and > validity after including user comments or the like. > > If you want a mode that guarantees validity, that already exists - > it's called "put a validator into your workflow". Many popular text > editors offer plugins that validate your markup as you go, as well. > > The problem with breaking visibly is that it doesn't punish authors, > it punishes *users*, who overwhelmingly blame the browser rather than > the site author when the site won't display for whatever reason. > There's no *benefit* to a browser for doing this; it's much more in > their interest to continue doing error-recovery, because, again, > history suggests very strongly that most authors *who theoretically > want strict parsing* can't actually satisfy the constrains they ask > for. It's simply better for users to always do soft error-recovery, > no matter what the author claims they want. > > ~TJ >
Received on Saturday, 11 August 2012 00:06:54 UTC