- From: Eric Gauthier <eric@gauthier.centre.edu>
- Date: Mon, 12 Feb 1996 23:17:31 -0500 (EST)
- To: www-html@w3.org
- Cc: eric@gauthier.centre.edu (Eric Gauthier)
Well, First I'd like to clear up something I said in my original posting about bandwidth. I messed up. I meant that it might put an unnecessary drain on their system/processor load. Not that parsing a short HTML file is difficult, but I'm sure they are constantly trying to speed up their database searches. Second of all, there has been a lot of discussion about what "errors" should pass and what "extra" stuff should pass (like netscape-acisms). I believe the whole point of this was to give people incentives to conform to the HTML standards. All we would have to do is say "Yes, this page has no errors according to RFCxx". Now, if the page also contains java links, extensions, or non-standard tags -- Great! But, I was under the impression that we just wanted a way to cut down on pages with blatent errors in them and pages which, although a particular browser can handle, do not conform and can not be viewed. Now, you might add a second category like: Netscape Enhanced <yes/no> Jave/VRML <yes/no> Or something like that, but it should be EXTRA. The rating should merely reflect the standard. Netscape, although adding a bunch of extra stuff, should still be able to view the original. Oh well, this seems to have sparked MUCH debate. Maybe this is a problem which needs to be tested in practice instead of just debated in theory. I don't have an HTML parser, but it shouldn't be to difficult to write. Does anyone just want to do this and see where it takes us? Eric :) PS: sorry about talking about netscape so much, its the browser I'm most familiar with. No bias intended... :)
Received on Monday, 12 February 1996 23:18:02 UTC