- From: Andi Hindle <andih@harlequin.co.uk>
- Date: Wed, 11 Feb 1998 06:12:21 -0500 (EST)
- To: George Olsen <golsen@2lm.com>
- cc: www-style@w3.org
Hi On 10/02/98 22:49:02, George Olsen <golsen@2lm.com> wrote: [snip] > Yes, the Web is *not* print, and I do believe pages should be > scalable to > multiple browsers, monitors and platforms. However, print has had a > long > time to experiment and develop principles for how to present content > effectively. For example, the guideline that leading needs to be > increased > as line length increases to maintain legibility is a principle that > works > regardless of whether it's a book, a billboard, or a computer monitor. > While admittedly most users weren't aware of this at a conscious level > (since HTML was like the DTP revolution all over again), those of us > who > did were found HTML to be painful to work with. > [snip] > While I realize that many here find using tables for layout > abhorrent, not > providing some sort of <NOCSS-P> capability to author a combined > CSS-P/HTML > 3.2 document makes it difficult for me to transition to CSS-P thus > slowing > adoption of the standard. Clients simply aren't go to pay me double > so that > I can do a CSS-P version and a HTML 3.2 tables version. And > presentation is > important enough to them that they won't accept what happens when an > older > browser tries intrepret a CSS-P page. So I kludge yet again and use > CSS-1 > and tables. George makes some excellent points (thanks, George). Moving to HTML + CSS or even to XML is going to be a painful process. Indeed, with the current level of use agent support (I'm thinking especially of Internet Explorer and Netscape here), it's practically impossible to even start the transitioning process not because the support isn't there, but because the support is so buggy that you can't reliably predict the outcome of using any given CSS element; it's actually better for me today to choose to compromise a design by using HTML 3.2 with no CSS at all. This isn't to say (I hasten to add) that the work on CSS & XML is not important -- far from it! But we are soon going to be presented with a real problem. Even if we allow that there will be a plethora of tools available for generating/designing web pages in many different formats to allow for backwards compatibility without having to over-compromise on design, we are still stuck with the problem of how to deliver these pages seamlessly to the end user based on what support their user agent has for a given markup/other technology. OK, I could ask them, as many sites do. whether they want the XML version or an old HTML 2.0 compliant version. This is partially effective, but does require intervention on the part of the user. The only other way to do this is to have a script that recognizes the incoming HTTP request and sends the appropriate level of markup etc. out. Which is wonderful _except_ that caches are stupid. By which I mean that a cache will interpret a UA request for <http://www.foo.com/> as request the same page _regardless of the fact that a different page is generated for different user agent request_. I've seen this myself accessing a page with a recent UA like NS4.0 and then accessing the same URL with Lynx. Oh dear :-( I appreciate that this may seem off-topic for this list and I apologize if anyone feels that way; but I actually think that this is very important indeed to getting the web to grow the way I'm sure we would all like to see. (I would have sent this to www-proxies, but that list seems to be inactive.). I wonder if anyone knows whether this problem is likely to be addressed by anyone. If not, maybe we should try to figure out what to do about it! ;-) --&e
Received on Wednesday, 18 February 1998 04:37:34 UTC