W3C home > Mailing lists > Public > public-i18n-geo@w3.org > June 2003

Re: backburner question: Checking HTTP Headers (was: Sites to see headers)

From: Martin Duerst <duerst@w3.org>
Date: Tue, 17 Jun 2003 12:36:45 -0400
Message-Id: <4.2.0.58.J.20030617115451.03f70bf0@localhost>
To: Tex Texin <tex@i18nguy.com>
Cc: public-i18n-geo@w3.org, Olivier Thereaux <ot@w3.org>

Hello Tex,

Many thanks for your comments.

At 22:45 03/06/16 -0400, Tex Texin wrote:

>Very good.
>Here are some comments:
>
>1) The notation (X)HTML is confusing. Let's not use it. Earlier, I thought
>this referred to any of XML, XHTML, HTML. I have since learned that it is used
>to refer to only XHTML and HTML and excludes XML.
>Since it is non-standard and not well known, the Q&A should be explicit and
>reference HTML and XHTML.

Done. reinstated the parentheses around it.

Maybe we need a general policy here.


>2) The question is really about checking charset, so we should make it more
>precise. Change
>
>How can I check the HTTP headers with which my Web documents are served?
>to something like:
>
>How can I verify the value of CHARSET in the HTTP headers served with my web
>documents?

Fixed, according to Richard's proposal. Please check.


>3) When describing that the reader should look for "charset=", it should be
>mentioned that it may not be provided.
>They should look for Content-type and confirm that charset is provided and
>then confirm the value associated with it.
>Otherwise people may look for charset and not realize they are looking at a
>content-type with no charset setting.

I added the following note:

Note: The charset parameter may not be present. This is okay if your 
document itself indicates its character encoding.

I don't want to go into too much details.
I hope this helps.


>4) When I use the validator with xhtml it gives me some warning about not
>detecting or using charsets and using some default, if I recall. I can go
>retry it if needed.

Please do so.


>Before we document using the validator, does it work
>properly with xhtml and encodings now?

I think it does. There may be one or two exotic cases that are
not covered yet, and I don't know exactly what it does with
a BOM in UTF-8 (it should give a warning).



>5) in the paragraph on the extended interface, the reference to conversions is
>perhaps confusing. Also the specific use of UTF-8. Perhaps, the text should
>mention that pages should be served with the correct charset for the content
>of the page. The visual check (assuming the right fonts and browser
>configuration) should show the correct characters if the charset encoding is
>correct for the page being served.

okay, changed to "visually check that the source is correctly interpreted",
removing UTF-8. Do you think that's enough?


>6) on transcoding servers- Is there a way to detect if this is going on?
>Something we can tell the reader to evaluate?

I changed the last sentence to:

     This requires special care, because your browser, running e.g. on a
     Mac or on a Windows system, may indicate using a different character
     encoding than the encoding given to you by a Web-based service or the
     W3C Markup Validation Service (which are mostly based on UNIX systems).

I hope this helps. I don't want to go too much into details.

Regards,    Martin.


>hth
>tex
>
>Martin Duerst wrote:
> >
> > I just made a question out of the list of sites to see
> > HTTP headers. Thanks to Andrew and Tex for their help.
> >
> > This is for my next round, so we don't have to
> > discuss this on this Wednesday. Please see
> > http://www.w3.org/International/questions/qa-headers-charset.html
> >
> > Olivier, I have copied you because this mentions the validator.
> >
> > Richard, I'm not sure I got the
> > <div class="content"> markup right. It's not clear
> > what it is for, but it seems to affect styling in
> > somewhat strange ways. In general, to reduce overhead,
> > it is easiest to mark up all the other parts and leave
> > the unmarked parts as simple content, but maybe I got
> > something wrong.
> >
> > Regards,     Martin.
> >
> > At 09:52 03/06/12 +1000, Andrew Cunningham wrote:
> >
> > >Martin Duerst wrote:
> > >>There are the sites I know to check
> > >>http://webtools.mozilla.org/web-sniffer/
> > >>http://www.delorie.com/web/headers.html
> > >
> > >http://www.rexswain.com/httpview.html
> > >
> > >
> > >likewise if your interested in the http request
> > >
> > >http://www.delorie.com:81/some/url.html
> > >http://www.i18ngurus.com/cgi-bin/TestLang.pl
> > >
> > >
> > >
> > >--
> > >Andrew Cunningham
> > >Multilingual Technical Officer
> > >Online Projects Team, Vicnet
> > >State Library of Victoria
> > >328 Swanston Street
> > >Melbourne  VIC  3000
> > >Australia
> > >
> > >andrewc@vicnet.net.au
> > >
> > >Ph. +61-3-8664-7430
> > >Fax: +61-3-9639-2175
> > >
> > >http://www.openroad.net.au/
> > >http://www.libraries.vic.gov.au/
> > >http://www.vicnet.net.au/
>
>--
>-------------------------------------------------------------
>Tex Texin   cell: +1 781 789 1898   mailto:Tex@XenCraft.com
>Xen Master                          http://www.i18nGuy.com
>
>XenCraft                            http://www.XenCraft.com
>Making e-Business Work Around the World
>-------------------------------------------------------------
Received on Tuesday, 17 June 2003 12:37:21 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 8 January 2008 14:12:37 GMT