- From: Luca Passani <passani@eunet.no>
- Date: Thu, 08 Jan 2009 16:57:51 +0100
- To: public-bpwg-ct@w3.org
I think we are forgetting one basic requirements here. Transcoders must
err on the side of not transcoding whenever in doubt. I still think that
application/xhtml+xml is a strong indicator that the content is probably
already suitable for mobile. If W3C does not want to take it as an
absolute indicator, then you may want to compromise on something like:
If MIME-Type=application/xhtml+xml
AND
SIZE <= 20kb
AND
No Web-only tags such as iframe nor complex HTML/Javascript
THEN => MOBILE CONTENT (Do Not Touch)
this would still be a much better heuristics than disregarding
application/xhtml+xml altogether.
I suspect someone here is forgetting that transcoders are essentially a
bunch of hacks. They are not standard by any stretch of the
immagination. Transcoders compete with one another on who has the best
hacks. As a developer, one has no way to know what will happen to one's
content once it goes through a transcoder. In this situation there is no
alternative but to demand that those hacks refrain from interfering with
the legitimate intentions of the content owners whenever there is a
trace of a doubt that the content is already mobile-optimised.
wrt 800 URLS, I think we can still make some sense of it without going
through each and every one of them.
I also want to reiterate (in case someone forgets) that this discussion
is about a minor point (one heuristic), compared to the major one (UA
spoofing) which is the difference between life and death in the whole
transcoding matter. IMO, CTG does not offer enough protection against UA
spoofing. For me, the situation at the moment is that the Manifesto
puts the responsibility of protecting mobile content completely on
transcoders (which is the only reasonable thing to do), while CTG tries
to move (at least part of) the responsibility on content providers
(which is unacceptable).
Luca
Eduardo Casais wrote:
>> I argue that a lot of those 800 "not-anambiguosly mobile"
>> sites are actually OK for mobile users.
>>
>
> Perhaps, but that is not really the point.
>
> The argument applies to HTML sites as well: they might be
> suitable for mobile devices -- but that might be a conscious
> result of application design, or just a coincidence. The
> issue is inferring the intended target device class from
> explicit declarations associated with the content a priori.
>
> The MIME type application/xhtml+xml is ambiguous, since
> at least in the desktop Web, the associated content's
> doctypes are not incontrovertibly intended for mobile
> devices: some might correspond to XHTML basic
> (intended for mobile), some even to XHTML mobile profile
> (intended for mobile), many to traditional W3C XHTML
> (intended as a replacement of HTML 4.0 for desktop, not
> for mobile).
>
> What you are now trying to put forth is that XHTML 1.0/1.1
> itself is intended for mobile devices at a rate so high
> (near 100%) that when application/xhtml+xml is present,
> one can simply assume that it is for mobile and eschew
> inspecting the DOCTYPE declaration entirely. I doubt very
> much this inference chain holds. I see no evidence that
> standard W3C XHTML 1.0/1.1 documents are
> overwhelmingly produced for mobile content. In fact, I am
> convinced that upon encountering application/xhtml+xml,
> one must check the DOCTYPE, and if this is neither
> XHTML basic nor mobile profile (nor one of the i-mode or
> Softbank or Openwave variants) but rather the traditional,
> original W3C XHTML, and in the absence of further
> indications, then one should assume desktop-orientated
> content.
>
>
>> Would it be possible to get hold of those 800 urls so that
>> we can take a proper look?
>>
>
> Well, you have to contact the MAMA project leader at
> Opera for that.
>
> Even then, this is quite an endeavour: do you really
> have the capacity or the tools to inspect 830 or 935
> pages to check their applicability for mobile terminals?
>
>
> E.Casais
>
>
>
>
>
>
>
>
Received on Thursday, 8 January 2009 15:58:33 UTC