- From: Rotan Hanrahan <rotan.hanrahan@mobileaware.com>
- Date: Sun, 22 Mar 2009 18:30:23 +0000
- To: <public-bpwg@w3.org>
- Message-ID: <8BE85C26-8526-4FB5-A40C-29956F81A798@mimectl>
Just adding to the voices... Luca (responding to Chaals): >> The point is that UA-spoofing is a specific issue that is not the same >> as transcoding in general. And one that I think is actually worth the >> time to discuss. > > yes. So, my request to this group is that CTG demands that no UA > spoofing is performed by the transcoder. Compromise can be found on > allowing that a second spoofed HTTP request is made if the first request > to a given site does not return mobile-content. So, despite the potential for interfering with content that Luca has been warning about, this proposed compromise suggests that in the use-case where mobile content is not returned it is legitimate to spoof the UA. Presumably such spoofing is for the purpose of retrieving alternative content, with the strong likelihood of it being transcoded subsequently. So this means: there are times when transcoding is OK. Correct me if I'm wrong, Luca, you also have less of a problem with Mini mainly because it is a distributed browser (comprising a transcoder and in-device renderer) that openly declares itself. But anyone using the same transcoding technology without declaring their presence is not OK. Right? Thinking about Luca's other concerns, and in light of the above, the issue he is exploring is the degree of transcoding that may take place. Major restructuring of the original content (without any input from the author) does not show respect for the content/author, but lesser adjustments to make the content compatible with the end user device are acceptable. In this case, the motivation is to "bridge the capability gap", not marketing, censorship, etc. Also, transcoding of the content within the device to make it compatible with the user (who may have some modal disability) is also acceptable. Very acceptable, actually, as we are all compassionate people. With this understanding, Mini might be in the grey area of acceptability (from Luca's standpoint) because although it openly declares its presence, the significant restructuring of the original content could be trouble. However, Mini doesn't remove anything without good reason, and doesn't inject any significant material, so perhaps this is OK. Hopefully I am interpreting Luca correctly here. Having established that some kinds of transcoding are OK, even if they do some major adjustments (for good reasons), the question is: would this proposed "double request" compromise work in practise? I'm not so sure. I accept that it might work for a lot of existing sites, but only the existing mobile aware sites will benefit. (Pun intended.) The mobile aware sites would respond with mobile content, and the transcoders would (hopefully) not interfere. Non mobile aware sites would see their HTTP traffic increase. Where before they were getting just one request, now they get two. In addition to the performance hit, their statistics would now be skewed. Maybe the transcoder would remember non mobile sites, to avoid future double-request behaviour, but what happens to the site when they install a product (e.g. ours) in order to "go mobile"? In such a scenario, the transcoders would still be spoofing because they remember the site as non-mobile. We'd need a mechanism to signal to transcoders that the site had gone mobile ("vary" header?), or perhaps they could automatically refresh their "mobile status" so that newly mobilized sites are detected. What this shows is that while the proposed compromise looks good at first, there are still some issues that need to be considered, and other techniques/heuristics will still have a role to play. The point about search engines distorting content is also rather interesting. The question to ask is: who benefits? Users, for one, as it helps them find (mobile) content. Authors, because it increases their audience. SEs, because their search results are often ordered or accompanied by advertising, but only during the search phase. That sounds like a win-win-win. Furthermore, as robots.txt is well-established, most (knowledgeable) Web sites are effectively signalling their willingness (or lack thereof) for their content to be included (and therefore transcoded) in search results. So this is a potentially beneficial activity conducted in a manner that respects author rights(*), and stretches back through a lot of the Web legacy. Regarding the "pixel perfect" point raised in the original email, the Web wasn't intended to be pixel perfect though the original small number of browsers encouraged many authors to think of the Web in that way. Word processing and DTP habits compounded the problem. With the diversity of the mobile Web, pixel perfection is near impossible. However, taking our own technology as an example, you can still author in a more abstract (less pixel perfect) manner yet deliver perfection thanks to clever origin-server adaptation. It is very annoying then if something in the delivery path steps in and interfers with the perfect response. When an author goes to a lot of trouble to provide a contextually great response, probably via clever adaptation, there had better be a very good reason to interfere. (Subsequent adaptation for the accessibility needs of a particular user would be a good reason, to give one example.) The Web is about negotiated experience of distributed content and services. Everyone should have a say. The author. The user. The carrier of the data. The builder of the devices. The legal framework of the participants. The cultural norms to which we subscribe. All of these (and probably more) should have the means to influence the Web experience to varying degrees, and it should not be acceptable for one party to ignore the effort/rights/needs of others. ---Rotan. (*) I accept that for some authors it is not possible for them to influence the robots.txt data, and their only option would be to move to a different host. However, I suspect these authors would generally prefer their content to be indexed by SEs. From: Luca Passani Sent: Sun 22/03/2009 11:50 To: public-bpwg@w3.org Subject: Re: The Web has flexible presentation Re: (Nielsen article) comments in-line Charles McCathieNevile wrote: >> >> I am not sure how you can call what I say nonsense. I find this >> rather insulting. > > Sorry. I did not mean to be insulting. However, the way you chose to > express your original statement (which isn't how you express it below), > was a statement that I still believe is "plain wrong". no. I did not restate anything. I stated: "The success of the web was based on the basic assumption that whoever could publish web content and they would know what end-users would see" This is what you called nonsense. I wouldn't have objected if you had stated that some publishers may be OK with letting their content be presented in different ways. But your saying "nonsense" is nonsense. >> >> Mud the water as much as you like, Chaals. Web production is about >> delivering content which is as much as possible close to the visuals >> that a communication company has created for their customer. 99.99% >> of the site visitors will get exactly that, no matter how much >> fiddling with CSS settings an Opera user may (but won't) do. > > *SOME* production is about that. A large (and probably increasing) > part of > production is about getting content and services to users so they will > read and use them, and an important trend has been to reduce the reliance > on pixel-perfect layout (since that is known to only work sometimes) and > increase the unserstanding of what the technology actually does well, and > how to take advantage of that. what you call SOME is 99% of companies creating professional websites. Anyway, by controlling the view, I did not mean "exactly pixel-perfect". Content is still allowed to flow within small ranges inside a page. But still that has nothing to do with the transcoding that OperaMini, Novarra and other transcoders are promoting as the best thing since sliced-bread. > > While some clients will accept a piece of design that simply annoys users > who are different from the average, others will work to make one that > tolerates such differences. In countries like the United Kingdom, failing > to anticipate needs of a diverse audience such as the requirement to > increase font sizes is now actually illegal in extreme cases, but in the > field of Web Design in general it is simply recognised as the sort of > slap-dash approach one would now only expect of an amateur. So, are we discussing the fact that some users may want to change the font-size because they can't read properly with that stylish but tiny font OR are we discussing transcoding, which takes a website, cuts it in bits, removes parts and reshuffles the remaining parts according to logics which are totally outside of author control? I think we are talking about the second. I don't think that the possibility of changing fonts for end users can ever be given enough steroids to make it grow and justify transcoding. > > In their default setup, which one expects a large majority of users to > simply accept the default for whatever they have. With a clear > understanding that in some number of cases, if the site cannot adapt to > the needs of users they will walk away from it. > > Which in turn leads to a "pragmatic" decision about how important those > users are and how hard it is to do web design that takes account of them. sure, but if users walk away, it's the content owner who gets the hit and it's the content owner who needs to do the maths and understand if they need to create support for extra users. This cannot be used as an excuse for a third-party company to build a business model on creating derivative work of third-party content. >> >> I have already repeated ad-nauseam that users are not in the position >> to decide that the right of content owners can be ignored. > > And accepted that in various cases that is not true, as far as I can see, > in the case of your purported right if the content owner to determine > presentation. > > The web has never properly supported that use case, and has been designed > with the active intention to allow more detailed expression of what > content owners would like the user to see, again, yes, it was originally designed to leave some "freedom" of presentation, but has rapidly gone in the direction of giving authors more control. And this is what the web is mostly about today. > along with simpler ways for > users not to see it thus. again, a knowledgeable user may decide to do things on their local configuration which the author has no economic interest in preventing, but this is not a justification for a third-party company to go in and "industrialize" the method of creating derivative work of content without right. >> >> Did Opera ask those sites "is it ok if I transform your content?". >> Was the answer "yes, it is OK. Please go ahead"? > > No, but Opera can be assumed to have asked asked the sites "will you > provide your HTML source to our browser to render for a user?" And the > answer was yes. I am OK with this, since OperaMini declares itself in the UA string, but this is not consistent with Opera's licensing its engine to ByteMobile and allowing ByteMobile to spoof every device UA string using the OperaMini UA string. > > In some cases (like Barclays), without asking anything, they said to the > people they expect as users, "We recommend you use Opera Mobile or Opera > Mini". You have a point here. Why a bank is OK with forfeiting it's end2end secure connection between itself and its customers remains a mystery to me. >> you see, the burden of demonstrating that copyright owners have not >> been cheated is on transcoders and whoever deploys them. > > For your argument to hold water, it should demonstrate that someone has > been cheated. read the comments next to the signatures here: http://wurfl.sourceforge.net/vodafonerant/ you may also want to read this: http://uk.techcrunch.com/2007/09/21/vodafone-in-mobile-web-storm/ > > Is the basic point "users don't have the right to determine how they see > the web, they are required to accept something expressed by content > owners"? I am not avoiding it, I have adressed it a number of times. I > think that assertion is simply wrong. OK. Let's agree to disagree. You think that users are almighty and they have a right to change the way they see their content presented. I think content authors also have rights: they have a right to decide what they content should look like (particularly because presentation is often part of the message) and they have the rights to see their content protected by transcoders. >> >> so, two questions: >> >> 1) where do the rights of content owners fit in this? > > As far as I can tell, you have failed to demonstrate that content owners > have a right to a particular default presentation being rendered by all > the tools that they allow to view their content. The great majority (90% of more) of those who create content expect it to be shown to users the way they have created it (maybe not pixel perfect, but close). This is a basic fact. I am not sure how you can say I have failed to demonstrate this. There is nothing to demonstrate. It's a fact. It's all that publishing content is all about. > Further, you have failed > to demonstrate that they even believe that this is something that will > happen. Instead you have accepted that in a known set of circumstances it > won't happen, and that in at least some of those circumstances, this is > fine. And you have failed to explain the distinction that makes it > fine in > some circumtances, but not others. I am not sure what you are referring to. I think it's OK which users with disabilities get away with using tools for their personal use to access content that would otherwise not be accessible. I think it's unethical of Opera to use this as an excuse to justify abusive business practices (breaking HTTPS without content owner consent) > >> 2) why have you licensed your technology to ByteMobile? > > Because we are a software company, and we license technology to people > who > think they have a good use case for it and want to buy it. right. I don't have a problem with that. I have a problem with a company who gets its hands dirty with dubious business practices coming here and trying to get W3C to justify those abusive practices as if they were legitimate. Opera could have licensed the rendering engine to BM under the condition that the UA string of the real device is not spoofed. Or at least, under the condition that the OperaMini UA string was not used for spoofing. To the best of my knowledge, Opera didn't. Which means that Opera was not so interested in preserving the ecosystem of mobile developers. Opera needs to make a choice here. Is Opera the ethical company who works with W3C to make the web (and the mobile web) a better place for everyone OR is Opera a corporation just like Microsoft, Novarra and endless other corporations which considers profit the main standard to abide by? The latter is a perfectly legitimate choice, but it undermines Opera's credibility when, within W3C, they push a vision of the mobile web that serves their economic interests exclusively (or at least primarily). >> they are using the UA string of Opera-Mini to disguise the real >> device UA and fool websites which may have a carefully built mobile >> site (ByteMobile provides OperaMini headers which do not represent >> the ones in the real device). > > Are they providing this to end users, or is this a service they were > contracted to provide? Two aspects here: ByteMobile goes to operators and tells them that users are the operator's users, and the operator has the right to do what they want with them (and screw those thousands of content providers who want to monetize the operators' users). Of course, once this vision is sold to the operator (mind you, a lot of operators already understand this vision is wrong), spoofing and hacking is what is expected of BM, but this is the abuse which should not be perpetrated (at least, not in W3C's name) > If the latter case, do you know why they are > fulfilling the terms of the contract? > > (Without that information, this discussion can go nowhere as far as I can > see). you are mudding the water, Chaals >> >> I disagree. OperaMini (and other transcoders) attempts to render web >> content. Sometimes the results are good. Sometimes they are really >> bad. Users will go back to sites that transcode well and ignore those >> which don't. This is not a proof that transcoding is always good for >> users either. A lot of times it isn't. > > Without real analysis, the difference between "some" and "a lot" is > meaningless. Just saying... it is not meaningless. What I have been saying multiple times is that HTTP is the platform that MUSt be preserved. Transcoders have the responsibility of doing whatever they can to preserve mobile content and err on the side of not transcoding. So, even without analysis of difference between some and a lot, my point still stands. Having said this, roughly 33% of transcoded web content is usable on high-end devices, 33% is almost unsable, and 33% is total rubbish. This varies wildly depending on transcoder and depending on device. > > I have not argued that Opera Mini is somehow universally good. I am > arguing in this case that a number of people have seen fit to use it, and > to keep using ait and increase their usage of it, and that this > demonstrates that it does something they consider beneficial to them. > > In paticular, I have argued that what it seems Opera Mini does which they > consider a benefit, is present web sites in the way the original > developers hoped they would be presented. There is a moving target here which makes the discussion more complicated. I personally dislike those abusively deployed transcoders, but I never considered OperaMini one of them. Things changed when 1) I discovered that ByteMobile was spoofing as opera Mini 2) I saw you and others from Opera strongly defend abusive business practices here. So, some of my comments are directed at transcoders in general. If you replace transcoders with OperaMini, they do not necessarily apply anymore. >>> Are you telling me those people are *wrong* because they have chosen >>> a service that suits them? (We don't force anyone to use Mini. We >>> offer it, in a free market, and millions choose to take up the offer >>> by actively installing and actively opening and using it). >> >> Again, the fact that users may like it does not mean that you can >> transcode legitimately. > > Agreed. Nor does the fact that people like it mean you can make ice-cream > legitimately. (I think the arguments are a reasonable parallel). no, it's not. A reasonable parallel is that you make ice-cream with eggs, chocolate, pistachos, strawberies, vanilla, milk and sugar which you have just stolen from the supermarket. Have you tried paying for the ingredients before you start making and selling ice-cream? > > The basic discussion here has been whether it is legitimate to provide a > distributed rendering service. I don't see any convincing argument > that it > isn't. italian wisdom: the hardest deaf people to cure are those who don't want to listen. >> Open and free standards which steamroll the requests from a whole >> ecosystem to serve the purpose of a few commercial companies that >> paid the ticket to seat at the W3C table. Pardon me, but this is not >> my definition of open standards. > > Would you suggest, perhaps that the process by which WML was developed is > somehow more open? WML was created before any WML content existed. WAPForum created a proposition and content owners were invited to create content based on the proposition created by WAPForum. Here the situation is different. Thousands of content owners have created content based on a certain assumption of how HTTP works (i.e. the network won't interfere with my content), transcoders have unilaterally decided to break that, W3C is close to ratifying this mess. > I don't think W3C is a perfect organisation, but I > think its record on responding to public comment, even ill-informed and > deliberately vexatious public comment, is extremely good by comparison to > many similar organisations. rather different context, I think. > > I think characterising the patience of the people reading this thread as > "steamrolling the requests from a whol ecosystem" is to seriously > misunderstand what is happening, to underestimate the intelligence of the > audience, to insult their motives and the work that they do, and to > confuse the idea of commenting with the idea of representation. Again, http://wurfl.sourceforge.net/manifesto/ why isn't this being taken in proper consideration by CTWG? > >> >> the way the web works today is that virtually every content owner >> goes out of their way to enforce a particular presentation. You can >> ignore this only if you are in bad faith. > > Or if you are trying to present something that won't work "as-is" to > the user, who has gone out of their way to look for it. Google, for > example, massively distorts the presentation it provides to users, and > yet you seem to accept that in this case it is a service. two main differences: - robots.txt allows content authors to opt-out of search engines - search engines benefit content owners. Transcoding done the Novarra way damages them. > > I think that you can, in good faith, change presentation in many ways, > although there are some specific things that you should not do. I have > tried to get from general ranting to a focused discussion of the > specific real issues. question for you: do you think that the way Novarra does transcoding (UA spoofing and all) is OK? I would like to see you separate OperaMini from transcoding in general to understand where you really stand. >> >> I suspect your thoughtful web developer will soon be an unemployed >> web developer. > > Maybe. But many of the developers I meet who are regarded as leaders > in their field seem to follow my reasoning (more accurately, it is a > restatement of what they say and do). can you provide some names? not someone too close to Opera preferably. > > The point is that UA-spoofing is a specific issue that is not the same > as transcoding in general. And one that I think is actually worth the > time to discuss. yes. So, my request to this group is that CTG demands that no UA spoofing is performed by the transcoder. Compromise can be found on allowing that a second spoofed HTTP request is made if the first request to a given site does not return mobile-content. Luca
Received on Sunday, 22 March 2009 18:32:40 UTC