- From: Ian Hickson <ian@hixie.ch>
- Date: Fri, 13 May 2011 03:34:13 +0000 (UTC)
On Fri, 11 Feb 2011, Glenn Maynard wrote: > On Fri, Feb 11, 2011 at 3:24 PM, Ian Hickson <ian at hixie.ch> wrote: > > On Wed, 29 Dec 2010, Glenn Maynard wrote: > > > I hit this problem in a UI I worked on. It rendered into a canvas > > > the size of the window, which can be zoomed and scrolled around. > > > At 100% full page zoom this works well. At 120% zoom, it creates a > > > canvas smaller than the window, which is then scaled back up by the > > > browser, resulting in a blurry image. Full page zoom should work on > > > the UI around it--I didn't want to disable it entirely--but the > > > canvas itself should be created in display pixels, rather than CSS > > > pixels. > > > > > > I didn't find any reasonable workaround. All I can do is tell > > > people not to use full-page zoom. Many users probably see a blurry > > > image and don't know why, since there's no way to detect full-page > > > zoom in most browsers to even hint the user about the problem. > > > > That's a bug in the browser. If it knows it's going to be zooming up > > the canvas when it creates the backing store, it should be using a > > bigger backing store. > > It sounds like you're saying that, if the user's full-page zoom level is > 110% and the page requests a 100x100 canvas, the browser should create a > 110x110 backing store instead. There are several problems with that: > > - The full-zoom level can be changed by the user after the canvas is > already rendered. If I load a page at 100%, the canvas renders at that > resolution, and then I change the full-zoom level to 110%, there's no > way for the browser to know this and use a bigger backing store in > advance. Sure, this is a "best-effort" kind of thing. > - The data would have to be downscaled to the exposed 100x100 resolution > when exported with ImageData. No, ImageData exposes the underlying data, not the data in CSS pixels. > Similarly, rendering a 100x100 image into a canvas set to 100x100 would > upscale the image, blurring it: the developer should be able to expect > that blitting a 100x100 image into a 100x100 canvas will be a 1:1 copy. It would make no difference since the canvas is zoomed 110% anyway. > - If, rather than displaying it in the document at the full-zoom level, > the data is sent to the server, the results would be blurry. For > example, if I create a 1000x1000 canvas (and the browser's backing store > is actually 1100x1100), and I send the finished data to the server (at > the exposed 1000x1000), the browser has to resample the final image, > blurring it. Yup. If you want to do graphics and know the resolution of the backing store, doing it on the client is a poor path. You don't know what resolution the image you get back will be in. > > I went to books.google.com, opened up the first book in my library, > > and zoomed in, and it reflowed and rerendered the text to be quite > > crisp. I don't see any problem here. Images were similiarly handled > > beautifully. > > > > Could you elaborate on the steps to reproduce this problem? > > (I tried this, and text was blurry even when I zoomed using only that > page's built-in zoom mechanism; it seemed to be scaling the rendered > page and not rerendering text at all. I figured some books might not be > OCR'd so I tried another couple books, but it still happened; then it > somehow crashed FF3, so I gave up.) Weird. -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Received on Thursday, 12 May 2011 20:34:13 UTC