W3C home > Mailing lists > Public > www-archive@w3.org > March 2011

Comments on REC-PNG-20031110

From: Bjoern Hoehrmann <derhoermi@gmx.net>
Date: Fri, 04 Mar 2011 04:29:41 +0100
To: png-group@w3.org
Message-ID: <0jl0n6l0vfgm4lkis4r2462itrectr0voi@hive.bjoern.hoehrmann.de>
Hi,

  Regarding <http://www.w3.org/TR/2003/REC-PNG-20031110>, in the process
of implementing an IDAT recompressor I've implemented parts of the PNG
specification. In this I found three things hard to understand. The
first thing is how IDAT chunks encode filter type and the scanline data.
This is mentioned in passing somewhere, but it is not as clear as, say,
some indented

  +-------------+------------------------------------+
  | Filter type | Filtered data for this scanline... |
  +-------------+------------------------------------+

The second thing is the discussion of filters. Filters map from source
to destination, but section 9.2 isn't quite phrased this way. It's hard
to properly understand the "pixel versus byte" distinction early on in
the section, and the notation in table 9.1 is hard to understand. I got
confused a couple of times in what order things had to be done, what it
really means to have `Filt(x)` in the reconstruction function and so on.

One ambiguity for instance is that it's not really clear what `Orig(x)`
is since from the perspective of the scanline for `x` there are two
original values for the other variables, before and after they had their
filter applied. The same goes for `Recon(a)`, is that before or after it
had its scanline filter removed? While I figured out the answer, I did
understand these things incorrectly initially.

(It would have helped a lot had there been some pseudo-code showing how
you get from the color and depth values to the advance width and how you
resolve the aforementioned ambiguity, or at the least something about
how you can decode top-down but need to encode bottom-up or with a se-
parate buffer.)

The third issue is the CRC calculation. These days there are very many
things that go by "CRC" with various polynominals and whatnot. Since I'd
rather not code my own implementation, or use the specification's code,
I went looking for the right library, but that's not very easy. It would
have helped if there had been a reference, like saying this is the same
CRC that is used by gzip.

Similarily, in 5.1 "A four-byte CRC (Cyclic Redundancy Code) calculated
on the preceding bytes in the chunk, including the chunk type field and
chunk data fields, but not including the length field." is confusing.
What is says is that you really only take the type field and the data,
as far as I understand it anyway, so saying "including" the only things
you include is misleading.

(As an aside, the second to last paragraph in 9.2 seems malformed and
there are a couple of " ." instances that should not be there.)

regards,
-- 
Björn Höhrmann · mailto:bjoern@hoehrmann.de · http://bjoern.hoehrmann.de
Am Badedeich 7 · Telefon: +49(0)160/4415681 · http://www.bjoernsworld.de
25899 Dagebüll · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/ 
Received on Friday, 4 March 2011 03:30:14 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 7 November 2012 14:18:35 GMT