- From: Geoffrey Sneddon <me@gsnedders.com>
- Date: Thu, 29 Dec 2016 16:31:35 +0000
- To: public-test-infra <public-test-infra@w3.org>
- Cc: Nikos Andronikos <Nikos.Andronikos@cisra.canon.com.au>, www-svg <www-svg@w3.org>
There was a discussion a month ago on www-svg about anti-aliasing in reftests, and the problems of creating exact matches with SVG (especially with anything related to curves); the complete original email is included below. Mozilla's fuzzy reftests were pointed out (of which we have no equivalent in wpt), along with dbaron pointing out that "there are many cases where this can be avoided by structuring the tests so they go through the same codepaths, so the test and reference only differ by the thing that is being tested" (though obviously how well that works relies on implementations doing the same thing to some degree!). I know Servo has had problems with WebRender and certain hardware (i.e., a given platform / GPU / driver combination) there are sometimes differences between <span>AB</span> and <span>A</span><span>B</span>. For example (slightly rephrased from what Glenn Watson said to me): With the way webrender works, the first case gets batched together into a single draw call, while the second case ends up with two draw calls. With the exact same input data going to the GPU in both cases (down to the exact floating point values for vertices etc), some GPUs would have an extremely subtle difference (in this case, < 5 pixels in the image would be off by a value of 1.0/255.0 etc), since they end up going through a different shader. That's an example of inexact renderings *not* down to subpixel AA or anything similar, just different GPU outputs with a slightly different order of operations, hence the questions about fuzzy reftests are really more general than just subtly different rendering of curves and anti-aliasing. With ever more work happening on the GPU this is likely to just become a larger problem for all vendors, although as far as I know everyone is essentially avoiding it by using software renders which don't have such oddities for now. Obviously, though, not testing on real hardware is an aggravating limitation, especially given the relative bugginess of many drivers! /gsnedders ---------- Forwarded message ---------- From: Nikos Andronikos <Nikos.Andronikos@cisra.canon.com.au> Date: Mon, Nov 28, 2016 at 11:31 PM Subject: Handling slight differences in anti aliasing for SVG 2 test suite To: www-svg <www-svg@w3.org> Cc: Geoffrey Sneddon <me@gsnedders.com> Hi all, I’m looking to gather some wisdom and advice from people who have more experience with ref tests than myself. It’s a common problem that reference images may be very slightly different than the renderings for tests. This seems to be compounded with the ref test model for SVG. To give an example (ignore 2b): http://andronikos.id.au/layout-test-results/ These are some marker tests from SVG 1.1, converted to Web Platform Test format. They fail under Webkit. Because of the various transforms that markers go through, it becomes very difficult to create a test that produces an output that is easy to reproduce in the reference. We end up with very slight differences in the user space co-ordinate values and this results in anti-aliasing differences. My gut feeling is that this is going to be very hard to avoid in the SVG 2 test suite for anything other than very simple tests. But browsers must have come up against this issue with SVG before, so what do you do? I can see a few possible options: 1. A dumb tolerance of x% of pixels may be different (yuk) 2. A smarter image diff that allows tolerance only within 1px of object boundaries 3. When testing multiple levels of transform, apply a filter to each SVG test to make it 1bpp, with any value below one going to zero 4. Try harder to craft tests that always match 100% (as I said above, I think this is going to be very difficult) What do others think? I’d love some input on this. Nikos. The information contained in this email message and any attachments may be confidential and may also be the subject to legal professional privilege. If you are not the intended recipient, any use, interference with, disclosure or copying of this material is unauthorised and prohibited. If you have received this email in error, please immediately advise the sender by return email and delete the information from your system.
Received on Thursday, 29 December 2016 16:32:08 UTC