Re: Use of Python Imaging Library in tests

On 05/06/15 09:26, Kristijan Burnik wrote:

> On Thu, Jun 4, 2015 at 10:56 AM, James Graham <james@hoppipolla.co.uk
> <mailto:james@hoppipolla.co.uk>> wrote:
>
>     It looks like a number of the new referrer-policy tests are trying
>     to use the Python Image module, which isn't part of the stdlib. As a
>     result these tests aren't possible to run out of the box, which is
>     new and problematic. Even if we shipped PIL (or Pillow) as part of
>     the repository I believe that it has a build step, so tests would
>     continue to not work out of the box.
>
>     Can we find some alternative solution like checking in the generated
>     files needed rather than trying to generate images at runtime?
>     Otherwise I will end up disabling these tests on the Gecko
>     infrastructure, which won't help anyone.
>
>
> I have suppressed and removed the image tests for the time being. An
> open PR is awaiting.

Great, thanks!

> The reason why we're not serving static images is
> due to the fact we are encoding data into the image itself as (e.g. the
> headers we receieve in the test) and then return back that data encoded
> as color which we read from the canvas.

Right, that makes sense. I guess something like an inline encoder that 
only knows how to produce the specific images we want sounds reasonable.

>     On a somewhat related note, there seem to be 1500 new files
>     containing referrer-policy tests. This is somewhat slow to run. Are
>     we sure that there isn't some way to set up these tests that's
>     faster? I understand that there might not be, but I want to check.
>
>
> I believe individual tests shouldn't be slow to run, only the sheer
> number might be problematic in that sense. We're also looking into ways
> to reduce the number of tests, e.g. we can suppress some redundancy in
> the near future.

Yeah, it's only the number that was a concern. Of course if they are all 
really testing different and unique things, and each has to be a 
separate file, that's not a problem, I'm just checking that there isn't 
redundancy, or the possibility of having multiple tests per file (this 
was particularly bad when they were first imported into gecko because 
the harness restarted Firefox after each file with an unexpectedly 
failing test, so we blew through our time budget. Now we have recorded 
those tests as expecting to fail those restarts don't happen and the 
suite is more manageable).

Received on Friday, 5 June 2015 09:45:19 UTC