- From: James Graham <james@hoppipolla.co.uk>
- Date: Mon, 16 Dec 2013 16:10:24 +0000
- To: "public-test-infra@w3.org" <public-test-infra@w3.org>
On 12/12/13 13:30, James Graham wrote:
> Redirecting this conversation to public-test-infra.
>
> On 12/12/13 13:01, Arthur Barstow wrote:
>> On 12/12/13 7:31 AM, ext Simon Pieters wrote:
>
>>> First I ran the tests using
>>> https://bitbucket.org/ms2ger/test-runner/src on a local server, but
>>> then I couldn't think of a straight-forward way to put the results in
>>> the wiki so I just ran the tests manually, too. :-( Since most tests
>>> are automated it's silly to run them manually and edit a wiki page. Is
>>> there a better way?
>>
>> Re automated running, there is <http://w3c-test.org/framework/app/suite>
>> but I think it is considered obsolete (and isn't maintained). Test
>> automation is/was on Tobie's ToDo list. I'll followup separately about
>> the status on public-test-infra.
>
> Ms2ger has a simple in-browser runner which we could adapt to use a
> top-level browsing context rather than an iframe, and to use the
> manifest file generated by the script in review at [1].
I have now started this work. The code is in the jgraham/runner branch
of the web-platform-tests repository. It is chronically in need of some
love by someone that enjoys design work.
> Yeah, so I forsee this taking longer to output than to actually do the
> run (which I can fully automate for gecko). We should agree on a simple
> format that can be produced by any kind of automated runner and make a
> tool that can turn that format into an implementation report. Something
> like
>
> [{test_id:string|list, status:string, subtests:[{name:string,
> status:string}]}]
This is more or less the format I used. See [1] for examples of what I
finally adopted.
> If we do something like this I can likely organise for such output to be
> automatically generated for every testrun on gecko, so producing an
> implementation report for any feature would just be a matter of
> importing the data from the latest nightly build.
Once the dependencies are checked in to the web-platform-test repo
(basically the self-hosting tests stuff) I will create a review for this
and we can check it in. Then we can set up some sort of web service that
will allow people to upload the results and get fully-automated reports
on who passes which tests.
Received on Monday, 16 December 2013 16:10:54 UTC