- From: Brad Hill <hillbrad@gmail.com>
- Date: Mon, 23 Jun 2014 16:14:50 -0700
- To: public-test-infra <public-test-infra@w3.org>, "public-webappsec-testsuite@w3.org" <public-webappsec-testsuite@w3.org>
- Cc: James Graham <james@hoppipolla.co.uk>, Odin Hørthe Omdal <odinho@opera.com>, "jgraham@hoppipolla.co.uk" <jgraham@hoppipolla.co.uk>
- Message-ID: <CAEeYn8jX8+WaJUTWNwq7BGdYxJJq5iZ_E044H5tM2ROgFccUKg@mail.gmail.com>
OK, I'm working through the directions from web-platform-tests repo README.md in preparation for a TTWF in which it was strongly suggested that I move all existing CSP tests from PHP to the new wptrunner format. I am a python novice, and I assume TTWF participants will be, too. I: Forked the repo. Cloned it. $cd web-platform-tests $git submodule update --init --recursive edited /etc/hosts $python serve.py Now I"m trying to figure out how to find/use the test runner. README.md says: "There is a test runner in tools/runner that is designed to provide a convenient way to run the web-platform tests in-browser. It will run testharness.js tests automatically but requires manual work for reftests and manual tests." When I go into directories with a MANIFEST, there's no clear indication which file actually runs them. So instead I try to go to localhost:8000/tools or /tools/runner, as suggested in README, and I get: {"error": {"message": "", "code": 404}} I keep digging in the directory on the command line and eventually force-browse to: http://localhost:8000/tools/runner/index.html Then I get a test runner interface, but if I enter /html/syntax/parsing or any other given suggestion into the path input and click "Start", nothing happens. (Tried Safari and Chrome) Debug output on the console shows it seems to be looking for a MANIFEST file in / and not finding it because there isn't one in the Git repo,. It still does this even if I add "?path=/html/syntax/parsing" to the URL. So far, this isn't very beginner friendly, or even moderate-to-expert friendly. (because the expert knows that he is only entering the rabbit hole once he starts the debugging of a tool with out-of-date documentation, and despairs at the unknown depths it may contain vs. the time available to actually get work done) I will provide the "beginner's mind" and send the pull request for an updated README if someone who knows how this actually works can help me walk through it successfully. -Brad On Wed, Jun 4, 2014 at 5:14 PM, Rebecca Hauck <rhauck@adobe.com> wrote: > > > On 6/4/14, 3:27 PM, "James Graham" <james@hoppipolla.co.uk> wrote: > > >(note: I am not on public-webappsec so please CC me explicitly) > > > >On 04/06/14 22:39, Odin Hørthe Omdal wrote: > >> On Wed, Jun 4, 2014, at 22:53, Hill, Brad wrote: > >>> Can you point me at how users can see/test changes in real-time? That > >>> was the other issue I had with using the "canonical" repo - with only > >>>me > >>> contributing, there was nobody who really had it on their TODO list to > >>> review and approve my submissions, so merge times when I was working on > >>> CORS tests were on the order of 6-8 weeks, and only after I jumped up > >>>and > >>> down and waved a big flag. That's just not workable to have multiple > >>> people contributing and be so out of sync. > >> > >> The review time is indeed a problem. But the outstanding reviews is much > >> more visible now than before. And we are reviewing more than before. > >> It's still a problem though. More people should review, especially > >> people who know the specs. > >> > >> For events such as TTWF it is quite crucial to have people reviewing > >> tests just as they come in. It can be the present experts doing it in > >> GitHub at the place, or people contributing remotely. I also think it > >> would be very helpful if some of the best people reviewed others' tests > >> at the event. > > > >Yes, this should be an important function of experts at events. Indeed > >the best way to work would be to have the expert review the test > >alongside the author in realtime so they can easily understand how > >review works, what is being looked for, and what the issues are with > >their test. > > Sort of a side point to this is that experts & attendees are encouraged to > continue on after the event. It¹s very common to have PRs in progress at > the end of the day and even people just submitting initial PRs under the > wire. We always wish we had more time. Admittedly, we haven¹t done the > best job at staying engaged with new contributors after the event, but > that was indeed one of the original goals of this movement. I¹ve also made > open calls to mailing lists in the days after events asking for reviews on > remaining open TestTWF PRs with some moderate success. > > > > > >> We had an automatic pull request viewing system before, that was synced > >> to w3c-test.org, -- that seems to currently be down (due to changes?). > >> Having pr-<number>.w3c-test.org would be quite cool. I'd like to see > >> something like that turn up again. I remember seeing many reviews about > >> syncing and server setup, so I hope that's what those reviews were for. > > > >That still happens; submissions go under > >w3c-test.org/submissions/<pr-number>. If the submitter is not a > >"collaborator" on the w-p-t repo, someone who is needs to add a comment > >"w3c-test:mirror" to cause the mirroring to occur. > > > >Regarding the earlier discussion about using web-platform-tests vs a > >custom repository, there are a number of reasons to prefer using > >web-platform-tests even if some tests require unusual server side setups > >that cannot be provided in all environments yet. > > The only thing I have to add here is be careful about using custom > repositories because of CLA issues. Tobie set it up so that making the > pull request is equivalent to the Grant of License, as defined in the root > of the repo [1]. If for some reason you decide to use another repo that > does not have this file (I think a fork is fine), people will need to fill > out the old form [2]. > > > > >As Odin pointed out, wptserve provides the same level of control as PHP > >for inspecting the request and sending the response. You have absolute > >control over which bytes are sent over the wire and when. There are also > >a number of features specifically optimised for writing tests. > > > >Using web-platform-tests makes it possible to reuse almost all the > >documentation that exists at testthewebforward.org. People who are > >confortable with writing tests for some other spec can dive right in > >without having to learn anything new about where the tests are hosted, > >how they are written, or whatever. > > > >Self-hosted tests won't actually get used in browser CI systems, since > >those are generally forbidden from accessing external hosts for reasons > >of reliability. Therefore such an approach is of limited usefulness. On > >the other hand, at least at Mozilla, we have made big progress in > >running web-platform-tests on CI and, with a little more work on > >stability, should be running them on each commit. At present encrypted > >connections are not supported simply because it's a difficult problem > >and it's a higher priority to get things working at all. But other > >Mozilla testsuites such as mochitest do support this, and we should be > >able to leverage the techniques they use for > >web-platform-tests-on-Mozilla-infrastructure. I imagine other vendors > >have similar solutions. So once webappssec tests are in > >web-platform-tests it's much more likely that we will do any extra work > >needed to get them running on our CI system because it will fit with our > >general future plans in that area. > > > [1] https://github.com/w3c/web-platform-tests/blob/master/CONTRIBUTING.md > [2] https://www.w3.org/2002/09/wbs/1/testgrants2-200409/ > > >
Received on Monday, 23 June 2014 23:15:20 UTC