- From: Mihai Balan <mibalan@adobe.com>
- Date: Fri, 13 Sep 2013 15:03:30 +0100
- To: "Public CSS Test suite mailing list (public-css-testsuite@w3.org)" <public-css-testsuite@w3.org>
- Message-ID: <32E5AE8565CC6142BA0AAFA5A95697E04D2CE8DA9F@eurmbx01.eur.adobe.com>
Hello everybody, Recently I have participated in discussions, in different contexts, about the usefulness of recording test suggestions for W3C tests when implementing them on the spot is just not feasible because of resource/priority constraints. This becomes even more interesting/useful in the context of events like Test the Web Forward, where simply pointing to a specification might prove discouraging for a person looking to start writing a test, but instead pointing to a list of proposed/requested test cases can be a lot more approachable. I initially had some off-list exchanges with Peter about a gitHub-centric process that used issues on the csswg-test repo[1] to do that, but it proved unfeasible for a number of reasons (most important being access control). The proposed solution would be adding a new feature in Shepherd to create/track/edit test request that would sync with the gitHub repository eventually (but without the access control mess). I put together a rough draft of the proposed workflows and needed features over at [2], and I’m interested in your opinion about it: · Does the whole idea of test requests/test suggestions make sense? · Do the proposed workflows/features make sense and/or are enough? Let me know what you think, m. [1] https://github.com/w3c/csswg-test/ [2] https://github.com/mibalan/csswg-test/wiki/Tracking-tests-to-be-written-(test-requests) Mihai Balan | Quality Engineer @ Web Engine team | mibalan@adobe.com<mailto:mibalan@adobe.com> | +4-031.413.3653 / x83653 | Adobe Systems Romania
Received on Friday, 13 September 2013 14:04:02 UTC