Re: WebTV Help for Getting Engaged in W3C Test Effort

James,
thanks for your reply. Would be good if you can keep the TV list (and
myself) in CC as I/we are not all on public-test-infra

I'm quoting your answer below entirely for the benefit of the TV folks+add
some more questions/comments

From: James Graham <james@hoppipolla.co.uk>
> Date: Tue, 22 Apr 2014 17:56:52 +0100
> On 22/04/14 17:22, Giuseppe Pascale wrote:
> > In no particular order, here a set of questions I've heard from various
> > people+some comments from me. Can you help address them? I would like to
> > invite other IG participants to chime in if I forgot something:
> >
> > 1. the first question was about "where to find information on the W3C
> > testing setup and material". Bryan tried to answer with the mail below.
> In
> > short it seems to me that the starting point is
> > http://testthewebforward.org/docs/. Please chime in if anything needs
> to be
> > added
> Yes, that's the right site. The idea is to centralise all the useful
> information there. Since it is documenting an ongoing software
> development process, I have no doubt that the documentation could be
> improved. One slight problem with our current setup is that the TestTWF
> docs are in a seperate repo, so it's easy to forget to update those docs
> when making changes to e.g. testharness.js.
>

are there then plans to coordinate this eve further and have everything
documented in this central page?


> > 2. TTWF website points to this post (
> >
> http://testthewebforward.org/blog/2013/02/20/testing-the-open-web-platform.html
> )
> > under "Want to learn more about our plans?". Is that post still 100%
> > accurate? If not, would be good to get an update post/page about what
> TTWF
> > is and what the plans around testing are.
> I agree an update would be useful. We have come a long way in the last
> year, and achieved many of the goals that Tobie set out.
>

good, thanks.


> > 3. One other question raised was about process: how you submit tests, how
> > do their get reviewed approved etc. AFAIKS this is answered here:
> > http://testthewebforward.org/docs/review-process.html. One thing that is
> > not clear though is timing information and who does what. If an
> > organization/company was to submit 100s of test, who would be reviewing
> > them? Is there any guarantee those would be reviewed at all? Or would
> your
> > recommend to whoever intends to contribute tests to also contribute
> reviews?
> The situation is basically that no one is paid specifically to review
> tests. However some people have jobs that allow them to spend some time
> doing test review, and other people have been doing reviews in their
> spare time. This can make it hard to cope with large influxes of review
> items, particularly after TestTWF events. The data at [1] (look at the
> red area) shows we were making headway in reducing the backlog until the
> recent events. Obviously we would like to do better here, but the main
> problem is lack of people who are both qualified and inclined to do the
> work.
> For companies looking to contribute to web-platform-tests, putting
> effort into review is just as valuable as putting effort into writing
> tests. If people are going to submit hundreds of tests, it it worth
> knowing that there's a rule that internal review in a public location
> may be carried forward to the test repository. For example, if a Mozilla
> developer makes a patch that includes some code changes and adds some
> web-platform-tests, a r+ in Bugzilla for the code+test changes would be
> enough to land the test changes in web-platform-tests without requiring
> a second round of review.
> Obviously if this doesn't work well for some entities (i.e. if people
> start landing low-quality tests on the basis of such "review") we will
> start a blacklist. That hasn't been a problem to date.
>

Ok clear.


> > 4. Let's assume some organizations/companies decide the contribute to the
> > W3C effort. What are the plans when it comes to maintaining the test that
> > gets submitted? Are there processes in place to make sure that if spec
> > changes, tests are "invalidated"? In other words, how can I know, at any
> > given time, if a test suite for a given spec is still valid? And who is
> in
> > charge to check that tests are still valid when a spec gets updated?
> Also,
> > are there way to "challenge" a test, i.e. to say that a given (approved)
> > test is in fact invalid?
> It's very hard to automatically invalidate tests when the spec changes.
> Even if we had lots of metadata linking tests to spec sections — which
> we don't — it is quite common for a test to depend on many things other
> than that which it claims to be testing. And requiring a lot of metadata
> adds an unacceptable overhead to the test authoring process (I have seen
> cases where people have has testsuites, but have refused to submit them
> to common testsuites due to metadata overheads).
>

Agree that a  lot of metadata may be overhead. But there is probably a
middle group between no metadata and a lot of them. For example, even
though you may not be able to automatically track if changes in the spec
imply changes in the tests, would be valuable to know against which version
of the spec a given test was written. Later on, if the spec changes, people
running the tests should be able to also update such information to
indicate that are still valid for a given spec.

This would be relatively small overhead and would give you at least an idea
of how recently the test was checked.

In practice the way we expect to deal with these things is to have
> implementations actually run the tests and see what breaks when they are
> updated to match the new spec.
> > 5. IIRC not all WGs are using the process/tools from TTWF. Is this
> > documented somewhere? Will these other groups continue with their tool
> for
> > the time being or is there any plan to merge the various efforts at some
> > point?
> CSS, at least, currently use different repositories. I think there is a
> considerable advantage to everyone sharing the same infrastructure, but
> at the moment there are not concrete plans to merge the repositories.
> > 6. Do the test include metadata that easily allow to (at the very list)
> > extract relevant test for a given spec? Are these
> > mandatory/checked/maintained?
> The directory structure reflects the structure of the specs; each spec
> has its own top level directory and subdirectories within that
> correspond to sections of the spec.


So when spec section changes (and we have seen this happening with HTML5)
you will move test around? Maybe this is not that common?


> Some tests include more metadata,
> but this is not required.


can you give us an indication of what kind of metadata (if any) are
required for each test? Or is there no requirement for metadata at all and
it's always optional?


> Where it has been added I expect it is often
> wrong. I am much more interested in finding ways to automatically
> associate tests and parts of specs e.g. by instrumenting browsers to
> report which apis are called by each test, or by looking at code coverage.
>

is there anything happening on this or it's just something to look at at
some point?


> > 7. Some people expressed an interest to organize a TTWF event (or
> something
> > similar) dedicated for TV, to better discuss and understand the needs of
> > the TV industry. Do you think this would be doable? Is there already a
> > calendar or next events and/or open dates? How would you recommend we go
> > about this? (and do you think it would be useful in the first place?
> I am not the best person to answer this, but typically TestTWF events
> have been about actually writing tests rather than discussions around
> test writing. Having said that, some kind of event sounds like it could
> be sensible. Previously the events have often been organised around some
> existing W3C meeting so there is a high concentration of people to act
> as "experts" already in the area. However it sounds like you might want
> a slightly different kind of event, so this might not be the right
> arrangement.
> Feel free to chat on irc if you have any more questions.
> [1] http://testthewebforward.org/dashboard/?#all


I think you are right, thanks for the info

/g



On Tue, Apr 22, 2014 at 6:22 PM, Giuseppe Pascale <giuseppep@opera.com>wrote:

> Bryan,
> thanks for starting this.
>
> Robin,Tobie, testing folks,
> during the workshop that Bryan mentioned, we discussed testing of we
> specs, and various organizations and company, while agreeing that having a
> common pool of test cases would be ideal, wanted to first get a better
> understanding of the of the W3C testing effort, to make sure this meets the
> requirements of those organizations/companies and that any effort they
> would put into contributing to it wouldn't be wasted.
>
>
> In no particular order, here a set of questions I've heard from various
> people+some comments from me. Can you help address them? I would like to
> invite other IG participants to chime in if I forgot something:
>
> 1. the first question was about "where to find information on the W3C
> testing setup and material". Bryan tried to answer with the mail below. In
> short it seems to me that the starting point is
> http://testthewebforward.org/docs/. Please chime in if anything needs to
> be added
>
> 2. TTWF website points to this post (
> http://testthewebforward.org/blog/2013/02/20/testing-the-open-web-platform.html)
> under "Want to learn more about our plans?". Is that post still 100%
> accurate? If not, would be good to get an update post/page about what TTWF
> is and what the plans around testing are.
>
> 3. One other question raised was about process: how you submit tests, how
> do their get reviewed approved etc. AFAIKS this is answered here:
> http://testthewebforward.org/docs/review-process.html. One thing that is
> not clear though is timing information and who does what. If an
> organization/company was to submit 100s of test, who would be reviewing
> them? Is there any guarantee those would be reviewed at all? Or would your
> recommend to whoever intends to contribute tests to also contribute reviews?
>
> 4. Let's assume some organizations/companies decide the contribute to the
> W3C effort. What are the plans when it comes to maintaining the test that
> gets submitted? Are there processes in place to make sure that if spec
> changes, tests are "invalidated"? In other words, how can I know, at any
> given time, if a test suite for a given spec is still valid? And who is in
> charge to check that tests are still valid when a spec gets updated? Also,
> are there way to "challenge" a test, i.e. to say that a given (approved)
> test is in fact invalid?
>
> 5. IIRC not all WGs are using the process/tools from TTWF. Is this
> documented somewhere? Will these other groups continue with their tool for
> the time being or is there any plan to merge the various efforts at some
> point?
>
> 6. Do the test include metadata that easily allow to (at the very list)
> extract relevant test for a given spec? Are these
> mandatory/checked/maintained?
>
> 7. Some people expressed an interest to organize a TTWF event (or
> something similar) dedicated for TV, to better discuss and understand the
> needs of the TV industry. Do you think this would be doable? Is there
> already a calendar or next events and/or open dates? How would you
> recommend we go about this? (and do you think it would be useful in the
> first place?
>
> /g
>
>
> On Wed, Apr 16, 2014 at 4:23 PM, SULLIVAN, BRYAN L <bs3131@att.com> wrote:
>
>>  Hi all,
>>
>>
>>
>> In the 4th Web & TV Workshop, one of the action items that I said I
>> would take was to help disseminate useful information to WebTV members that
>> want to get engaged in the testing effort at W3C, as a first step toward
>> closing the gaps in WebTV-usecase test assets and methodology noted in the
>> workshop. This is addressing a specific need of the WebTV IG, but should be
>> useful for any community of interest (e.g. WebMob) or new contributors at
>> large.
>>
>>
>>
>> This information will be provided on the test-infra list (
>> http://lists.w3.org/Archives/Public/public-test-infra/), and for more
>> convenient/long-term use on one or more of:
>>
>> ·         the Testing Wiki (https://www.w3.org/wiki/Testing) – note the
>> “not maintained” banner for now, but I expect that it will get maintained
>> soon as part of this effort, unless the places below take precendence
>>
>> ·         the TTWF site (http://testthewebforward.org/) if/once I figure
>> out how we can add communities of interest (e.g. WebTV, Mobile, etc) to
>> TTWF. TTWF should be the preferred public home for such info I think.
>>
>> ·         The W3C github repository (http://testthewebforward.org/) in
>> particular the readme.md (
>> https://github.com/w3c/web-platform-tests/blob/master/README.md) or some
>> other place on the github site, for more detailed help in getting engaged.
>>
>>
>>
>> The point of putting this info on one of these more permanent/usable
>> places is to prevent people who want to get engaged, from having to
>> rediscover this info each time (similar to forum FAQ/sticky posts) and bug
>> the list etc with newbie questions. We can also add a “who to ask” list if
>> questions aren’t answered there and direct people to the test-infra list as
>> a backup.
>>
>>
>>
>> Our intent to do this is not to put a management layer on the testing
>> effort, rather to make it easier for people to get engaged and to organize
>> as a community of interest around closing the gaps that matter to them.
>> This may include reviewing existing tests, adding new ones, building
>> metadata e.g. test runner scripts that focus test runs on specific features
>> or link tests to specific features/assertions where that info is missing
>> from tests, sharing test results, etc.
>>
>>
>>
>> Any other info that people have on “how to get engaged” info and where
>> that is documented, is requested as a response to this post.
>>
>>
>>
>> Thanks,
>>
>> Bryan Sullivan | Service Standards | AT&T
>>
>>
>>
>
>

Received on Wednesday, 23 April 2014 06:32:05 UTC