Draft minutes QAWG telcon 20030922

QA Working Group Teleconference
Monday, 22-September-2003
--
Scribe: Lynne Rosenthal

Attendees:
(PC) Patrick Curran (Sun Microsystems)
  (KD) Karl Dubost (W3C, WG co-chair)
  (DH) Dominique Hazaël-Massieux (W3C)
(LR) Lynne Rosenthal (NIST - IG co-chair)
(VV) Vanitha Venkatraman (Sun Microsystems)
(SM) Sandra Martinez (NIST)
(DM) David Marston (IBM)

Regrets:
(PF) Peter Fawcett (RealNetworks)
(LH) Lofton Henderson (CGMO - WG co-chair)
(MS) Mark Skall (NIST)

Absent:
(dd) Dimitris Dimitriadis (Ontologicon)
(KG) Kirill Gavrylyuk (Microsoft)
(AT) Andrew Thackrah (Open Group)

Summary of New Action Items:
[Format:
AI-20030922-1   DH: to modify OpsGL Entrance criteria to include P3: Sept 23
AI-20030922-2:  KD: organize the review of WG against OpsGL,: Sept 29

Agenda: http://lists.w3.org/Archives/Public/www-qa-wg/2003Sep/0076.html
Previous Telcon Minutes: To be filled in.


1.) roll call 11am EDT, membership

2.) QA OpsGL CR
Entrance Criteria for PR: KD explains what is happening. OpsGL taken off TR 
page, temporarily.  Entrance criteria for PR now P3.  We need to clarify 
the P3 checkpoints.  We can specify that P3 are at risk, i.e., that if they 
are not implemented, they would be dropped or we seek that P3 checkpoints 
be implemented.  DH: There are only 3 P3 checkpoints and not very hard to 
get implementation of them.  KD: If we find that P3 checkpoints are not 
implemented, then we can drop them in the next version.  PC: Don’t want to 
drop them.  KD: modify implementation plan to cover P3 checkpoints.


Selling Points of QA Ops Document:   Need a document that explains the 
benefits of implementing OpsGL  the selling points. If people have ideas, 
then send via mailing list.

Organizing reviews of WGs to find implementation
KD: Finding groups to implement OpsGL will be difficult.  If you are 
participating in other groups, try to be proactive in getting them to 
participate.  To get started, propose that we review other WG documents 
again and see which WGs are close to passing OpsGL and if close, then work 
with that WG.  Karl to organize, selecting volunteers and will establish a 
time table, aiming to be done by the next F2F.


TestGL topics continued.

Guideline 3: process of managing test materials:
Metadata (CP3.2)
PC: Should the process related data be included as P1 (status of test 
material)?  KD: Agrees that this is important.  DM: agreed, may want to 
revisit in the future.  PC: may want to include another CP on reviewing 
tests. There is an OpsGL related CP, (define a review process).  Decided 
that the process was out of bounds, but TestGL could require the 
status.  Appropriate states include: under review, accepted, rejected, are 
the fundamental states. DH: suggests that we provide these as examples and 
let WGs define the ones they use.  DM: wants to mandate at least these 
three or equivalents of these three.  SM: What about ‘type of test’?  DM: 
type information could be captured in the scenarios.

Coverage information (CP3.3).
What do people think about coverage information?  DM: this may impact 
SpecGL, since you need a way to tag assertions in the spec.  How would 
someone publish a statement of coverage?  DM: you can say that there is at 
least one test case for each assertion.  PC:  that is a breath measurement, 
since it doesn’t say anything about the extent that the assertion is 
tested.  That is the only measurement that we would be able to formalize 
and test.  Have people publish the percentage of test assertions that have 
at least one test.  This provides information of how the tests map to the 
spec.   Discussion on whether we should specify a specific level of 
coverage that must be met. Mixed opinions.  PC: Focus on the importance of 
measuring, how to do it, and publish some numbers.

Bug Tracking (CP3.4)
PC: We keep track of buggy tests and why it is buggy.  Need to keep track 
and publish the information.  Tests could be flagged in the test management 
system.  DM: would like a test case to be able to point to a list of 
problems and that all problems need to be resolved prior to the test being 
good.  PC: that may be an implementation detail and could be in ExTech. CVS 
by itself wouldn’t be a bug tracking system, but an issues list could 
be.  KD: not recommend we specify a tool.  Agreed.

Should there be additional checkpoint in GL3?

Received on Monday, 22 September 2003 12:03:42 UTC