W3C home > Mailing lists > Public > public-xmlsec-maintwg@w3.org > January 2008

XML Signature interop inventory

From: Frederick Hirsch <frederick.hirsch@nokia.com>
Date: Tue, 15 Jan 2008 10:02:20 -0500
Message-Id: <DC5E0989-393F-495E-9064-EF86F9260C66@nokia.com>
Cc: Frederick Hirsch <frederick.hirsch@nokia.com>
To: XMLSec XMLSec <public-xmlsec-maintwg@w3.org>

As a first step toward producing an implementation report for XML  
Signature, I think we need to agree which tests will go into the report.

(1) I believe there is no question that the xpointer tests should be  
in the report. All implementors have performed these tests and have  
checked in files for them.

xpointer-1
xpointer-2
xpointer-3
xpointer-4
xpointer-5
xpointer-6

(2) Include the dnString tests. All except one implementor tested  
(that one had the same implementation as another).

dnString-4
dnString-6
dnString-8

How should we report for the one implementation that did not test due  
to same underlying implementation, considering report made public?  
(Note as passed with footnote that same as other passing  
implementation?)

(3) Should we include the diffRFC tests?

diffRFCs-1
diffRFCs-2
diffRFCs-3
diffRFCs-4
diffRFCs-5

(4) Not include defCan-2 and defCan-3 tests since they were optional.
[member minutes http://www.w3.org/2007/12/04-xmlsec- 
minutes.html#item04 ]

defCan-2
defCan-3

(5) Do include defCan-1? Limited implementations due to need for  
specific interfaces available in implementations.

defCan-1

List as an optional test and only show who did it (e.g. not show  
PASS, FAIL)?

(6) Are there any other tests to consider?

Other thoughts on XML Signature implementation report contents?

Thanks

regards, Frederick

Frederick Hirsch
Nokia
Received on Tuesday, 15 January 2008 15:04:00 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 15 January 2008 15:04:00 GMT