- From: C. M. Sperberg-McQueen <cmsmcq@blackmesatech.com>
- Date: Tue, 07 Jun 2022 08:10:21 -0600
- To: John Lumley <john@saxonica.com>
- Cc: public-ixml@w3.org
John's report inspires me to offer a similar report, for the record.
As of yesterday, Aparecium's test report reads in part
<tc:description>
<tc:p>Tests run / passed / failed / not-run / other:</tc:p>
<tc:p>Grammar tests (73): pass: 73. </tc:p>
<tc:p>Test cases (229): fail: 16. pass: 213. </tc:p>
</tc:description>
The discrepancies from John's numbers tell me that I am missing some
recent updates to the tests.
Aparecium's test failures all involve recently added tests for prologs,
dynamic errors, and insertions. Aparecium accepts version declarations
but is not providing ixml:state="version-mismatch" attributes when it
should; it is failing on some dynamic errors instead of catching and
reporting them; it does not yet support insertions.
On the plus side, my test harness and test driver now work with Saxon PE
and HE, so I can run the test suite using both BaseX and Saxon as XQuery
implementations. (Support for other XQuery engines is planned but not
imminent.)
Michael
John Lumley writes:
> On 02/06/2022 11:14, John Lumley wrote:
>> As of the test suite sometime on June 1st my processor (must really
>> give it a catchy name) is passing 308 test cases with no
>> failures... Note that each an every grammar test (90) and parse test
>> (218) is counted, though a number of the assert-not-a-grammar
>> declrations are duplicated both in the test set and individual child
>> test cases. Today I'll try to update my version of the test-suite
>> I'm using, and produce some less 'development-focussed' reporting.
>
> As of 7th June, using the current test suite (last modified 4th June)
> my processor executes 163 test sets with 252 test cases and passes 231
> parse tests and 94 grammar tests - no skips, no failures.
--
C. M. Sperberg-McQueen
Black Mesa Technologies LLC
http://blackmesatech.com
Received on Tuesday, 7 June 2022 14:10:41 UTC