Re: shaping up Xproc unit test thoughts

James Fuller wrote:
> shaping up some ideas on unit testing with xproc;
> 
> I would place under a different optional library...have scratched out
> the following as a starting point for discussion;
> 
> <t:test-suite>
> 
> <t:test name="check-ex-pipeline" msg="checking pipeline output">
> <p:input port="test-source">
>  <p:pipe step="xform" port="result"/>
> </p:input>
> <p:input port="expected">
>  <p:document href="test.xml"/>
> </p:input>
> 
> <t:assert msg="checking result has title element">
> </t:assert>
> 
> <t:assert msg="checking result has body element">
> </t:assert>
> 
> <t:assert msg="checking result has meta tags">
> </t:assert>
> 
> </t:test>
> 
> </t:test-suite>

Sorry, I'm missing some context. Is the idea that this is an XML 
document that you would programmatically convert into a pipeline and 
then run? If you have separate <t:assert> steps, what's the role of the 
'expected' input?

> now a few questions and thoughts;
> 
> *  I guess a  t:test type step could be considered very similar to 
> p:viewport
> 
> * I would like to append output from multiple tests to a single
> p:output, unsure of how this is achievable with current p:output
> definition and allowed sequence

You generate a *sequence* of documents on a single output and then wrap 
them into a single document in a separate step, with <p:wrap-sequence> 
for example.

> * I have left the t:assert elements empty for now, but one can imagine
> assertions testing for true, false, xml-equals, xml-not-equals,
> xpath-exists, xpath-not-exists, etc....not quite sure if its a
> parameter or an option ...probably being silly here

If you (as the pipeline author) know the names you want to use then 
they're options. If you don't (and the user of the step gets to choose 
the names) then they're parameters.

> * I see such unit tests as valuable part of documentation of code, so
> I would advocate for them living inside p:pipeline

I think that would work, if you add the test namespace to list of 
ignored namespaces (otherwise they'd be interpreted as steps).

(I used a similar approach in my XSLT unit testing framework, but it 
seems lots of people like to have the tests in a separate document 
instead, so I'd suggest making both options viable.)

> * should such tests be applied to steps, compound steps, pipelines and
> subpipelines or should I make some differences now in the t:test
> element

I'd invoke tests on entire pipelines. You need ways to specify all the 
inputs, options and parameters and to test all the outputs.

> * anyone know of test result microformats  out there...over the years
> I have seen a few come and go with no adoption. Junit xml output is
> useful as a starting point
> 
> * test failure, what does it mean ...I know that when a test fails
> this means it is indicated in the output

Don't forget methods of testing whether the pipeline throws an error, 
and whether the error is the expected one.

> * must think a bit more about issues with context and inherited
> environment with testing

This is why testing at the pipeline level is good: there's very little 
context that you can't pass explicitly into the pipeline.

> * an implementation detail, switch to turn on or off understanding t:
> namespace elements

If they're ignored in the pipeline then that's turned them 'off' as far 
as running the pipeline normally goes. Otherwise, you'll extract them to 
run them. So I don't see a need for turning understanding on or off, but 
might have misinterpreted the method you're envisaging using to run the 
tests.

Cheers,

Jeni
-- 
Jeni Tennison
http://www.jenitennison.com

Received on Tuesday, 12 June 2007 07:50:21 UTC