W3C home > Mailing lists > Public > w3c-ietf-xmldsig@w3.org > April to June 2002

RE: A simple test of XPath filter performance

From: John Boyer <JBoyer@PureEdge.com>
Date: Wed, 22 May 2002 14:37:00 -0700
Message-ID: <7874BFCCD289A645B5CE3935769F0B52328759@tigger.PureEdge.com>
To: <reagle@w3.org>, "merlin" <merlin@baltimore.ie>
Cc: <w3c-ietf-xmldsig@w3.org>

Hi Joseph,

Regarding the interop matrix row, yes we should add information on the
type of machine that is expected to achieve a certain time performance
so that those with more powerful machines can adjust the time

Someone a while back wrote in that my time of 500ms for the simple form
was interesting, but that it was not likely that such signatures would
be used in practice because you cannot take a half a second per
transaction on the server-side.  I agree with the 'because' part, but I
disagree with the assertion that such signatures are unlikely to be used
in practice.

In the 500ms time I reported, I clearly stated that I did a full
signature generation followed by a full signature verification on a
'typical' client-side computer.  Firstly, the server would be more
likely to validate only, so that's 250ms.  Then, I would expect a server
machine to have four to ten times the 'jam' of a client-side machine
(heck my home computer is four times more powerful than the 500Mhz
cpu/133Mhz bus computer than the computer for which I cited the time).
This would bring the expected performance profile of a server side
validate over the XFDL form I provided to a more respectable 50ms.

In conclusion, the most important thing to keep in mind for the interop
matrix, though, is that the times being reported are for FULL SIGNATURE
GENERATION AND VALIDATION, not just for the document filtering part of
the operation.  The signature generation and validation times should be
specified so that they are acceptable on both client and server
machines, and it would be 'cheating' to use a client-side time on a
server-class machine.

John Boyer

-----Original Message-----
From: Joseph Reagle [mailto:reagle@w3.org]
Sent: Monday, May 20, 2002 3:06 PM
To: John Boyer; merlin
Cc: w3c-ietf-xmldsig@w3.org
Subject: Re: A simple test of XPath filter performance

On Tuesday 14 May 2002 16:19, John Boyer wrote:
> The point I'm driving for here is whether we can figure out something
> RECOMMEND or REQUIRE (in the RFC 2119 sense) which will ensure that
> compliant implementations have a reasonable performance profile.

Honestly, I'm not sure how. Haven't really seen anything like this to 
borrow from. And given what I've learned here I've tried to be more 
specific in xenc without much result for good reasons...

However! I am willing to add a row in the interop matrix if someone can 
provide me two things (that might already exist, just point it out to

1. A specific instance/example that we want to test over. (I'm sure
an instance of a signature over John's test and expression on the list,
I lost track of all the examples while I was gone!).
2. A decent metric for evaluation. Is it adequate to say something like 
"Validate this signature on an expected target machine in under 500ms".
we need to specify the type of target machine? I think we're talking 
something around 600 bogomips [1]...

[1] http://www.tldp.org/HOWTO/mini/BogoMips-2.html#ss2.1
Received on Wednesday, 22 May 2002 17:37:35 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:21:37 UTC