W3C home > Mailing lists > Public > w3c-dist-auth@w3.org > April to June 2001


From: Greg Stein <gstein@lyra.org>
Date: Fri, 4 May 2001 18:05:39 -0700
To: w3c-dist-auth@w3.org
Message-ID: <20010504180539.B1374@lyra.org>
On Fri, May 04, 2001 at 12:45:49PM -0400, Jason Crawford wrote:
> <<
> I think the key part of Geoff's post is "the subset that it can use".  The
> problem with allprop is that it will return all the live properties
> irrespective of whether the client is aware of the properties' semantics.
> Sometimes this is what the client wants, say if it is naively displaying a
> property sheet; but most likely it is not since there is no way for the
> client to interpret the values or know if/how they can be changed.
> >>
> Actually I'd think that property sheet case would be pretty common.  And
> removing allprop isn't going to prevent people from doing the same thing...
> now with two requests rather than one.

That is my point exactly. Tossing allprop is not going to affect Geoff's
issue with computed properties.

> And sometimes for wide directories
> it will be difficult to avoid this (suspected) problem even without
> ALLPROP.  If this is actually the pivotal concern, I think the best you can
> do just warn people of the potential cost that we see of using allprop.
> >From there on in, let time tell if it's really a problem.  If we discover
> it is, let's *really* solves the problem then.  I'm willing to remove
> ALLPROP, but it doesn't sound like doing that really would solve the
> problem and it's not clear if there is a problem.


> > It's the old NRA argument -- allprop doesn't kill servers, clients kill
> > servers ;-)
> At first I thought that analogy was flawed, but as I think about it, I
> think that the situation is analogous.  This discussion seems to have all
> of the same aspects.  The differences I see are...
> 1) I don't think it's clear that there actually is a problem in 2518 that
> we need to solve.
> 2) In 2518, the people that would be vicitimized by the concern are
> actually in (more) control over whether they are vulnerable.  (Client
> programers can discover that they don't really want all those random
> properties and perhaps conclude that it's slowing their response time and
> stop using ALLPROP.  I think clients can disconnect if a response is too
> long.  And I think servers (with guidance from us) can chose to reject
> certain requests if they really feel that they are too expensive.)


I'd prefer the guidance approach. mod_dav (in its default configuration)
will simply refuse Depth:infinity PROPFIND requests (of any variety). Not
necessarily nice, but that was the approach I took. I'd much rather have a
"recommendation" from the Working Group on how to do that.

> <<
> I have to agree that it is a stealth action to undermine (the effects of)
> allprop.
> >>
> I'm guessing you're joking, but I'd like to hear why that was put in that
> spec.  Was there some issue involved that we haven't mentioned here?

I doubt it was joking. It truly was a stealth action. I've never noticed it
before. Looks like it was in 14.1 and 15 at least.


In Lisa's post, she posits some possible scenarios. Here is an *actual*
scenario, today:

Subversion (SVN) (the version control system, which uses DeltaV for its
network protocol) exposes the notion of properties on the resources. The
client can get/set them at will, and they are committed along with content
changes to the repository.

SVN uses a CVS-like model, or "client-side" if you will. The content is
pulled down to the client, edits are made, and at commit time, everything is
pushed back up to the server. (blah blah blah about conflict resolution)

So... when the checkout is performed and content is pulled down to the
client, we need to do a PROPFIND/allprop to fetch all the properties that
may have been attached to the resource. The user can then add/change/delete
the properites, and commit them back.

That allprop is the key, and our actual use scenario.


I explained how a propname followed by a fetch was non-trivial for the
client, but Tim said "nah, that's trivial." Sorry, but I have to disagree.
You're talking about a good chunk of code to union all of those names and
their namespaces, assigning prefixes to all of them, generating the new
request with all that data, then parsing through all the results where you
have a mix of 200'd properties and 404'd properties. That is very
non-trivial in my book.

[ and yes: I'm both the client and server implementor ]


Greg Stein, http://www.lyra.org/
Received on Friday, 4 May 2001 21:02:27 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 20:01:22 UTC