Re: HTML.next and Rechartering

On Mon, Sep 12, 2011 at 3:59 PM, Carr, Wayne <wayne.carr@intel.com> wrote:
> Some organizations have no explicit patent obligations.  Some have obligations that apply only to one's own contributions (not to the whole Work).  For W3C, the obligation is for all WG Members for the whole specification (not just their contributions to it).

Right.

> Since they will be obligated for the contributions of others, there are periods where WG members are permitted to opt-out specific patent claims (to say they won't license them).  Those periods are after the first formal public draft and after Last Call.  That means they get a last chance to opt out patent claims that apply to new content after the last substantive change to the draft (since Last Call repeats if there are further substantive changes).

I follow this.

> That's why there is necessary relationship between the technical content of the drafts and the legal phases for opt out.

Okay, well, clearly there's a necessary relationship between the
technical content of the drafts and the legal phases for opt out.  The
question is why the *requirements to reach* those legal phases should
be technical at all.  Currently, there's no patent protection for
anything until it reaches REC.  REC means two interoperable
implementations of every feature.  Why should there be no patent
protection until there are two interoperable implementations of every
feature?  Why not just make an arbitrary snapshot that may or may not
have interoperable implementations, and have the patents licensed to
any implementation of that?  As long as the snapshot is frozen, its
technical quality should make no difference.

> It sounds like some people don't like that.  What you get out of it is all WG Members agree to license essential patent claims for the whole specification, not just for what they contributed themselves.
>
> Other organizations that either have no patent obligations at all or that restrict obligations to one's own contributions (not the whole spec) have narrower patent grants.  It isn't just arbitrary overhead; it's to enable the broader patent grants.

I understand this, but it doesn't seem to preclude patent licensing
for arbitrary snapshots instead of only snapshots with two
interoperable implementations.

On Tue, Sep 13, 2011 at 6:27 AM, Robin Berjon <robin@berjon.com> wrote:
> But that's not the point. Nothing in this discussion hinges usefully on the fact that a specification may be final or not. What we collectively need boils down to satisfying two aspects:
>
>    • a specification that is continuously improved in a tight feedback loop with the reality of implementation; and
>    • at regular intervals, anchors of stability in the above flow that have specific characteristics.
>
> Aside from the observation that we might wish to base an evolution of the process on something like git-flow, I think that the point of contention is the set of characteristics that these anchors must have.

Right.

> In my experience patent people like there to be some (attempted) grounding in reality for RF licences that constrain what may be licensed under some form of reality principle so as to avoid a group cramming features into a draft in order to weaken some given intellectual property without anyone really intending to implement the whole thing. That's why the RF licence can be limited to bona fide implementations of the REC. Requiring actual implementations is far from ideal in many ways, but it was deemed an acceptable constraint. Note that the RF licence "may be suspended with respect to any licensee when licensor is sued by licensee for infringement of claims essential to implement *any* W3C Recommendation" (my emphasis). This makes RECs quite powerful in that they have IP implications even for non-participants. This naturally means that any form of snapshot with RF implications will require some form of patent review. I don't personally much care for patents, but if we want to get RF licences we need to strike a workable deal with the people who do.

Since only implementations of the standard are covered, it seems like
there's no reason to require in advance that the standard be
interoperably implemented, right?  If it turns out some speculative
feature in the draft never gets implemented, patent holders have lost
nothing by licensing patents for it.  So this suggests that snapshots
of random Editor's Drafts are just as good as RECs.

> Also please note that snapshots aren't used solely for the PP. Contracts often require something real to anchor off of. For this you need some form of version tagging that's considered stable, or even a stable branch if it's widely recognised to be trusted as such. Whether or not such contracts are daft is not really a question here — it's an extremely widespread practice that's unlikely to change inside even of a decade or two.

I grant that the practice exists and it makes sense in some cases, but
it seems served well enough by stable snapshots of random EDs.  It
doesn't demand the incredibly long, tedious refinement process of the
current REC track.

> Finally please keep in mind that just because RECs have at times been meaningless in the past, they (or whatever they are replaced with) don't have to be in the future. We can sediment to REC only what's stable and tested. Also note that doing so in a timely manner can facilitated by smaller, modular specifications (just sayin').

This is really the problem.  When REC was meaningless, RECs were
really just arbitrary snapshots of EDs with some minor stabilization
work, and it was fine.  Now that there are real requirements for
reaching REC, suddenly the patent obligations only kick in after many
years.

Received on Tuesday, 13 September 2011 19:42:16 UTC