Most folks who are hearing about this haven't directly participated in
a community standards effort, or a more formal standards body. They
think the W3C/IETF/OASIS "covers it".
But I think the sense of folks here is that there needs to be
something lighter weight that's only focused on the minimum needed for
a spec to become widely adoptable. For me, thats IPR hygiene -- almost
everything else can be done *easily* without an org (save, maybe the
organizational standup of a new org to hold/manage IPR). Having
slogged through this IPR policy stuff several times,I'm really happy
to see this effort to create a reusable framework for community
efforts. I only hope it remains lightweight and facilitates the widest
range of community efforts as possible.
-Gabe
Are these criteria for content, or merely for openness?
Is this group trying to be some sort of judge of technical merit, or
of market value?
-Gabe
--
Gabe Wachob / gwa...@wachob.com \ http://blog.wachob.com
This ideas in this email: [ ] I freely license [X] Ask first [ ] May
be subject to patents
I'm not going to speak on behalf of the group of course at this point
but would be interested in your thoughts on this notion of "criteria".
Chris
Sent from an iPhone Classic.
The more this group goes outside just providing the legal/IPR
framework, the more I get nervous.
What exactly is the purpose of being a gatekeeper w/r/t competing
specs? Why *not* let the market decide if two "competing" specs come
out of efforts under the OWF umbrella? This org's purpose is not to
promote a certain spec over another, except as to the "openness", right?
I'm just really worried that once you get into the "this spec is
blessed and this isn't", for any reasons other than IPR openness, you
instantly become un-lightweight, and the purpose gets muddled.
Furthermore, you likely end up turning away potential work that
*could* be useful and would leverage the IPR framework in OWF.
-Gabe
Most stds bodies are tall and skinny..
-Gabe
IMHO even just defining clear and simple processes and documents
(IPR-related mainly) would be a huge contribution to the diffusion of
open standards.
It will be a great value to the big corporations as they'll know that
a spec has been developed using a commonly agreed (legal) framework,
BUT it will be of greater value to the small players and groups that
may be able to come up with interesting specs but may not (and usually
don't) have the experience and ability to go beyond the technical
spec.
In the end, with OWF, we may come out with something akin to an
IPR-commons (or Open-IPR, doing for specs IPR what CC did for the
licensing of creative work).
Also important is that these efforts are grounded on some real prior
work that has succeeded ans has been validated "by the market", as the
process and IPR work around OAuth.
Luca Mearelli
A great quote from Stephen Walli at OSCON this week (which I hope I remember right):
Standards are how companies declare war against the market leader.
(http://en.oreilly.com/oscon2008/public/schedule/detail/2313)
EHL
In fact, if you describe OWF as "Apache for specs", you could describe
OASIS as an "Apache for Enterprisey Specs" ;)
Whether OWF is OASIS-Lite or something else, there are probably a lot
of interesting things to borrow from at:
http://www.oasis-open.org/who/policies_procedures.php
James
--
James Tauber http://jtauber.com/
journeyman of some http://jtauber.com/blog/
I'm worried about the whole filtering process inside ASF - totally
appropriate for that environment, but it feels like friction we don't
need for specs...
-Gabe
--
Point is, the more we excercise criteria like the following ones
[1] from ASF, the more it looks feels a standards body:
Alignment / Synergy
* Use of other ASF subprojects
* Develop synergistic relationship with other ASF subprojects
If it were up to me, any group that met a minimum bar could come
into the org and comply with the IPR rules (and maybe extra rules
about openness, including diversity of participation, transparency,
etc) and produce a spec. And the meaning of OWF's association would be
that the IPR hygiene is clean and the spec was made in a minimally
transparent way.
The Apache meritocracy is about producing good quality code, where
good is defined by "being done by people with good reputation". I just
get really nervous when a group, no matter how experienced and well
respected the leaders/comitters are, decides a spec gets a thumbs up
or thumbs down before a spec even gets to market. It dilutes the
purpose of this org, I believe. Call me a free marketer ;)
So I hear you about lightweight and focus on IPR, but I'm trying
to understand the purpose of the Apache process for promoting work
from "candidate" to podling to project and why that's needed here.
Maybe I'm just being too literal here - but why do we need anything
other than "in/out" (and maybe "dead to inactivity or failure to
comply with IPR and/or process")? Once you come and show that your
contributors are good to go with the OWF IPR rules (and that there's a
legitimate community effort -- but thats a low bar I think), what else
should you need?
-Gabe
[1] http://incubator.apache.org/incubation/Incubation_Policy.html#Graduating+from+the+Incubator
Too many specs?
Diluting the OWF brand?
The OWF not being "relevant enough"?
Can we do half as much and be twice as successful?
-Gabe
> My largest concern is time and resources of smart people. The more
> that get involved then the more that we can do. We shouldn't start
> with an open specification for a DSL modem authentication protocol
> as I doubt we have the domain expertise to do a good job.
But following on from the OASIS-Lite meme, would we want to allow a
group of DSL modem auth protocol experts to create a working group
under OWF to do this if they came to us?
If we are going to learn from other organisations, then a second thing
that is needed is process. One of the main reasons the ASF works, IMO,
is because of two pretty simple rules:
1. Meritocracy
2. Three +1s, anyone can veto but must justify.
The IETF lets anyone participate equally. This is broken because WGs
can be stalled by any idiot with time on his hands. The ASF allows
anyone to speak, but only votes from committers are counted.
The W3C lets you buy a voice - and won't give you one unless you pay.
I hope its obvious why this is broken.
The three +1s with veto allows progress to be made rapidly without
having to pause for formal votes on a regular basis. The justification
requirement seems to pretty effectively prevent hidden agendas and
frivolous vetos (no-one is going to say "I vetoed because my plan is
better than yours").
>
> -Gabe
>
> >
>
The lesson the ASF learned is that you actually do have to require
governance and process or you end up with some very dysfunctional
projects. This is largely why the incubator exists.
Could you expand on what you mean by "the whole filtering process inside ASF"?
I would agree that these should be suggestions rather than requirements.
> If it were up to me, any group that met a minimum bar could come
> into the org and comply with the IPR rules (and maybe extra rules
> about openness, including diversity of participation, transparency,
> etc) and produce a spec. And the meaning of OWF's association would be
> that the IPR hygiene is clean and the spec was made in a minimally
> transparent way.
+1
> The Apache meritocracy is about producing good quality code, where
> good is defined by "being done by people with good reputation". I just
> get really nervous when a group, no matter how experienced and well
> respected the leaders/comitters are, decides a spec gets a thumbs up
> or thumbs down before a spec even gets to market. It dilutes the
> purpose of this org, I believe. Call me a free marketer ;)
The meritocracy decides who is trusted to screw up the code base, not
the popularity of the product.
> So I hear you about lightweight and focus on IPR, but I'm trying
> to understand the purpose of the Apache process for promoting work
> from "candidate" to podling to project and why that's needed here.
> Maybe I'm just being too literal here - but why do we need anything
> other than "in/out" (and maybe "dead to inactivity or failure to
> comply with IPR and/or process")? Once you come and show that your
> contributors are good to go with the OWF IPR rules (and that there's a
> legitimate community effort -- but thats a low bar I think), what else
> should you need?
Process that ensures that genuine participants are treated fairly.
I think it is a given that we can only bind participants to IPR
agreements. Patent trolls are out of scope.
This is a great summation - thanks. While the OWF is not a standards
body, I expect that specs that come out of the OWF process with a
clean IPR bill of health will be easier to move through the standards
process, since the IPR issues will have already been dealt with.
--Steve
On Fri, Jul 25, 2008 at 7:59 AM, DeWitt Clinton <dew...@google.com> wrote:
> The end result of project that goes through the OWF incubation process is a
> working specification with clean IP that has demonstrated the ability to
> sustain a diversity of contributors. Nothing more, nothing less.
--
Steve Ivy
http://redmonk.net // http://diso-project.org
This email is: [ ] bloggable [x] ask first [ ] private
thanks Dewitt, that greatly reassures me.
> Regarding IPR, yes, I think what we're trying to do is a) create
> some commonly agreed upon language around specification licensing,
> a. la. the CC license for copyright or the Apache license for source
> code, and
I can see that working, with some legal resource, of course CC had
Larry Lessig to turn a sea of bespoke licenses into pressing a set of
simple radio buttons.
There are existing licenses to reuse, W3C document springs to mind,
but IANAL, and I for guess this to work, we're going to need one. Or a
bunch.
> b) ensure that all project contributors have agreed to those terms.
That's something every collaborative effort has to tackle at some
stage. Having a transparent, off the shelf process which scales
horizontally will help many and be invaluable. Is that the intent?
> The OWF governance model should be optimized for participation by
> busy engineers, not full-timers.
Bang on. Don't make me think!
However, I think you are maybe mischaracterizing OASIS a bit and I'd
just like folks to keep OASIS in mind as a model in addition to ASF.
OASIS doesn't make any attempt to "de-dupe" specs. OASIS has a very
very lightweight process (scales down to a handful of individuals if
you want - except the final Oasis-wide vote which has a lot of warts).
OASIS, in fact, does no real filtering at all except minimum bars of
transparency and adherence to one of several IPR modes (only one of
which folks here would find acceptable for "open standards"). When I
was saying "filtering" before re: ASF, it was not a dig - but rather a
statement that anyone who wants to be an ASF project cannot just show
up and be a project - in fact, you have to convince disinterested
parties that you belong there. Probably the right answer for OWF is
somewhere in between.
I'm not suggesting we copy OASIS, any more than you guys are
suggesting we copy ASF. Lets just not throw the baby out with the
bathwater....
I think the first step for OWF is a statement of rather detailed goals
and principles because otherwise, this thread will go on forever ;)
-Gabe
On Fri, Jul 25, 2008 at 7:59 AM, DeWitt Clinton <dew...@google.com> wrote:
--
I think that realistically the way things are today the traditional
standards bodies have stopped adding anything meaningful. Something
light and open like has been described here for OWF is really all that
should every be necessary - even in the long run.
Really excited to see what comes of this!
--
- Stephen Paul Weber (Singpolyma)
Web: http://singpolyma.net/
Twitter: http://twitter.com/singpolyma
IM: singp...@gmail.com
Thanks Dewitt, this is great.
I'm not suggesting we copy OASIS, any more than you guys are
suggesting we copy ASF. Lets just not throw the baby out with the
bathwater....
I think the first step for OWF is a statement of rather detailed goals
and principles because otherwise, this thread will go on forever ;)
What about compliance to the standard?
Arguably the most important reason for a standard is to assure
interoperability among different implementations. Contributing one's IP
(whether licensed or through non-assertion) to a standard with minimal
restrictions is fine, but shouldn't that extend only to those compliant with
the standard? I don't mind giving IP to help create a standard, even on a
royalty-free basis, but I'm not too excited about non-assertion clauses that
favor forking and potential incompatibility.
Wouldn't compliance be an obvious and fair quid-pro-quo for IP
contributions? That would make it at least one kind of restriction on
declared patents that makes sense.
-j
--
Joe Andrieu
SwitchBook
http://www.switchbook.com
j...@switchbook.com
+1 (805) 705-8651
It makes sense at first sight, but opens up a hole for gaming.
Measuring "compliance" is really, really hard, and introducing any
kind of dependency for IP grant ("compliance" and "necessary claims"
being examples) immediately renders open source developers unsafe due
to uncertainty.
* Adding "required claims" language (where the grant of rights is
dependent on the only way of implementing your software being to use
the patent) requires an outside expert to help determine eligibility.
* Requiring "compliance" renders the rapidly iterative "use &
improve" approach of open source impossible as only the final,
"compliant" version will be eligible for the grant (and even then only
after following some form of onerous certification process).
I recommend that OWF not allow either "necessary claims" or
"compliance" as predicates to IP grant. A straightforward,
unconditional, sublicensable, non-expiring and ownership-change-
surviving non-assert is the answer in my view. Plenty of dragons to
tame in those words, mind you.
S.
Seems to me the difficulty with compliance depends, in large part, on
whether or not the spec is complete enough to test. As you imply, an
underspecified standard is easy to rev and hard to test for compliance.
But that doesn't mean you can't bake compliance into the spec, with a test
suite.
The last thing I want to do is get into a market-driven standards war with
Microsoft (or any big company) over which variant of a spec is going to
actually be supported by the majority of service/content providers on the
net.
We've seen that mess with HTML, css, and javascript. There's gotta be a
better way.
It seems that the extra work to properly define compliance more than pays
for itself in interoperability. Smart wording of compliance licensing could
manage a reasonable distinction between development and production code.
Code in development must be able to be iteratively evolved, but, IMO, it
shouldn't be moved to production until it is actually compliant with the
standard.
Isn't it precisely that kind of distinction that OWF is here to figure out?
If all we're here for is to define good IPR = non-assertion, that seems to
miss the point. Mind you, compliance-based IPR policy may not be right for
every project, but seems like finding one way to do it well is the kind of
thing that could be leveraged across a lot of projects.
-j
--
Joe Andrieu
SwitchBook
http://www.switchbook.com
j...@switchbook.com
+1 (805) 705-8651
> -----Original Message-----
> From: open-web...@googlegroups.com [mailto:open-web-
> dis...@googlegroups.com] On Behalf Of Simon Phipps
> Sent: Saturday, July 26, 2008 1:41 AM
> To: open-web...@googlegroups.com
> Subject: Re: Open Web Foundation characterization
>
>
>
Two examples: the Delicious API vs the Ma.gnolia API. The latter was
richer, possibly better and more intentionally designed; the former
simpler and easier to implement. The latter took off as such and only
after Ma.gnolia mirrored the delicious API did people start to build
against it. Their was no formal compliance testing -- either it worked
or it didn't, and if it didn't you spent more in support costs dealing
with angry or frustrated customers.
Second is the Flickr API, where a number of services have spring up
that implement it, or portions of it, depending on the purpose of the
application. Again, no formal compliance process there, and yet their
API specification has been both very successful and quite influential
on other similar APIs.
Those are cases informing my thinking here -- as well as cases like
OpenDD or oEmbed, where the specs might be a page or two long and no
more. You typically need compliance testing in systems where
complexity requires more attention than a single developer's. I think
we'd like to enable and encourage an ecosystem of simpler, more direct
technologies and then see where that leads us, through the application
of Darwinian open source survival-of-the-easiest to socialize and
implement!
Chris
If we want this to be true, then we will have to depart significantly
from existing IPR agreements (or wash our hands of the problems they
introduce), since they pretty much universally talk about "Necessary
Claims", which refer to "Compliant Portions" (of implementations),
which, obviously, have to be compliant with the spec.
So, implementations that are not compliant are not covered by any
non-assert or licence.
Now, I'm no fan of the "Necessary Claims" language but its pretty
obvious that its going to be hard to entirely eliminate, since that
would mean that participants would be granting a free-for-all on all
their patents that can be in any way read to be relevant to the
specification.
> Now, I'm no fan of the "Necessary Claims" language but its pretty
> obvious that its going to be hard to entirely eliminate, since that
> would mean that participants would be granting a free-for-all on all
> their patents that can be in any way read to be relevant to the
> specification.
Sun has done that for ODF. It was cheap and easy for Sun to do (in
terms of legal due diligence), and it declares implementations of ODF
risk-free (at least from Sun) for open source developers.
S.
Pointer?
> Seems to me the difficulty with compliance depends, in large part, on
> whether or not the spec is complete enough to test. As you imply, an
> underspecified standard is easy to rev and hard to test for
> compliance.
I don't think it's to do with "underspecification". Java was and is
very thoroughly specified, yet the initial (significantly complex)
compliance tests published in the late 90s were still holey enough to
drive a coach through. Over the years they have become gargantuan and
thorough, but in some cases it is to the point where they test for now-
rarely-used capabilities (CORBA, anyone?). Compliance is
combinatorial, so building compliance suites for any non-trivial
specification will be costly and slow and those who do it will be
likely to seek compensation. And trust me, we don't want that...
> It seems that the extra work to properly define compliance more than
> pays
> for itself in interoperability. Smart wording of compliance
> licensing could
> manage a reasonable distinction between development and production
> code.
> Code in development must be able to be iteratively evolved, but,
> IMO, it
> shouldn't be moved to production until it is actually compliant with
> the
> standard.
I admire your confidence, but experience to date shows what you
describe is not easy. And done imperfectly it creates gameability for
the wiley corporation.
I'd assert that compliance testing is contrary to the open market
spirit of Apache. Just as the Apache license has no concern for the
uses to which downstream users put code that uses it, so I suspect OWF
should have no concern about the way its specifications are used. The
very best compliance test is a comprehensive open source
implementation. Once it exists, implementors will base their work on
that implementation and those who fail to interoperate with it will
become isolated.
OWF should focus on timely open source implmentation. Maybe the
graduation criterion should be the existence of an open source
implementation created by multiple independent parties? As for
compliance: leave it to the standards body that adopts the spec when
it graduates.
S.
The non-assert for ODF is at:
http://www.oasis-open.org/committees/office/ipr.php
I discuss it at:
http://blogs.sun.com/webmink/entry/ten_reasons_the_world_needs
There's no "essential claims" language in Sun's SAML and OpenID
covenants either, links in that blog posting.
S.
Well, that's indeed a good thing, though I wonder how one would decide
whether some piece of software implements a specification, and whether
that is, in practice, any different from "Essential Claims".
My non-lawyerly assessment is that in some sense it could actually be
worse - if I only implement part of a specification, would I be
covered by Sun's non-assert? "Implements the XXX specification" would
suggest to me that an implementation had to implement all of it to be
coverered.
To be clear, I am not questioning Sun's intent with these non-asserts
- but I am wondering if they are perhaps too brief...
I'd also note that it is much easier for a company to make this kind
of agreement after the fact, when it is clear what you are giving
away, than when a spec is in the process of being defined. I am
curious whether Sun always did these in retrospect?
--Steve
[1] http://feedvalidator.org/
[2] http://www.tbray.org/ongoing/When/200x/2006/08/11/Meet-the-Ape
--
I'm not saying Sun is bad here. Just that verification costs money.
Danese
Was "safe" the outcome, or "acceptable"? I ask because there are
clearly many IPR agreements out there that many companies have found
acceptable, but we wouldn't be here today if they were "safe".
If we want this to be true, then we will have to depart significantly
On Sat, Jul 26, 2008 at 2:02 PM, <chris....@gmail.com> wrote:
>
> My instinct is that compliance is out of scope for OWF.
from existing IPR agreements (or wash our hands of the problems they
introduce), since they pretty much universally talk about "Necessary
Claims", which refer to "Compliant Portions" (of implementations),
which, obviously, have to be compliant with the spec.
Perhaps a suggestion for new specs would be the development of a
testing suite or validator, and implementations must pass the
validator to be compliant? I'm thinking of something along the lines
of the Feed Validator[1] or the APE[2], for example.
--Steve
[1] http://feedvalidator.org/
[2] http://www.tbray.org/ongoing/When/200x/2006/08/11/Meet-the-Ape
In case anyone things I am in disagreement with any of this, I'm not. :-)
If we _can_ totally axe any implication of compliance I am very much in favour.
Danese
Your instincts I think were dead on. Necessary claims language is
often there because its the only way that some organizations feel they
can reasonably participate/contribute in an open standard, if they
hold IPR that might be implicated. This is a substantive topic that
should be guided by stated goals/principles of this organization.
There's a tension in creating IPR policies that are both
implementer friendly and IPR-contributor (esp patents) friendly.
Contributors want to have certain limitations around their
contributions, implementers want to have (at least) a well defined
playing field free from IPR concerns. The goal is to maximize the
size of that playing field. How does this organization want to do
that? (Rhetorical question for discussion).
An example of this tension came up in both OpenID and OAuth
discussions - the issue was around language in the non-assert which
"pulled in" IPR from specs that were referenced in the core spec. Does
the non-assert around the core spec "flow down" to all the referenced
specs? (ie would a non-assert around TLS imply a non-assert around all
the encryption methods referenced in the doc)? What about "optional"
features in the spec, that are not required to implement the core
spec? I'm not trying to raise these issues now, but just trying to
point out that if you are trying to get the most people to contribute
their IPR, you have to think about the perceived risk of contribution
by existing IPR holders. I think thats probably the biggest
substantive challenge for any IPR policy/process this organization is
going to craft.
And I disagree that we have to mandate any "compliance program"
within the org for the purposes of supporting necessary claims
language. The "necessary claims" language really would only come up in
the context of an actual dispute. I just think being too ambitious
with changing the status quo (which typically includes necessary
claims language - despite the discomfort it causes many people) is
counter to what I'd like to see. I'd rather codify (mostly) existing
practice and make it reusable for all comers, rather than try to boil
the IPR ocean at this point -- unless I could be convinced that
maximizing the size of the playing field could be done by changing
existing practice.
-Gabe
--
"Sun irrevocably covenants that, subject solely to the reciprocity
requirement described below, it will not seek to enforce any of its
enforceable U.S. or foreign patents against any implementation of the Open
Document Format for Office Applications (OpenDocument) v1.0 Specification,
or of any subsequent version thereof ("OpenDocument Implementation") in
which development Sun participates to the point of incurring an obligation,
as defined by the rules of OASIS, to grant (or commit to grant) patent
licenses or make equivalent non-assertion covenants."
The crux, for this discussion is the determination of "any implementation"
and how much of some complete software program is viewed as "any
implementation" of the Open Document Format. For that, one needs to
understand conformance in the OpenDocument Format specification (which is
very tolerant of variations and limited implementations), assess the good
will and good faith of the promiser, and also use common sense in the
matter. I, personally, have no qualms about that covenant and would rely on
it. That declaration is not offered as advice.
- Dennis
PS: It is useful, in this context, to differentiate conformance from
compliance, reserving the second for more formal mechanisms. I would expect
Open Web to stay as far away from compliance and certification efforts as
most standards bodies do, any working-group undertaking of test suites and
conformance-confirmation tools notwithstanding (I love that word, probably
use it wrong).
-----Original Message-----
From: Ben Laurie
http://groups.google.com/group/open-web-discuss/msg/054f0c2e652a2a8b?hl=en
Sent: Saturday, July 26, 2008 07:42
To: open-web...@googlegroups.com
Subject: Re: Open Web Foundation characterization
On Sat, Jul 26, 2008 at 3:39 PM, Simon Phipps <web...@gmail.com> wrote:
[ ... ]
> Sun has done that for ODF. It was cheap and easy for Sun to do (in
> terms of legal due diligence), and it declares implementations of ODF
> risk-free (at least from Sun) for open source developers.
Pointer?
[ ... ]
For example (since it came up at OSCON), the Microsoft Open Specification
Promise covenant not to sue is conditioned on implementations being of the
covered specification. They had to clarify what was meant by conformance in
that respect to allow people to implement parts of a spec that made sense in
a particular case, to allow for the fact that their might be bugs that show
up as deviations, etc.
This was covered in a post from Sam Ranji on that topic:
http://port25.technet.com/archive/2008/07/25/oscon2008.aspx
with details (still too much legal-speak, but ...)
http://port25.technet.com/archive/2008/07/25/osp.aspx
[there's still a problem if there is a patent's essential claim that can be
asserted for implementation features that have nothing to do with the bits
that implement some degree of the specification, but that problem exists
everywhere for everything, and it will apply for any statement about this
that Open Web Foundation comes up with.]
-----Original Message-----
From: David Recordon
http://groups.google.com/group/open-web-discuss/msg/3c72b21c2a8d1809?hl=en
Sent: Saturday, July 26, 2008 01:07
To: open-web...@googlegroups.com
Subject: Re: Open Web Foundation characterization
--David
[ ... ]
The aspect of the open-source spirit that provides for free adaptation of
licensed code to any purpose and mutation to some completely different
application can always run across an IP problem if it no longer embodies an
implementation of the covered specification but does run afoul of a patent's
claims. You cannot expect a safe harbor from a covenant that is tied to the
specification if you end up not implementing the specification in some
derivative of implementing code. (Whether this is a real risk rather than a
speculative bogey is going to depend on context that we don't have before
us.) I don't think this is a problem that any use of specification-tied
covenants can resolve. Is this what you are concerned about?
- Dennis
Dennis E. Hamilton
------------------
NuovoDoc: Design for Document System Interoperability
mailto:Dennis....@acm.org | gsm:+1-206.779.9430
http://NuovoDoc.com http://ODMA.info/dev/ http://nfoWorks.org
-----Original Message-----
From: DeWitt Clinton
http://groups.google.com/group/open-web-discuss/msg/7fcb455deff01a88?hl=en
Sent: Saturday, July 26, 2008 09:01
To: open-web...@googlegroups.com
Subject: Re: Open Web Foundation characterization
[ ... ]
I'm in agreement with much of what Simon says. So I think "Compliant
Portions" will need to be carefully but aggressively (re)considered in the
OWF IPR policies; there is an inherent difficulty in defining the term
"Compliant" without introducing a politicized morass that runs counter to
the open source spirit*. And agreed, that term is extremely poorly defined
in the current generation of open IPR policies.
[ ... ]
My contention is that necessary claims language, at least as practised
in current IPR policies, reduces the playing field to essentially zero
unless you are well-heeled enough to do a lot of due diligence.
In practice, many open source developers just wing it, but surely
that's what we're here to fix?
So, I am not arguing for a compliance program - indeed, I think it is
both impractical and a barrier to progress. I am pointing out the
opposition that we may face and making it clear that I do have some
awareness of the underlying issues. How much we care about that
opposition and where we draw the line is, indeed, an interesting
challenge.
My concern is simply that existing IPR agreements do not cover
implementations of the specification sufficiently well. I agree that
if they are tied to the specifications, then they are tied to the
specifications, but the problem is that mostly they actually cover
some subset of the specification, nothing it references and only when
there wasn't some other way to do it, no matter how stupid. In other
words, from the POV of a developer with nothing at hand except the
specification: nothing at all.
And that's when they got the language right so there aren't gaping holes in it.
Are there any IPR statements that come close to what you are looking for?
I'm interested in what would be satisfactory in closing the gap.
- Dennis
-----Original Message-----
From: Ben Laurie
http://groups.google.com/group/open-web-discuss/msg/e3d35fc1d7c21106?hl=en
Sent: Saturday, July 26, 2008 11:06
To: open-web...@googlegroups.com
Subject: Re: Open Web Foundation characterization
[ ... ]
My concern is simply that existing IPR agreements do not cover
implementations of the specification sufficiently well. I agree that
if they are tied to the specifications, then they are tied to the
specifications, but the problem is that mostly they actually cover
some subset of the specification, nothing it references and only when
there wasn't some other way to do it, no matter how stupid. In other
words, from the POV of a developer with nothing at hand except the
specification: nothing at all.
And that's when they got the language right so there aren't gaping holes in
it.
[ ... ]
Danese Cooper wrote:
> I think its important to mention wrt ODF that Google paid the SFLC a
> fair amount of money to review ODF
When I said that a broad, simple patent covenant is "easy" I meant
that it is cheap and easy for the entity making the covenant. If they
believe that the application of the specification should be without
fear for the open source developer, they make the covenant. No
enumeration of claims is needed and no searches are essential, since
the covenant is scoped on topic and not on the patents themselves.
I've no idea why Google decided to get SFLC's opinion but I'd like to
think that it was because there had never been such a straightforward
non-assert before.
> I'm going to have to defer to Chris DiBona to explain the scope of
> what was decided.
No need. The SFLC position is available at http://www.softwarefreedom.org/resources/2006/OpenDocument.html
Ben Laurie wrote:
> My non-lawyerly assessment is that in some sense it could actually be
> worse - if I only implement part of a specification, would I be
> covered by Sun's non-assert? "Implements the XXX specification" would
> suggest to me that an implementation had to implement all of it to be
> coverered.
I wondered this as well and was told that since the scope was not
limited in the non-assert to "compliant" or "complete" implementation
this was not a problem. IANAL & TINLA of course.
> To be clear, I am not questioning Sun's intent with these non-asserts
> - but I am wondering if they are perhaps too brief...
The simplicity is what makes it so powerful. The lack of qualification
and exception makes it very easy to rely upon.
> I'd also note that it is much easier for a company to make this kind
> of agreement after the fact, when it is clear what you are giving
> away, than when a spec is in the process of being defined. I am
> curious whether Sun always did these in retrospect?
Yes, they were each established in retrospect of an early version. The
ODF covenant continues to apply to each new version all the time Sun
is involved in its production (obviously earlier non-asserts would
continue to apply in the unlikely event of Sun pulling out of the
OASIS ODF TC). I've not studied the other covenants.
Gabe Wachob wrote:
> Necessary claims language is
> often there because its the only way that some organizations feel they
> can reasonably participate/contribute in an open standard,
And there's the crux of the matter. The core question is, do we want
OWF to accommodate corporations who out of their own caution wish to
impose legal uncertainty on open source developers? My vote would be
"no". Unless, of course, we want OWF to be a standards body...
As Ben says, "In practice, many open source developers just wing it,
but surely that's what we're here to fix?"
David Recordon wrote:
> It's my opinion that formal compliance can come later. I think the
> IETF's
> model of two independent interoperable implementations is a pretty
> decent
> start toward compliance. :)
I don't disagree with the "compliance comes later" sentiment but I'd
vote for "much, much later" personally. But I regard the IETF's "two
interoperable implementations" benchmark as outmoded in the age of
open source. I would rather have all implementations be derived from a
reference implementation created as open source by a diverse community
under a class A license (Apache, BSD, MIT etc). By using the same code
at each end of the wire, interoperability is almost guaranteed. Maybe
this is cheating, of course :-)
S.
Yeah, but. Suppose I want C and you wrote it in Java (a common problem
I face), or Ruby or Python? Or suppose I want it in Java but I don't
like your monolithic behemoth?
Interoperablity proves that the spec is correct. Reference
implementations as the only basis detract from that. Bug-for-bug
compliance sucks.
>
> S.
>
>
>
> >
>
Well, subject to analysis from lawyers, and if Simon's claims of wide
applicability turn out to be something lawyers in cynical mode agree
with, Sun's sounds like a good start.
Well let me ask you this about 'compliance'.
Isn't a testing and QA lab really necessary? We can believe in the distributed web and avoid putting all our eggs in one basket, and relying upon BigCos, but then there's the nitty gritty requirements that dictates that with all these different vendors out there implementing specs - we need to make sure that they all work together.
I think OpenID has been pretty lucky so far, but what if Microsoft adds a bit and then MySpace changes something and all of a sudden there are two flavors of OpenID?
I know you're working real hard to make sure that doesn't happen - but now lets multiple that by oAuth, OpenSocial, Portable Contacts and so on and so on.
With each new spec that the OWF incubates we increase the complexity of interop and solutions coming together smoothly.
DiSO, Google Friend Connect and other 'packaged' sets of standards all rely upon clean interop and solid compatibility. Now add in Facebook Connect,MySpace Data availability and Microsoft's ever expanding universe of APIs and Live Mesh.
Can we afford to no longer have a testing and QA labs?
And isn't this EXACTY what taking BigCo money is all about?
The Bush administration was able to game 'global warming' and 'gas house effects' and literally declare that there was nothing bad going on - but it's gonna be real hard for Sun or Microsoft to game a testing labs on interop between all these specs.
And they have people who work for them who do nothing but test compatibility - all day long.
Certainly everything we do from now on - has to be tested AGAINST XMPP, oAuth, OpenID and RSS!
Now what this has to do with OWF - I'm not sure. But I'd sure liek to see the OpenID Foundation help solve this issue.
And what IS the official relationship between the OpenID Foundation and OWF?
Dude, you are sucking all the fun out of this :-)
More seriously, doesn't testing go on all the time? I know that as an
engineer in a BigCo we are constantly reminded when our
implementations don't fit the spec.
More seriously, doesn't testing go on all the time? I know that as an
engineer in a BigCo we are constantly reminded when our
implementations don't fit the spec.
> Yeah, but. Suppose I want C and you wrote it in Java (a common problem
> I face), or Ruby or Python? Or suppose I want it in Java but I don't
> like your monolithic behemoth?
>
> Interoperablity proves that the spec is correct. Reference
> implementations as the only basis detract from that. Bug-for-bug
> compliance sucks.
Completely agree. Just saying that the IETF's "two independent
interoperable implementations dates from before the days when we'd all
have the chance to use the same code.
S.
This is probably OT for OWF at this point; I just don't want this to get
into the conventional wisdom without challenge.
- Dennis
-----Original Message-----
From: Simon Phipps
http://groups.google.com/group/open-web-discuss/msg/07c1537a495bc66a?hl=en
Sent: Saturday, July 26, 2008 13:12
To: open-web...@googlegroups.com
Subject: Re: Open Web Foundation characterization
[ ... ]
Just saying that the IETF's "two independent
interoperable implementations dates from before the days when we'd all
have the chance to use the same code.
[ ... ]
Reference implementations are valuable, and can provide useful guidance
about implementations in conjunction with a specification. They can also
drive up important questions and clarifications needed in a specification.
But I have to say that "the [mono-]code rules" is a bad idea in the context
of *open* interoperability:
http://orcmid.com/blog/2008/07/interoperability-no-code-need-apply.asp
-----Original Message-----
From: Ben Laurie
http://groups.google.com/group/open-web-discuss/msg/1b6f47ba8053b5c7?hl=en
Sent: Saturday, July 26, 2008 13:03
To: open-web...@googlegroups.com
Subject: Re: Open Web Foundation characterization
On Sat, Jul 26, 2008 at 8:39 PM, Simon Phipps <web...@gmail.com> wrote:
http://groups.google.com/group/open-web-discuss/msg/dbf146c1dc21f025?hl=en
[ ... ]
> I don't disagree with the "compliance comes later" sentiment but I'd
> vote for "much, much later" personally. But I regard the IETF's "two
> interoperable implementations" benchmark as outmoded in the age of
> open source. I would rather have all implementations be derived from a
> reference implementation created as open source by a diverse community
> under a class A license (Apache, BSD, MIT etc). By using the same code
> at each end of the wire, interoperability is almost guaranteed. Maybe
> this is cheating, of course :-)
Yeah, but. Suppose I want C and you wrote it in Java (a common problem
I face), or Ruby or Python? Or suppose I want it in Java but I don't
like your monolithic behemoth?
Interoperablity proves that the spec is correct. Reference
implementations as the only basis detract from that. Bug-for-bug
compliance sucks.
[ ... ]
So what qualifies as "sufficiently complicated"? Not SSL/TLS, it
seems. Not HTTP. Not Javascript. Not C or C++. I can think of many
very complicated specs with no testing and QA labs that seem to get on
just fine (for some value of "fine"). What _does_ have one?
On Sat, Jul 26, 2008 at 8:42 AM, Steve Ivy <stev...@gmail.com> wrote:
Perhaps a suggestion for new specs would be the development of a
testing suite or validator, and implementations must pass the
validator to be compliant? I'm thinking of something along the lines
of the Feed Validator[1] or the APE[2], for example.
--Steve
[1] http://feedvalidator.org/
[2] http://www.tbray.org/ongoing/When/200x/2006/08/11/Meet-the-Ape
So a validator for your spec would be a checklist item? Nice.
Reminds me of the XP rule to "Code the Unit Test First".
The Bush administration was able to game 'global warming' and 'gas house effects' and literally declare that there was nothing bad going on - but it's gonna be real hard for Sun or Microsoft to game a testing labs on interop between all these specs.
And they have people who work for them who do nothing but test compatibility - all day long.
Certainly everything we do from now on - has to be tested AGAINST XMPP, oAuth, OpenID and RSS!
bug-for-bug compliance *works*. Specs are generally useless - they
can describe something brilliant that either doesn't get built for
years or that gets implemented poorly by each implementation
(*cough*webbrowsers). Reference implementations and even "bug-for-bug
compliance" ensures that everyone knows what's going on and that it
*actually works* after being *actually built*. Rather important,
imho.
--
- Stephen Paul Weber (Singpolyma)
Web: http://singpolyma.net/
Twitter: http://twitter.com/singpolyma
IM: singp...@gmail.com
Multiple implementations simply shows that the spec can be implemented
by more than one organisation. That's important as a basic benchmark
of readability of a spec. If they can interwork to at least some
wothwhle degree, thats a good indication of being basically useful. It
doesn't necessarily have to be linked to conformance statements or a
conformance process.
I'm sure in some cases one reference implementation can work. But in
those cases the implementation is what will become the standard, not
the specification. Is that the desired outcome?
My understanding is at present OWF would encourage community members
to develop specifications that are usable by whatever means seems
appropriate (plugfests, demos, libraries, test cases, all or none of
the preceding), but it is not mandating any normative compliance
statements or methods. Graduating specs would need to adopt the
conformance model required by the standards body they are submitted
into anyway.
S
> --~--~---------~--~----~------------~-------~--~----~
> You received this message because you are subscribed to the Google
> Groups "Open Web Foundation Discussion" group.
> To post to this group, send email to open-web...@googlegroups.com
> To unsubscribe from this group, send email to open-web-discu...@googlegroups.com
> For more options, visit this group at http://groups.google.com/group/open-web-discuss?hl=en
> -~----------~----~----~----~------~----~------~--~---
>
+1 for somewhat strict values of 'interoperable'
wrt the standards themselves: yes. This is great.
wrt OWF and IPR: bad. Reading this thread reminds me of why Adobe PDF
is still somewhat dangerous: you must be fully conformant or their
lawyers come after you.
Reminds me of TCMs : conformance good, forcing conformance by abusing
IP law, bad.
On Jul 26, 2008, at 10:01, Joe Andrieu wrote:
>
> Wouldn't compliance be an obvious and fair quid-pro-quo for IP
> contributions? That would make it at least one kind of restriction on
> declared patents that makes sense.
It makes sense at first sight, but opens up a hole for gaming.
Measuring "compliance" is really, really hard, and introducing any
kind of dependency for IP grant ("compliance" and "necessary claims"
being examples) immediately renders open source developers unsafe due
to uncertainty.
* Adding "required claims" language (where the grant of rights is
dependent on the only way of implementing your software being to use
the patent) requires an outside expert to help determine eligibility.
* Requiring "compliance" renders the rapidly iterative "use &
improve" approach of open source impossible as only the final,
"compliant" version will be eligible for the grant (and even then only
after following some form of onerous certification process).
I recommend that OWF not allow either "necessary claims" or
"compliance" as predicates to IP grant. A straightforward,
unconditional, sublicensable, non-expiring and ownership-change-
surviving non-assert is the answer in my view. Plenty of dragons to
tame in those words, mind you.
S.