"Public Test Results Summary" and CCRs

134 views
Skip to first unread message

David Blevins

unread,
Feb 2, 2021, 10:58:22 PM2/2/21
to Micro Profile
Recapping/expanding the conversation from today's hangout.

The mindset to put yourself in is we want to create "informative download pages." There are some things we can require and there are some things we don't, but you should really do anyway.

The goal of the "informative download page" (public tck results summary linked in your certification request) is to make your certification/compliance clear and something the user can kind of dig into a little bit so they don't have to just take your word for it. In your "informative download page" you should have all the standard stuff you'd expect from a download page:

- The name version of your implementation and link to download (requirement)
- Good branding (not our place to require that, but you should do it)

Since you being compliant is a first-class citizen on this download page you should have information on exactly what you are compliant with and some basic proof/evidence:

- Specification Name, Version
- TCK used
- Total number of tests ran and passed
- Java or other runtimes on which you've passed those tests
- OS used

We haven't formally identified the above, but this is something we can nail down together. We could add more things to the list, tweak the list, discuss how we want to communicate them to users, etc. Lots of discussion we can have here. We might decide there are additional requirements that make sense for specific apis. For example, maybe we might want people to test MP JWT impls against a real JWT provider, or MP Metrics impls to test against an actual store. If we did require that in the future, we'd probably want the name/version of the API Gateway or Metrics store you used to certify.

The bottom line is we're trying to create more transparency for / educate users so we can create an industry that understands, values and demands being compliant. It should empower users to call BS on a potential bad player who claims it falsely. It should not be done in a way that is "trust us, the right people approved our certification, you don't need to know those details", but "here's our proof to you the user."

We don't want to make the requirements too cumbersome, but it'd be super fantastic if we as implementors even allowed people to see what optional tests we pass or skip. For example, MP JWT has several optional aspects around integration with various Jakarta EE APIs like EJB and Enterprise Security. There's currently zero public record of who is actually running and passing those tests. This kind of information is ideally at users' fingertips. It's definitely a gap we should address someday. For now we just need to start small.

The actual certification request really could just be an issue with a link to the "informative download page", or Public TCK Results Summary as we named it on the Jakarta side. If we do a good job with them collectively, we really don't need to be duplicating all that in the actual certification request github issue.

Since this is all about backing up our claims to the public, one of the things I called out in the last round of votes is that several of the certification requests claimed "passes on java 8 and 11", but then had one set of test results. Which test run did those results come from, 8 or 11? There'd logically be two runs, but you're just showing me data from one run. You should disclose the results from all the runs claimed in your request. The spirit isn't "trust me" the spirit is "show me."

Similarly if people start running on multiple operation systems we should be presenting users with results from all of those runs. That can certainly be a lot of test results to put on one page, so definitely we can explore allowing those to be links as long as people don't have to dig terribly deep.

These "informative download pages" also need to be permanent, like any download page would be. If you release version 1.2.3 of your software and put all this data on it you really should supply a link in your certification request for that *exact* version and that page should live forever. Don't supply a generic page like "https://our.suff/downloads.html" that will obviously get overwritten with newer releases. Similarly, don't supply a link like "https://our.suff/microprofile-5-certification.html"... unless of course you never ever intend to release any updates to your software that can pass the MP 5 TCK, for example. New certified release == new page.

As these pages need to be user-friendly and consumable by the average person, like a download page, that also means using a link to a CI job is not a great fit. Aside from the fact they don't last forever, you'd never point a user at the CI job that released your software and say, "it's in there to download if you dig enough, go have fun. I'm sure you can figure it out." Would you use a CI job to generate a nice page or publish your binaries to a more consumable place? Definitely. Same applies here.

Additionally, snapshots or other non-final builds do not work for certification requests. What you're certifying needs to be something that is permanent and not going to be deleted. What you certify should actually be something you're ok with people downloading and using. If you're the first implementation to be certified and are therefore used in the specification's release ballot, we need that build to be reproducible from source. When someone says, "I challenge this test and don't see how anyone could have passed it!" we'll need to be able to see how you passed it; not using similar source from that day, but the actual source from that build. This also means that compatible implementations used in a ballot to release a spec must be open source. The open source definition requires there "must be a well-publicized means of obtaining the source code" for that exact build and binary. Snapshots and a link to the master/main branch of your github repo doesn't cut and are not legally compliant with the open source definition. Date stamps don't cut it either; try scheduling a meeting at "4pm on Tuesday" with your friends in Australia, Canada and London and see who shows up. Compatible implementations that come after ballot can be closed source proprietary software. Knowing that this is possible means it's all the more important for us to set a good example. We definitely do not want the potential for closed source impls, poorly disclosing their certification results in very hard to consume ways, such that no one is really looking or can really call BS on if they truly pass. We want as many eyes as possible on those test results. More eyes keeps everyone honest. Very important as we have no other means to correct dishonesty other than public pressure. We need to create that pressure.

In short, aim for something that is like a well-branded download page (it can actually be your download page) that has information on your certification presented in a way the average user can appreciate/consume and (hopefully) will read.


--
David Blevins
http://twitter.com/dblevins
http://www.tomitribe.com


Nathan Rauh

unread,
Feb 3, 2021, 3:29:29 PM2/3/21
to MicroProfile
While I'm in agreement with most of what David says here, as someone who is currently struggling to get a certification request together for one of the Microprofile standalone specs, I've found that one aspect of it is unworkable for implementations that repackage and ship the MicroProfile spec binaries in question:

"Additionally, snapshots or other non-final builds do not work for certification requests."

The problem here is that I can't just merge in changes to the product in which I'm providing the compatible implementation that pick up the not-yet-approved final release spec binary from the staging location in order to get a final build of it. Doing that would put our product into a position where we have our users running against binaries that aren't, and might not ever be, published to the final location in maven. Submitting builds with automated tests runs of the TCK against my non-merged pull request is perfectly fine. Merging it prematurely is not. But now the proposal here is asking that I need to merge first and then run against a final build image, leaving me in a situation where I can't create the compatible implementation final build without a publish to maven, but I can't publish to maven without the certification approved which requires the compatible implementation final build.

I understand the reasoning behind why you want to have the requirement for the final build, and they are good reasons. Could we address them another way, maybe by allowing the certification request approval initially with the non-final build so that we can get the approval and publish to maven, and then allowing the compatible implementation to come back later and update results with the final build that later gets released?

David Blevins

unread,
Feb 8, 2021, 11:20:08 PM2/8/21
to Micro Profile
Thanks, Nathan, for the follow up. I'll give my thoughts, but it would be great to have more voices even if to say "sounds good", "maybe x", or any other comments. I have a lot of thoughts, but I don't want that to scare people away from contributing to the discussion or asking questions.

> On Feb 3, 2021, at 12:29 PM, Nathan Rauh <natha...@us.ibm.com> wrote:
>
> While I'm in agreement with most of what David says here, as someone who is currently struggling to get a certification request together for one of the Microprofile standalone specs, I've found that one aspect of it is unworkable for implementations that repackage and ship the MicroProfile spec binaries in question:
> "Additionally, snapshots or other non-final builds do not work for certification requests."
>
> The problem here is that I can't just merge in changes to the product in which I'm providing the compatible implementation that pick up the not-yet-approved final release spec binary from the staging location in order to get a final build of it. Doing that would put our product into a position where we have our users running against binaries that aren't, and might not ever be, published to the final location in maven. Submitting builds with automated tests runs of the TCK against my non-merged pull request is perfectly fine. Merging it prematurely is not. But now the proposal here is asking that I need to merge first and then run against a final build image, leaving me in a situation where I can't create the compatible implementation final build without a publish to maven, but I can't publish to maven without the certification approved which requires the compatible implementation final build.
>
> I understand the reasoning behind why you want to have the requirement for the final build, and they are good reasons. Could we address them another way, maybe by allowing the certification request approval initially with the non-final build so that we can get the approval and publish to maven, and then allowing the compatible implementation to come back later and update results with the final build that later gets released?

I don't know if this clarification helps, but the build does not need to be GA or published to Maven Central.

They can be milestones, alphas, betas, etc. Any form of release works. They can be in a staging repo or any temporary location as long as they're promoted somewhere permanent if/when the vote passes. The bare minimum is we need a binary people can use, a source people can build and test results page. The source shouldn't have Maven SNAPSHOTs in it, so the build is repeatable.

On the Jakarta side the compatible implementation releases that get specs up for ballot are often milestones. The same compatible implementation is often working on and releasing milestones of their implementation long before the final spec vote. We also do not have any concept of "the official API jar." That is actually up to the implementation to supply. We have tests that verify all the expected API classes are there and the signatures are correct.

In practice the compatible implementation does use the API jars that are staged for the final spec vote and the compatible implementation is itself in some form of staging. Both will be trashed and recreated if the vote fails.

Does any of the above help open any new options?

I don't have an answer to the issue of creating temporary builds from PRs that are not merged. All the arguments you make about "what guarantee do we have this will be released" apply equally in the reverse; the working group as a whole has no guarantee the PR will be merged, a build will be created in a timely manner and that build won't contain other changes that might cause the release to not pass the tests.

The very definition of being a compatible implementation used in a spec ballot is that you have some comfort level with implementing a spec that is not yet final. I don't really know how to eliminate that, but if there are ways we can make that easier let's definitely discuss.

Open to any and all thoughts.


-David

David Blevins

unread,
Feb 8, 2021, 11:50:46 PM2/8/21
to Micro Profile
> On Feb 8, 2021, at 8:20 PM, David Blevins <dble...@tomitribe.com> wrote:
>
> The very definition of being a compatible implementation used in a spec ballot is that you have some comfort level with implementing a spec that is not yet final. I don't really know how to eliminate that, but if there are ways we can make that easier let's definitely discuss.

Would it help if we published an RC of the API jar just before the vote and you could prepare some form of release based on that?


-David

Martin Stefanko

unread,
Feb 9, 2021, 12:18:22 PM2/9/21
to MicroProfile
David,

so in other words we are not able to avoid doing some form of the release? Personally, I also don't like releasing artifacts with a hardcoded staging repository that is going to be outdated in two weeks after the release.

In meeting minutes from the last hangout there is a response - "The Compatible Implementation must be long-lived and available via a long-lived URL.  An RC in Maven Central is an easy way to provide for this.  Or, having a download site for the Compatible Implementation with a long-lived link and artifact."

If I understand this then we can just reference a commit in the history and publish the link to the commit together with the binaries built from this commit in the TCK/CCR result at the implementation site? Or is it required that the commit is tagged(named)? In that case, can we just tag in GitHub without releasing it to any repository and publish the binaries just on the site? 

Thanks,
Martin

David Blevins

unread,
Feb 9, 2021, 8:47:25 PM2/9/21
to Micro Profile
> On Feb 9, 2021, at 9:18 AM, Martin Stefanko <xstef...@gmail.com> wrote:
>
> David,
>
> so in other words we are not able to avoid doing some form of the release?

That's the spirit of the process, yes. The intent is we're preparing releases of the spec, tck and at least one compatible implementation all staged in their final form and ready to be pushed to the public the moment the vote passes.

There are often press releases staged and ready to go that praise the compatible implementation(s) and encourage people to download and try them.

If things can't be in their best shape for the vote, the spirit is we're trying figure out how to improve that so we can do better next time.

> Personally, I also don't like releasing artifacts with a hardcoded staging repository that is going to be outdated in two weeks after the release.

If this is the main concern you can put staging repositories in ~/.m2/settings.xml

- https://maven.apache.org/guides/mini/guide-multiple-repositories.html

This would allow the release to be no different than any other release in terms of source and binary. I hesitate to mention it as I don't want the perception that this thread is telling vendors how they must cut releases.

> In meeting minutes from the last hangout there is a response - "The Compatible Implementation must be long-lived and available via a long-lived URL. An RC in Maven Central is an easy way to provide for this. Or, having a download site for the Compatible Implementation with a long-lived link and artifact."
>
> If I understand this then we can just reference a commit in the history and publish the link to the commit together with the binaries built from this commit in the TCK/CCR result at the implementation site? Or is it required that the commit is tagged(named)? In that case, can we just tag in GitHub without releasing it to any repository and publish the binaries just on the site?

These things would meet the letter of the law, but maybe not the spirit.

If we're going to go down the path of doing the least amount of work on our compatible implementation releases for votes -- just enough to check a box -- and "real" releases after the vote, then I think we just need to talk about it as it affects everything else. That "checkbox" build is not likely something we'd want to promote in an announcement, tweet about or add to start.microprofile.io. That'd mean we're holding our tweets, announcements and other activities around promoting the release until at least one real compatible implementation release is ready.

If we're going to go that route, we'd all have to be on the same page and work out the details and timing.


I'm enjoying this thread, btw. I think it's providing some good context and feel very happy that the rationale behind whatever we do will be well documented. Thanks so much to you and Nathan for engaging. Let's keep this going.

Nathan Rauh

unread,
Feb 10, 2021, 9:59:42 AM2/10/21
to microp...@googlegroups.com
Thanks David - I like that suggestion.   I think that would solve the problem.  If the vote were instead based on a final release candidate of the API/TCK, we could definitely merge changes to the compatible implementation and even have a released product version based on that.  If the vote is approved, then the RC gets promoted to final, with no additional changes.  If the voted is not approved, then the RC remains like any other RC, and a new final release candidate would be needed correcting whatever was wrong before requesting a new vote.

This would line up really well with what our spec did because after we published our final release candidate, we had at least one compatible implementation pick it up and put it into a beta release, fully passing the TCK.  And we made no further changes to the spec between that point and needing to request final approval.
If, at this point, I could have just requested the approval based on that, that would have been ideal. But instead, the current process had me proceed to create a final release driver (identical to the RC, except marked final) which is stashed away in the staging area pending approval, but otherwise unavailable as an official artifact to be used in a release.  If the approval goes through, it will of course become officially available at that point. But if it doesn't, we would have shipped with a version of the spec that customers can't get anywhere, which is a situation that compatible implementations aren't willing to put themselves into.
--
You received this message because you are subscribed to a topic in the Google Groups "MicroProfile" group.
To unsubscribe from this topic, visit
https://groups.google.com/d/topic/microprofile/6v65mBjIXiQ/unsubscribe.
To unsubscribe from this group and all its topics, send an email to microprofile...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/microprofile/EE6F70AB-D918-4FB9-9A01-ABBB04C15221%40tomitribe.com.




David Blevins

unread,
Feb 10, 2021, 11:30:49 AM2/10/21
to Micro Profile
> On Feb 10, 2021, at 6:59 AM, Nathan Rauh <natha...@us.ibm.com> wrote:
>
> Thanks David - I like that suggestion. I think that would solve the problem. If the vote were instead based on a final release candidate of the API/TCK, we could definitely merge changes to the compatible implementation and even have a released product version based on that. If the vote is approved, then the RC gets promoted to final, with no additional changes. If the voted is not approved, then the RC remains like any other RC, and a new final release candidate would be needed correcting whatever was wrong before requesting a new vote.

Right. And just to be super clear, the spec project would cut an additional final jar from source in the regular way, but the contract would be that there can be no changes of any kind between the RC and the Final jar that goes up for vote. Specification teams could even cut them both literally at the same time; do the RC1 release and push that straight to Maven Central, then immediately release the proposed final to staging, no other commits or time lapsed in between. The commit history is public so we can all verify the rules were followed. If we find an issue that requires a fix we do the process again, but now it's RC2 in Maven Central and a new proposed final in a new staging repo.

I'm just brainstorming here, we'd need to get everyone's buy-in that this is something they'd want to do. Hopefully others will chime in and give some feedback on if they think it sounds good or terrible.

If you and Martin like it, maybe we can team up and write a short proposal in the form of a guide/process doc and put it up for discussion in a thread with a more obvious title. If people like it, we publish the doc so all the committers have a guide to follow.

Thoughts?


-David

Nathan Rauh

unread,
Feb 10, 2021, 12:03:33 PM2/10/21
to microp...@googlegroups.com
+1
That's a great idea creating the RC and final back-to-back to rule out someone accidentally pushing changes in between.





From:        David Blevins <dble...@tomitribe.com>
To:        Micro Profile <microp...@googlegroups.com>
Date:        02/10/2021 10:30 AM
Subject:        [EXTERNAL] Re: [microprofile] "Public Test Results Summary" and CCRs
Sent by:        microp...@googlegroups.com




--
You received this message because you are subscribed to a topic in the Google Groups "MicroProfile" group.
To unsubscribe from this topic, visit
https://groups.google.com/d/topic/microprofile/6v65mBjIXiQ/unsubscribe.

To unsubscribe from this group and all its topics, send an email to microprofile...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/microprofile/CDF6A219-5CE5-4A59-986B-0E73AF7B17A2%40tomitribe.com.




Martin Stefanko

unread,
Feb 11, 2021, 3:49:17 AM2/11/21
to MicroProfile
> If you and Martin like it, maybe we can team up and write a short proposal in the form of a guide/process doc and put it up for discussion in a thread with a more obvious title. If people like it, we publish the doc so all the committers have a guide to follow.

+1, let me know how can I help.

> That's a great idea creating the RC and final back-to-back to rule out someone accidentally pushing changes in between.

That is surely one possible way but I just want to say that we shouldn't establish it as the only possible solution. I think it would be better if we could list all acceptable ways of delivering a compatible implementation so that the implementation can find the one that is the closest to their current process. In Narayana for instance (which I am doing now for LRA 1.0 CCR) there isn't an established process for beta releases (RCs, Ms, etc.) so we shouldn't require an implementation that wants to help deliver a final version of the specification to change their release process if they don't want to (however, of course, they can :)). In other words, make it as it easy as possible for an implementation to support a final release of MP specification.

Thank you both for talking this through. I have now a clear idea of what is needed to be done.

Martin

Emily Jiang

unread,
Feb 11, 2021, 6:25:53 AM2/11/21
to MicroProfile
Hi David,

Doing CCR against the RCx before final certainly makes the CCR a lot of easier and less fragile. I guess this is applicable for dependency management. For instance, MP Rest Client has a dependency on MP Config. MP Config does a release 2.1-RC1 and then stage with a 2.1 final release. MP Rest Client can do a 2.1-RC1 against MP Config 2.1-RC for its CCR for MP Rest Client 2.1 final release, which depends on the staged MP Config 2.1.

I have a few questions with the CCR's criteria:
1. Specification Name, Version and download URL on CCR
See an example of the current CCR here.

With your current proposal, under the Specification Name, Version and download URL: do we put in the to-be-pushed final release or RC release? If we put the to-be-pushed final release, the link will be broken for some time before the release is pushed. If we put RC release, the future CCR for certifying runtimes will have different input. I guess we need to differentiate the CCRs: the one used for release and the one used for runtime to certify.

2. The build for CCR
>Additionally, snapshots or other non-final builds do not work for certification requests. What you're certifying needs to be something that is permanent and not going to be deleted.

If you want the build to be permanent, the snapshots or other non-final builds for using CCR can be moved to somewhere permanent that are not being deleted though.  If the request is for a permanent build, we should just ask the build needs to be kept and not be deleted instead of asking for an official build. Pleasae clarify.

3. The source is accessible for the build for CCR
> What you certify should actually be something you're ok with people downloading and using. If you're the first implementation to be certified and are therefore used in the specification's release ballot, we need that build to be reproducible from source. When someone says, "I challenge this test and don't see how anyone could have passed it!" we'll need to be able to see how you passed it; not using similar source from that day, but the actual source from that build.

Why can't people run the test against the binary to prove tests actually pass? I am not sure whether attaching source is to prove it is open source or make sure the source is builtable. I know the source zip is required for servicability reasons, I am not convinced why it is necessary to have the source jar (since it consumes MP spec RCx) kept forever.

Thanks
Emily

David Blevins

unread,
Feb 16, 2021, 9:08:41 PM2/16/21
to Micro Profile
> On Feb 11, 2021, at 3:25 AM, 'Emily Jiang' via MicroProfile <microp...@googlegroups.com> wrote:
>
> I have a few questions with the CCR's criteria:
> 1. Specification Name, Version and download URL on CCR
> See an example of the current CCR here.
>
> With your current proposal, under the Specification Name, Version and download URL: do we put in the to-be-pushed final release or RC release? If we put the to-be-pushed final release, the link will be broken for some time before the release is pushed. If we put RC release, the future CCR for certifying runtimes will have different input. I guess we need to differentiate the CCRs: the one used for release and the one used for runtime to certify.

I think perhaps the CCR requirements in general can be eliminated as they are largely a duplicate of the Public Test Results page (i.e. download page with certification information). Perhaps we need to stop using the term Public Test Results page and actually call it download page or maybe document it in a way that it's clear we're recommend implementations include this information on their download pages, but they are free to dedicate another page to this purpose.

Anyway, in the context of the "Public Test Results" what we really need is a link back to a page on the microprofile.io website dedicated to that specification version. This specification version page should have all the specification assets on it: pdf, api jars, tck, etc.

The spirit is the compatible implementations should have a link back to the spec it says it implements. Something more specific than the microprofile.io website or the specification team's section of the website. Something not as specific as just one aspect of the spec, like the API jar or PDF.

That was the thought at least. So in that context, we'd link to the future spec page on the microprofile.io website. Perhaps that's even a current page on the microprofile.io website if we can get them up early when specs are still in draft form.

> 2. The build for CCR
> >Additionally, snapshots or other non-final builds do not work for certification requests. What you're certifying needs to be something that is permanent and not going to be deleted.
>
> If you want the build to be permanent, the snapshots or other non-final builds for using CCR can be moved to somewhere permanent that are not being deleted though. If the request is for a permanent build, we should just ask the build needs to be kept and not be deleted instead of asking for an official build. Pleasae clarify.

The spirit here is we want releases of all the necessary assets for a specification: api, doc, tck, compatible implementation. They should all be in their best shape and ready for inclusion in press releases and a big splash. We want people to come running and kick the tires on our available compatible implementations. If things can't be in their best shape for the vote, the spirit is we're trying figure out how to improve that so we can do better next time.

In that regard is a "saved" snapshot or other non-release the best shape we can do? Speaking for myself only, I hope we can do better and give the world a more professional impression. Any form of release people would feel comfortable trying is good. Alpha, beta, milestone are all fine, IMO.

The other aspect is that we need at least one open source compatible implementation in our ecosystem. That's connected to your next question.

> 3. The source is accessible for the build for CCR
> > What you certify should actually be something you're ok with people downloading and using. If you're the first implementation to be certified and are therefore used in the specification's release ballot, we need that build to be reproducible from source. When someone says, "I challenge this test and don't see how anyone could have passed it!" we'll need to be able to see how you passed it; not using similar source from that day, but the actual source from that build.
>
> Why can't people run the test against the binary to prove tests actually pass? I am not sure whether attaching source is to prove it is open source or make sure the source is builtable. I know the source zip is required for servicability reasons, I am not convinced why it is necessary to have the source jar (since it consumes MP spec RCx) kept forever.

Per the EFSP we need at least one open source compatible implementation so the world at large can see how it is possible to pass the TCK (which is also open source). This gives us a fully open source ecosystem and is a compromise that allows us to avoid having RIs.

We definitely can have compatible implementations that are not open source and builds that are not open source compliant. They can even be referenced in the ballot. They just can't help us meet our minimum requirement of one open source compatible implementation.

A binary that can't be build from source is not open source. Once you've stamped an open source license header on your code you still have additional requirements to meet at release time. Specifically, the Open Source Definition[1] requires you to either distribute the source with the binary itself or provide "well-publicized means of obtaining the source code" for that exact binary. A link to the main branch on Github wouldn't be sufficient.

The spirit of the requirement is we're presenting an open source release that's in the best shape for the public to build and use. If there's something we don't like about the requirement that would make us want to cut corners, we should talk about that and see if there's a way to address the concerns.


-David

[1] https://opensource.org/osd-annotated

David Blevins

unread,
Feb 16, 2021, 9:36:26 PM2/16/21
to Micro Profile
> On Feb 11, 2021, at 12:49 AM, Martin Stefanko <xstef...@gmail.com> wrote:
>
> > If you and Martin like it, maybe we can team up and write a short proposal in the form of a guide/process doc and put it up for discussion in a thread with a more obvious title. If people like it, we publish the doc so all the committers have a guide to follow.
>
> +1, let me know how can I help.

If you have the bandwidth to kick-start a draft document we can use as a guide/process doc, that'd be really fantastic. I fear I may be a bottleneck on that aspect and definitely do not want to hold up the show.

> > That's a great idea creating the RC and final back-to-back to rule out someone accidentally pushing changes in between.
>
> That is surely one possible way but I just want to say that we shouldn't establish it as the only possible solution.

I tend to agree. Time always reveals new options. As long as what we're trying to achieve is clear and agreed, we should allow ourselves the flexibility evolve the "how."
The "how" is often a best-guess that ages poorly. I've done my best to speak the spirit of the requirements deliberately so people can suggest multiple "hows." If we always keep that mentality in our community, we'll evolve very well.

If we don't we end up with cargo-cult processes that over time fewer and fewer people understand but everyone still follows.

> I think it would be better if we could list all acceptable ways of delivering a compatible implementation so that the implementation can find the one that is the closest to their current process. In Narayana for instance (which I am doing now for LRA 1.0 CCR) there isn't an established process for beta releases (RCs, Ms, etc.) so we shouldn't require an implementation that wants to help deliver a final version of the specification to change their release process if they don't want to (however, of course, they can :)). In other words, make it as it easy as possible for an implementation to support a final release of MP specification.

Feel free to throw any and all ideas into the doc.

> Thank you both for talking this through. I have now a clear idea of what is needed to be done.

Thank you for all the questions/comments! These threads are extremely critical to creating a shared sense of "why" so many people can help create more "hows" :)

I'm so glad we have this in our archives.


-David

Emily Jiang

unread,
Feb 22, 2021, 6:48:38 AM2/22/21
to MicroProfile
Thank you David for the detailed explanation! I have made the following suggestions inline with your response!

On Wednesday, February 17, 2021 at 2:08:41 AM UTC dble...@tomitribe.com wrote:
> On Feb 11, 2021, at 3:25 AM, 'Emily Jiang' via MicroProfile <microp...@googlegroups.com> wrote:
>
> I have a few questions with the CCR's criteria:
> 1. Specification Name, Version and download URL on CCR
> See an example of the current CCR here.
>
> With your current proposal, under the Specification Name, Version and download URL: do we put in the to-be-pushed final release or RC release? If we put the to-be-pushed final release, the link will be broken for some time before the release is pushed. If we put RC release, the future CCR for certifying runtimes will have different input. I guess we need to differentiate the CCRs: the one used for release and the one used for runtime to certify.

I think perhaps the CCR requirements in general can be eliminated as they are largely a duplicate of the Public Test Results page (i.e. download page with certification information). Perhaps we need to stop using the term Public Test Results page and actually call it download page or maybe document it in a way that it's clear we're recommend implementations include this information on their download pages, but they are free to dedicate another page to this purpose.

+1
Anyway, in the context of the "Public Test Results" what we really need is a link back to a page on the microprofile.io website dedicated to that specification version. This specification version page should have all the specification assets on it: pdf, api jars, tck, etc.

The spirit is the compatible implementations should have a link back to the spec it says it implements. Something more specific than the microprofile.io website or the specification team's section of the website. Something not as specific as just one aspect of the spec, like the API jar or PDF.

That was the thought at least. So in that context, we'd link to the future spec page on the microprofile.io website. Perhaps that's even a current page on the microprofile.io website if we can get them up early when specs are still in draft form.

If we just use CCR for releasing Specs only, I think this CCR should directly link to the what artifacts the CCR tests. If it tests the pre-final RCx release, the CCR should explicitly say so instead of saying testing the final version though it pulls in the RCx release. There should be acomment to say RCx is identicial to the staged final. Does this sound fair?
 
> 2. The build for CCR
> >Additionally, snapshots or other non-final builds do not work for certification requests. What you're certifying needs to be something that is permanent and not going to be deleted.
>
> If you want the build to be permanent, the snapshots or other non-final builds for using CCR can be moved to somewhere permanent that are not being deleted though. If the request is for a permanent build, we should just ask the build needs to be kept and not be deleted instead of asking for an official build. Pleasae clarify.

The spirit here is we want releases of all the necessary assets for a specification: api, doc, tck, compatible implementation. They should all be in their best shape and ready for inclusion in press releases and a big splash. We want people to come running and kick the tires on our available compatible implementations. If things can't be in their best shape for the vote, the spirit is we're trying figure out how to improve that so we can do better next time.

In that regard is a "saved" snapshot or other non-release the best shape we can do? Speaking for myself only, I hope we can do better and give the world a more professional impression. Any form of release people would feel comfortable trying is good. Alpha, beta, milestone are all fine, IMO.

The other aspect is that we need at least one open source compatible implementation in our ecosystem. That's connected to your next question.

I thought this further. In order to make more open source projects (Open Liberty, Payara, etc) feasible as candidates for CCR, I think we should do this gradually. It is reasonable to assert that the builds used for CCR should be clickable and all links are working. The fact is that for some product builds that just consumes MP RCx might not want to be kept forever and might not want to do a alpha/beta/GA for it either. In order to get ballot out without much delay, why not use an ordinary build (e.g. daily build) for getting ballot voted and the build should be kept till it is replaced by a better beta or GA build. After the release is declared, the CCR should be updated to use the finally released MP spec version and the release needs to be either Beta or GA in the post release process. In this case, the previous build can be deleted and they don't need to be kept forever. Additionally, that build used for ballot can be done within a day, insteading waiting for at least 4 weeks (most products have 4-week or longer). Thoughts?

> 3. The source is accessible for the build for CCR
> > What you certify should actually be something you're ok with people downloading and using. If you're the first implementation to be certified and are therefore used in the specification's release ballot, we need that build to be reproducible from source. When someone says, "I challenge this test and don't see how anyone could have passed it!" we'll need to be able to see how you passed it; not using similar source from that day, but the actual source from that build.
>
> Why can't people run the test against the binary to prove tests actually pass? I am not sure whether attaching source is to prove it is open source or make sure the source is builtable. I know the source zip is required for servicability reasons, I am not convinced why it is necessary to have the source jar (since it consumes MP spec RCx) kept forever.

Per the EFSP we need at least one open source compatible implementation so the world at large can see how it is possible to pass the TCK (which is also open source). This gives us a fully open source ecosystem and is a compromise that allows us to avoid having RIs.

We definitely can have compatible implementations that are not open source and builds that are not open source compliant. They can even be referenced in the ballot. They just can't help us meet our minimum requirement of one open source compatible implementation.

A binary that can't be build from source is not open source. Once you've stamped an open source license header on your code you still have additional requirements to meet at release time. Specifically, the Open Source Definition[1] requires you to either distribute the source with the binary itself or provide "well-publicized means of obtaining the source code" for that exact binary. A link to the main branch on Github wouldn't be sufficient.


Thanks for your explanation! I was commenting on the purpose of attaching the source jar. Actually, for achieveing the open source compatible implementation requirement, we can add the commit id on CCR as you indicated (I overlooked:o). I think it is ok to add the commit id of the build in the CCR to ensure the build reproducible.
Reply all
Reply to author
Forward
0 new messages