Re: new starr

66 views
Skip to first unread message

Lukas Rytz

unread,
Jul 6, 2012, 5:42:34 AM7/6/12
to Gruntz Dominik, Josh Suereth, Eugene Burmako, scala-i...@googlegroups.com
I agree, when a commit has a new starr, there should be a parent revision that is the source of the starr
(not sure if that's always possible, but when it is, it should be done). the commit message should refer to
that parent ("new starr based on abc123").

another thing is that in the past, there used to be a difference between "scala-library/scala-compiler" jar
files from "pack" and the ones in "starr". not sure what that difference was. to build a new starr, one was
supposed to use the "newstarr" ant target. as far as i'm concerned, we can just use the jars in "pack".

Lukas




On Thu, Jul 5, 2012 at 7:50 PM, Gruntz Dominik <dominik...@fhnw.ch> wrote:

Hier noch die Antwort von Josh:

 

Personally, I like to have the git commit that *generated* the STARR in the history.  I get nervous with what Eugene does because sometimes intermediate STARRs have no source they can be generated from in the event of emergency.   SO, I'd like to two-fer this pull.  First you submit a bootstrapping commit, then we make a new star from that and remove the bad implementation for a dummy one.

 

- Josh

- Dominik

 

 

From: Gruntz Dominik
Sent: Donnerstag, 5. Juli 2012 19:20
To: 'Lukas Rytz'
Subject: RE: new starr

 

here it is:

 

1) I made a new build and thereby created the jar-files for the compiler and the library

2) Replace the jar files in lib/ with the new jar files

3) make sure that the ant-script does not download the jar files (uncomment that part  in the init.jars target of build.xml)

4) recompile the system with the new compiler (in order to run the tests)

5) upload the new jar-files with push-binary-libs.sh (pw required => Josh)

6) The script will upload all modified jars to the typesafe repositories (it takes a while)

7) After the script succeeds, it will update the *.sha files next to the modified jars.

8) Commit the updated *.sha files along with the code.

9) You're done, (re)submit a pull request

 

Dominik

 

 

From: Lukas Rytz [mailto:lukas...@epfl.ch]

Sent: Donnerstag, 5. Juli 2012 19:06
To: Gruntz Dominik
Subject: Re: new starr

 

ich könnte die erklärung gerade brauchen im moment :)

2012/7/5 Gruntz Dominik <dominik...@fhnw.ch>

ok. Eugene hat es mir erklärt, aber ich habs noch nicht gemacht.

Liebe Grüsse

Dominik

 

 

From: Lukas Rytz [mailto:lukas...@epfl.ch]
Sent: Donnerstag, 5. Juli 2012 18:44
To: Gruntz Dominik
Subject: new starr

 

wenn du herausgefunden hast wie man einen neuen starr baut und veröffentlicht,

wäre es gut das mit einem mail an scala-internals zu dokumentieren - adriaan weiss

zum beispiel auch nicht wie es funktioniert :)

 

danke

 


Gruntz Dominik

unread,
Jul 6, 2012, 5:48:31 AM7/6/12
to Lukas Rytz, Josh Suereth, Eugene Burmako, scala-i...@googlegroups.com

in the meantime I have adjusted my provisional list of tasks necessary to build/upload a new starr to contain the step "ant replacestarr" to build the jar files (instead of simply copying them over)

Hubert Plociniczak

unread,
Jul 6, 2012, 5:50:31 AM7/6/12
to scala-i...@googlegroups.com
On 07/06/2012 11:42 AM, Lukas Rytz wrote:
I agree, when a commit has a new starr, there should be a parent revision that is the source of the starr
(not sure if that's always possible, but when it is, it should be done). the commit message should refer to
that parent ("new starr based on abc123").

another thing is that in the past, there used to be a difference between "scala-library/scala-compiler" jar
files from "pack" and the ones in "starr". not sure what that difference was. to build a new starr, one was
supposed to use the "newstarr" ant target. as far as i'm concerned, we can just use the jars in "pack".

pack contains msil/fjbg and some other minor classes. starr doesn't.

Lukas Rytz

unread,
Jul 6, 2012, 5:51:50 AM7/6/12
to scala-i...@googlegroups.com
On Fri, Jul 6, 2012 at 11:50 AM, Hubert Plociniczak <hubert.pl...@epfl.ch> wrote:
On 07/06/2012 11:42 AM, Lukas Rytz wrote:
I agree, when a commit has a new starr, there should be a parent revision that is the source of the starr
(not sure if that's always possible, but when it is, it should be done). the commit message should refer to
that parent ("new starr based on abc123").

another thing is that in the past, there used to be a difference between "scala-library/scala-compiler" jar
files from "pack" and the ones in "starr". not sure what that difference was. to build a new starr, one was
supposed to use the "newstarr" ant target. as far as i'm concerned, we can just use the jars in "pack".

pack contains msil/fjbg and some other minor classes. starr doesn't.

do you know what's the advantage of keeping them separate?

Hubert Plociniczak

unread,
Jul 6, 2012, 6:09:15 AM7/6/12
to scala-i...@googlegroups.com
On 07/06/2012 11:51 AM, Lukas Rytz wrote:


On Fri, Jul 6, 2012 at 11:50 AM, Hubert Plociniczak <hubert.pl...@epfl.ch> wrote:
On 07/06/2012 11:42 AM, Lukas Rytz wrote:
I agree, when a commit has a new starr, there should be a parent revision that is the source of the starr
(not sure if that's always possible, but when it is, it should be done). the commit message should refer to
that parent ("new starr based on abc123").

another thing is that in the past, there used to be a difference between "scala-library/scala-compiler" jar
files from "pack" and the ones in "starr". not sure what that difference was. to build a new starr, one was
supposed to use the "newstarr" ant target. as far as i'm concerned, we can just use the jars in "pack".

pack contains msil/fjbg and some other minor classes. starr doesn't.

do you know what's the advantage of keeping them separate?

Size? Not everyone needs msil.jar I guess? Plus they are different packages. It has been like that for a really long time so I don't know the exact reason. Also, I wonder if sbt builder works the same?
I know that once someone pushed pack jars for new starr and for some reason that was causing problems in Scala IDE (manifests I think).

Lukas Rytz

unread,
Jul 6, 2012, 6:13:59 AM7/6/12
to scala-i...@googlegroups.com
On Fri, Jul 6, 2012 at 12:09 PM, Hubert Plociniczak <hubert.pl...@epfl.ch> wrote:
On 07/06/2012 11:51 AM, Lukas Rytz wrote:


On Fri, Jul 6, 2012 at 11:50 AM, Hubert Plociniczak <hubert.pl...@epfl.ch> wrote:
On 07/06/2012 11:42 AM, Lukas Rytz wrote:
I agree, when a commit has a new starr, there should be a parent revision that is the source of the starr
(not sure if that's always possible, but when it is, it should be done). the commit message should refer to
that parent ("new starr based on abc123").

another thing is that in the past, there used to be a difference between "scala-library/scala-compiler" jar
files from "pack" and the ones in "starr". not sure what that difference was. to build a new starr, one was
supposed to use the "newstarr" ant target. as far as i'm concerned, we can just use the jars in "pack".

pack contains msil/fjbg and some other minor classes. starr doesn't.

do you know what's the advantage of keeping them separate?

Size? Not everyone needs msil.jar I guess?

well, they are there anyway in lib/ just as separate jars. maybe the reason is that the
we need to have those separate jars for fjbg etc for other compilers in the build
(locker,quick).

Eugene Burmako

unread,
Jul 6, 2012, 7:07:17 AM7/6/12
to scala-i...@googlegroups.com
In my experience it's rarely possible to have a starr as a separate commit. If we could have extra branches for starrs (or a dedicated repository), I'd contribute to them. In fact, this is what I always do internally when submitting patches with modified starrs.

Josh Suereth

unread,
Jul 6, 2012, 7:22:39 AM7/6/12
to scala-i...@googlegroups.com

Really?  How can we call it the "stable reference" if we don't even have stable commits for them?  I think not having a commit with Starr sources is a very bad habit we must break.  Either that or invest time in regenerating Starr from deployed sources.   However, I know Starr doesn't include reflect or compiler src artifacts.

As for newstarr and fjbg, etc.   As you may have noticed, these are now always built.   I was waiting for a new Starr To remove these jars.  As far as the ide  is concerned, I think we can fix them up.

I'm slowly cleaning up our process, and we need to be more rigorous about starr.

Historical reasons for not rebuilding was speed.   However nit rebuilding in Jenkins was a big issue that's been corrected.

Eugene Burmako

unread,
Jul 6, 2012, 7:45:26 AM7/6/12
to scala-i...@googlegroups.com
I suggest we update the replacestarr script to include sources of new starrs.

When I do some changes to fundamental stuff, then frequently starr and the commit it supports have almost nothing in common. Hence submitting them side by side would distort history. Therefore I suggest either using a special repo for starr sources or (much simpler) attach sources to starrs.

Lukas Rytz

unread,
Jul 6, 2012, 7:58:55 AM7/6/12
to scala-i...@googlegroups.com
I'm in a situation right now that needs a new starr, but I can't just build the sources

I need to move the annotation class "scala.cloneable" to "scala.annotation.cloneable".
In Definitions.scala, we have

    lazy val CloneableAttr = requiredClass[scala.cloneable]

No matter in which order I do changes, I can't have a set of sources that passes the normal
build process.

 - If I move the annotation class, the starr compiler will crash (not find the class)
 - If I change the compiler source to requiredClass[scala.annotation.cloneable], the locker
   compiler will crash

I could hack the compiler to search in both places, but we don't want that code to be
committed either.

So what I need to do is

 1. build locker with the modified compiler
 2. move the annotation class
 3. build quick, pack, and make a new starr
 4. push the starr and get the .desired.sha1

Let me know about better ways :)
Lukas

Josh Suereth

unread,
Jul 6, 2012, 8:03:22 AM7/6/12
to scala-i...@googlegroups.com
Sources really should have been included from the beginning.  I'll start doing so in replace-starr tasks.

Still... We have this fundamental issue.

Josh Suereth

unread,
Jul 6, 2012, 8:08:16 AM7/6/12
to scala-i...@googlegroups.com
On Fri, Jul 6, 2012 at 7:58 AM, Lukas Rytz <lukas...@epfl.ch> wrote:
I'm in a situation right now that needs a new starr, but I can't just build the sources

I need to move the annotation class "scala.cloneable" to "scala.annotation.cloneable".
In Definitions.scala, we have

    lazy val CloneableAttr = requiredClass[scala.cloneable]

No matter in which order I do changes, I can't have a set of sources that passes the normal
build process.

 - If I move the annotation class, the starr compiler will crash (not find the class)
 - If I change the compiler source to requiredClass[scala.annotation.cloneable], the locker
   compiler will crash

I could hack the compiler to search in both places, but we don't want that code to be
committed either.

So what I need to do is

 1. build locker with the modified compiler
 2. move the annotation class
 3. build quick, pack, and make a new starr
 4. push the starr and get the .desired.sha1


Sounds like a build process.....    I could probably set up a mechanism whereby you can bootstrapp appropriately using the ant build.

i.e.

(1) You modify the compiler sources
(2) We make a src/bootstrapp/{library,reflect,compiler,....}   where we can put *changes* to files.   If I find a file here with the same name as one in the raw source package, I drop the one in the raw package.    If you need to "remove" a file, just "touch src/bootstrap/file-to-remove"
(3) Regular ant build process (including jenkins testing) can work against starr to ensure the STARR we deploy is really stable.
(4) newstarr can automatically move src/bootstrap changes into regular source code.  That way generating a new starr also cleans up bootstrapping related changes.

WDYT?   

Remember, let's make the build work *for* us, rather than working against the build.

- Josh

Lukas Rytz

unread,
Jul 6, 2012, 8:28:12 AM7/6/12
to scala-i...@googlegroups.com
I don't understand all the details. Would we commit the sources in "src/bootstrapp"?

Not sure if there's a "one size fits all" solution to this, I doubt it. I don't like the idea of
"newstarr" changing / overwriting source files. Doing what I described by hand doesn't
seem like too much effort either.

The important thing (obviously) is that every revision committed to the repo builds &
passes tests. I absolutely agree that it would be good to know for each starr what are
its sources. Eugene's idea of putting them in a repo seems valid.

Josh Suereth

unread,
Jul 6, 2012, 8:46:33 AM7/6/12
to scala-i...@googlegroups.com
Right, but just putting the sources in a jar *doesn't* fix your workflow outlines below because you can't automatically rebuild from those sources. 

(a) you need a STARR to build from.  WHICH ONE TO USE?
(b)  You can't build locker with the same source as quick.

What I'm proposing is to change the nature of quick/locker builds such that *quick* can have source overrides to locker, or vice versa.  These are temproary bootstrapping hacks so that we can have automated builds with no manual intervention.   *This* means we could rebuild from source.

So, an example for your use case:

src/library -> Contains full changes
src/compiler -> Contains full changes
src/locker/library -> Contains file with old location of annotation.   Also could have "empty file" with new annotation location.

Now, when building, we use files in src/locker/library *over* sources in src/library for building locker.   When building quick, we use the raw sources.

Replacestarr, in addition to changing all the .desired.sha1 files for commit, will also *remove* all the src/locker files for the new starr, to ensure that bootstrapping now works without the hacks with the new starr.

I think this is a far better mechanism than by hand, but feel free to disagree and let me know why :)

Paul Phillips

unread,
Jul 6, 2012, 9:36:52 AM7/6/12
to scala-i...@googlegroups.com


On Fri, Jul 6, 2012 at 4:58 AM, Lukas Rytz <lukas...@epfl.ch> wrote:
I could hack the compiler to search in both places, but we don't want that code to be
committed either.

Why not? It seems better than the alternatives.  It's how it has been done before (see below, which is still in trunk) but now it can be written more easily as getClassIfDefined(name1) orElse getRequiredClass(name2).

    /** getModule2/getClass2 aren't needed at present but may be again,
     *  so for now they're mothballed.
     */
    // def getModule2(name1: Name, name2: Name) = {
    //   try getModuleOrClass(name1.toTermName)
    //   catch { case ex1: FatalError =>
    //     try getModuleOrClass(name2.toTermName)
    //     catch { case ex2: FatalError => throw ex1 }
    //   }
    // }
    // def getClass2(name1: Name, name2: Name) = {
    //   try {
    //     val result = getModuleOrClass(name1.toTypeName)
    //     if (result.isAliasType) getClass(name2) else result
    //   }
    //   catch { case ex1: FatalError =>
    //     try getModuleOrClass(name2.toTypeName)
    //     catch { case ex2: FatalError => throw ex1 }
    //   }
    // }

Lukas Rytz

unread,
Jul 6, 2012, 10:01:10 AM7/6/12
to scala-i...@googlegroups.com
probably i'm a bit lazy here, but why is it so important to have the starr build reproducible?
every revision comes with a "some scala compiler" that is able to build the sources in the repo, and from there you can rebuild everything, includiong a new starr.

Paul Phillips

unread,
Jul 6, 2012, 10:06:53 AM7/6/12
to scala-i...@googlegroups.com


On Fri, Jul 6, 2012 at 7:01 AM, Lukas Rytz <lukas...@epfl.ch> wrote:
probably i'm a bit lazy here, but why is it so important to have the starr build reproducible?
every revision comes with a "some scala compiler" that is able to build the sources in the repo, and from there you can rebuild everything, includiong a new starr.

Having a binary blob in your bootstrap is like having a "then a miracle occurs" step in your proof.  I know, I know, we don't have the source to the microcode of the cpu or what-have-you, but it is still something to be avoided.

Josh Suereth

unread,
Jul 6, 2012, 10:07:07 AM7/6/12
to scala-i...@googlegroups.com
So... let's say our STARR repo gets corrupted.... Now what?  If I have a few starrs that are ok, do I have anyway to reproduce the others?   Right now that's a big no.

If I have a linear track of how starrs are created, I can at least automate a system to go back and make the starrs.   Reproducible builds are about reproducibility and transparency.   If you make a STARR, I should be able to as well.   Not only can I then verify what you added (and make sure there aren't insidious pieces of code), I can also help debug problems you might be experiencing.  Finally, I can actually achieve automation.

- Josh

Lukas Rytz

unread,
Jul 6, 2012, 4:37:38 PM7/6/12
to scala-i...@googlegroups.com
ok, i think i could do my new starr with that approach. one problem - but i'd have the same if
i do it manually - is the following. the new compiler sources do

 requiredClass[scala.annotation.cloneable]

if we have the old library sources (from src/locker/library) on the classpath while building
the compiler sources, that new class is not there, so it would not even compile.

so i'd need a different version of the compiler just do do the starr anyway it seems, e.g what
paul suggests.

so i agree that if we want reproduceability your proposal seems to work. we'd also commit
the temporary files in src/locker/..., and remove them in the next commit, right?

lukas

Josh Suereth

unread,
Jul 6, 2012, 4:43:33 PM7/6/12
to scala-i...@googlegroups.com

Yes.   Or we can let them Hang out for a bit and do less frequent Starr releases.   That can help reduce the disk space we use for these and help those without Starr repo access get stuff done. 

Paul Phillips

unread,
Jul 6, 2012, 4:46:52 PM7/6/12
to scala-i...@googlegroups.com
On Fri, Jul 6, 2012 at 1:37 PM, Lukas Rytz <lukas...@epfl.ch> wrote:
the new compiler sources do

 requiredClass[scala.annotation.cloneable]

if we have the old library sources (from src/locker/library) on the classpath while building
the compiler sources, that new class is not there, so it would not even compile.

You have to leave the old class in place during transition.
 

Paul Phillips

unread,
Jul 6, 2012, 4:47:38 PM7/6/12
to scala-i...@googlegroups.com
On Fri, Jul 6, 2012 at 1:46 PM, Paul Phillips <pa...@improving.org> wrote:
You have to leave the old class in place during transition.

And/or add the new class; I mean, both will be present.  It won't hurt anyone, I've done it a bunch of times.
 
Reply all
Reply to author
Forward
0 new messages