Begin forwarded message:From: Chris Bieneman via cfe-dev <cfe...@lists.llvm.org>Subject: [cfe-dev] Raising CMake minimum version to 3.4.3Date: April 26, 2016 at 3:01:23 PM PDTTo: cfe-dev <cfe...@lists.llvm.org>, llvm...@lists.apple.comCc: Galina Kistanova <gkist...@gmail.com>Reply-To: Chris Bieneman <be...@apple.com>Hello llvm-dev and cfe-dev,
I want open up the discussion of upgrading our minimum required CMake version.
In the past when we’ve discussed this one of the major reasons for not moving forward was that the Ubuntu LTS release was on CMake 2.8.12.x. A few days ago Ubuntu 16.04, the new LTS, was released, and it contains CMake 3.5.1.
I have a couple of motivations for this, the biggest of which is that I’ve hit a wall trying to overcome some limitations in the CMake ExternalProject module which I can’t get past without being on CMake 3.4 or newer. These limitations make using ExternalProject to build the LLVM test-suite and runtime libraries (compiler-rt, libcxx, etc) difficult.
The other big motivation that I have for this is the ability to cleanup code. We have a lot of CMake code that checks CMAKE_VERSION and enables and disables features used on the version. None of the places where we currently have CMAKE_VERSION checks should impact build correctness, but most people don’t realize that if you’re using Ninja your build will be faster on a newer CMake than on an older one. It would be nice if we just pushed the bar up and could remove a bunch of those conditionals.
To do this we need bot maintainers to update their CMake installations, so we need some consensus that (1) we want to do this and (2) what a reasonable timeline for doing it is.
What I’d like to do is raise the minimum version for all LLVM projects to CMake 3.4.3 at the end of May. Setting the date at the end of May gives bot maintainers and developers lots of time to update, and CMake 3.4.3 is the last CMake 3.4 release and it is widely available. For reference here’s a list of linux distributions and their CMake versions:
Ubuntu Wily -> 3.2.2
Ubuntu Xenial -> 3.5.1
Ubuntu Yakkety -> 3.5.1
Debian jessie -> 3.0.2
Debian stretch -> 3.5.1
Debian sid -> 3.5.2
FreeBSD 10.2 -> 3.5.0
FreeBSD HEAD -> 3.5.2
Feedback?
Thanks,
-Chris
_______________________________________________
cfe-dev mailing list
cfe...@lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-dev
On Apr 26, 2016, at 3:28 PM, Rafael Espíndola <rafael.e...@gmail.com> wrote:Why not 3.5.1?
Cheers,
Rafael
On Apr 26, 2016, at 3:47 PM, John Criswell <jtcr...@gmail.com> wrote:
On 4/26/16 6:01 PM, Chris Bieneman via cfe-dev wrote:Hello llvm-dev and cfe-dev,
I want open up the discussion of upgrading our minimum required CMake version.
In the past when we’ve discussed this one of the major reasons for not moving forward was that the Ubuntu LTS release was on CMake 2.8.12.x. A few days ago Ubuntu 16.04, the new LTS, was released, and it contains CMake 3.5.1.
I have a couple of motivations for this, the biggest of which is that I’ve hit a wall trying to overcome some limitations in the CMake ExternalProject module which I can’t get past without being on CMake 3.4 or newer. These limitations make using ExternalProject to build the LLVM test-suite and runtime libraries (compiler-rt, libcxx, etc) difficult.
The other big motivation that I have for this is the ability to cleanup code. We have a lot of CMake code that checks CMAKE_VERSION and enables and disables features used on the version. None of the places where we currently have CMAKE_VERSION checks should impact build correctness, but most people don’t realize that if you’re using Ninja your build will be faster on a newer CMake than on an older one. It would be nice if we just pushed the bar up and could remove a bunch of those conditionals.
To do this we need bot maintainers to update their CMake installations, so we need some consensus that (1) we want to do this and (2) what a reasonable timeline for doing it is.
What I’d like to do is raise the minimum version for all LLVM projects to CMake 3.4.3 at the end of May. Setting the date at the end of May gives bot maintainers and developers lots of time to update, and CMake 3.4.3 is the last CMake 3.4 release and it is widely available. For reference here’s a list of linux distributions and their CMake versions:
Ubuntu Wily -> 3.2.2
Ubuntu Xenial -> 3.5.1
Ubuntu Yakkety -> 3.5.1
Debian jessie -> 3.0.2
Debian stretch -> 3.5.1
Debian sid -> 3.5.2
FreeBSD 10.2 -> 3.5.0
FreeBSD HEAD -> 3.5.2
Feedback?
How did you determine that FreeBSD 10.2 is using cmake 3.5.0? On my FreeBSD 10.3 system, I have cmake version 3.4.1 (I think it got installed from ports when I installed other software).
Also, I see that you sent this to llvm...@lists.apple.com. Shouldn't this also go to llvm...@lists.llvm.org (or are they the same list)?
Third, just to nitpick, FreeBSD is not a Linux distribution. :)
Regards,
John Criswell
Thanks,
-Chris
_______________________________________________
cfe-dev mailing list
cfe...@lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-dev
--
John Criswell
Assistant Professor
Department of Computer Science, University of Rochester
http://www.cs.rochester.edu/u/criswell
On Apr 26, 2016, at 3:47 PM, John Criswell <jtcr...@gmail.com> wrote:
On 4/26/16 6:01 PM, Chris Bieneman via cfe-dev wrote:
Hello llvm-dev and cfe-dev,
I want open up the discussion of upgrading our minimum required CMake version.
In the past when we’ve discussed this one of the major reasons for not moving forward was that the Ubuntu LTS release was on CMake 2.8.12.x. A few days ago Ubuntu 16.04, the new LTS, was released, and it contains CMake 3.5.1.
I have a couple of motivations for this, the biggest of which is that I’ve hit a wall trying to overcome some limitations in the CMake ExternalProject module which I can’t get past without being on CMake 3.4 or newer. These limitations make using ExternalProject to build the LLVM test-suite and runtime libraries (compiler-rt, libcxx, etc) difficult.
The other big motivation that I have for this is the ability to cleanup code. We have a lot of CMake code that checks CMAKE_VERSION and enables and disables features used on the version. None of the places where we currently have CMAKE_VERSION checks should impact build correctness, but most people don’t realize that if you’re using Ninja your build will be faster on a newer CMake than on an older one. It would be nice if we just pushed the bar up and could remove a bunch of those conditionals.
To do this we need bot maintainers to update their CMake installations, so we need some consensus that (1) we want to do this and (2) what a reasonable timeline for doing it is.
What I’d like to do is raise the minimum version for all LLVM projects to CMake 3.4.3 at the end of May. Setting the date at the end of May gives bot maintainers and developers lots of time to update, and CMake 3.4.3 is the last CMake 3.4 release and it is widely available. For reference here’s a list of linux distributions and their CMake versions:
Ubuntu Wily -> 3.2.2
Ubuntu Xenial -> 3.5.1
Ubuntu Yakkety -> 3.5.1
Debian jessie -> 3.0.2
Debian stretch -> 3.5.1
Debian sid -> 3.5.2
FreeBSD 10.2 -> 3.5.0
FreeBSD HEAD -> 3.5.2
Feedback?
How did you determine that FreeBSD 10.2 is using cmake 3.5.0? On my FreeBSD 10.3 system, I have cmake version 3.4.1 (I think it got installed from ports when I installed other software).
I have a fully updated FreeBSD 10.2 machine and ran “pkg search cmake”. It came back with 3.5.0. I’m not sure what version it released with, but the updated version is in the ports collection.
We can't assume every Linux is Ubuntu, nor that every platform has packages for this or that release.
You are asking for a move to built CMake support, from packages, and that's a big move. We can't control CMake's progress nor its repository.
This is bigger than a simple version upgrade, regardless of which version the new Ubuntu or freebsd ships, or the fact that other systems already need to build it.
Cheers,
Renato
_______________________________________________
LLVM Developers mailing list
llvm...@lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/llvm-dev
This is a limited picture that we have addressed on previous similar
threads about CMake. Please search the archives for CMake, Ninja and
"versions" to see the whole context.
Now, to your points...
Not everyone has the ability to pick and choose whatever they want.
Also, local development is different than buildbot deployment, and we
do have a lot of obscure targets with obscure operating systems.
Most CMake development has been done on a selective way so far, and
it's working well. IFF you have CMake X.Y.Z, enable this feature. IFF
you have Ninja 1.5.X, enable that one. This is the best way to go for
now.
> CMake itself is released and packaged by CMake. You can download binaries
> from their website for major platforms.
> And it doesn’t have to be a DEB or RPM package, it’s a portable tar.gz file
> that can be unpacked with statically linked
> binaries. You don’t even need to be root to use it and it could even be
> automated for most platforms using some build
> scripts that fetch the dependencies.
You're assuming everyone can just install whatever they want on their
company servers...
> For the other platforms, you could build it from source, which is really
> easy. Newer versions of CMake still support some
> really ancient platforms, so I don’t think anyone will have issues doing
> that. If you’re using an exotic platform, you should
> be used to compiling your own software anyway, so I don’t think this will be
> an issue for them.
You're assuming it's easy to compile based on your experience, but
have you tested cross-compiling it to old distributions?
> Do Windows devs get stuck because the Windows packaging system doesn’t come
> with the latest version of CMake?
> They don’t have any, so they download the installer, use the updated version
> and don’t complain (too) much.
Is this an argument that Linux users shouldn't be worried about binary
installers just because Windows users do that all the time? Because if
it is, it doesn't hold water.
cheers,
--renato
It needs a compatible version of C and C++ libraries. If you build on
a modern machine but run on an older Linux, you'll get at least
libstdc++ clashes.
Also, compiling CMake and Ninja is not the most reliable way of
deploying buildbots.
What's wrong with using newer CMake features IFF you have that
version? Then you can choose where to incur the extra deployment cost
or not based on your need of that feature.
cheers,
--renato
+1
Same for OS X.
I work on linux most of the time and fully support upgrading cmake to
whatever version people writing our cmake files find useful. I have no
idea what cmake version the distribution I use ships, and I see no
point in looking that up. Building cmake is truly trivial. Anyone (or
any bot) that can build llvm can build cmake.
Cheers,
Rafael
It doesn’t link dynamically with libstdc++ and it requires glibc 2.2.5 apparently, from 2002.
They make sure it works on ancient systems when they do a release.
If you want to build a bleeding edge compiler on a system from 2002, you may have bigger issues
than trying to run CMake.
>
> Also, compiling CMake and Ninja is not the most reliable way of
> deploying buildbots.
Ninja is not the problem here though and I would agree that it’s not reliable.
Fortunately, CMake can generate projects compatible with quite old versions of Make already.
>
> What's wrong with using newer CMake features IFF you have that
> version? Then you can choose where to incur the extra deployment cost
> or not based on your need of that feature.
Maintenance burden.
Few people will understand why there’s 2 code paths in the build script and will duplicate the wrong
one when creating a new module and looking for “inspiration".
>
> cheers,
> --renato
> _______________________________________________
> cfe-dev mailing list
> cfe...@lists.llvm.org
> http://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-dev
/Florent
If you're cross compiling, you have less of a choice over what's on
the target and host.
> Maintenance burden.
> Few people will understand why there’s 2 code paths in the build script and will duplicate the wrong
> one when creating a new module and looking for “inspiration".
That would be quickly caught by buildbots. We deal with that issue already.
To be clear, I'm not against moving the version up, I just want to
make sure that people understand that this is not *just* a version
upgrade, but a development philosophy move for all Linux developers
and production environments (of which we have plenty). This move was
proposed before and was rejected for the reasons I pointed out:
maintenance.
Once we go the path of accepting compiled versions of CMake, then
anyone will be able to add any feature and put the maintenance burden
on who had nothing to do with it, ie, production environment
maintainers. Having a cap on CMake/Ninja has the great advantage that
production environments will remain stable for as long as possible and
I can plan my migrations.
So far, the argument for a new CMake is that it's "nicer", not that it
has a feature that we cannot go without, and that's not strong enough,
especially when against increased production environment maintenance
cost.
--renato
Yes. It is a move to put linux developers in the same position as
windows and OS X ones, which is a *very* reasonable thing to do.
Cheers,
Rafael
Apart from the fact that neither Windows nor OSX users have a choice.
We have discussed this before, Rafael, and I don't think doing it
again will yield different results unless new evidence is brought to
light.
Unless there is a feature in a newer CMake that we *must* have, I see
no need to upgrade the version. In the same way we've been holding on
C++11/14 functionality because MSVC couldn't cope with anything until
recently.
Trying to go too fast on the cool tools can create a patchwork of
versions and functionality that will make it very hard to debug
buildbot problems.
LLVM is not a toy any more, we can't just expect that everyone can get
the same environment we have on our own machines, or we'll fall into
the "works for me" style of closing bugs that is pervasive of open
source projects where only a few people ever commit code to.
Now, back to Chris' point:
Is the ExternalProject the only sane way to build compiler-rt and
libc++? Because this IS a big point.
Will it allow a way to build Compiler-RT and libc++ for *all*
supported platforms (as in -DLLVM_TARGETS_TO_BUILD)? Would it be
possible (even if not nice) to do so with an older version of CMake?
How worse would it be, if possible?
Until we can answer all these questions, build times and personal
preferences have no impact on such a big decision.
cheers,
--renato
On 27 April 2016 at 15:39, Rafael Espíndola <rafael.e...@gmail.com> wrote:
> Yes. It is a move to put linux developers in the same position as
> windows and OS X ones, which is a *very* reasonable thing to do.
Apart from the fact that neither Windows nor OSX users have a choice.
We have discussed this before, Rafael, and I don't think doing it
again will yield different results unless new evidence is brought to
light.
Unless there is a feature in a newer CMake that we *must* have, I see
no need to upgrade the version. In the same way we've been holding on
C++11/14 functionality because MSVC couldn't cope with anything until
recently.
Trying to go too fast on the cool tools can create a patchwork of
versions and functionality that will make it very hard to debug
buildbot problems.
LLVM is not a toy any more, we can't just expect that everyone can get
the same environment we have on our own machines, or we'll fall into
the "works for me" style of closing bugs that is pervasive of open
source projects where only a few people ever commit code to.
Now, back to Chris' point:
Is the ExternalProject the only sane way to build compiler-rt and
libc++? Because this IS a big point.
Will it allow a way to build Compiler-RT and libc++ for *all*
supported platforms (as in -DLLVM_TARGETS_TO_BUILD)? Would it be
possible (even if not nice) to do so with an older version of CMake?
How worse would it be, if possible?
Until we can answer all these questions, build times and personal
preferences have no impact on such a big decision.
I agree, this is one of the strong points towards the move. Much more
relevant than both personal preferences.
> I'm not really sure how pushing towards newer released and stable versions
> will cause *more* of a patchwork of versions than following the distro
> installed versions. It seems to involve strictly fewer versions of CMake.
> But maybe I'm not understanding your concern here?
I may digress a little, so bear with me. But just to be sure, I want
you to know that I don't disagree with the upgrade, I just want to
make the reasons clear.
Today, installing CMake (or any dep) from packages means they were
validated and released by people that did the validation.
Right now, I *need* CMake 3.x and Ninja 1.5.* (because of LINK_POOL),
and I have no access on my ageing boards to install them from
packages. so I have gone to the trouble to download a *stable* copy of
the sources, and built from source.
This, however, adds to the maintenance of adding new boards, or
putting them back online when they fail. Work done at those times is
stressful, because we're out of coverage until I can put them back in
production. It's not a lot of work, but it's a very stressful one. But
I'm willing to consciously pay the price and I do so with great care.
But asking people to build CMake encourages them to build whatever
stable, or even unstable versions of CMake. Not that they can't do it
today, but being this the recommended way means people will think less
about that step and just get whatever is easier. That's where unknown
bugs in unstable versions will catch us, because no one will know it's
an unstable CMake version on a buildbot until a lot of time has been
wasted.
I agree, however, that this is not a very strong argument. But it's
stronger than personal preference. That's why I was asking for the
real reasons.
> - Most major Linux distros have support out of the box in their "current"
> version (OpenSUSE, Fedora, Ubuntu, Debian, Arch), and several in a released
> version (Ubuntu, Debian, Arch).
Note that this is *current* and mostly x86_64. Life on ARM, MIPS and
probably PPC is not as shiny.
At least on ARM, different boards may have different (old) systems
that they were validated with and will *not* upgrade, at least not
officially. Modulo some serious validation, there's no way I can
upgrade to a current version and expect it to be stable.
> But clearly we *do* need a fallback option for folks that are on an OS that
> doesn't provide CMake or that can't install packages using the package
> manager into the primary OS. I'm totally supportive of that use case.
Right. And I totally want to support Chris and others to make our
build system better.
I am definitely willing to take the hit and upgrade CMake on *all* my
builders (current and future) to a *stable* CMake release if we can
make it clear what we want.
If it's package based, having a minimum version is necessary, since
some new distros don't have the old one. But it's also ok, because
distros validate their packages and control their back-ports, so we
know any update from them is reliable, no matter how new.
If it has to be compiled, than things are a little more complicated.
We need to make sure people understand that *any* production
environment (buildbot/Jenkins, release testing, etc) needs to use a
stable release. We don't control our validation or release test
environments as tight as we should, anyone can add a bot or be a
tester, so having a loose requirement list makes the binaries less
worth, and we do get bug reports on them due to incompatibilities. We
want to decrease that problem, not increase.
I also want to make the releases a community process, where
stakeholders (like Android, Apple, ARM, Sony, etc) can chime in and
validate in their production environments, so that we know all people
that use LLVM have a good, stable and robust base to build upon. Make
it more flaky will counter those goals.
I don't want us to block people from using experimental CMake
versions, but I think we must do so for the CMake scripts that go
upstream. In a nutshell, we should mark a range, not a minimum.
Preferably, a range includes a CMake that exists in Debian Jessie,
since this is going to be the production release of most ARM and
AArch64 machines for the foreseeable future. But stronger arguments
may trump that, I understand.
> If it would help
> we could even bundle a copy of the CMake source code at a known good
> snapshot in the LLVM project and write a wrapper script to bootstrap it into
> a known location and then use it for subsequent builds ... but that seems
> unnecessary to me.
No, having a range and *requiring* stable releases should be more than enough.
> I think removing impedance from the development of the CMake build, or
> enabling new functionality such as properly bootstrapped runtime libraries,
> are absolutely things to weigh against the cost of using a newer tool.
I absolutely agree, but so far there were no technical arguments to
support any of that.
All version proposals were hand wavy and based on their existence on
this or that OS. I think we need to get a list of the features we
need, match to the *stable* versions available from source, and make
an informed choice. Just like we did with the compilers versions.
> The folks working on our build infrastructure have limited time. Making them cope with designing the usage of CMake in such a way that it gracefully degrades across many major versions of CMake makes their work substantially harder, and even if it is theoretically possible to do, it may become sufficiently annoying or slow to do that we simply don't get improvements to how we build things.
>
> And I think that the same tradeoff holds for C++11 features. We didn't *need* any of them, and we actually pushed the Windows platform harder than all of the others because it was the one holding us back. And I think that was good because it made the developers substantially more productive. In this case, it's just the build infrastructure and not the entire codebase, but I think a similar argument holds. If the functionality in CMake 3.4 makes Chris's job on CMake substantially easier, and it at least seems reasonable to get access to that version of CMake, I'm very supportive of the change.
Thanks. I think this is a perfect summary. In the end of the day it is a tradeoff of who spends time on what. And upgrading cmake takes very little time compared to having to support old versions.
Cheers,
Rafael
I am definitely willing to take the hit and upgrade CMake on *all* my
builders (current and future) to a *stable* CMake release if we can
make it clear what we want.
I think we shouldn't try to go beyond 3.4.x, then.
Renato, in your most recent email you comment about differentiating *stable* vs random versions of CMake built from source. I believe as a community our recommendation should be that people download CMake sources from https://CMake.org/download/ instead of git. Then they get the source of a known stable release. If people choose to pull random git hashes or tags, that is their decision, but since the minimum version I'm proposing is widely available I think it will be uncommon.
So let's talk about ExternalProject.
Is it the only way? No. There are actually several approaches that could be taken here, but let me explain why I want to do it this way.
At a very basic level there are really two high-level approaches to solve the problem of building runtime libraries multiple times:
(1) Do the "Darwin" thing hand hack building multiple targets from the same CMake configuration.
(2) Configure the runtimes multiple times, once per target.
I've been working to try and un-do the horrible Darwin approach we have because I think it has some serious problems, but an alternative solution would be to extend the Darwin approach to every other platform. If you do this you don't need ExternalProject, but you do need to do some really dirty hacks. If you want to understand those hacks, just look at CompilerRTDarwinUtils.cmake in compiler-rt. That is a bit of evil that I’m responsible for, and I’ll be paying down that debt for a long time to come.
Apart from this we also have another problem. The build system makes configuration decisions based on the compiler *at configuration time*. The problem is you really want to configure compiler-rt (and the other runtimes) *after* you've already built clang so the runtimes are configured and built with the clang you just built. The only way to do that is with some mechanism similar to ExternalProject.
We could roll our own. At this point I'm going to put on my "non-constructive reality" hat. I'm not going to do that. I don't mean that from a philosophical stance; I love reinventing me some wheels. I mean that from a "there is no way my boss is going to let me waste that much of my time solving an already solved problem" stance. Even using ExternalProject, this work is largely a side-project of mine, so it is going to take me a while to untangle it.
Replicating ExternalProject would be a lot of work, and it would be a huge maintenance burden to the community. As a data point, the one place in LLVM where we don't use ExternalProject (and we should be) is the cross-compiling goop, and it is a gigantic mess (see: CrossCompile.cmake and every mention of LLVM_NATIVE_BUILD in TableGen.cmake). It is filled with bugs and in desperate need of disentangling. This is another one of my sins that I need to repent.
I feel I should also point out that ExternalProject isn't a silver bullet. It isn't going to solve all our problems, and since it is one of the more actively changing parts of CMake we may find ourselves re-visiting this conversation in a year or two talking about a newer version of CMake.
-Chris
I'm guilty of cloning the git repo on my local tree, because it's
easier to do that than to find the link, download, untar, clean up,
etc. Then there's a bug, and you need to upgrade, "git pull" is much
simpler than repeating the whole process again. :)
Also, recent "stable", as Chuck has shown, is less stable than
previous stable. I'd like to err in the side of caution.
> I've been working to try and un-do the horrible Darwin approach we have because I think it has some serious problems, but an alternative solution would be to extend the Darwin approach to every other platform. If you do this you don't need ExternalProject, but you do need to do some really dirty hacks. If you want to understand those hacks, just look at CompilerRTDarwinUtils.cmake in compiler-rt. That is a bit of evil that I’m responsible for, and I’ll be paying down that debt for a long time to come.
Point taken, it is horrible. (sorry)
> Apart from this we also have another problem. The build system makes configuration decisions based on the compiler *at configuration time*. The problem is you really want to configure compiler-rt (and the other runtimes) *after* you've already built clang so the runtimes are configured and built with the clang you just built. The only way to do that is with some mechanism similar to ExternalProject.
<devil's advocate>We could come up with a set of rules that makes it
possible (not preferable) to do so at config time.</da>
Is that where the current complication comes from? What makes it so
horrible and ends up as a home brew version of ExternalProject?
> We could roll our own. At this point I'm going to put on my "non-constructive reality" hat. I'm not going to do that.
Absolutely not. Ten shots to the head before we start rolling thing
our own that are perfectly covered by a tool we already use.
> Replicating ExternalProject would be a lot of work...
Just to be clear, I didn't mean that. If you're saying that, to get
the desired functionality we'll either need ExternalProject or
something identical to it, then the choice is clear. I was asking if
there was another (good, but less good) solution to the problem.
Rolling our own is not even a solution, much less a good one.
> As a data point, the one place in LLVM where we don't use ExternalProject (and we should be) is the cross-compiling goop, and it is a gigantic mess (see: CrossCompile.cmake and every mention of LLVM_NATIVE_BUILD in TableGen.cmake). It is filled with bugs and in desperate need of disentangling. This is another one of my sins that I need to repent.
Oh, so ExternalProject would also fix cross compilation!? I'm sold! :)
> I feel I should also point out that ExternalProject isn't a silver bullet. It isn't going to solve all our problems, and since it is one of the more actively changing parts of CMake we may find ourselves re-visiting this conversation in a year or two talking about a newer version of CMake.
Right, this is an interesting point. I don't want to do this every
year, so let's be as reasonable as we can.
Is there anything in 3.5 that can really save us a lot of effort in
the current work?
I'm now using CMake 3.2.2 on all buildbots. If I have to move it up to
3.4.3, I'm fairly confident that there will be absolutely no problem
with it.
But as Chuck said, there were some crashes on 3.5.2 that needed 3.5.3,
and by using it on so many different platforms, we may uncover bugs,
and may have to roll back to 3.4 temporarily, and then forwards to
3.5, etc.
I really want to avoid that. But I also want to avoid doing this over
again next year.
Finally, Linaro cares more about Linux on ARM (v7+) than bare-metal,
but we can't just break the rest. I know there are people out there
that have infrastructure to test that kind of restricted environment,
not only on ARM, but also MIPS and old PPC32 stuff. I think they
should voice their concerns before we take any harsh decision.
On Apr 27, 2016, at 11:17 AM, Renato Golin <renato...@linaro.org> wrote:On 27 April 2016 at 18:41, Chris Bieneman <be...@apple.com> wrote:Renato, in your most recent email you comment about differentiating *stable* vs random versions of CMake built from source. I believe as a community our recommendation should be that people download CMake sources from https://CMake.org/download/ instead of git. Then they get the source of a known stable release. If people choose to pull random git hashes or tags, that is their decision, but since the minimum version I'm proposing is widely available I think it will be uncommon.
I'm guilty of cloning the git repo on my local tree, because it's
easier to do that than to find the link, download, untar, clean up,
etc. Then there's a bug, and you need to upgrade, "git pull" is much
simpler than repeating the whole process again. :)
Also, recent "stable", as Chuck has shown, is less stable than
previous stable. I'd like to err in the side of caution.I've been working to try and un-do the horrible Darwin approach we have because I think it has some serious problems, but an alternative solution would be to extend the Darwin approach to every other platform. If you do this you don't need ExternalProject, but you do need to do some really dirty hacks. If you want to understand those hacks, just look at CompilerRTDarwinUtils.cmake in compiler-rt. That is a bit of evil that I’m responsible for, and I’ll be paying down that debt for a long time to come.
Point taken, it is horrible. (sorry)Apart from this we also have another problem. The build system makes configuration decisions based on the compiler *at configuration time*. The problem is you really want to configure compiler-rt (and the other runtimes) *after* you've already built clang so the runtimes are configured and built with the clang you just built. The only way to do that is with some mechanism similar to ExternalProject.
<devil's advocate>We could come up with a set of rules that makes it
possible (not preferable) to do so at config time.</da>
Is that where the current complication comes from? What makes it so
horrible and ends up as a home brew version of ExternalProject?
We could roll our own. At this point I'm going to put on my "non-constructive reality" hat. I'm not going to do that.
Absolutely not. Ten shots to the head before we start rolling thing
our own that are perfectly covered by a tool we already use.Replicating ExternalProject would be a lot of work...
Just to be clear, I didn't mean that. If you're saying that, to get
the desired functionality we'll either need ExternalProject or
something identical to it, then the choice is clear. I was asking if
there was another (good, but less good) solution to the problem.
Rolling our own is not even a solution, much less a good one.
As a data point, the one place in LLVM where we don't use ExternalProject (and we should be) is the cross-compiling goop, and it is a gigantic mess (see: CrossCompile.cmake and every mention of LLVM_NATIVE_BUILD in TableGen.cmake). It is filled with bugs and in desperate need of disentangling. This is another one of my sins that I need to repent.
Oh, so ExternalProject would also fix cross compilation!? I'm sold! :)
I feel I should also point out that ExternalProject isn't a silver bullet. It isn't going to solve all our problems, and since it is one of the more actively changing parts of CMake we may find ourselves re-visiting this conversation in a year or two talking about a newer version of CMake.
Right, this is an interesting point. I don't want to do this every
year, so let's be as reasonable as we can.
Is there anything in 3.5 that can really save us a lot of effort in
the current work?
I'm now using CMake 3.2.2 on all buildbots. If I have to move it up to
3.4.3, I'm fairly confident that there will be absolutely no problem
with it.
But as Chuck said, there were some crashes on 3.5.2 that needed 3.5.3,
and by using it on so many different platforms, we may uncover bugs,
and may have to roll back to 3.4 temporarily, and then forwards to
3.5, etc.
I really want to avoid that. But I also want to avoid doing this over
again next year.
> Replicating ExternalProject would be a lot of work...
That certainly seems to be the case. I'm happy with 3.4.x.
> I don’t believe so. I’ve read the release notes
> (https://cmake.org/cmake/help/v3.5/release/3.5.html), and I don’t think
> there is anything really important to us in them. I am interested in a few
> things being discussed on the cmake-developers list though
> (http://public.kitware.com/pipermail/cmake-developers/2016-January/027370.html).
> It looks like that might make CMake 3.6, and it could be interesting for us.
> That said, this is likely a case where we could easily support the old and
> new versions without much problem and it doesn’t impact build correctness.
Is CMake like Linux that the even releases are stable and odd are
experimental (or vice versa)?
If not, I think we can safely update to CMake M.N as soon as M.N+1 is
declared "stable". This worked before with 2.8, and should work with
3.4.
> While I don’t want to advocate for us updating frequently “just because”, I
> do think that if there are compelling reasons to update every year or two,
> we shouldn’t be afraid to.
This is a big issue now because we're still relying on what's on the distros.
As soon as we officially move to built-in, and as long as we keep our
documentation in order, and make sure to mention strict versions and
build methods (no git, only tarballs), then moving CMake versions
won't be a big issue.
This could still be dangerous if the user has a git clone.
I'd rather follow what Chris said, and update CMake version's every
year or two, than have to rely on multiple configuration scenarios
that not everyone tests.
The problem I see with doing this in the current situation is that it isn’t just the CMake module we need. If you look at the changes I called out in my earlier email, there are associated CMake C++ source changes too.I also think that one of the limitations I frequently come up against with CMake 2.8.12, is that newer CMake versions accept generator expressions in more places. This is entirely implemented in the CMake C++ code, and there is no way to workaround it in CMake scripts other than not using generator expressions.
Is CMake like Linux that the even releases are stable and odd are
experimental (or vice versa)?
It would be nice to have at least a month of lead time. (Which the OP did provide..) Upgrading every year or so isn’t a hardship.
Jim Rowan
j...@codeaurora.org
Qualcomm Innovation Center, Inc. is a member of Code Aurora Forum, hosted by the Linux Foundation
The "lets update every year just because" does have ripple effects for us non-traditional platforms.
John
Thank you John, with some concrete arguments to my earlier attempt to
hold the horses.
As far as ARM is concerned, it's probably fine to upgrade (we've done
most of the work last week), but there are platforms that make ARM
testing look easy. :)
I'm adding some MIPS, PPC and release folks to make sure there's
nothing else we're missing.
cheers,
--renato
I really hope nobody decides not to move to a more recent version of
cmake because of IA-64.
It depends on the extent of our changes and the extent of the yearly differences in CMake. Some might merge easily, some might require re-engineering if the underlying code has major changes. Having a “fetch me a shrubbery” exercise once a year and hope we aren’t surprised is just something I’d rather not have to do. We already have to worry about any out-of-tree merges into newer LLVMs. At some point as we get farther along, I’ll come back here and see what sort of stuff we can check into the tree. For example, getting our own triple into the tree would be a good start. However, I suspect we’re over a year away from any of those discussions.
Of course the real solution would be to work with the CMake folks and provide/support some OpenVMS support or at least follow THEIR development discussions. Just more work for my small team.
From: James Y Knight [mailto:jykn...@google.com]
Sent: Tuesday, May 03, 2016 9:56 AM
To: John Reagan
Cc: llvm-dev; llvm-dev...@lists.llvm.org
Subject: Re: [llvm-dev] [cfe-dev] Fwd: Raising CMake minimum version to 3.4.3
It sounds like your problem is with having cmake working at all, not which version is required...So I'm not sure how requiring an upgrade every year could make that any worse.
I will only be using CMake when we have working OpenVMS x86 systems. The fact that I'm starting on IA-64 hosts is not relevant.
-----Original Message-----
From: C Bergström [mailto:cberg...@pathscale.com]
Sent: Tuesday, May 03, 2016 11:12 AM
To: James Y Knight
Cc: John Reagan; llvm-dev; llvm-dev...@lists.llvm.org
Subject: Re: [llvm-dev] [cfe-dev] Fwd: Raising CMake minimum version to 3.4.3
4/23 - I will send another follow-up email reminding everyone of this change and timeline
4/30 - I will send a final notice an hour before making the change to the LLVM, Clang, Compiler-RT, Clang-Tools-Extra, LibCXX, LibCXXABI and Test-Suite repos
During the week of 4/30 I will revert as necessary if bots fail. Hopefully having the change permanently landed by the middle of the week.
Does this sound agreeable to everyone?
Thanks,
-Chris
On May 3, 2016, at 9:41 AM, Smith, Kevin B <kevin....@intel.com> wrote:-----Original Message-----
From: cfe-dev [mailto:cfe-dev...@lists.llvm.org] On Behalf Of Chris
Bieneman via cfe-dev
Sent: Tuesday, May 03, 2016 9:07 AM
To: LLVM Dev <llvm...@lists.llvm.org>; Clang Dev <cfe-
d...@lists.llvm.org>
Cc: Chris Matthews <cmatt...@apple.com>; Galina Kistanova
<gkist...@gmail.com>
Subject: Re: [cfe-dev] [llvm-dev] Fwd: Raising CMake minimum version to
3.4.3
Since there seems to be no strong objections remaining I’d like to propose
the following timeline and process:
4/23 - I will send another follow-up email reminding everyone of this change
and timeline
4/30 - I will send a final notice an hour before making the change to the
LLVM, Clang, Compiler-RT, Clang-Tools-Extra, LibCXX, LibCXXABI and Test-
Suite repos
During the week of 4/30 I will revert as necessary if bots fail. Hopefully
having the change permanently landed by the middle of the week.
Do you mean 5/23 and 5/30?
First I don’t think anyone is suggesting we should update *ever* “just because”. In fact, I think as a community we’ve held a pretty high bar for updating the CMake dependency favoring keeping it stable. Notice my failed attempt to move to CMake 3.1 last year.
Second, I’d really like to keep discussions of future updates to the version separate from the current update. I know I started this all in my email by stating “...we may find ourselves re-visiting this conversation in a year or two…”, but let’s please not entangle to two.
I want to stress that what I was suggesting originally was that we may find compelling reasons to update our CMake version in the future. The CMake developers are actively adding new and useful features and we *might* find that new features are valuable enough to raise our minimum version again in another year or two. Alternatively we might find that we’re happy on 3.4.3 for the next decade.
Despite my problems keeping track of dates, my love of DeLoreans, and my affinity toward driving at 88 MPH, I am not a time traveler. I have no idea what the future holds. I just don’t think we should hold back our project by being unwilling to revisit this conversation. It is still a conversation. Nobody is suggesting we should commit to a regular update schedule for kicks.
-Chris
I'd like to propose a different approach. How about we do this the
other way around? Maybe we should try the "move first, fix later",
than the "break first, despair later".
This week we (Linaro) have finished our buildbot migration, all of
them running on CMake 3.4.3. I'd like to see if other bot maintainers
could take the same effort before a certain date.
So, we can still have the same dates (in May, of course), but with
different "labels":
Soon, you send an announcement:
* Saying the consensus is to move to CMake 3.4.3 as well as a
compiled CMake for all platforms,
* That the technical reason is strong enough to do so, even if it
incurs more work to some people,
* That *ALL* bot owners should migrate as soon as possible (no later
than 30th).
On 23rd, you send a reminder to the list informing all bot owners that
time is running out.
* If they can't do it, who could help them
* If they need push in the CMake community, or a new stable release,
it's better to have it now than then
On 30th, you do the migration as you proposed.
It's all the same, but with the difference that we're involving all
bot owners, and hopefully not having to revert too many times such a
troublesome change (CMake changes always mess up the bots anyway).
Anyway, for ARM/AArch64, we're good to go.
cheers,
--renato
This approach sounds great to me.
Chris M let me know separately that GreenDragon is on CMake 3.5.
I belive Galina and Takumi maintain the largest chunk of remaining public bots. Do either of you have any thoughts on this?
Thanks,
-Chris
Jim Rowan
j...@codeaurora.org
Qualcomm Innovation Center, Inc. is a member of Code Aurora Forum, hosted by the Linux Foundation
_______________________________________________
Our team manages a couple of lldb buildbots still using cmake 2.8.
We'll get started on upgrading them though.
pl
On 5 May 2016 at 01:08, Galina Kistanova via cfe-dev
> _______________________________________________
> cfe-dev mailing list
> cfe...@lists.llvm.org
> http://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-dev
>