Chromium Compile Time

5,961 views
Skip to first unread message

Nicolae

unread,
Feb 10, 2016, 4:00:36 AM2/10/16
to chromi...@chromium.org
Hi, 

Not sure where to ask this, I have a machine that has 12 cores + HT => 24 CPUs show up in htop.

I did few tests on compiling Chromium on Arch Linux by following the corresponding instructions.

ninja -C out/Debug chrome
Start: 21:00 PM
End:   21:35 PM


ninja -j 200 -C out/Release chrome
Start: 22:20 PM
End:   22:54 PM


ninja -j 10 -C out/Release chrome
Start: 00:01 AM
End:   00:48 AM



The time when I use all the cores/threads seems to not go lower than 35 min.
This is a Xeon E5-2697 v2 @ 2.70GHz.

I did prior to this a test with a Skylake 6820HK and the time was better ! it took about 30 min.
Same amount of RAM and SSDs, etc

Curious to see what others say about timing... is this a good I have for this CPU or I'm doing something horribly wrong.
I had the Skylake laptop and thought this is too much time, had the $ and went to buy the Xeon build (this time it is a laptop too http://www.eurocom.com/ec/configure(1,234,0)ec). But to my shock the timing is worse. Was thinking for parallel compiling the number of cores will cut the time at least in half if not less.

Any ideas ?

PhistucK

unread,
Feb 10, 2016, 4:09:44 AM2/10/16
to nicolae...@gmail.com, chromi...@chromium.org
Did you check whether your CPU was even fully used in the Skylake case? Perhaps there is a different part that is maxed out, like your SSD, your RAM and so on.


PhistucK

--
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev
---
You received this message because you are subscribed to the Google Groups "Chromium-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to chromium-dev...@chromium.org.

Peter Kasting

unread,
Feb 10, 2016, 4:18:06 AM2/10/16
to PhistucK Productions, nicolae...@gmail.com, chromi...@chromium.org
On Wed, Feb 10, 2016 at 1:07 AM, PhistucK <phis...@gmail.com> wrote:
Did you check whether your CPU was even fully used in the Skylake case? Perhaps there is a different part that is maxed out, like your SSD, your RAM and so on.

Yeah, if you really want to utilize 24 virtual cores I'd think you'd need massive amounts of RAM.  Certainly more than the 32 GB that laptop supposedly maxes out at.

Also it's weird that the log you give shows building Debug Chrome without -j and then Release Chrome with -j, instead of comparing all Release builds.

PK

Nico Weber

unread,
Feb 10, 2016, 7:42:55 AM2/10/16
to nicolae...@gmail.com, chromi...@chromium.org
Is the question that `ninja -C out/Debug chrome` isn't faster than `ninja -j 200 -C out/Debug chrome`? If so, I think the answer is that ninja defaults to using all your cores. If you want to test how long building with a single core takes, run `ninja -j 1 -C out/Debug chrome` and it should take much longer.

--

Daniel Bratell

unread,
Feb 10, 2016, 11:43:40 AM2/10/16
to chromi...@chromium.org, Nicolae
On Wed, 10 Feb 2016 09:58:55 +0100, Nicolae <nicolae...@gmail.com> wrote:

Was thinking for parallel compiling the number of cores will cut the time at least in half if not less.

Any ideas ?

I've found Chromium compilations to be a mix of very parallel compilation (where good I/O, plenty or RAM and lots of CPU cores is important) and long linking steps where single-thread performance is most important. Your Xeon might fall behind in the linking phases. Your numbers are not worse than what I see from a variety of hardware around me and if there was a silver bullet here I would be so happy.

Since you are at Linux I would suggest getting ccache working (a bit tricky with clang but it is possible). You will still have some very long compilations but sometimes you can cut the compilation times to a fraction of that.

/Daniel

--
/* Opera Software, Linköping, Sweden: CET (UTC+1) */

Lucas Gadani

unread,
Feb 10, 2016, 12:44:31 PM2/10/16
to bra...@opera.com, chromi...@chromium.org, Nicolae
(reposting from the correct address).

I've found that on Linux, the linking step can be improved a lot by using component=shared_library.


--

WC Leung

unread,
Feb 10, 2016, 1:08:17 PM2/10/16
to Chromium-dev, nicolae...@gmail.com

I did prior to this a test with a Skylake 6820HK and the time was better ! it took about 30 min.
Same amount of RAM and SSDs, etc

The Skylake 6820HK is horribly fast (IMHO the time is unrealistic)! I'm using a i7-4790, 16GB RAM, and a Sandisk Ultra II 120GB SSD (Windows is in another hard disk), and it took me 2 hours to compile!

Daniel Bratell

unread,
Feb 10, 2016, 5:44:19 PM2/10/16
to Chromium-dev, WC Leung, nicolae...@gmail.com
Seems similar to what I have on a computer and I don't use 2 hours, I think. You may want to check if you're spending too much time fighting for the RAM? Some RAM for OS and tools, ~4-6 GB for file caches and a few GB per logical core and you easily run out with "only" 16 GB. It's possible 32 GB will make a big difference.

WC Leung

unread,
Feb 11, 2016, 12:25:52 AM2/11/16
to Chromium-dev, lwc...@gmail.com, nicolae...@gmail.com
The "System and compressed memory" process has some activities, but I don't think RAM is ever eaten up. I guess it can be different with the selection of build targets, different OS, the version of Visual Studio, and anti-virus software. So here's mine:

OS: Windows
Visual Studio: 2015 (I heard that compiler time is worse that 2013.)
Build targets: ninja -C out/Debug chrome unit_tests
Antivirus: AVG Free  ("Identity Protection" of this one may break timers, but I'm not sure.)

Peter Kasting

unread,
Feb 11, 2016, 12:31:39 AM2/11/16
to WC Leung, Chromium-dev, nicolae...@gmail.com
On Wed, Feb 10, 2016 at 9:25 PM, WC Leung <lwc...@gmail.com> wrote:
Antivirus: AVG Free  ("Identity Protection" of this one may break timers, but I'm not sure.)

Tangent: Never, ever, ever use AVG antivirus.


PK 

Daniel Bratell

unread,
Feb 11, 2016, 4:07:01 AM2/11/16
to Chromium-dev, WC Leung, nicolae...@gmail.com
On Thu, 11 Feb 2016 06:25:52 +0100, WC Leung <lwc...@gmail.com> wrote:

The "System and compressed memory" process has some activities, but I don't think RAM is ever eaten up.

It can be a bit hard to see since you don't just need free memory, but enough free memory for an efficient file cache and the operating system will not tell you the hit rate in the file cache. The only difference will be performance. Anyway, if you have enough, more will not help.

I guess it can be different with the selection of build targets, different OS, the version of Visual Studio, and anti-virus software. So here's mine:

OS: Windows
Visual Studio: 2015 (I heard that compiler time is worse that 2013.)
Build targets: ninja -C out/Debug chrome unit_tests
Antivirus: AVG Free  ("Identity Protection" of this one may break timers, but I'm not sure.)

In addition to the general problems with AVG that pkasting mentions, there is a lot to gain from excluding build directories (both in the Visual Studio installation and the actual checkout) from realtime antivirus scanning and other live monitoring (mac spotlight, windows file system indexing, ...). Even if it only slows down I/O a little bit, it makes a big difference since a build probably does millions of file accesses.

Nicolae

unread,
Feb 11, 2016, 5:33:29 AM2/11/16
to Chromium-dev
Hi,

I did a bit more digging, and all I can say so far is that there does not seem to be a bottleneck like SSD/Swap/RAM. The Xeon E5-2697 v2 CPU seems to work at full capacity and all it manages is to be almost on par with a Skylake 6820HK (I had a MSI GTS 219 before this one).


To summarize: I guess the "single-thread performance is most important" in today's compiling cycles. I did not see anything on the net that will have some kind of benchmarks & numbers except the spam like CPUBOSS and alike. So decided to do it on my own - got the skylake 4/8 cores had timings, got the xeon 12/24 cores got the timing. Lesson learned :)) more expensive does not mean more value. Probably this CPU model is a bad catch because the next one the v3 seems to look better on paper. But there are no one who provides laptops with them. The eurocom were the last ones to do that and I grabbed literally the last model they had and even for that one I had to wait for parts. All you can find these days is gaming oriented only being marketed as top performance.


Have answered the concerns below:

PhistucK - yes the CPU was fully used, the RAM and SSD do not seem to be the bottleneck.

Peter Kasting - but the RAM is not used even half it, I have the 32G.
About the provided logs, I did few tests in release & debug with and without the -j option
I just provided the summary of timing in different conditions. The time won't change.

Nico Weber - sorry no the question is not if ninja is faster with -j option.
The concern is that I had a machine before with Skylake 6820HK in it and it had basically
the same timing as this Xeon E5-2697 v2. One had 4 cores (8 threads) and this 12 cores (24 threads). This one was supposed to kick ass in compile time. But it barely manages to keep up. Firs laptop was $3.5k the xeon laptop is $8k :)
 
Daniel Bratell - yes, what you said this is what I think is happening ->
"long linking steps where single-thread performance is most important. 
Your Xeon might fall behind in the linking phases."  
I will try to play with ccache.

Lucas Gadani - I will try to play with this option "component=shared_library"

WC Leung - this is what I had as timing.
The config was 64GB RAM, RAID0-SSDs, 6820HK CPU.
I also did a "tmpfs" test meaning compiling all in RAM just to take out disk access from the equation and the time was same as compiling with a RAID SSDs. You may want to exclude in your antivirus the folder where all your compiling is happening. The antivirus might check for access before and after a file created/updated ... which results in IO bottleneck.




Here is the is the initial conversation with the support team from Eurocom. But in the end it's kind of not their fault the CPU does not have the expected performance. The best so far I got out of them is to update the BIOS to a custom version. I did receive and special package from them, flushed it, the BIOS was updated with new features. Maybe this improved something but nothing related to my main issue of timing. 

I have installed Arch Linux (rolling/bleeding edge), Then Ubuntu 15.10.
On both I have compiled chromium browser using these instructions: https://chromium.googlesource.com/chromium/src/+/master/docs/linux_build_instructions.md

The compilation of chromium will take advantage of all the cores and HT.
The point of buying 12Cores/24Threads is to cut the compile time.

Before buying this one I did same tests on i7 Skylake 6820HK and the timing was about 30min (+/- 2 min).
I also had a SSD raid0 config for it just in case to take out the HDD bottle neck.


Now for this laptop, I did same test for chromium compiling with same sources, same build scripts, etc

Test One
ninja -C out/Debug chrome

Timing:
Start 6:45 PM
End   7:24 PM
Total:  40 min

Test Two (add some optimizations based on https://chromium.googlesource.com/chromium/src/+/master/docs/linux_faster_builds.md) - this is basically to take out the HDD from equation as I did not have SSDs in RAID0, I put everything into RAM (ramfs)
Shared: build/gyp_chromium -D component=shared_library
No WebKit symbolsbuild/gyp_chromium -D remove_webcore_debug_symbols=1
Write to RAM (skip HDD):  sudo mount -t tmpfs -o size=20G,nr_inodes=40k,mode=1777 tmpfs out/Debug

ninja -C out/Debug chrome

Start 7:39 PM
End   8:15 PM
Total:  36 min 


I was expecting at least 50% if not more boost in compile time compared to the Skylake build as this has 3 times more cores and the GHz frequency per core is almost identical.

The result of the testing shows basically same results or even worse than the Skylake which looks a bit shocking to me.






WC Leung

unread,
Feb 11, 2016, 5:51:07 AM2/11/16
to Chromium-dev, lwc...@gmail.com, nicolae...@gmail.com
Thanks! BTW, what antivirus product do you recommend? I won't pay for one anyway, so only free products.
(P.S. Read your blog. Nice stuff! I don't have a CS degree myself though.)

Peter Kasting

unread,
Feb 11, 2016, 6:11:20 AM2/11/16
to WC Leung, Chromium-dev, nicolae...@gmail.com
On Thu, Feb 11, 2016 at 2:51 AM, WC Leung <lwc...@gmail.com> wrote:
Thanks! BTW, what antivirus product do you recommend? I won't pay for one anyway, so only free products.
(P.S. Read your blog. Nice stuff! I don't have a CS degree myself though.)

Honestly, for a technically savvy person, I recommend as little antivirus as possible.  On Windows, for example, I use MSE (or Defender or whatever Microsoft has renamed it to) and nothing else, largely because MSE does very little.

As Tavis Ormandy and others have shown, it's just way too easy for AV products to make your system less secure, and seemingly all the major AV products do so.  Couple this with the fact that even the most aggressive such products have fairly poor detection rates and can cause severe performance impacts, and I think you are better in almost every way not running one, as long as you practice safe computing in general.  I'm sure you already know all these:

* Run under a user account without admin/root privileges.
* For Windows, turn on UAC.
* Use a firewall.
* Use Chrome to browse.  If you get a SafeBrowsing alert in Chrome, take it seriously.  If you are super paranoid, set plugins to not run unless you manually launch them and/or install some kind of extension with NoScript-like effects to limit JS, but I find these steps hinder usability too much.
* Be very cautious with links in emails, attached documents, downloaded files, etc.  Do not run files from anywhere you're not sure you can trust.  Do not open attachments from others unless it's very clear they weren't emailed by a worm.
* Do not let others use your computer while logged in as you.  Log them in as a guest or similar.

PK

WC Leung

unread,
Feb 11, 2016, 7:05:59 AM2/11/16
to Chromium-dev, lwc...@gmail.com, nicolae...@gmail.com
Thanks a lot! Anyway I've a few rants...


* Use Chrome to browse.  If you get a SafeBrowsing alert in Chrome, take it seriously.  If you are super paranoid, set plugins to not run unless you manually launch them and/or install some kind of extension with NoScript-like effects to limit JS, but I find these steps hinder usability too much.

Actually I've used Firefox + NoScript for years, but just stopped this habit a week before. Anyway NoScript starts to be unusable as it is grossly incompatible with CDNs, meaning you need to white-list something every time, which is a security problem on top of usability problems.
 
* Be very cautious with links in emails, attached documents, downloaded files, etc.  Do not run files from anywhere you're not sure you can trust.  Do not open attachments from others unless it's very clear they weren't emailed by a worm.

The "do not run" part is very difficult. Nowadays almost NOBODY from proprietary software can be trusted. But I can't switch to Linux because of the reliance on MS Word with equations (heard that old equations are broken in Office 2016, so it's a lol anyway). And well, everyone of my family/colleague is pushing on something like QQ, WeChat, 360 (something worse than AVG), and AliPay (the holy plugin by Alibaba).
 
* Do not let others use your computer while logged in as you.  Log them in as a guest or similar.

Impossible at home. I do have a couple of strategy to protect myself though:
(1) Do never store anything that can be compromising if that is exposed.
(2) Do not do any transaction in the internet. Well, this is very possible in Hong Kong. (Unless I'm using an OS in a Live CD/USB. But now I'm unsure because trojans can easily be stored in UEFI.)

WC Leung

unread,
Feb 11, 2016, 7:06:40 AM2/11/16
to Chromium-dev, nicolae...@gmail.com
I see. Your system seems to be 3X or more as expensive as mine, and it's memory bandwidth is 2X mine. This can make a big difference.

PhistucK

unread,
Feb 11, 2016, 7:33:37 AM2/11/16
to Peter Kasting, Chromium-dev
I understand that Google employees have really fast clobbered builds (around 20 minutes or less?). Can you share the specification of your machine?


PhistucK

--
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev

Torne (Richard Coles)

unread,
Feb 11, 2016, 12:15:45 PM2/11/16
to phis...@gmail.com, Peter Kasting, Chromium-dev
Googler build times are typically using goma and so aren't something you can achieve with just a powerful machine, you need a distributed build farm :)

PhistucK

unread,
Feb 11, 2016, 12:23:08 PM2/11/16
to Torne (Richard Coles), Peter Kasting, Chromium-dev
I like farms, too. :)
Can you confirm that a normal (not distributed) build on your most powerful machine takes about half an hour?
Well, I am more interested in Windows, you probably build Android or Linux, so perhaps Peter can chime in?


PhistucK

Nicolae

unread,
Feb 11, 2016, 12:58:06 PM2/11/16
to chromi...@chromium.org


==> Can you confirm that a normal (not distributed) build on your most powerful machine takes about half an hour?

I'm interested in this too, at least this will show some reference numbers for others to relate to

Marc-Antoine Ruel

unread,
Feb 14, 2016, 3:18:02 PM2/14/16
to nicolae...@gmail.com, chromium-dev
It's better to have larger amount of RAM (64Gb Nicolae stated should be enough) than RAID SSD. A single high quality SSD will be better than 2 cheaper SSDs in RAID0 for the same price.

Then you are CPU bound which will scale near linearly.

PhistucK, on Windows, you can't do miracles.

M-A

2016-02-11 12:57 GMT-05:00 Nicolae <nicolae...@gmail.com>:


==> Can you confirm that a normal (not distributed) build on your most powerful machine takes about half an hour?

I'm interested in this too, at least this will show some reference numbers for others to relate to

--
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev



--
M-A

PhistucK

unread,
Feb 14, 2016, 4:32:11 PM2/14/16
to Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
But how long is the non-miracle? :)


PhistucK

Daniel Bratell

unread,
Feb 14, 2016, 5:56:01 PM2/14/16
to Marc-Antoine Ruel, PhistucK, nicolae...@gmail.com, chromium-dev
On Sun, 14 Feb 2016 22:30:40 +0100, PhistucK <phis...@gmail.com> wrote:

But how long is the non-miracle? :)

It depends on a lot of things. What exact product you are building? 32 bit or 64 bit? Optimized or not optimized? Are you building tests or not? Also, most of the time you will be able to reused some parts from an earlier build making it hard to get hard data.

I would not be surprised by any number between 30 minutes and 2 hours. It's also not getting any better over time. Actually, I think compilation times have probably doubled in the last 2-3 years. I see a very steady increase (but I only have a few months of data in front of me).

PhistucK

unread,
Feb 15, 2016, 12:32:05 AM2/15/16
to Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
A clobbered, clean build, 32 bit, release mode, not official build and whatever is built when you run ninja -C out\Release chrome.
Same for debug builds.
(On a very powerful machine)

Can anyone provide an approximate duration for the mentioned configuration from their recent experiences?


PhistucK

Peter Kasting

unread,
Feb 15, 2016, 12:35:54 AM2/15/16
to PhistucK Productions, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
On Sun, Feb 14, 2016 at 9:29 PM, PhistucK <phis...@gmail.com> wrote:
A clobbered, clean build, 32 bit, release mode, not official build and whatever is built when you run ninja -C out\Release chrome.
Same for debug builds.
(On a very powerful machine)

Can anyone provide an approximate duration for the mentioned configuration from their recent experiences?

On my home box, which is a 4.5 GHz 4-core (= 8 HT) machine with 16 GB of RAM, building this configuration on Windows takes a couple of hours.  It's definitely much worse than when I built this machine (to compile Chrome!) 4 years ago or so.

In particular, files in Blink seem to take a very long time individually to compile.  I don't know why.

I am somewhat curious if our compile times would be noticeably improved if we could run a tool like Include-What-You-Use and trim down extraneous #includes some.

PK 

PhistucK

unread,
Feb 15, 2016, 12:47:12 AM2/15/16
to Peter Kasting, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
In May, 2009, it took my then-machine (a overheating ~2.5GHz dual core and 2 GB RAM) about five hours. I think I was lucky. :P

Well, I understand that the incremental component build is very fast, hopefully that will stay this way.
(I am planning on buying a new machine and I want to make sure it will let me build Chromium again (since 2009) and maybe start fixing bugs and not just triaging them :) without being discouraged by extremely slow build times)

Thank you for your insight!


PhistucK

Peter Kasting

unread,
Feb 15, 2016, 1:47:12 AM2/15/16
to PhistucK, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
On Sun, Feb 14, 2016 at 9:45 PM, PhistucK <phis...@gmail.com> wrote:
Well, I understand that the incremental component build is very fast, hopefully that will stay this way.

Unfortunately even a component build ends up rebuilding most of the project if you resync, even after only a couple hours, so really the only solution is "never sync unless you're prepared to build for hours".
 
(I am planning on buying a new machine and I want to make sure it will let me build Chromium again (since 2009) and maybe start fixing bugs and not just triaging them :) without being discouraged by extremely slow build times)

I work from home most of the time anymore, so the machine I described above is my "primary work machine" at this point.  It's definitely doable.  But you do need something pretty beastly.  If I were building the system today, I'd probably get an i7-5930K (a 5960X is too rich for my blood) and a really good cooler, overclock as far as possible, and have at least 32 GB RAM.

PK 

PhistucK

unread,
Feb 15, 2016, 2:02:18 AM2/15/16
to Peter Kasting, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
Wow... Working on Chromium is hard. :(

I am planning on getting the next generation once it is out, but Intel seems to be delayed (they usually announce or release it around September every year and they have not since 2014 :(). The last Extreme edition had 8 cores, so I was planning on getting the next generation non-Extreme 8 core (hopefully it will not be an Extreme edition, as it costs twice as much) with some hydrocooling. I was thinking of 16 GB RAM, but I guess 32 GB RAM will be worth it.


PhistucK

Jakob Kummerow

unread,
Feb 15, 2016, 4:08:39 AM2/15/16
to PhistucK Productions, Peter Kasting, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
If you care about compile time, you should seriously consider working on Linux. Especially on beefier machines, it can be ~twice as fast on identical hardware. Last time I built at home (a couple of months ago) on an i7-2600K overclocked to 4GHz, 16GB RAM, magnetic HDD, "ninja -C out/Debug chrome" for a component build took just under an hour.

IIRC, back when I had access to two identical machines at work (16 cores / 32 threads at 2.9GHz, 64GB RAM, SSDs), with distributed compilation turned off the Linux machine took around 30 minutes and the Windows machine more than an hour. I don't remember exact numbers, but I do remember that the difference was shockingly large.

Colin Blundell

unread,
Feb 15, 2016, 5:12:14 AM2/15/16
to jkum...@chromium.org, PhistucK Productions, Peter Kasting, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
Of course, there are engineers who (1) work on Windows-specific UI and/or (2) don't want to give up the Windows development environment. I believe that for PK both of these factors are in play :).

If those factors aren't in play, I can't +1 this comment hard enough. I used to work primarily on a Mac and since I've switched to Linux the difference has been through the roof.

Daniel Bratell

unread,
Feb 15, 2016, 10:07:01 AM2/15/16
to PhistucK Productions, Peter Kasting, Marc-Antoine Ruel, nicolae...@gmail.com, blink-dev
On Mon, 15 Feb 2016 06:34:59 +0100, Peter Kasting <pkas...@chromium.org> wrote:

In particular, files in Blink seem to take a very long time individually to compile.  I don't know why.

(Moving to blink-dev, chromium-dev on BCC)

I know why. Every compilation unit in Blink needs LayoutObject and/or Document. Those header files in turn include every other header file in Blink. A non trivial amount of Blink's code is in header files due to templates and forced inlining.

So basically for every compilation unit you compile 5% of Blink. For 3000 compilation units that makes a lot of percent.

Precompiled headers help. In Windows pch can cut a common recompilation from 20 minutes to 5 minutes. There is a patch at https://codereview.chromium.org/1167523007/ (and we use it at Opera) but every time it has landed one of the bots has developed broken dependencies after 2-3 days resulting in strange compilation errors. I will try again when we have VS2015 in the hope that the problem is a bug in VS2013. I've also not updated this patch with gn support and I don't even know if gn supports this yet.

I am somewhat curious if our compile times would be noticeably improved if we could run a tool like Include-What-You-Use and trim down extraneous #includes some.

Not easily. The only include I found that contributed a lot by itself was <algorithm> (in C++11) and it would still only save <1 minute if removed from all header files.

Ways forward for compilation times:
* Precompiled headers.
* Unity builds.
* Make Document.h and LayoutObject.h very fast to compile (without replacing them with something equally slow)
* Untangle the dependency mess one class at a time and cut down the number of compile time dependencies. This might mean fewer templates and less forced inclining and requires a lot of knowledge about how Blink works and should work.

PhistucK

unread,
Feb 15, 2016, 1:20:05 PM2/15/16
to Jakob Kummerow, Peter Kasting, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev

On Mon, Feb 15, 2016 at 11:07 AM, Jakob Kummerow <jkum...@chromium.org> wrote:
IIRC, back when I had access to two identical machines at work (16 cores / 32 threads at 2.9GHz, 64GB RAM, SSDs), with distributed compilation turned off the Linux machine took around 30 minutes and the Windows machine more than an hour. I don't remember exact numbers, but I do remember that the difference was shockingly large.


I searched for a single processor with 16 cores and 2.9GHz and could not find one. Was it an "Intel Xeon E5-2690 - Dual Processor 16-Core X 2.9GHz​"?
Lately I figured that I probably need more than Intel Core and that I should start looking into the Xeon, but those are hugely expensive (over $1000 or even over $2000 only for the processor. And for two - for the dual one... damn).

Thank you for the Linux tip. I am a Windows guy, so I am much more used to it, unfortunately (really unfortunately). But I was going to install Linux as well anyway (once I buy a beastly machine), so I might just end up using it exclusively when it comes to Chromium development.


PhistucK

Torne (Richard Coles)

unread,
Feb 15, 2016, 1:22:21 PM2/15/16
to phis...@gmail.com, Jakob Kummerow, Peter Kasting, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
Yes, most of our workstations at google are dual 8-core or dual 10-core Xeon machines, which are incredibly expensive :/

--

PhistucK

unread,
Feb 15, 2016, 1:24:58 PM2/15/16
to Torne (Richard Coles), Jakob Kummerow, Peter Kasting, Daniel Bratell, Marc-Antoine Ruel, nicolae...@gmail.com, chromium-dev
Oh. Well. :) Keeping all of the fun to yourself... I would really not be mad if you slipped one of those beasts into my place.



PhistucK

WC Leung

unread,
Feb 15, 2016, 1:56:53 PM2/15/16
to Chromium-dev, phis...@gmail.com, bra...@opera.com, mar...@chromium.org, nicolae...@gmail.com
I wonder if it would be possible to just compile the files (i.e. *.cc) I've changed, instead of building the whole Chromium. Unless I'm working on javascript part of Chromium (e.g. userpod), I don't see a need to compile the whole program.


On Monday, February 15, 2016 at 2:47:12 PM UTC+8, Peter Kasting wrote:
On Sun, Feb 14, 2016 at 9:45 PM, PhistucK <phis...@gmail.com> wrote:
Well, I understand that the incremental component build is very fast, hopefully that will stay this way.

Unfortunately even a component build ends up rebuilding most of the project if you resync, even after only a couple hours, so really the only solution is "never sync unless you're prepared to build for hours".

PK
 

Daniel Bratell

unread,
Feb 15, 2016, 2:24:25 PM2/15/16
to Chromium-dev, WC Leung, phis...@gmail.com, mar...@chromium.org, nicolae...@gmail.com
On Mon, 15 Feb 2016 19:56:53 +0100, WC Leung <lwc...@gmail.com> wrote:

I wonder if it would be possible to just compile the files (i.e. *.cc) I've changed, instead of building the whole Chromium. Unless I'm working on javascript part of Chromium (e.g. userpod), I don't see a need to compile the whole program.

That is how it's supposed to work, but unfortunately there will every day be people changing enough shared files to force an almost full recompile. But as long you stay away from shared headers, it's not too bad. With component builds linking can be fast as well. This is what keeps developers working at Cromium sane. I think. Don't quote me. :-)

WC Leung

unread,
Feb 15, 2016, 11:31:26 PM2/15/16
to Chromium-dev, lwc...@gmail.com, phis...@gmail.com, mar...@chromium.org, nicolae...@gmail.com
My point is I want to compile those *.cc into *.obj, without getting the files linked to a .exe/.dll/.lib. In this way I can check for syntax errors early (e.g. to check for errors whenever I save a file.

Peter Kasting

unread,
Feb 16, 2016, 12:10:17 AM2/16/16
to WC Leung, Chromium-dev, PhistucK Productions, Marc-Antoine Ruel, nicolae...@gmail.com
On Mon, Feb 15, 2016 at 8:31 PM, WC Leung <lwc...@gmail.com> wrote:
My point is I want to compile those *.cc into *.obj, without getting the files linked to a .exe/.dll/.lib. In this way I can check for syntax errors early (e.g. to check for errors whenever I save a file.

That seems like a feature of any decent IDE.  I use Visual Studio to develop, for example, and it will live-highlight syntax errors, plus I can ctrl-F7 to compile the file at any time.

PK

WC Leung

unread,
Feb 16, 2016, 1:12:48 AM2/16/16
to Chromium-dev, lwc...@gmail.com, phis...@gmail.com, mar...@chromium.org, nicolae...@gmail.com
Wow... you're able to run VS with Chromium? Splendid! Did you also get the correct include paths?

Anyway, I'm trying something different:
ninja -C out\Debug -t targets | grep [target obj file]

And I get something like this:
ninja -t msvc -e environment.x86 -- "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\amd64_x86\cl.exe" /nologo /showIncludes /FC @obj\chrome\browser\profiles\browser.avatar_menu.obj.rsp /c ..\..\chrome\browser\profiles\avatar_menu.cc /Foobj\chrome\browser\profiles\browser.avatar_menu.obj /Fdobj\chrome\browser.cc.pdb

But how can I obtain the .rsp file?

WC Leung

unread,
Feb 16, 2016, 1:39:10 AM2/16/16
to Chromium-dev, lwc...@gmail.com, phis...@gmail.com, mar...@chromium.org, nicolae...@gmail.com

Anyway, I'm trying something different:
ninja -C out\Debug -t targets | grep [target obj file]

The command is wrong... sorry. It should be
ninja -C out\Debug -t commands | grep [target obj file]
instead.

Peter Kasting

unread,
Feb 16, 2016, 3:30:39 AM2/16/16
to WC Leung, Chromium-dev, PhistucK Productions, Marc-Antoine Ruel, nicolae...@gmail.com
On Mon, Feb 15, 2016 at 10:12 PM, WC Leung <lwc...@gmail.com> wrote:
Wow... you're able to run VS with Chromium? Splendid! Did you also get the correct include paths?

I don't understand the question.  What issue is there with include paths?

Anyway, I'm trying something different:
ninja -C out\Debug -t targets | grep [target obj file]

I do not know how to compile individual files with direct ninja invocations on Windows.

PK 

WC Leung

unread,
Feb 16, 2016, 8:36:27 AM2/16/16
to Chromium-dev, lwc...@gmail.com, phis...@gmail.com, mar...@chromium.org, nicolae...@gmail.com


On Tuesday, February 16, 2016 at 4:30:39 PM UTC+8, Peter Kasting wrote:
On Mon, Feb 15, 2016 at 10:12 PM, WC Leung <lwc...@gmail.com> wrote:
Wow... you're able to run VS with Chromium? Splendid! Did you also get the correct include paths?

I don't understand the question.  What issue is there with include paths?

If you don't get the correct include paths, then the file won't build because there's a missing #include. You may look at https://codereview.chromium.org/1631373003/diff/20001/chrome/browser/profiles/profile_attributes_entry.h and expand the first group of messages to see an example of such an error. (The trybot log files have been deleted. Sorry for that!)

 
Anyway, I'm trying something different:
ninja -C out\Debug -t targets | grep [target obj file]

I do not know how to compile individual files with direct ninja invocations on Windows.

Interested to get a script for doing this? I've got a way to get the .rsp file already...
by modifying the "rule cc" in build.ninja, so the remaining puzzles should be just coding...

Of course I still need to know how to build the auto-generated headers, like "out/Debug/gen/*.h".
 
PK 

Nico Weber

unread,
Feb 16, 2016, 10:29:39 AM2/16/16
to WC Leung, Chromium-dev, PhistucK Productions, Marc-Antoine Ruel, nicolae...@gmail.com
Run `ninja -C out\Debug obj\base\strings\strings16.obj` to execute the build command that produces strings16.obj, or run `ninja -C out\Debug ..\..\base\strings\string16.cc^` to produce an obj file that has base/strings/string16.cc as input. Look e.g. at tools/vim/ninja.vim for a "compile this file" thingy for vim.

(If you want to look at the rsp file for some reason, you can pass `-d keeprsp` to ninja to make it keep the rsp files around after compilations. You can use `ninja -C out\Debug -n -d keeprsp base` to write all rsp files used to build base and do nothing else. But generally you shouldn't have to muck with .rsp files yourself.)

--

WC Leung

unread,
Feb 16, 2016, 11:44:21 AM2/16/16
to Chromium-dev, lwc...@gmail.com, phis...@gmail.com, mar...@chromium.org, nicolae...@gmail.com
Thanks a lot!

Thiago Farina

unread,
Feb 16, 2016, 1:12:45 PM2/16/16
to pkas...@chromium.org, Daniel Bratell, chromium-dev


On Monday, February 15, 2016, Peter Kasting <pkas...@chromium.org> wrote:
On Sun, Feb 14, 2016 at 9:29 PM, PhistucK <phis...@gmail.com> wrote:
A clobbered, clean build, 32 bit, release mode, not official build and whatever is built when you run ninja -C out\Release chrome.
Same for debug builds.
(On a very powerful machine)

Can anyone provide an approximate duration for the mentioned configuration from their recent experiences?

On my home box, which is a 4.5 GHz 4-core (= 8 HT) machine with 16 GB of RAM, building this configuration on Windows takes a couple of hours.  It's definitely much worse than when I built this machine (to compile Chrome!) 4 years ago or so.

In particular, files in Blink seem to take a very long time individually to compile.  I don't know why.
It was told me it is because of their heavy use of templates. I think Daniel Bratell knows more about this and was the one I talked about the poor compile time of Blink source files. I recently started to notice this with V8 as well.


--
Thiago Farina

WC Leung

unread,
Feb 17, 2016, 12:35:26 PM2/17/16
to Chromium-dev, lwc...@gmail.com, phis...@gmail.com, mar...@chromium.org, nicolae...@gmail.com
Just tried it with save-commands in Atom. Viola! It works perfectly! Anyway, remember to add the quotation when you try the command, i.e.

ninja -C out\Debug "..\..\base\strings\string16.cc^"

PhistucK: looks like you can try some refactoring tasks, because those do not require a complete build of Chromium. Still not fixing bug though.

BTW, just being curious: Who are the external contributors here? (i.e. non-Google, non-Opera, non-Intel, non-Samsung.)
Let's hope that it's not only PhistucK and me...


On Tuesday, February 16, 2016 at 11:29:39 PM UTC+8, Nico Weber wrote:

PhistucK

unread,
Feb 17, 2016, 12:38:55 PM2/17/16
to WC Leung, Chromium-dev, Marc-Antoine Ruel, nicolae...@gmail.com
There are more.


PhistucK

Nico Weber

unread,
Feb 17, 2016, 12:55:42 PM2/17/16
to WC Leung, Chromium-dev, PhistucK Productions, Marc-Antoine Ruel, nicolae...@gmail.com
On Wed, Feb 17, 2016 at 9:35 AM, WC Leung <lwc...@gmail.com> wrote:
Just tried it with save-commands in Atom. Viola! It works perfectly! Anyway, remember to add the quotation when you try the command, i.e.
ninja -C out\Debug "..\..\base\strings\string16.cc^"

Ah, sorry, should've mentioned that :-) The problem is that '^' is the escape character in cmd.exe. Quoting works, or you can write ^^ (similar to how \\ gets you a single backslash in posix shells):

C:\src\chrome\src>echo ^^
^

FWIW, I often build chrome on my mac laptop, and while a full build takes over an hour, incremental builds are on the order of a few seconds. So I usually sync in the evening and make the laptop do a full build over night. You don't _have_ to have a beefy Linux box that with the help of goma can do full builds in 2 minutes -- it certainly helps, but it's not necessary. There are a few tweaks you can do to speed up your build:

* If you set disable_nacl=1 (gyp) / enable_nacl=false (gn), gyp will run 30% faster and your build will be ~15% faster (iirc)
* If you set fastbuild=1 (gyp) / symbol_level=1 (gn), your build will be faster, but you won't have full debug information
* On OS X, Release builds are noticeably faster than Debug builds (I think on Linux too, but haven't measured there since my Linux box is beefy. Not sure on Windows.)
* If you set component=shared_library (gyp) / is_component_build=true (gn), incremental builds will be way way faster (like 10x as fast)



PhistucK: looks like you can try some refactoring tasks, because those do not require a complete build of Chromium. Still not fixing bug though.

BTW, just being curious: Who are the external contributors here? (i.e. non-Google, non-Opera, non-Intel, non-Samsung.)
Let's hope that it's not only PhistucK and me...

On Tuesday, February 16, 2016 at 11:29:39 PM UTC+8, Nico Weber wrote:
Run `ninja -C out\Debug obj\base\strings\strings16.obj` to execute the build command that produces strings16.obj, or run `ninja -C out\Debug ..\..\base\strings\string16.cc^` to produce an obj file that has base/strings/string16.cc as input. Look e.g. at tools/vim/ninja.vim for a "compile this file" thingy for vim.

(If you want to look at the rsp file for some reason, you can pass `-d keeprsp` to ninja to make it keep the rsp files around after compilations. You can use `ninja -C out\Debug -n -d keeprsp base` to write all rsp files used to build base and do nothing else. But generally you shouldn't have to muck with .rsp files yourself.)

Nicolae

unread,
Feb 17, 2016, 2:16:52 PM2/17/16
to Chromium-dev
​Where ​do you find this info and what they do ? Good to know them

Daniel Bratell

unread,
Feb 18, 2016, 5:25:05 AM2/18/16
to Chromium-dev, Nicolae
On Wed, 17 Feb 2016 20:14:45 +0100, Nicolae <nicolae...@gmail.com> wrote:

​Where ​do you find this info and what they do ? Good to know them

Such information is spread all over the place. Documentation oscillates between "too much information at too little space so nobody reads/grasps it" and "too simplified instructions for the expert".

I could find fastbuild information at http://www.chromium.org/developers/gyp-environment-variables but that was because I knew what I was looking for. Trying to remember what it was like when I was new in this project, I think I picked up information gradually over time. From forum posts, documentation pages, commit messages, ... It's probably the same for a lot of people in the project.

To comment on "what they do", Nico explained it but I will expand on it a bit for future reference. See below.



On Wed, Feb 17, 2016 at 9:54 AM, Nico Weber <tha...@chromium.org> wrote:
If you set disable_nacl=1 (gyp) / enable_nacl=false (gn), gyp will run 30% faster and your build will be ~15% faster (iirc)

nacl (NaCl, Native Client) is a sandbox and system for running compiled machine code inside a Chromium based browser. We don't use this in Opera products and unless you are actively working on NaCl there is no reason to have it enabled as it adds quite a bit of development overhead.

* If you set fastbuild=1 (gyp) / symbol_level=1 (gn), your build will be faster, but you won't have full debug information

Exactly what Nico says. Composing the debug information can take a considerable amount of time and if you don't have a use for it, then use fastbuild/symbol_level. Downside is that you will have some trouble debugging those builds but sometimes that is not a concern.

* On OS X, Release builds are noticeably faster than Debug builds (I think on Linux too, but haven't measured there since my Linux box is beefy. Not sure on Windows.)
* If you set component=shared_library (gyp) / is_component_build=true (gn), incremental builds will be way way faster (like 10x as fast)

component=shared_library/is_component_build splits chromium up into a million dll/.so files. Each of them is much smaller than a full binary so linking is much faster, but on the other hand you get something that will be a bit different from what you would have end-users using. Some special care needs to be taken in code when jumping between dll/.so files but no need to worry about that until you encounter the problems.

Nicolae

unread,
Feb 18, 2016, 2:10:39 PM2/18/16
to Chromium-dev
Thanks !
Reply all
Reply to author
Forward
0 new messages