Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

windows build anti-virus exclusion list?

206 views
Skip to first unread message

Ben Kelly

unread,
Mar 16, 2017, 11:26:20 PM3/16/17
to dev-pl...@lists.mozilla.org
Hi all,

I'm trying to configure my new windows build machine and noticed that
builds were still somewhat slow. I did:

1) Put it in high performance power profile
2) Made sure my mozilla-central dir was not being indexed for search
3) Excluded my mozilla-central directory from windows defender

Watching the task monitor during a build, though, I still saw MsMpEng.exe
(antivirus) running during the build.

I ended up added some very broad exclusions to get this down close to
zero. I am now excluding:

- mozilla-central checkout
- mozilla-build install dir
- visual studio install dir
- /users/bkelly/appdada/local/temp
- /users/bkelly (because temp dir was not enough)

I'd like to narrow this down a bit. Does anyone have a better list of
things to exclude from virus scanning for our build process?

Thanks.

Ben

Ben Kelly

unread,
Mar 16, 2017, 11:34:15 PM3/16/17
to dev-pl...@lists.mozilla.org
On Thu, Mar 16, 2017 at 11:26 PM, Ben Kelly <bke...@mozilla.com> wrote:

> - mozilla-build install dir
> - visual studio install dir
> - /users/bkelly/appdada/local/temp
> - /users/bkelly (because temp dir was not enough)
>

FWIW, adding all these extra exclusions dropped my build time from ~22
minutes to ~14 minutes.

I'm hoping I can narrow my home directory exclusion down to things like
.bash_profile, .cargo, etc.

David Major

unread,
Mar 16, 2017, 11:37:34 PM3/16/17
to dev-pl...@lists.mozilla.org
Try using Sysinternals Process Monitor to see what files MsMpEng.exe is
reading.

On Fri, Mar 17, 2017, at 04:26 PM, Ben Kelly wrote:
> Hi all,
>
> I'm trying to configure my new windows build machine and noticed that
> builds were still somewhat slow. I did:
>
> 1) Put it in high performance power profile
> 2) Made sure my mozilla-central dir was not being indexed for search
> 3) Excluded my mozilla-central directory from windows defender
>
> Watching the task monitor during a build, though, I still saw MsMpEng.exe
> (antivirus) running during the build.
>
> I ended up added some very broad exclusions to get this down close to
> zero. I am now excluding:
>
> - mozilla-central checkout
> - mozilla-build install dir
> - visual studio install dir
> - /users/bkelly/appdada/local/temp
> - /users/bkelly (because temp dir was not enough)
>
> I'd like to narrow this down a bit. Does anyone have a better list of
> things to exclude from virus scanning for our build process?
>
> Thanks.
>
> Ben
> _______________________________________________
> dev-platform mailing list
> dev-pl...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform

Michael Hoye

unread,
Mar 16, 2017, 11:40:12 PM3/16/17
to Ben Kelly, dev-platform
Depending on your AV, if you don't exempt mozilla-central some of our tests
will get quarantined and you won't be able to build at all.

- mhoye


On Mar 16, 2017 20:34, "Ben Kelly" <bke...@mozilla.com> wrote:

On Thu, Mar 16, 2017 at 11:26 PM, Ben Kelly <bke...@mozilla.com> wrote:

> - mozilla-build install dir
> - visual studio install dir
> - /users/bkelly/appdada/local/temp
> - /users/bkelly (because temp dir was not enough)
>

FWIW, adding all these extra exclusions dropped my build time from ~22
minutes to ~14 minutes.

I'm hoping I can narrow my home directory exclusion down to things like
.bash_profile, .cargo, etc.

Ben Kelly

unread,
Mar 17, 2017, 12:05:39 AM3/17/17
to Michael Hoye, dev-platform
On Thu, Mar 16, 2017 at 11:40 PM, Michael Hoye <mh...@mozilla.com> wrote:

> Depending on your AV, if you don't exempt mozilla-central some of our
> tests will get quarantined and you won't be able to build at all.
>

I guess I was hoping someone familiar with our build might know the answer.
:-)

In any case, I managed to narrow my exclusions down to:

c:\devel
c:\mozilla-build
c:\program files\microsoft visual studio 12.0
c:\users\bkelly\.bash_profile
c:\users\bkelly\.cargo
c:\users\bkelly\.multirust
c:\users\bkelly\.rustup

And then in my .bash_profile I did this to avoid having to exclude all of
%TEMP%:

export TMP=/c/devel/.tmp
export TEMP=$TMP
export TMPDIR=$TMP

I couldn't figure out how to re-map /tmp in msys. The fstab was not
working for me.

I guess if that all seems reasonable I can try to add some text to the wiki
suggesting it:

https://developer.mozilla.org/en-US/docs/Mozilla/Developer_guide/Build_Instructions/Windows_Prerequisites

Thanks.

Ben

Ben Kelly

unread,
Mar 17, 2017, 12:06:19 AM3/17/17
to Michael Hoye, dev-platform
On Fri, Mar 17, 2017 at 12:05 AM, Ben Kelly <bke...@mozilla.com> wrote:

> On Thu, Mar 16, 2017 at 11:40 PM, Michael Hoye <mh...@mozilla.com> wrote:
>
> Depending on your AV, if you don't exempt mozilla-central some of our
>> tests will get quarantined and you won't be able to build at all.
>>
>
> I guess I was hoping someone familiar with our build might know the
> answer. :-)
>

Sorry. I think I replied to the wrong person here.

Mike Hoye

unread,
Mar 17, 2017, 12:27:04 AM3/17/17
to Ben Kelly, dev-platform
Hah! I thought "yeah, harsh but fair."

- mhoye

Honza Bambas

unread,
Mar 17, 2017, 4:46:09 AM3/17/17
to dev-pl...@lists.mozilla.org
I have a very similar setup, with even way more exceptions added, but
none of them has the desired effect. Unfortunately, the only way to make
MsMpEng shut up is to disable run-time protection completely for the
time of the build. I think it's a bug in Defender.

On a 20 core system it saves ~30 seconds. On 8 core it could be way more.

-hb-


On 3/17/17 4:26 AM, Ben Kelly wrote:
> Hi all,
>
> I'm trying to configure my new windows build machine and noticed that
> builds were still somewhat slow. I did:
>
> 1) Put it in high performance power profile
> 2) Made sure my mozilla-central dir was not being indexed for search
> 3) Excluded my mozilla-central directory from windows defender
>
> Watching the task monitor during a build, though, I still saw MsMpEng.exe
> (antivirus) running during the build.
>
> I ended up added some very broad exclusions to get this down close to
> zero. I am now excluding:
>
> - mozilla-central checkout
> - mozilla-build install dir
> - visual studio install dir
> - /users/bkelly/appdada/local/temp
> - /users/bkelly (because temp dir was not enough)
>
> I'd like to narrow this down a bit. Does anyone have a better list of
> things to exclude from virus scanning for our build process?
>
> Thanks.
>
> Ben

Chris Peterson

unread,
Mar 17, 2017, 1:12:23 PM3/17/17
to
On 3/17/2017 1:45 AM, Honza Bambas wrote:
> I have a very similar setup, with even way more exceptions added, but
> none of them has the desired effect. Unfortunately, the only way to make
> MsMpEng shut up is to disable run-time protection completely for the
> time of the build. I think it's a bug in Defender.

Maybe `mach build` can temporarily disable Defender when building?

Ted Mielczarek

unread,
Mar 17, 2017, 1:36:40 PM3/17/17
to dev-pl...@lists.mozilla.org
You can't programmatically control Windows Defender, even as an
Administrator. This is a security precaution from Microsoft. It's
configured with a special user account. I looked into this recently
because I thought it would be nice if *something* in the build system or
bootstrap could at least let you know if your build directories were not
in the list of exclusions.

Back to the original topic, I recently set up a fresh Windows machine
and I followed the same basic steps (enable performance power mode,
whitelist a bunch of stuff in Windows Defender) and my build seemed
basically CPU-bound[1] during the compile tier. Disabling realtime
protection in Defender made it *slightly* better[2] but didn't have a
large impact on the overall build time (something like 20s out of ~14m
total for a clobber).

Ideally we should have this stuff as part of `mach bootstrap` or similar
so everyone gets their machine configured properly for the fastest
builds possible.

Related, my next steps were that I was planning to figure out how to
gather an xperf profile of the entire build process to see if there were
any obvious speedups left from a system perspective (the resource usage
graph shows the obvious inefficiencies left that are already known:
configure + the non-compile tiers), but UIforETW hung when I tried to
use it to do so and I haven't followed up yet.

-Ted

1. http://people.mozilla.org/~tmielczarek/build-resource-usage.svg
2.
https://people-mozilla.org/~tmielczarek/build-resource-usage-no-defender.svg

Ben Kelly

unread,
Mar 17, 2017, 2:20:11 PM3/17/17
to Ted Mielczarek, dev-pl...@lists.mozilla.org
On Fri, Mar 17, 2017 at 1:36 PM, Ted Mielczarek <t...@mielczarek.org> wrote:

> Back to the original topic, I recently set up a fresh Windows machine
> and I followed the same basic steps (enable performance power mode,
> whitelist a bunch of stuff in Windows Defender) and my build seemed
> basically CPU-bound[1] during the compile tier. Disabling realtime
> protection in Defender made it *slightly* better[2] but didn't have a
> large impact on the overall build time (something like 20s out of ~14m
> total for a clobber).
>

The 14min measurement must have been for a partial build. With defender
disabled the best I can get is 18min. This is on one of the new lenovo
p710 machines with 16 xeon cores.

I definitely observed periods where it was not CPU bound. For example, at
the end of the js lib build I observed a single cl.exe process sit for ~2
minutes while no other work was being done. I also saw link.exe take a
long time without parallelism, but i think that's a known issue.

Eric Rahm

unread,
Mar 17, 2017, 2:20:42 PM3/17/17
to Ted Mielczarek, dev-platform
I filed meta bug 1326328 [1] a few months ago for tracking things we can do
to improve Windows build times. It would be great to file / block existing
bugs against the meta bug to track these issues. There hasn't been much
traction on any of the bugs though, perhaps a good topic for the next
all-hands.

A quick summary of what we have:

- Bug 1095293 [2] - Use msvc's '/MP' option
- Bug 1321922 [3] - Use symlinks on windows to avoid copying files into
the objdir
- Bug 1326329 [4] - Each time we call 'cl' we spawn 5 processes
- Bug 1326333 [5] - Build from legit msvc project files to leverage
msbuild's multiprocess improvements
- Bug 1326353 [6] - Reduce the amount of console output when building

Integrating suggestions for first-time builders into `mach bootstrap` would
certainly be great (even if it's just a link to a "10 crazy things that
will make your windows build faster!" article). I didn't see a bug for
programmatically disabling power save mode while building, but I recall
that being a potential option.

-e

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1326328
[2] https://bugzilla.mozilla.org/show_bug.cgi?id=1095293
[3] https://bugzilla.mozilla.org/show_bug.cgi?id=1321922
[4] https://bugzilla.mozilla.org/show_bug.cgi?id=1326329
[5] https://bugzilla.mozilla.org/show_bug.cgi?id=1326333
[6] https://bugzilla.mozilla.org/show_bug.cgi?id=1326353

On Fri, Mar 17, 2017 at 10:36 AM, Ted Mielczarek <t...@mielczarek.org> wrote:

> On Fri, Mar 17, 2017, at 01:12 PM, Chris Peterson wrote:
> You can't programmatically control Windows Defender, even as an
> Administrator. This is a security precaution from Microsoft. It's
> configured with a special user account. I looked into this recently
> because I thought it would be nice if *something* in the build system or
> bootstrap could at least let you know if your build directories were not
> in the list of exclusions.
>
> Back to the original topic, I recently set up a fresh Windows machine
> and I followed the same basic steps (enable performance power mode,
> whitelist a bunch of stuff in Windows Defender) and my build seemed
> basically CPU-bound[1] during the compile tier. Disabling realtime
> protection in Defender made it *slightly* better[2] but didn't have a
> large impact on the overall build time (something like 20s out of ~14m
> total for a clobber).
>
> Ideally we should have this stuff as part of `mach bootstrap` or similar
> so everyone gets their machine configured properly for the fastest
> builds possible.
>
> Related, my next steps were that I was planning to figure out how to
> gather an xperf profile of the entire build process to see if there were
> any obvious speedups left from a system perspective (the resource usage
> graph shows the obvious inefficiencies left that are already known:
> configure + the non-compile tiers), but UIforETW hung when I tried to
> use it to do so and I haven't followed up yet.
>
> -Ted
>
> 1. http://people.mozilla.org/~tmielczarek/build-resource-usage.svg
> 2.
> https://people-mozilla.org/~tmielczarek/build-resource-
> usage-no-defender.svg

Eric Rahm

unread,
Mar 17, 2017, 2:30:14 PM3/17/17
to Ben Kelly, dev-pl...@lists.mozilla.org
It looks like there's bug 1188823 [1] for enabling fastlink to improve link
times, but that's in a bad way right now.

FWIW on my ThankPad laptop I have clobber build times in the 35 - 40 minute
range, so even things that seem small for a 16 core desktop are a bigger
win for me.

-e

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1188823

On Fri, Mar 17, 2017 at 11:20 AM, Ben Kelly <bke...@mozilla.com> wrote:

> On Fri, Mar 17, 2017 at 1:36 PM, Ted Mielczarek <t...@mielczarek.org>
> wrote:
>
> > Back to the original topic, I recently set up a fresh Windows machine
> > and I followed the same basic steps (enable performance power mode,
> > whitelist a bunch of stuff in Windows Defender) and my build seemed
> > basically CPU-bound[1] during the compile tier. Disabling realtime
> > protection in Defender made it *slightly* better[2] but didn't have a
> > large impact on the overall build time (something like 20s out of ~14m
> > total for a clobber).
> >
>
> The 14min measurement must have been for a partial build. With defender
> disabled the best I can get is 18min. This is on one of the new lenovo
> p710 machines with 16 xeon cores.
>
> I definitely observed periods where it was not CPU bound. For example, at
> the end of the js lib build I observed a single cl.exe process sit for ~2
> minutes while no other work was being done. I also saw link.exe take a
> long time without parallelism, but i think that's a known issue.

Ted Mielczarek

unread,
Mar 17, 2017, 2:44:20 PM3/17/17
to Ben Kelly, dev-pl...@lists.mozilla.org
On Fri, Mar 17, 2017, at 02:20 PM, Ben Kelly wrote:
> On Fri, Mar 17, 2017 at 1:36 PM, Ted Mielczarek <t...@mielczarek.org>
> wrote:
>
> > Back to the original topic, I recently set up a fresh Windows machine
> > and I followed the same basic steps (enable performance power mode,
> > whitelist a bunch of stuff in Windows Defender) and my build seemed
> > basically CPU-bound[1] during the compile tier. Disabling realtime
> > protection in Defender made it *slightly* better[2] but didn't have a
> > large impact on the overall build time (something like 20s out of ~14m
> > total for a clobber).
> >
>
> The 14min measurement must have been for a partial build. With defender
> disabled the best I can get is 18min. This is on one of the new lenovo
> p710 machines with 16 xeon cores.

Nope, full clobber builds: `./mach clobber; time ./mach build`. (I have
the same machine, FWIW.) The svg files I uploaded were from `mach
resource-usage`, which has nice output but not a good way to share the
resulting data externally. I didn't save the actual output of `time`
anywhere, but going back through my IRC logs the first build I did on
the machine took 15:08.01, the second (where all the source files ought
to be in the filesystem cache) took 14:58.24, and then another build I
did with Defender's real-time indexing disabled took 14:27.73. We should
figure out what the difference is between our system configurations,
3-3.5 mins is a good chunk of time to be leaving on the table!
Similarly, I heard from someone (I can't remember who it was) that said
they could do a Linux Firefox build in ~8(?) minutes on the same
hardware. (I will try to track down the source of that number.) That
gives us a fair lower-bound to shoot for, I think.

> I definitely observed periods where it was not CPU bound. For example,
> at
> the end of the js lib build I observed a single cl.exe process sit for ~2
> minutes while no other work was being done. I also saw link.exe take a
> long time without parallelism, but i think that's a known issue.

Yeah, I specifically meant "CPU-bound during the compile tier", where we
compile all the C++ code. If you look at the resource usage graphs I
posted it's pretty apparent where that is (the full `mach
resource-usage` HTML page has a nicer breakdown of tiers). The stuff
before and after compile is not as good, and the tail end of compile
gets hung up on some long-pole files, but otherwise it does a pretty
good job of saturating available CPU. I also manually monitored disk and
memory usage during the second build, and didn't see much there. The
disk usage showed ~5% active time, presumably mostly the compiler
generating output, and memory usage seemed to be stable at around 9GB
for most of the build (I didn't watch during libxul linking, I wouldn't
be surprised if it spikes then).

-Ted

Ben Kelly

unread,
Mar 17, 2017, 2:52:53 PM3/17/17
to Ted Mielczarek, dev-pl...@lists.mozilla.org
On Fri, Mar 17, 2017 at 2:43 PM, Ted Mielczarek <t...@mielczarek.org> wrote:

> > The 14min measurement must have been for a partial build. With defender
> > disabled the best I can get is 18min. This is on one of the new lenovo
> > p710 machines with 16 xeon cores.
>
> Nope, full clobber builds: `./mach clobber; time ./mach build`. (I have
> the same machine, FWIW.) The svg files I uploaded were from `mach
>

Sorry. I misread. I thought you were referring to my earlier email about
dropping from 20+ minutes to 14 minutes. My 14 minute measurement was
erroneous. I seem to get 18 minute clobber builds.


> Yeah, I specifically meant "CPU-bound during the compile tier", where we
> compile all the C++ code. If you look at the resource usage graphs I
> posted it's pretty apparent where that is (the full `mach
> resource-usage` HTML page has a nicer breakdown of tiers). The stuff
> before and after compile is not as good, and the tail end of compile
> gets hung up on some long-pole files, but otherwise it does a pretty
> good job of saturating available CPU. I also manually monitored disk and
> memory usage during the second build, and didn't see much there. The
> disk usage showed ~5% active time, presumably mostly the compiler
> generating output, and memory usage seemed to be stable at around 9GB
> for most of the build (I didn't watch during libxul linking, I wouldn't
> be surprised if it spikes then).
>

That "long pole file" at the end of the js lib is over 10% of my compile
time. That's not very good parallelism in the compile stage IMO.

Ben Kelly

unread,
Mar 17, 2017, 3:17:07 PM3/17/17
to Ted Mielczarek, dev-pl...@lists.mozilla.org
This is the part of the build I'm talking about:

15:17.80 Unified_cpp_js_src8.cpp
15:17.90 Unified_cpp_js_src38.cpp
15:18.33 Unified_cpp_js_src40.cpp
15:19.96 Unified_cpp_js_src41.cpp
15:21.41 Unified_cpp_js_src9.cpp
16:59.13 Interpreter.cpp
16:59.15 js_static.lib
16:59.99 module.res
17:00.04 Creating Resource file: module.res
17:00.81 StaticXULComponentsStart.cpp
17:00.99 nsDllMain.cpp

For the 1:38 between Unified_cpp_js_src9.cpp and Interpreter.cpp only a
single cl.exe process is running. I guess thats closer to 8% of the total
build time. Still seems very weird to me.

Ted Mielczarek

unread,
Mar 17, 2017, 3:41:11 PM3/17/17
to Ben Kelly, dev-pl...@lists.mozilla.org
Yeah, the JS engine uses a lot more complex C++ features than the rest
of the code in our tree, so it takes longer to compile. This is also why
the `FILES_PER_UNIFIED_FILE` setting is lower in js/src than the rest of
the tree. We do try to build js/src pretty early in the build, although
the exact workings of the compile tier is not something I currently
understand. One thing we could try here would be to hack up some
instrumentation to record the time taken to compile each source file,
which would let us determine if we need to tweak
`FILES_PER_UNIFIED_FILE` lower, at least.


-Ted


Ben Kelly

unread,
Mar 17, 2017, 3:43:41 PM3/17/17
to Ted Mielczarek, dev-pl...@lists.mozilla.org
On Fri, Mar 17, 2017 at 3:40 PM, Ted Mielczarek <t...@mielczarek.org> wrote:

> Yeah, the JS engine uses a lot more complex C++ features than the rest of
> the code in our tree, so it takes longer to compile. This is also why the
> `FILES_PER_UNIFIED_FILE` setting is lower in js/src than the rest of the
> tree. We do try to build js/src pretty early in the build, although the
> exact workings of the compile tier is not something I currently understand.
> One thing we could try here would be to hack up some instrumentation to
> record the time taken to compile each source file, which would let us
> determine if we need to tweak `FILES_PER_UNIFIED_FILE` lower, at least.
>

Hmm. The "we do try to build js/src pretty early in the build" statement
doesn't seem to match what I am seeing. Its one of the last things to
build on my machine. Also, it seems to be serialized with building other
libraries.

Ben

Ted Mielczarek

unread,
Mar 17, 2017, 3:44:39 PM3/17/17
to Ben Kelly, dev-pl...@lists.mozilla.org
On Fri, Mar 17, 2017, at 02:43 PM, Ted Mielczarek wrote:
> Similarly, I heard from someone (I can't remember who it was) that said
> they could do a Linux Firefox build in ~8(?) minutes on the same
> hardware. (I will try to track down the source of that number.) That
> gives us a fair lower-bound to shoot for, I think.

Aha, it was ttaubert, and it was on Twitter:
https://twitter.com/ttaubert/status/838790894937595904

10 minute clobber builds on the same hardware on Linux, so honestly 14
minutes seems very reasonable to me, although obviously making it faster
would be nice.

-Ted

Boris Zbarsky

unread,
Mar 17, 2017, 3:53:22 PM3/17/17
to
On 3/17/17 3:40 PM, Ted Mielczarek wrote:
> We do try to build js/src pretty early in the build

We do? It's always the last thing I see building before we link libxul.
Seeing the js/src stuff appearing is how I know my build is about done...

-Boris

Steve Fink

unread,
Mar 17, 2017, 3:57:19 PM3/17/17
to dev-pl...@lists.mozilla.org
On 03/17/2017 12:40 PM, Ted Mielczarek wrote:
> On Fri, Mar 17, 2017, at 03:16 PM, Ben Kelly wrote:
>
>> For the 1:38 between Unified_cpp_js_src9.cpp and Interpreter.cpp only
>> a single cl.exe process is running. I guess thats closer to 8% of the
>> total build time. Still seems very weird to me.
>
> Yeah, the JS engine uses a lot more complex C++ features than the rest
> of the code in our tree, so it takes longer to compile. This is also why
> the `FILES_PER_UNIFIED_FILE` setting is lower in js/src than the rest of
> the tree. We do try to build js/src pretty early in the build, although
> the exact workings of the compile tier is not something I currently
> understand. One thing we could try here would be to hack up some
> instrumentation to record the time taken to compile each source file,
> which would let us determine if we need to tweak
> `FILES_PER_UNIFIED_FILE` lower, at least.

If I'm understanding correctly, that's not really coming into play here.
(Though maybe it would, if we had left it at the default.) The only
thing that matters in this case is Interpreter.cpp, which is the slowest
file to compile in the JS engine, and quite possibly in the whole
browser. And FILES_PER_UNIFIED_FILE is not going to be able to break
that one file down any more. (It is already compiled separately, and
does not get unified with anything else.)

That's not even the largest file; js/src/jit/IonBuilder.cpp is 2.5x
larger. But interpreters are really hard on compilers. Really, it's the
single *function* that takes up half that file that is expensive. It's a
JS interpreter. Starting it early in the build would certainly be good.

Mike Hommey

unread,
Mar 17, 2017, 6:31:45 PM3/17/17
to Boris Zbarsky, dev-pl...@lists.mozilla.org
We don't try very hard, but it's also not listed to be last in the
makefile that drives the build dependencies. In fact, it's in the middle
of the dependencies for libxul... so I doubt even trying to move it
there is going to affect the outcome much... At this point, someone
needs to look at how Make actually orders the things it builds.

It also doesn't help that Make (or ninja, etc. for that matter) is not
aware of how long each target is going to take to build. If it did, it
could decide ahead of time that anything it needs to build that is not a
dependency of libxul could be built while libxul links. It could also
decide that things that take longer should go as early as possible so
that they don't remain pegging only one CPU if they happen last before
what depends on them.

Mike

Nathan Froyd

unread,
Mar 17, 2017, 7:13:15 PM3/17/17
to Mike Hommey, Boris Zbarsky, dev-platform
On Fri, Mar 17, 2017 at 6:31 PM, Mike Hommey <m...@glandium.org> wrote:
> On Fri, Mar 17, 2017 at 03:53:14PM -0400, Boris Zbarsky wrote:
> We don't try very hard, but it's also not listed to be last in the
> makefile that drives the build dependencies. In fact, it's in the middle
> of the dependencies for libxul... so I doubt even trying to move it
> there is going to affect the outcome much... At this point, someone
> needs to look at how Make actually orders the things it builds.

It is at least before all the libxul-specific code (i.e. code not in
mozglue/mfbt/external libs/etc.), but apparently that does not help
very much.

> It also doesn't help that Make (or ninja, etc. for that matter) is not
> aware of how long each target is going to take to build.

When this has come up in the context of ninja, the developer's
response has been that you should order your dependencies such that
things that take longer to build should appear earlier in the
dependency list. I'd guess this is probably the same heuristic make
uses, although our recursive build structure probably doesn't play
very well with that.

-Nathan

Mike Hommey

unread,
Mar 17, 2017, 8:19:25 PM3/17/17
to Nathan Froyd, Boris Zbarsky, dev-platform
On Fri, Mar 17, 2017 at 07:13:03PM -0400, Nathan Froyd wrote:
> On Fri, Mar 17, 2017 at 6:31 PM, Mike Hommey <m...@glandium.org> wrote:
> > On Fri, Mar 17, 2017 at 03:53:14PM -0400, Boris Zbarsky wrote:
> > We don't try very hard, but it's also not listed to be last in the
> > makefile that drives the build dependencies. In fact, it's in the middle
> > of the dependencies for libxul... so I doubt even trying to move it
> > there is going to affect the outcome much... At this point, someone
> > needs to look at how Make actually orders the things it builds.
>
> It is at least before all the libxul-specific code (i.e. code not in
> mozglue/mfbt/external libs/etc.), but apparently that does not help
> very much.

Search for js/src/target in $objdir/root-deps.mk.

> > It also doesn't help that Make (or ninja, etc. for that matter) is not
> > aware of how long each target is going to take to build.
>
> When this has come up in the context of ninja, the developer's
> response has been that you should order your dependencies such that
> things that take longer to build should appear earlier in the
> dependency list. I'd guess this is probably the same heuristic make
> uses, although our recursive build structure probably doesn't play
> very well with that.

I doubt that's the whole story, at least for Make.

Mike

Jean-Yves Avenard

unread,
Mar 27, 2017, 2:56:36 AM3/27/17
to Ben Kelly, dev-pl...@lists.mozilla.org
Hi. 

I have received the new Dell XPS 15 9560 and got very puzzled as to why compiling central was so slow on this machine. 
This is comparing against a Gigabyte Aero 14 with a gen 6 intel CPU (2.6Ghz i7-6600HQ) vs Dell's 7th gen (2.8Ghz i7-7700HQ)
On the Aero 14, compiling central takes 24 minutes. On the XPS 15, first go 38 minutes. 
The XPS 15 came with McAfee anti-virus and Windows Defender is disabled. An exclusion list made almost no difference. Disabling entirely McAfee: the time dropped to 28 minutes.Uninstalling McAfee completely, enabling Windows defender with an exclusion list as mentioned in the first post: 26.5 minutes
Now disabling Windows Defender: not just an exclusion list saw the time dropped to 25 minutes.Interestingly, on the Aero disabling Windows Defender or having just an exclusion list made almost no difference in compilation time. I can't explain the reason. Maybe because big brother is watching all the time!
After following the instructions listed there: http://www.ultrabookreview.com/14875-fix-throttling-xps-15/
Compilation time dropped to 23.8 minutes.The main factor was adding thermal pads to the MOSFETs. 
Undervolting the CPU by 125mV added 50s of compilation time, but dropped the processor temperature by 10C (max 78C vs 68C) and my guess will also add quite a lot of battery life.
So if you're in a hurry, you may want to try disabling Windows Defender completely. 
FWIW: on those same machines running Linux Ubuntu 16.10;Aero 14: 14 minutesXPS 15: 13 minutes.That's out of the box, no tweaks of any kinds. 
JY 
-------
If it ain't broken, please fix it




On Fri, Mar 17, 2017 at 4:26 AM +0100, "Ben Kelly" <bke...@mozilla.com> wrote:










Hi all,

I'm trying to configure my new windows build machine and noticed that
builds were still somewhat slow. I did:

1) Put it in high performance power profile
2) Made sure my mozilla-central dir was not being indexed for search
3) Excluded my mozilla-central directory from windows defender

Watching the task monitor during a build, though, I still saw MsMpEng.exe
(antivirus) running during the build.

I ended up added some very broad exclusions to get this down close to
zero. I am now excluding:

- mozilla-central checkout
- mozilla-build install dir
- visual studio install dir
- /users/bkelly/appdada/local/temp
- /users/bkelly (because temp dir was not enough)

I'd like to narrow this down a bit. Does anyone have a better list of
things to exclude from virus scanning for our build process?

Thanks.

Ben

Gregory Szorc

unread,
Mar 29, 2017, 3:56:42 PM3/29/17
to Jean-Yves Avenard, dev-platform, Ben Kelly
On Sun, Mar 26, 2017 at 11:56 PM, Jean-Yves Avenard <jyav...@mozilla.com>
wrote:

> Hi.
>
> I have received the new Dell XPS 15 9560 and got very puzzled as to why
> compiling central was so slow on this machine.
> This is comparing against a Gigabyte Aero 14 with a gen 6 intel CPU
> (2.6Ghz i7-6600HQ) vs Dell's 7th gen (2.8Ghz i7-7700HQ)
> On the Aero 14, compiling central takes 24 minutes. On the XPS 15, first
> go 38 minutes.
> The XPS 15 came with McAfee anti-virus and Windows Defender is disabled.
> An exclusion list made almost no difference. Disabling entirely McAfee: the
> time dropped to 28 minutes.Uninstalling McAfee completely, enabling Windows
> defender with an exclusion list as mentioned in the first post: 26.5 minutes
> Now disabling Windows Defender: not just an exclusion list saw the time
> dropped to 25 minutes.Interestingly, on the Aero disabling Windows Defender
> or having just an exclusion list made almost no difference in compilation
> time. I can't explain the reason. Maybe because big brother is watching all
> the time!
>

The way that A/V scanning typically works on Windows is that the A/V
software inserts a "file system filter driver" into the kernel. This driver
sees pretty much all I/O operations and has the opportunity to change their
behavior, delay their fulfillment (by calling out into a blocking API in
userland), etc.

The file system filter driver for A/V is always installed. Exclusion list
processing needs to be handled by that driver. So just having the driver
installed means there is some overhead. If the exclusion list processing
code is slow, this will slow down I/O, even if a path is in the exclusion
list. From your numbers, I suspect McAfee may have a less optimal exclusion
list processor than Microsoft. That wouldn't surprise me.

You can see which file system filters are loaded by running `fltmc.exe
filters`. Note: you'll have to do this from a command prompt with
administrator access. You'll likely have to plug the names into a search
engine to figure out what they do.

I'm not sure about McAfee, but Windows Defender scans files during the
CloseHandle() API. But only if the file was written or appended to. This
scanning makes CloseHandle() go from ~1us to ~1ms. When you are writing
thousands of files, this scanning can add up! FWIW, Mercurial mitigates
this by having a thread pool that just sits around closing file
descriptors. This change made fresh `hg update` operations on Windows >60s
faster IIRC. I recommend using Process Monitor to record system call times.
That's how I discovered that Windows Defender was mucking about with
CloseHandle(). If you find system calls that other A/V (like McAfee) is
mucking with, we could potentially mitigate their impact in the build
system similar to what Mercurial has done.


> After following the instructions listed there: http://www.
> ultrabookreview.com/14875-fix-throttling-xps-15/
> Compilation time dropped to 23.8 minutes.The main factor was adding
> thermal pads to the MOSFETs.
> Undervolting the CPU by 125mV added 50s of compilation time, but dropped
> the processor temperature by 10C (max 78C vs 68C) and my guess will also
> add quite a lot of battery life.
>

I'll be blunt: you should not be performing full builds on a laptop because
of issues like thermal throttling. Laptops are OK for limited uses (like
traveling). But if you are paid to hack on Gecko, Mozilla should buy you a
desktop. Firefox Desktop developers (who can use artifact builds) can get
away with a laptop for building.

Marco Bonardo

unread,
Apr 18, 2017, 5:18:26 AM4/18/17
to dev-pl...@lists.mozilla.org
On Fri, Mar 17, 2017 at 7:20 PM, Ben Kelly <bke...@mozilla.com> wrote:

> On Fri, Mar 17, 2017 at 1:36 PM, Ted Mielczarek <t...@mielczarek.org>
> wrote:
>
> With defender
> disabled the best I can get is 18min. This is on one of the new lenovo
> p710 machines with 16 xeon cores.
>

Just to add one datapoint, I just replaced my desktop with an AMD Ryzen
1700x (OC @3.8GHz) and I can clobber in less than 18mins on something that
may be cheaper than those XEON machines (an OC 1700 can probably do the
same). I'm currently using -J20, I didn't yet try to bump that up to see
what's the limit, I'm positive I can get better times out of this machine,
but I need to find some time for testing.
I'm using Panda Free, with the SSD containing the sources in its exclusion
list, didn't do anything to Defender (but I guess it's disabled by having
another AV in the system).
You can also disable Defender with a GPO, by setting "Turn off Windows
Defender" to Enabled.

Jeff Gilbert

unread,
Apr 20, 2017, 11:03:07 PM4/20/17
to Marco Bonardo, dev-pl...@lists.mozilla.org
Can confirm for Ryzen: FWIW, my stock R7 1800x (RAM @2133Mhz for now
:( ) did a Windows debug clobber in 15:32.40 with the default -j16.
(any directory that showed up as read by MsMpEng in Resource Monitor
excluded for Defender)

I'll give it a run through with Defender disabled, and see if there's
any change.
0 new messages