Slower performance when moving from 5.3.2 to 6.0.2

245 views
Skip to first unread message

Jim Sellers

unread,
Sep 29, 2020, 9:55:31 AM9/29/20
to Dependency Check
Hello all.

I've noticed a noticeable slowdown going from 5.3.2 to 6.0.2.

Using h2 database, if I total up the time of generating reports, it goes from 50 minutes to 120 minutes for a report that I run daily against everything the versions of apps in prod.

I wiped out the h2 databases so it should be clean.

They are all multimodule projects and the command I run is:
mvn org.owasp:dependency-check-maven:6.0.2:check org.owasp:dependency-check-maven:6.0.2:aggregate -Dformat=ALL -DskipProvidedScope=true -DautoUpdate=false -DnuspecAnalyzerEnabled=false -DassemblyAnalyzerEnabled=false -DnodeAuditAnalyzerEnabled=false -DnodeAnalyzerEnabled=false -DrubygemsAnalyzerEnabled=false -DbundleAuditAnalyzerEnabled=false

Has anyone else noticed a significant slowdown?
Jim

Alexander von Buchholtz

unread,
Sep 29, 2020, 11:00:26 AM9/29/20
to Dependency Check

Hi Jim,

 

I can confirm a ~30-40% time increase (internal NVD mirror) after upgrading from 5.3.2 to 6.0.2 for our projects when using the Maven plugin, only java dependencies, no test dependencies checked.

 

Cheers,

Alex

--
You received this message because you are subscribed to the Google Groups "Dependency Check" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dependency-che...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dependency-check/9d6e9761-b143-4efe-b758-e337ef82fe30n%40googlegroups.com.

John Patrick

unread,
Sep 29, 2020, 2:23:55 PM9/29/20
to Alexander von Buchholtz, Dependency Check
I've also noticed increased execution times, maybe ~25% increase.

I'm using a jenkins openshift slave docker image, so it's a fresh
database build each execution.

I'm also getting random failures, but re-execute the build and it
passes a few mins later, building from the same git commit.

John
> To view this discussion on the web visit https://groups.google.com/d/msgid/dependency-check/b0b733085239465b9fbcf800ca61a7bd%40qudosoft.de.

Jeremy Long

unread,
Sep 29, 2020, 6:54:42 PM9/29/20
to John Patrick, Alexander von Buchholtz, Dependency Check
If you want a faster build in your pipeline - cache the data directory.

Alexander von Buchholtz

unread,
Sep 30, 2020, 2:56:43 AM9/30/20
to Dependency Check

Hi Jeremy,

 

I'm already caching the data directory in my pipelines. I personally see no issue with the increased processing time of DependencyCheck as it is mostly nightly builds that only run 2-5 mins each and run in parallel.

However a 30-40% increase is rather unusual and if others incorporate DependencyCheck in their regular release pipeline where every minute counts if you release/deploy rather frequently it could be worthwhile to check with a profile where the increased time originates.

 

Cheers,

Alex

Jim Sellers

unread,
Oct 1, 2020, 6:09:19 PM10/1/20
to Alexander von Buchholtz, Dependency Check
Hi all.

I created a test project that just has libraries.
Version 5.3.2 takes 01:35 min
Version 6.0.2 takes 01:58 min

Log files of debug output in the root of the project.

I hope that this helps.
Jim


You received this message because you are subscribed to a topic in the Google Groups "Dependency Check" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dependency-check/Tw0rBVedT5g/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dependency-che...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dependency-check/b3f40bafb840469490679bad9af85725%40qudosoft.de.

Hans Aikema

unread,
Oct 1, 2020, 7:31:13 PM10/1/20
to Jim Sellers, Alexander von Buchholtz, Dependency Check
Jim,

Your project inspired me to run a quick local comparison. Made some change to your project’s run script - first let it run-in (resolution of artifacts, population of caches - to avoid all kinds of network/cache build-up overhead triggering too much of a run-time influence) and then run it again, that time with maven debug-logging and JavaFlightRecorder enabled.

Stats for that run are (in timing-perspective slightly) in favor of DependencyCheck 6.0.2

DC 5.3.2 - Duration of total recording 1 min 5s
Heap Usage Avg 389MiB, Max 1.59Gib
Total CPU Avg 43, max 97
GC Pause time Avg 22.774ms - max 68.786ms

DC 6.0.2 - Duration of total recording 1 min
Heap Usage Avg 244Mib Max 755MiB
CPU Usage Avg 52.8, max 98.4
GC Pause time avg 16.998ms max 58.910ms


My modified run-test.sh shellscript:
run-test.sh

John Patrick

unread,
Oct 2, 2020, 4:43:54 AM10/2/20
to Dependency Check, Hans Aikema, Jim Sellers, Alexander von Buchholtz
So Hans from your output did v6 take 1 min and v5 take 1 min 5
seconds? So v6 was actually quicker than v5 for yourself?

I executed the original run-test.sh in quick succession and got the
following 3 times;
v5 @ 01:31 min | 01:16 min | 01:20 min
v6 @ 02:47 min | 01:49 min | 02:18 min

Running the hans version, also 3 times in quick succession, I get these times;
v5 @ 02:11 min | 01:05 min | 01:10 min
v6 @ 02:34 min | 01:38 min | 01:25 min

With the run_in files reporting;
v6 @ 01:07 min | 43.630 s | 46.098 s
v5 @ 01:26 min | 01:16 min | 01:15 min

Just to confirm, using MacBookPro 10.15.7 with SSD and 16GB Ram, Maven
3.6.3 and Java 1.8.0_192.

John
> --
> You received this message because you are subscribed to the Google Groups "Dependency Check" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dependency-che...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/dependency-check/1650520B-2150-4CB9-8248-739D9CE84CD4%40gmail.com.
>
>
> Test-system:
> Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
> Java version: 1.8.0_191, vendor: Oracle Corporation, runtime: /Library/Java/JavaVirtualMachines/jdk1.8.0_191.jdk/Contents/Home/jre
> Default locale: en_GB, platform encoding: UTF-8
> OS name: "mac os x", version: "10.14.6", arch: "x86_64", family: “mac"
>
> Processor: Intel(R) Xeon(R) CPU E5-1620 v2 @ 3.70GHz, Vendor: GenuineIntel (4-core, 8 hardware threads, Ivy Bridge)
> Memory: 64 GB
>
> Given my system environment (fast PCIe SSD storage, close proximity of a Maven repository mirror (ping RTT 1-1.5ms)) my main suspects for the difference you see would be disk- or network I/O pattern differences
>
> kind regards,
> Hans Aikema
> > To view this discussion on the web visit https://groups.google.com/d/msgid/dependency-check/CAN6peoGAEc7-LVDOGga658naeeY7X8H_q5mbR8iMiHp9oFak2g%40mail.gmail.com.
>
> --
> You received this message because you are subscribed to the Google Groups "Dependency Check" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dependency-che...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/dependency-check/1650520B-2150-4CB9-8248-739D9CE84CD4%40gmail.com.

Hans Aikema

unread,
Oct 2, 2020, 7:53:00 AM10/2/20
to John Patrick, Dependency Check, Jim Sellers, Alexander von Buchholtz
Decided to spin up my macbook pro and check there…..
get similar delays for 6.0.2 on that (1m8s versus 1m35s)

Looking at the flightrecorder stats it seems that the CveDB is a location with a much higher block rate in 6.0.2 compared to 5.3.2 on my macbook. On my main system they are on-par

MacBook Pro run: CveDB estimated total blocked time 1m7s for 6.0.2 and 6.3s for 5.3.2

Mac Pro run: CveDB estimated total blocked time 2.8s for 6.0.2 and 14.6s for 5.3.2

Hans Aikema

unread,
Oct 2, 2020, 2:58:58 PM10/2/20
to John Patrick, Dependency Check, Jim Sellers, Alexander von Buchholtz
With debug-logging disabled (Maven’s jansi was also a big contributor to blocked threads in the profiling) I get similar results on the Mac Pro
Total runtime 5.3.2 40 seconds, 6.0.2 1min 3seconds

Most blocked time in CveDB for 6.0.2 2min 13s versus 13s for 5.3.2
Due to significantly more blocked threads in getCPEs (505 instances versus 90 instances when profiling with a Java Monitor Blocked threshold of 2ms)

Will keep digging when time allows

kind regards,
Hans

Hans Aikema

unread,
Oct 3, 2020, 4:10:14 PM10/3/20
to Jim Sellers, Dependency Check, John Patrick, Alexander von Buchholtz
Some further digging has surfaced the root-cause…. an apparently accidentally removed index for the h2 database

Opened an issue (https://github.com/jeremylong/DependencyCheck/issues/2861) and working on the PR to get this back in

@Jim thanks for raising this and all involved for testing / confirming it

kind regards,
Hans


> On 2 Oct 2020, at 13:52, Hans Aikema <aik...@gmail.com> wrote:
>

Jim Sellers

unread,
Oct 3, 2020, 4:12:32 PM10/3/20
to Hans Aikema, Alexander von Buchholtz, Dependency Check, John Patrick
Thanks all and thank you Hans for finding the root cause. That’s great!

Jim

John Patrick

unread,
Oct 4, 2020, 3:52:07 AM10/4/20
to Jim Sellers, Hans Aikema, Alexander von Buchholtz, Dependency Check
Nice spot on the missing index.

So building the issue-2861 branch locally, I'm seeing;

v5.3.2 @ 01:01 min | 01:09 min | 01:16 min
v6.0.3-SNAPSHOT @ 01:22 min | 01:37 min | 01:46 min

So a 34%, 40% and 39% increase against the v5 execution.

A definite improvement over the 83%, 43% and 72% increase which I saw
last time with 6.0.2 vs 5.3.2.

John
Reply all
Reply to author
Forward
0 new messages