How to speed up loading of large scripts

119 views
Skip to first unread message

edhenolin

unread,
Jul 24, 2015, 5:43:42 AM7/24/15
to v8-users
I'm using node and v8 to run a relatively big program on an embedded system. Studies indicate that compiling the user javascript code sometimes take up to 3 seconds, which make for slow startup times of the javascript application. 

Is there any way to speed this up? I have been trying to use the ScriptCompiler::CachedData, producing this on first compilation and then saving to disk. But I have been running into problems.

1)
If I use ScriptCompiler::kProduceCodeCache I often get problems when serializing, with a failure in CodeSerializer::SerializeObject that I don't understand:

// The code-caches link to context-specific code objects, which
// the startup and context serializes cannot currently handle.
DCHECK(!heap_object->IsMap() ||
 Map::cast(heap_object)->code_cache() ==
     heap_object->GetHeap()->empty_fixed_array());

Can anyone explain why I get this? I don't understand the comment.

IF compilation and serializing succeeds I will get a CachedData that I try to save to disk. On subsequent loads of the process it seems like I can't use this saved Cache however. Does anyone know if this is because the generated code has hardcoded memory addresses that will not work once the process has been restarted?

2)
If I use CompileOptions::kProduceParserCache it seems to work just fine. However I notice two problems.

a) Some files will not produce any data in the cache.

b) Those that do only generate 20 bytes of data, and loading this on subsequent compilations doesn't seems to speed up compilation at all.

Can anyone explain those two points?

Any other ideas on how it would be possible to speed up the initial loading of larger scripts? I assume we must be able to cache something, but so far I haven't had much success. 

Thanks in advance.



Yang Guo

unread,
Aug 11, 2015, 7:44:42 AM8/11/15
to v8-users
Hi,

which V8 version are you using? I don't think you can hit that assertion with the code serializer with the most up-to-date V8. kProduceParserCache doesn't produce any cached code, just some hint to speed up parsing. In some cases there is no hint that the parser can take advantage of.

What you should instead use is the custom start-up snapshot. You would create a snapshot of the heap after you executed the startup script, so that later, starting up would already put the heap into the state you snapshotted. Here's a short explaination: http://www.hashseed.net/2015/03/improving-v8s-performance-using.html

Yang

edhenolin

unread,
Aug 17, 2015, 5:11:23 AM8/17/15
to v8-users


On Tuesday, August 11, 2015 at 1:44:42 PM UTC+2, Yang Guo wrote:
Hi,

which V8 version are you using? I don't think you can hit that assertion with the code serializer with the most up-to-date V8. kProduceParserCache doesn't produce any cached code, just some hint to speed up parsing. In some cases there is no hint that the parser can take advantage of.

I was using version 3.28.73 which was the version released with Node at that point. I tried switching to 4.3 and yes the problem went away and the serializer now seems to work. Thanks.

 
What you should instead use is the custom start-up snapshot. You would create a snapshot of the heap after you executed the startup script, so that later, starting up would already put the heap into the state you snapshotted. Here's a short explaination: http://www.hashseed.net/2015/03/improving-v8s-performance-using.html

That seems like a great idea. However, I am not sure I understand exactly how that would work. Can I only pass one script? For instance, if I know I need all the NodeJs lib/*.js files, would I be able to capture the state of all of those, or can I only pick one script? I read somewhere that the blob will not work with node.js require function as that one is implemented in native C++.

 

Yang Guo

unread,
Aug 17, 2015, 8:59:27 AM8/17/15
to v8-users
Nope. Require won't work. Just concatenate all the JS files you need into one.

edhenolin

unread,
Aug 17, 2015, 9:13:30 AM8/17/15
to v8-users
Thanks a lot for the information.
Reply all
Reply to author
Forward
0 new messages