Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How to reduce startup time of perl script / perlcc / bytecode ?

169 views
Skip to first unread message

Joost

unread,
Mar 1, 2000, 3:00:00 AM3/1/00
to

Hi,

For a website I've written a Perl CGI script which is already 100k in
size. Even on a fast server, the script seems to take about half a
second just to load, apparently due to the amount of time required for
parsing the script into perl bytecode. I need to speed this up, cause
it's just a waste of CPU load.

I'd rather not use things like fastcgi, cgisock or mod_cgi, since they
all require me -as far as i know- to modify the script. Instead I'd
like to bypass just the parsing of my .pl file. There seem to be
several ways to do it, but somehow I can't get it working or I'm
lacking the documentation. So far I've found:


- O::Bytecode
If I do
perl -MO=Bytecode test.pl > test.plc
I get a bytecode dump of my perl script. But how can I run it?
B::Documentation says i should use the 'byteperl' binary, but I can't
find it anywhere. In a newsgroep I read that it has been exchanged for
a ByteLoader module. But I can't find documentation on any of those.


- perlcc
This seemed to be just what I was looking for. I managed to compile my
script into a binary and although it is over 2 MB, it loaded about 20x
faster than the original .pl script! However, got the following error
when initiating a database connection from the compiled script:
Can't locate object method "connect" via package "DBI"
at test.pl line 146.
while the original .pl script worked fine. So how to get perlcc to
work with DBI?


- core dumping
I heard some rumours about a way to speed up the parsing by letting a
perl script dump core right after loading and then run the script from
the core file or something like that. Just a rumour?


I saw other people asking similar questions in newsgroups, but the
only answers they got were like 'Why do you want to compile your
script? You really don't want to. Read the FAQ'. That's not the answer
I'm looking for; I read the FAQs and spent already two hours in
dejanews..

Any help appreciated!

Regards,
Joost
--
PLEASE NOTE: comp.infosystems.www.authoring.cgi is a
SELF-MODERATED newsgroup. aa.net and boutell.com are
NOT the originators of the articles and are NOT responsible
for their content. You can SELF-APPROVE your first posting
by writing the word 'passme' on a line by itself.

Joost

unread,
Mar 1, 2000, 3:00:00 AM3/1/00
to

Steve

unread,
Mar 1, 2000, 3:00:00 AM3/1/00
to
I know this isn't what you want to hear...

But, why write one huge script?

If you don't compile it, but keep it as script and use 'require' and a
series of .lib libraries which are only loaded if needed, then the perl
compiler will only compile what's needed.

I run a 600KB script, but have split it down into 6 .cgi and 20 .lib. No
cgi is more than 30K, and no compile requires more than 2 of the 20 .libs.

Before I did this, even when the script was only 120K, it always ran like a
three-legged dog.


Joost <joo...@newhouse.remove-this.nl> wrote in message
news:38bf4144...@news.et.tudelft.nl...
: Hi,

Joost

unread,
Mar 2, 2000, 3:00:00 AM3/2/00
to
On 1 Mar 2000 10:43:07 -0800, "Steve" <smeth...@compuserve.com>
wrote:
:I know this isn't what you want to hear...

Well..

:But, why write one huge script?

Call it personal preference..? I think it's easy to maintain this way,
I don't have to think everytime in which file a certain subroputine
is.

And I'm using self-referencing forms, where the form action is the
same script. So even if I would split it up, I would still prefer a
single .cgi.

:If you don't compile it, but keep it as script and use 'require' and a


:series of .lib libraries which are only loaded if needed, then the perl
:compiler will only compile what's needed.

Really? I always thought a 'require' just includes the file, so all
.libs still have to be compiled at run time? I checked the perl
documentation and it says that a require just does an 'eval' of the
file. How would perl otherwise know which libs to include and which
not?

:I run a 600KB script, but have split it down into 6 .cgi and 20 .lib. No


:cgi is more than 30K, and no compile requires more than 2 of the 20 .libs.
:
:Before I did this, even when the script was only 120K, it always ran like a
:three-legged dog.

Currently mine still runs fine, and the 0.5 second delay is hardly
noticable if a user has a dialup connection. But it will be a problem
when the site gets busier and the server load increases.

I'd still prefer a solution that does not require me to modify the
script...

Wouldn't it be nice if perl contained an option that saves a
pre-compiled version (bytecode) of the .pl file in the same directory.
Next time the script is invoked, it checks the file dates and runs the
pre-compiled version if it's newer.

Joost

:Joost <joo...@newhouse.remove-this.nl> wrote in message

Joost

unread,
Mar 2, 2000, 3:00:00 AM3/2/00
to
On Thu, 2 Mar 2000 19:47:00 +0930, "Wyzelli" <wyz...@yahoo.com>
wrote:
:If you are running ActivePerl on IIS (which I am guessing because you are
:referring to .pl...)

Nope, using Linux/Apache...

:You could try Perl for ISAPI. This still loads and compiles the script, but
:does not need to start up a new instance of the Perl compiler each time. It
:is reportedly much faster, particularly for repeated hits to the same
:script, as it can share objects.

This save the time required for loading the perl binary. But perl
probably still has to parse and compile the script itself everytime?
In my case, the latter is taking by far the most time.

:I have only just started testing this myself so cannot offer you any real
:performance, and it is environment specific, ie requires an ISAPI compliant
:server, so if you are not in that environment, this all is no use to you.

But thanks for your reply!

Steve

unread,
Mar 2, 2000, 3:00:00 AM3/2/00
to

Joost <joo...@newhouse.remove-this.nl> wrote in message
news:38c04326...@news.et.tudelft.nl...
: On 1 Mar 2000 10:43:07 -0800, "Steve" <smeth...@compuserve.com>

: wrote:
: :I know this isn't what you want to hear...
:
: Well..
:
: :But, why write one huge script?
:
: Call it personal preference..? I think it's easy to maintain this way,
: I don't have to think everytime in which file a certain subroputine
: is.
:
: And I'm using self-referencing forms, where the form action is the
: same script. So even if I would split it up, I would still prefer a
: single .cgi.
:
: :If you don't compile it, but keep it as script and use 'require' and a
: :series of .lib libraries which are only loaded if needed, then the perl
: :compiler will only compile what's needed.
:
: Really? I always thought a 'require' just includes the file, so all
: .libs still have to be compiled at run time? I checked the perl
: documentation and it says that a require just does an 'eval' of the
: file. How would perl otherwise know which libs to include and which
: not?

This is a fundamental point, and maybe somebody will tell me I'm wrong, but
I believe that if you set up a series of if/elsif/else statements in your
script, each based upon a different form action, and link each to a separate
lib using require e.g:

if ($form{'action'} eq '1') { require '1.lib'; &1; }
elsif ($form{'action'} eq '2') { require '2.lib'; &2; }
.
etc.

then only the required libs get compiled, based upon the initiating form
action.

Steve

(snip)

Barry Hemphill

unread,
Mar 4, 2000, 3:00:00 AM3/4/00
to
In article <38bf4144...@news.et.tudelft.nl>,
joo...@newhouse.remove-this.nl (Joost) wrote:
:

: For a website I've written a Perl CGI script which is already 100k in
: size. Even on a fast server, the script seems to take about half a
: second just to load, apparently due to the amount of time required for
: parsing the script into perl bytecode. I need to speed this up, cause
: it's just a waste of CPU load.
:
: I'd rather not use things like fastcgi, cgisock or mod_cgi, since they
: all require me -as far as i know- to modify the script.

One you didn't mention is mod_perl. It requires little or no
modification of your program, and it would reduce the startup time quite
a bit. I haven't configured mod_perl in a while, but I believe that it
also caches recently executed CGI's, which would be a big help.

Of course, this is only part of the solution. As someone else has
already said, split this monster up. In fact, if you do that in
conjunction with mod_perl, you can take a good chunk (as much as
possible if you ask me) out of the program and put it into a module(s),
which you can then have preloaded by Apache/mod_perl. I know you don't
want to split up the program, but this is one of those situations where
if splitting it up is going to be important for the speed, you have to
choose. Split it up and speed it up, or live with the slow load time.

Hope this helps,

Barry

--
Dr. Strangeweb
Or How I Learned To Stop Worrying And Love The Net


Sent via Deja.com http://www.deja.com/
Before you buy.

Joost

unread,
Mar 4, 2000, 3:00:00 AM3/4/00
to
On 4 Mar 2000 00:48:36 -0800, Barry Hemphill <u...@drstrangeweb.com>
wrote:
:In article <38bf4144...@news.et.tudelft.nl>,

: joo...@newhouse.remove-this.nl (Joost) wrote:
::
:: For a website I've written a Perl CGI script which is already 100k in
:: size. Even on a fast server, the script seems to take about half a
:: second just to load, apparently due to the amount of time required for
:: parsing the script into perl bytecode. I need to speed this up, cause
:: it's just a waste of CPU load.
::
:: I'd rather not use things like fastcgi, cgisock or mod_cgi, since they
:: all require me -as far as i know- to modify the script.
:
:One you didn't mention is mod_perl. It requires little or no
:modification of your program, and it would reduce the startup time quite
:a bit. I haven't configured mod_perl in a while, but I believe that it
:also caches recently executed CGI's, which would be a big help.

Does that mean that it caches the pre-compiled script, or the output
of the script? The latter would be of no use to me.

But I'll look into mod_perl, sounds interesting.

:Of course, this is only part of the solution. As someone else has


:already said, split this monster up. In fact, if you do that in
:conjunction with mod_perl, you can take a good chunk (as much as
:possible if you ask me) out of the program and put it into a module(s),
:which you can then have preloaded by Apache/mod_perl. I know you don't
:want to split up the program, but this is one of those situations where
:if splitting it up is going to be important for the speed, you have to
:choose. Split it up and speed it up, or live with the slow load time.

Yeah, guess you're right.
But the thing that bothers me is the fact that compiling the script
everytime is just a BIG waste of time and could be done far more
efficiently by saving a precompiled script somehow..

Thanks
Joost

0 new messages